WorldWideScience

Sample records for providing quantitative estimates

  1. Data Service Provider Cost Estimation Tool

    Science.gov (United States)

    Fontaine, Kathy; Hunolt, Greg; Booth, Arthur L.; Banks, Mel

    2011-01-01

    The Data Service Provider Cost Estimation Tool (CET) and Comparables Database (CDB) package provides to NASA s Earth Science Enterprise (ESE) the ability to estimate the full range of year-by-year lifecycle cost estimates for the implementation and operation of data service providers required by ESE to support its science and applications programs. The CET can make estimates dealing with staffing costs, supplies, facility costs, network services, hardware and maintenance, commercial off-the-shelf (COTS) software licenses, software development and sustaining engineering, and the changes in costs that result from changes in workload. Data Service Providers may be stand-alone or embedded in flight projects, field campaigns, research or applications projects, or other activities. The CET and CDB package employs a cost-estimation-by-analogy approach. It is based on a new, general data service provider reference model that provides a framework for construction of a database by describing existing data service providers that are analogs (or comparables) to planned, new ESE data service providers. The CET implements the staff effort and cost estimation algorithms that access the CDB and generates the lifecycle cost estimate for a new data services provider. This data creates a common basis for an ESE proposal evaluator for considering projected data service provider costs.

  2. Quantitative estimation of pollution in groundwater and surface ...

    African Journals Online (AJOL)

    Quantitative estimation of pollution in groundwater and surface water in Benin City and environs. ... socio-economic and public health concerns. Constant monitoring is recommended to provide information on water quality and health guide. Keywords: Microbial load, Physico-chemical properties, Water sources, Benin City ...

  3. Unrecorded Alcohol Consumption: Quantitative Methods of Estimation

    OpenAIRE

    Razvodovsky, Y. E.

    2010-01-01

    unrecorded alcohol; methods of estimation In this paper we focused on methods of estimation of unrecorded alcohol consumption level. Present methods of estimation of unrevorded alcohol consumption allow only approximate estimation of unrecorded alcohol consumption level. Tacking into consideration the extreme importance of such kind of data, further investigation is necessary to improve the reliability of methods estimation of unrecorded alcohol consumption.

  4. River Forecasting Center Quantitative Precipitation Estimate Archive

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Radar indicated-rain gage verified and corrected hourly precipitation estimate on a corrected ~4km HRAP grid. This archive contains hourly estimates of precipitation...

  5. Quantitative volumetric breast density estimation using phase contrast mammography

    Science.gov (United States)

    Wang, Zhentian; Hauser, Nik; Kubik-Huch, Rahel A.; D'Isidoro, Fabio; Stampanoni, Marco

    2015-05-01

    Phase contrast mammography using a grating interferometer is an emerging technology for breast imaging. It provides complementary information to the conventional absorption-based methods. Additional diagnostic values could be further obtained by retrieving quantitative information from the three physical signals (absorption, differential phase and small-angle scattering) yielded simultaneously. We report a non-parametric quantitative volumetric breast density estimation method by exploiting the ratio (dubbed the R value) of the absorption signal to the small-angle scattering signal. The R value is used to determine breast composition and the volumetric breast density (VBD) of the whole breast is obtained analytically by deducing the relationship between the R value and the pixel-wise breast density. The proposed method is tested by a phantom study and a group of 27 mastectomy samples. In the clinical evaluation, the estimated VBD values from both cranio-caudal (CC) and anterior-posterior (AP) views are compared with the ACR scores given by radiologists to the pre-surgical mammograms. The results show that the estimated VBD results using the proposed method are consistent with the pre-surgical ACR scores, indicating the effectiveness of this method in breast density estimation. A positive correlation is found between the estimated VBD and the diagnostic ACR score for both the CC view (p=0.033 ) and AP view (p=0.001 ). A linear regression between the results of the CC view and AP view showed a correlation coefficient γ = 0.77, which indicates the robustness of the proposed method and the quantitative character of the additional information obtained with our approach.

  6. Quantitative Estimate of Weeds of Sugarcane ( Saccharum ...

    African Journals Online (AJOL)

    A quantitative method was employed for the enumeration of weeds. Quadrats were laid along transects and individual weed species in each quadrat was identified and counted. Simpson's diversity index, Sorensen similarity index and relative abundance were used to determine the weed community structure. A total of 51 ...

  7. Novel method for quantitative estimation of biofilms

    DEFF Research Database (Denmark)

    Syal, Kirtimaan

    2017-01-01

    were quantified by the proposed protocol. For ease in referring, this method has been described as the Syal method for biofilm quantification. This new method was found to be useful for the estimation of early phase biofilm and aerobic biofilm layer formed at the liquid-air interphase. The biofilms...... formed by all three tested bacteria-B. subtilis, E. coli and M. smegmatis, were precisely quantified....

  8. Smile line assessment comparing quantitative measurement and visual estimation

    NARCIS (Netherlands)

    Geld, P. Van der; Oosterveld, P.; Schols, J.; Kuijpers-Jagtman, A.M.

    2011-01-01

    INTRODUCTION: Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation

  9. The effects of communicating uncertainty in quantitative health risk estimates.

    Science.gov (United States)

    Longman, Thea; Turner, Robin M; King, Madeleine; McCaffery, Kirsten J

    2012-11-01

    To examine the effects of communicating uncertainty in quantitative health risk estimates on participants' understanding, risk perception and perceived credibility of risk information source. 120 first year psychology students were given a hypothetical health-care scenario, with source of risk information (clinician, pharmaceutical company) varied between subjects and uncertainty (point, small range and large range risk estimate format) varied within subjects. The communication of uncertainty in the form of both a small and large range resulted in a reduction in accurate understanding and increased perceptions of risk when a large range was communicated compared to a point estimate. It also reduced perceptions of credibility of the information source, though for the clinician this was only the case when a large range was presented. The findings suggest that even for highly educated adults, communicating uncertainty as a range risk estimate has the potential to negatively affect understanding, increase risk perceptions and decrease perceived credibility. Communicating uncertainty in risk using a numeric range should be carefully considered by health-care providers. More research is needed to develop alternative strategies to effectively communicate the uncertainty in health risks to consumers. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. Quantitative Compactness Estimates for Hamilton-Jacobi Equations

    Science.gov (United States)

    Ancona, Fabio; Cannarsa, Piermarco; Nguyen, Khai T.

    2016-02-01

    We study quantitative compactness estimates in {W^{1,1}_{loc}} for the map {S_t}, {t > 0} that is associated with the given initial data {u_0in Lip (R^N)} for the corresponding solution {S_t u_0} of a Hamilton-Jacobi equation u_t+Hbig(nabla_{x} ubig)=0, qquad t≥ 0,quad xinR^N, with a uniformly convex Hamiltonian {H=H(p)}. We provide upper and lower estimates of order {1/\\varepsilon^N} on the Kolmogorov {\\varepsilon}-entropy in {W^{1,1}} of the image through the map S t of sets of bounded, compactly supported initial data. Estimates of this type are inspired by a question posed by Lax (Course on Hyperbolic Systems of Conservation Laws. XXVII Scuola Estiva di Fisica Matematica, Ravello, 2002) within the context of conservation laws, and could provide a measure of the order of "resolution" of a numerical method implemented for this equation.

  11. nowCOAST's Map Service for NOAA Quantitative Precipitation Estimates (Time Enabled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Map Information: This nowCOAST time-enabled map service provides maps depicting the NWS Multi-Radar Multi-Sensor (MRMS) quantitative precipitation estimate mosaics...

  12. Do quantitative decadal forecasts from GCMs provide decision relevant skill?

    Science.gov (United States)

    Suckling, E. B.; Smith, L. A.

    2012-04-01

    It is widely held that only physics-based simulation models can capture the dynamics required to provide decision-relevant probabilistic climate predictions. This fact in itself provides no evidence that predictions from today's GCMs are fit for purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales, where it is argued that these 'physics free' forecasts provide a quantitative 'zero skill' target for the evaluation of forecasts based on more complicated models. It is demonstrated that these zero skill models are competitive with GCMs on decadal scales for probability forecasts evaluated over the last 50 years. Complications of statistical interpretation due to the 'hindcast' nature of this experiment, and the likely relevance of arguments that the lack of hindcast skill is irrelevant as the signal will soon 'come out of the noise' are discussed. A lack of decision relevant quantiative skill does not bring the science-based insights of anthropogenic warming into doubt, but it does call for a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to do so may risk the credibility of science in support of policy in the long term. The performance amongst a collection of simulation models is evaluated, having transformed ensembles of point forecasts into probability distributions through the kernel dressing procedure [1], according to a selection of proper skill scores [2] and contrasted with purely data-based empirical models. Data-based models are unlikely to yield realistic forecasts for future climate change if the Earth system moves away from the conditions observed in the past, upon which the models are constructed; in this sense the empirical model defines zero skill. When should a decision relevant simulation model be expected to significantly outperform such empirical models? Probability

  13. WetLab-2: Providing Quantitative PCR Capabilities on ISS

    Science.gov (United States)

    Parra, Macarena; Jung, Jimmy Kar Chuen; Almeida, Eduardo; Boone, Travis David; Schonfeld, Julie; Tran, Luan Hoang

    2015-01-01

    The objective of NASA Ames Research Centers WetLab-2 Project is to place on the ISS a system capable of conducting gene expression analysis via quantitative real-time PCR (qRT-PCR) of biological specimens sampled or cultured on orbit. The WetLab-2 system is capable of processing sample types ranging from microbial cultures to animal tissues dissected on-orbit. The project has developed a RNA preparation module that can lyse cells and extract RNA of sufficient quality and quantity for use as templates in qRT-PCR reactions. Our protocol has the advantage that it uses non-toxic chemicals, alcohols or other organics. The resulting RNA is transferred into a pipette and then dispensed into reaction tubes that contain all lyophilized reagents needed to perform qRT-PCR reactions. These reaction tubes are mounted on rotors to centrifuge the liquid to the reaction window of the tube using a cordless drill. System operations require simple and limited crew actions including syringe pushes, valve turns and pipette dispenses. The resulting process takes less than 30 min to have tubes ready for loading into the qRT-PCR unit.The project has selected a Commercial-Off-The-Shelf (COTS) qRT-PCR unit, the Cepheid SmartCycler, that will fly in its COTS configuration. The SmartCycler has a number of advantages including modular design (16 independent PCR modules), low power consumption, rapid thermal ramp times and four-color detection. The ability to detect up to four fluorescent channels will enable multiplex assays that can be used to normalize for RNA concentration and integrity, and to study multiple genes of interest in each module. The WetLab-2 system will have the capability to downlink data from the ISS to the ground after a completed run and to uplink new programs. The ability to conduct qRT-PCR on-orbit eliminates the confounding effects on gene expression of reentry stresses and shock acting on live cells and organisms or the concern of RNA degradation of fixed samples. The

  14. Quantitative estimation of urinary protein excretion by refractometry.

    Science.gov (United States)

    Kumar, S; Visweswaran, K; Sobha, A; Sarasa, G; Nampoory, M R

    1992-09-01

    Quantitative estimation of proteinuria done by the refractometric method was compared with that done by the sulphosalycilic acid method and biuret method in 102 urine samples. The analysis of results by students' t test showed no statistically significant difference between the three methods. It is concluded that quantitative estimation of urinary protein excretion by refractometric method is a simple cheap and reliable method and can be performed easily in the outpatient clinic. The instrument is quite handy and can be carried in the pocket.

  15. Quantitative Estimates of Bio-Remodeling on Coastal Rock Surfaces

    Directory of Open Access Journals (Sweden)

    Marta Pappalardo

    2016-05-01

    Full Text Available Remodeling of rocky coasts and erosion rates have been widely studied in past years, but not all the involved processes acting over rocks surface have been quantitatively evaluated yet. The first goal of this paper is to revise the different methodologies employed in the quantification of the effect of biotic agents on rocks exposed to coastal morphologic agents, comparing their efficiency. Secondly, we focus on geological methods to assess and quantify bio-remodeling, presenting some case studies in an area of the Mediterranean Sea in which different geological methods, inspired from the revised literature, have been tested in order to provide a quantitative assessment of the effects some biological covers exert over rocky platforms in tidal and supra-tidal environments. In particular, different experimental designs based on Schmidt hammer test results have been applied in order to estimate rock hardness related to different orders of littoral platforms and the bio-erosive/bio-protective role of Chthamalus ssp. and Verrucariaadriatica. All data collected have been analyzed using statistical tests to evaluate the significance of the measures and methodologies. The effectiveness of this approach is analyzed, and its limits are highlighted. In order to overcome the latter, a strategy combining geological and experimental–computational approaches is proposed, potentially capable of revealing novel clues on bio-erosion dynamics. An experimental-computational proposal, to assess the indirect effects of the biofilm coverage of rocky shores, is presented in this paper, focusing on the shear forces exerted during hydration-dehydration cycles. The results of computational modeling can be compared to experimental evidence, from nanoscopic to macroscopic scales.

  16. A quantitative approach for sex estimation based on cranial morphology.

    Science.gov (United States)

    Nikita, Efthymia; Michopoulou, Efrossyni

    2017-12-19

    This paper proposes a method for the quantification of the shape of sexually dimorphic cranial traits, namely the glabella, mastoid process and external occipital protuberance. The proposed method was developed using 165 crania from the documented Athens Collection and tested on 20 Cretan crania. It is based on digital photographs of the lateral view of the cranium, drawing of the profile of three sexually dimorphic structures and calculation of variables that express the shape of these structures. The combinations of variables that provide optimum discrimination between sexes are identified by means of binary logistic regression and discriminant analysis. The best cross-validated results are obtained when variables from all three structures are combined and range from 75.8 to 85.1% and 81.1 to 94.6% for males and females, respectively. The success rate is 86.3-94.1% for males and 83.9-93.5% for females when half of the sample is used for training and the rest for prediction. Correct classification for the Cretan material based upon the standards developed for the Athens sample was 80-90% for the optimum combinations of discriminant variables. The proposed method provides an effective way to capture quantitatively the shape of sexually dimorphic cranial structures; it gives more accurate results relative to other existing methods and it does not require specialized equipment. Equations for sex estimation based on combinations of variables are provided, along with instructions on how to use the method and Excel macros for calculation of discriminant variables with automated implementation of the optimum equations. © 2017 Wiley Periodicals, Inc.

  17. Uncertainty estimations for quantitative in vivo MRI T1 mapping

    Science.gov (United States)

    Polders, Daniel L.; Leemans, Alexander; Luijten, Peter R.; Hoogduin, Hans

    2012-11-01

    Mapping the longitudinal relaxation time (T1) of brain tissue is of great interest for both clinical research and MRI sequence development. For an unambiguous interpretation of in vivo variations in T1 images, it is important to understand the degree of variability that is associated with the quantitative T1 parameter. This paper presents a general framework for estimating the uncertainty in quantitative T1 mapping by combining a slice-shifted multi-slice inversion recovery EPI technique with the statistical wild-bootstrap approach. Both simulations and experimental analyses were performed to validate this novel approach and to evaluate the estimated T1 uncertainty in several brain regions across four healthy volunteers. By estimating the T1 uncertainty, it is shown that the variation in T1 within anatomic regions for similar tissue types is larger than the uncertainty in the measurement. This indicates that heterogeneity of the inspected tissue and/or partial volume effects can be the main determinants for the observed variability in the estimated T1 values. The proposed approach to estimate T1 and its uncertainty without the need for repeated measurements may also prove to be useful for calculating effect sizes that are deemed significant when comparing group differences.

  18. How Accurately Can Emergency Department Providers Estimate Patient Satisfaction?

    Directory of Open Access Journals (Sweden)

    Lalena M. Yarris

    2012-09-01

    Full Text Available Introduction: Patient satisfaction is an important measure of emergency department (ED quality of care. Little is known about providers’ ability to estimate patient satisfaction. We aimed to measure providers’ ability to assess patient satisfaction and hypothesized that providers could accurately estimate overall patient satisfaction.Methods: We surveyed ED patients regarding satisfaction with their care. Treating providers completed analogous surveys, estimating patients’ responses. Sexual assault victims and non-English-speaking or severely ill patients were excluded. Satisfaction responses were categorized as ‘‘satisfied’’ or ‘‘not satisfied.’’ Patient satisfaction scores were considered the ‘‘gold standard,’’ and providers’ perceptions of the patient satisfaction were considered tests. Measures of diagnosticaccuracy, such as positive predictive value (PPV and sensitivity, were used to assess how accurately the provider could estimate his or her patient’s satisfaction.Results: Here, 242/457 eligible patients (53% completed the survey; 227 providers (94% completed a corresponding survey. Subject-reported overall satisfaction was 96.6%, compared with a provider estimated rate of 94.4%. The sensitivity and PPV of the provider’s estimate of the patient’s satisfaction were 95.2 (95% confidence interval [CI] 91.4, 97.7 and 97.5 (95% CI 94.4, 99.2, respectively, for overall patient satisfaction. The PPV was similar for clarity of communication. The PPV was 78.9 for perceived length of ED stay (99% CI 70.8, 85.6 and 82.6 for quality of pain control (95% CI 68.6, 92.2. Accuracy of attending and resident estimates of patient satisfaction did not differ significantly. The agreement between patient-reported and provider-estimated patient satisfaction was not associated with age, gender, patient disposition, or ED divert status.Conclusion: Providers are able to assess overall patient satisfaction and clarity of

  19. Computer Monte Carlo simulation in quantitative resource estimation

    Science.gov (United States)

    Root, D.H.; Menzie, W.D.; Scott, W.A.

    1992-01-01

    The method of making quantitative assessments of mineral resources sufficiently detailed for economic analysis is outlined in three steps. The steps are (1) determination of types of deposits that may be present in an area, (2) estimation of the numbers of deposits of the permissible deposit types, and (3) combination by Monte Carlo simulation of the estimated numbers of deposits with the historical grades and tonnages of these deposits to produce a probability distribution of the quantities of contained metal. Two examples of the estimation of the number of deposits (step 2) are given. The first example is for mercury deposits in southwestern Alaska and the second is for lode tin deposits in the Seward Peninsula. The flow of the Monte Carlo simulation program is presented with particular attention to the dependencies between grades and tonnages of deposits and between grades of different metals in the same deposit. ?? 1992 Oxford University Press.

  20. A uniform quantitative stiff stability estimate for BDF schemes

    Directory of Open Access Journals (Sweden)

    Winfried Auzinger

    2006-01-01

    Full Text Available The concepts of stability regions, \\(A\\- and \\(A(\\alpha\\-stability - albeit based on scalar models - turned out to be essential for the identification of implicit methods suitable for the integration of stiff ODEs. However, for multistep methods, knowledge of the stability region provides no information on the quantitative stability behavior of the scheme. In this paper we fill this gap for the important class of Backward Differentiation Formulas (BDF. Quantitative stability bounds are derived which are uniformly valid in the stability region of the method. Our analysis is based on a study of the separation of the characteristic roots and a special similarity decomposition of the associated companion matrix.

  1. Quantitative estimation of Nipah virus replication kinetics in vitro

    Directory of Open Access Journals (Sweden)

    Hassan Sharifah

    2006-06-01

    Full Text Available Abstract Background Nipah virus is a zoonotic virus isolated from an outbreak in Malaysia in 1998. The virus causes infections in humans, pigs, and several other domestic animals. It has also been isolated from fruit bats. The pathogenesis of Nipah virus infection is still not well described. In the present study, Nipah virus replication kinetics were estimated from infection of African green monkey kidney cells (Vero using the one-step SYBR® Green I-based quantitative real-time reverse transcriptase-polymerase chain reaction (qRT-PCR assay. Results The qRT-PCR had a dynamic range of at least seven orders of magnitude and can detect Nipah virus from as low as one PFU/μL. Following initiation of infection, it was estimated that Nipah virus RNA doubles at every ~40 minutes and attained peak intracellular virus RNA level of ~8.4 log PFU/μL at about 32 hours post-infection (PI. Significant extracellular Nipah virus RNA release occurred only after 8 hours PI and the level peaked at ~7.9 log PFU/μL at 64 hours PI. The estimated rate of Nipah virus RNA released into the cell culture medium was ~0.07 log PFU/μL per hour and less than 10% of the released Nipah virus RNA was infectious. Conclusion The SYBR® Green I-based qRT-PCR assay enabled quantitative assessment of Nipah virus RNA synthesis in Vero cells. A low rate of Nipah virus extracellular RNA release and low infectious virus yield together with extensive syncytial formation during the infection support a cell-to-cell spread mechanism for Nipah virus infection.

  2. Handling uncertainty in quantitative estimates in integrated resource planning

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Wagner, C.G. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Mathematics

    1995-01-01

    This report addresses uncertainty in Integrated Resource Planning (IRP). IRP is a planning and decisionmaking process employed by utilities, usually at the behest of Public Utility Commissions (PUCs), to develop plans to ensure that utilities have resources necessary to meet consumer demand at reasonable cost. IRP has been used to assist utilities in developing plans that include not only traditional electricity supply options but also demand-side management (DSM) options. Uncertainty is a major issue for IRP. Future values for numerous important variables (e.g., future fuel prices, future electricity demand, stringency of future environmental regulations) cannot ever be known with certainty. Many economically significant decisions are so unique that statistically-based probabilities cannot even be calculated. The entire utility strategic planning process, including IRP, encompasses different types of decisions that are made with different time horizons and at different points in time. Because of fundamental pressures for change in the industry, including competition in generation, gone is the time when utilities could easily predict increases in demand, enjoy long lead times to bring on new capacity, and bank on steady profits. The purpose of this report is to address in detail one aspect of uncertainty in IRP: Dealing with Uncertainty in Quantitative Estimates, such as the future demand for electricity or the cost to produce a mega-watt (MW) of power. A theme which runs throughout the report is that every effort must be made to honestly represent what is known about a variable that can be used to estimate its value, what cannot be known, and what is not known due to operational constraints. Applying this philosophy to the representation of uncertainty in quantitative estimates, it is argued that imprecise probabilities are superior to classical probabilities for IRP.

  3. Uncertainty Model For Quantitative Precipitation Estimation Using Weather Radars

    Directory of Open Access Journals (Sweden)

    Ernesto Gómez Vargas

    2016-06-01

    Full Text Available This paper introduces an uncertainty model for the quantitatively estimate precipitation using weather radars. The model considers various key aspects associated to radar calibration, attenuation, and the tradeoff between accuracy and radar coverage. An S-band-radar case study is presented to illustrate particular fractional-uncertainty calculations obtained to adjust various typical radar-calibration elements such as antenna, transmitter, receiver, and some other general elements included in the radar equation. This paper is based in “Guide to the expression of Uncertainty in measurement” and the results show that the fractional uncertainty calculated by the model was 40 % for the reflectivity and 30% for the precipitation using the Marshall Palmer Z-R relationship.

  4. Radar-Derived Quantitative Precipitation Estimation Based on Precipitation Classification

    Directory of Open Access Journals (Sweden)

    Lili Yang

    2016-01-01

    Full Text Available A method for improving radar-derived quantitative precipitation estimation is proposed. Tropical vertical profiles of reflectivity (VPRs are first determined from multiple VPRs. Upon identifying a tropical VPR, the event can be further classified as either tropical-stratiform or tropical-convective rainfall by a fuzzy logic (FL algorithm. Based on the precipitation-type fields, the reflectivity values are converted into rainfall rate using a Z-R relationship. In order to evaluate the performance of this rainfall classification scheme, three experiments were conducted using three months of data and two study cases. In Experiment I, the Weather Surveillance Radar-1988 Doppler (WSR-88D default Z-R relationship was applied. In Experiment II, the precipitation regime was separated into convective and stratiform rainfall using the FL algorithm, and corresponding Z-R relationships were used. In Experiment III, the precipitation regime was separated into convective, stratiform, and tropical rainfall, and the corresponding Z-R relationships were applied. The results show that the rainfall rates obtained from all three experiments match closely with the gauge observations, although Experiment II could solve the underestimation, when compared to Experiment I. Experiment III significantly reduced this underestimation and generated the most accurate radar estimates of rain rate among the three experiments.

  5. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  6. Novel whole brain segmentation and volume estimation using quantitative MRI

    Energy Technology Data Exchange (ETDEWEB)

    West, J. [Linkoeping University, Radiation Physics, Department of Medical and Health Sciences, Faculty of Health Sciences, Linkoeping (Sweden); Linkoeping University, Center for Medical Imaging Science and Visualization (CMIV), Linkoeping (Sweden); SyntheticMR AB, Linkoeping (Sweden); Warntjes, J.B.M. [Linkoeping University, Center for Medical Imaging Science and Visualization (CMIV), Linkoeping (Sweden); SyntheticMR AB, Linkoeping (Sweden); Linkoeping University and Department of Clinical Physiology UHL, County Council of Oestergoetland, Clinical Physiology, Department of Medical and Health Sciences, Faculty of Health Sciences, Linkoeping (Sweden); Lundberg, P. [Linkoeping University, Center for Medical Imaging Science and Visualization (CMIV), Linkoeping (Sweden); Linkoeping University and Department of Radiation Physics UHL, County Council of Oestergoetland, Radiation Physics, Department of Medical and Health Sciences, Faculty of Health Sciences, Linkoeping (Sweden); Linkoeping University and Department of Radiology UHL, County Council of Oestergoetland, Radiology, Department of Medical and Health Sciences, Faculty of Health Sciences, Linkoeping (Sweden)

    2012-05-15

    Brain segmentation and volume estimation of grey matter (GM), white matter (WM) and cerebro-spinal fluid (CSF) are important for many neurological applications. Volumetric changes are observed in multiple sclerosis (MS), Alzheimer's disease and dementia, and in normal aging. A novel method is presented to segment brain tissue based on quantitative magnetic resonance imaging (qMRI) of the longitudinal relaxation rate R{sub 1}, the transverse relaxation rate R{sub 2} and the proton density, PD. Previously reported qMRI values for WM, GM and CSF were used to define tissues and a Bloch simulation performed to investigate R{sub 1}, R{sub 2} and PD for tissue mixtures in the presence of noise. Based on the simulations a lookup grid was constructed to relate tissue partial volume to the R{sub 1}-R{sub 2}-PD space. The method was validated in 10 healthy subjects. MRI data were acquired using six resolutions and three geometries. Repeatability for different resolutions was 3.2% for WM, 3.2% for GM, 1.0% for CSF and 2.2% for total brain volume. Repeatability for different geometries was 8.5% for WM, 9.4% for GM, 2.4% for CSF and 2.4% for total brain volume. We propose a new robust qMRI-based approach which we demonstrate in a patient with MS. (orig.)

  7. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Nora E Fritz

    2017-01-01

    Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.

  8. Quantitative estimates of the volatility of ambient organic aerosol

    Directory of Open Access Journals (Sweden)

    C. D. Cappa

    2010-06-01

    Full Text Available Measurements of the sensitivity of organic aerosol (OA, and its components mass to changes in temperature were recently reported by Huffman et al.~(2009 using a tandem thermodenuder-aerosol mass spectrometer (TD-AMS system in Mexico City and the Los Angeles area. Here, we use these measurements to derive quantitative estimates of aerosol volatility within the framework of absorptive partitioning theory using a kinetic model of aerosol evaporation in the TD. OA volatility distributions (or "basis-sets" are determined using several assumptions as to the enthalpy of vaporization (ΔHvap. We present two definitions of "non-volatile OA," one being a global and one a local definition. Based on these definitions, our analysis indicates that a substantial fraction of the organic aerosol is comprised of non-volatile components that will not evaporate under any atmospheric conditions; on the order of 50–80% when the most realistic ΔHvap assumptions are considered. The sensitivity of the total OA mass to dilution and ambient changes in temperature has been assessed for the various ΔHvap assumptions. The temperature sensitivity is relatively independent of the particular ΔHvap assumptions whereas dilution sensitivity is found to be greatest for the low (ΔHvap = 50 kJ/mol and lowest for the high (ΔHvap = 150 kJ/mol assumptions. This difference arises from the high ΔHvap assumptions yielding volatility distributions with a greater fraction of non-volatile material than the low ΔHvap assumptions. If the observations are fit using a 1 or 2-component model the sensitivity of the OA to dilution is unrealistically high. An empirical method introduced by Faulhaber et al. (2009 has also been used to independently estimate a volatility distribution for the ambient OA and is found to give results consistent with the

  9. Quantitative estimation of regional brain iron with magnetic resonance imaging.

    Science.gov (United States)

    Martin, W R Wayne

    2009-12-01

    Biochemical studies have reported increased iron content in the substantia nigra pars compacta (SNc) in Parkinson disease (PD), with changes most marked in severe disease, suggesting that measurement of regional iron content in the nigra may provide an indication of the pathologic severity of the disease. Although basal ganglia structures, including the substantia nigra, are readily visualized with MRI, in part because of their high iron content, conventional imaging techniques have failed to show definitive abnormalities in individuals with PD. We have developed MRI-based methodology to estimate regional iron content utilizing a 1.5 tesla system and have shown a correlation between age and striatal iron, as well as a significant increase in putaminal and pallidal iron in PD that correlated with the severity of clinical symptomatology. Several investigators have utilized novel MR techniques implemented on 3 tesla magnets and have suggested the presence of increased nigral iron content in treated patients with PD, in addition to a correlation between nigral iron and simple reaction time. We have applied a modification of our original method to determine whether SNc changes evident at 3 tesla corresponded anatomically to the distribution of neuropathologic changes reported previously. Our results indicate the presence of lateral SNc abnormalities in untreated patients with early PD, consistent with increased iron content and corresponding to the known distribution of neuronal loss occurring in this disorder. We suggest that this may ultimately provide an imaging marker for disease progression in PD, although longitudinal studies are required.

  10. Peptide-Centric Approaches Provide an Alternative Perspective To Re-Examine Quantitative Proteomic Data.

    Science.gov (United States)

    Ning, Zhibin; Zhang, Xu; Mayne, Janice; Figeys, Daniel

    2016-02-16

    Quantitative proteomics can provide rich information on changes in biological functions and processes. However, its accuracy is affected by the inherent information degeneration found in bottom-up proteomics. Therefore, the precise protein inference from identified peptides can be mistaken since an ad hoc rule is used for generating a list of protein groups that depends on both the sample type and the sampling depth. Herein, we propose an alternative approach for examining quantitative proteomic data which is peptide-centric instead of protein-centric. We discuss the feasibility of the peptide-centric approach which was tested on several quantitative proteomic data sets. We show that peptide-centric quantification has several advantages over protein level analysis: (1) it is more sensitive for sample segregation, (2) it avoids the issues associated with protein inference, and (3) it can retrieve significant peptides lost in protein-centric quantification for further downstream analysis.

  11. Quantitative difference method for estimation of fertilizer nitrogen ...

    African Journals Online (AJOL)

    The percentic recovery and balance of fertilizer nitrogen can be determined by different methods. In this study, quantitative difference method in recovery of Nitrogen of above-ground dry matter was applied to investigate the uptake of field applied nitrogen by maize cultivated in an orthic oxisol soil. It was found that the ...

  12. Quantitative estimation of the parameters for self-motion driven by difference in surface tension.

    Science.gov (United States)

    Suematsu, Nobuhiko J; Sasaki, Tomohiro; Nakata, Satoshi; Kitahata, Hiroyuki

    2014-07-15

    Quantitative information on the parameters associated with self-propelled objects would enhance the potential of this research field; for example, finding a realistic way to develop a functional self-propelled object and quantitative understanding of the mechanism of self-motion. We therefore estimated five main parameters, including the driving force, of a camphor boat as a simple self-propelled object that spontaneously moves on water due to difference in surface tension. The experimental results and mathematical model indicated that the camphor boat generated a driving force of 4.2 μN, which corresponds to a difference in surface tension of 1.1 mN m(-1). The methods used in this study are not restricted to evaluate the parameters of self-motion of a camphor boat, but can be applied to other self-propelled objects driven by difference in surface tension. Thus, our investigation provides a novel method to quantitatively estimate the parameters for self-propelled objects driven by the interfacial tension difference.

  13. Gender differences in pension wealth: estimates using provider data.

    Science.gov (United States)

    Johnson, R W; Sambamoorthi, U; Crystal, S

    1999-06-01

    Information from pension providers was examined to investigate gender differences in pension wealth at midlife. For full-time wage and salary workers approaching retirement age who had pension coverage, median pension wealth on the current job was 76% greater for men than women. Differences in wages, years of job tenure, and industry between men and women accounted for most of the gender gap in pension wealth on the current job. Less than one third of the wealth difference could not be explained by gender differences in education, demographics, or job characteristics. The less-advantaged employment situation of working women currently in midlife carries over into worse retirement income prospects. However, the gender gap in pensions is likely to narrow in the future as married women's employment experiences increasingly resemble those of men.

  14. Providing Open-Access Know How for Directors of Quantitative and Mathematics Support Centers

    Directory of Open Access Journals (Sweden)

    Michael Schuckers

    2017-01-01

    Full Text Available The purpose of this editorial is to introduce the quantitative literacy community to the newly published A Handbook for Directors of Quantitative and Mathematics Centers. QMaSCs (pronounced “Q-masks” can be broadly defined as centers that have supporting students in quantitative fields of study as part of their mission. Some focus only on calculus or mathematics; others concentrate on numeracy or quantitative literacy, and some do all of that. A QMaSC may be embedded in a mathematics department, or part of a learning commons, or a stand-alone center. There are hundreds of these centers in the U.S. The new handbook, which is the outgrowth of a 2013 NSF-sponsored, national workshop attended by 23 QMaSC directors from all quarters of the U.S., is available open access on the USF Scholar Commons and in hard copy from Amazon.com. This editorial by the handbook’s editors provides background and overview of the 20 detailed chapters on center leadership and management; community interactions; staffing, hiring and training; center assessment; and starting a center; and then a collection of ten case studies from research universities, four-year state colleges, liberal arts colleges, and a community college. The editorial ends by pointing out the need and potential benefits of a professional organization for QMaSC directors.

  15. The Centiloid Project: standardizing quantitative amyloid plaque estimation by PET.

    Science.gov (United States)

    Klunk, William E; Koeppe, Robert A; Price, Julie C; Benzinger, Tammie L; Devous, Michael D; Jagust, William J; Johnson, Keith A; Mathis, Chester A; Minhas, Davneet; Pontecorvo, Michael J; Rowe, Christopher C; Skovronsky, Daniel M; Mintun, Mark A

    2015-01-01

    Although amyloid imaging with PiB-PET ([C-11]Pittsburgh Compound-B positron emission tomography), and now with F-18-labeled tracers, has produced remarkably consistent qualitative findings across a large number of centers, there has been considerable variability in the exact numbers reported as quantitative outcome measures of tracer retention. In some cases this is as trivial as the choice of units, in some cases it is scanner dependent, and of course, different tracers yield different numbers. Our working group was formed to standardize quantitative amyloid imaging measures by scaling the outcome of each particular analysis method or tracer to a 0 to 100 scale, anchored by young controls (≤ 45 years) and typical Alzheimer's disease patients. The units of this scale have been named "Centiloids." Basically, we describe a "standard" method of analyzing PiB PET data and then a method for scaling any "nonstandard" method of PiB PET analysis (or any other tracer) to the Centiloid scale. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  16. Estimation of Heterosis and Inbreeding Depression in Quantitative ...

    African Journals Online (AJOL)

    Heterosis and in-breeding depression were estimated in 8x8 half diallel crosses of rice. The planted materials consisted ofeight parental inbred lines, their F1 hybrids and F2 populations using randomized complete block design with three replications. Data were collected on number of days to 50% flowering, plant height, ...

  17. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    Science.gov (United States)

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  18. Quantitative estimation of source complexity in tsunami-source inversion

    Science.gov (United States)

    Dettmer, Jan; Cummins, Phil R.; Hawkins, Rhys; Jakir Hossen, M.

    2016-04-01

    This work analyses tsunami waveforms to infer the spatiotemporal evolution of sea-surface displacement (the tsunami source) caused by earthquakes or other sources. Since the method considers sea-surface displacement directly, no assumptions about the fault or seafloor deformation are required. While this approach has no ability to study seismic aspects of rupture, it greatly simplifies the tsunami source estimation, making it much less dependent on subjective fault and deformation assumptions. This results in a more accurate sea-surface displacement evolution in the source region. The spatial discretization is by wavelet decomposition represented by a trans-D Bayesian tree structure. Wavelet coefficients are sampled by a reversible jump algorithm and additional coefficients are only included when required by the data. Therefore, source complexity is consistent with data information (parsimonious) and the method can adapt locally in both time and space. Since the source complexity is unknown and locally adapts, no regularization is required, resulting in more meaningful displacement magnitudes. By estimating displacement uncertainties in a Bayesian framework we can study the effect of parametrization choice on the source estimate. Uncertainty arises from observation errors and limitations in the parametrization to fully explain the observations. As a result, parametrization choice is closely related to uncertainty estimation and profoundly affects inversion results. Therefore, parametrization selection should be included in the inference process. Our inversion method is based on Bayesian model selection, a process which includes the choice of parametrization in the inference process and makes it data driven. A trans-dimensional (trans-D) model for the spatio-temporal discretization is applied here to include model selection naturally and efficiently in the inference by sampling probabilistically over parameterizations. The trans-D process results in better

  19. [Quantitative estimation of evapotranspiration from Tahe forest ecosystem, Northeast China].

    Science.gov (United States)

    Qu, Di; Fan, Wen-Yi; Yang, Jin-Ming; Wang, Xu-Peng

    2014-06-01

    Evapotranspiration (ET) is an important parameter of agriculture, meteorology and hydrology research, and also an important part of the global hydrological cycle. This paper applied the improved DHSVM distributed hydrological model to estimate daily ET of Tahe area in 2007 using leaf area index and other surface data extracted TM remote sensing data, and slope, aspect and other topographic indices obtained by using the digital elevation model. The relationship between daily ET and daily watershed outlet flow was built by the BP neural network, and a water balance equation was established for the studied watershed, together to test the accuracy of the estimation. The results showed that the model could be applied in the study area. The annual total ET of Tahe watershed was 234.01 mm. ET had a significant seasonal variation. The ET had the highest value in summer and the average daily ET value was 1.56 mm. The average daily ET in autumn and spring were 0.30, 0.29 mm, respectively, and winter had the lowest ET value. Land cover type had a great effect on ET value, and the broadleaf forest had a higher ET ability than the mixed forest, followed by the needle leaf forest.

  20. Method for estimating total attenuation from a spatial map of attenuation slope for quantitative ultrasound imaging.

    Science.gov (United States)

    Pawlicki, Alexander D; O'Brien, William D

    2013-04-01

    Estimating total ultrasound attenuation from backscatter data is essential in the field of quantitative ultrasound (QUS) because of the need to compensate for attenuation when estimating the backscatter coefficient and QUS parameters. This work uses a reference phantom method of attenuation estimation to create a spatial map of attenuation slope (AS) from backscatter radio-frequency (RF) data of three phantoms and a rat mammary adenocarcinoma tumor (MAT). The attenuation maps show changes in attenuation between different regions of the phantoms and the MAT tumor. Analyses of the attenuation maps of the phantoms suggest that the AS estimates are in good quantitative agreement with the known values for the phantoms. Furthermore, estimates of total attenuation from the attenuation maps are likewise in good quantitative agreement with known values.

  1. 49 CFR 375.409 - May household goods brokers provide estimates?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false May household goods brokers provide estimates? 375... REGULATIONS TRANSPORTATION OF HOUSEHOLD GOODS IN INTERSTATE COMMERCE; CONSUMER PROTECTION REGULATIONS Estimating Charges § 375.409 May household goods brokers provide estimates? A household goods broker must not...

  2. An approach to quantitative assessment of crew well-being for providing safety of long-term space missions

    Science.gov (United States)

    Bartsev, S. I.; Mezhevikin, V. V.; Okhonin, V. A.

    The main destination of Life Support Systems - to support life and provide crew safety - put the problem of the most effective providing this function. In the scope of the whole mission the safety of crew depends on many interrelating features of space ship, LSS, and scenario of given mission itself. Effective risk mitigation needs optimal minimizing of all risk factors. Effective minimization presumes quantitative presentation of these factors. In the paper an approach to quantitative assessment of quality of life in the scope of previously introduced integrated coefficient of maximum reliability. One of the most significant risk factors is crew fatal mistake. There is always other-than-zero probability of a fatal human mistake in controlling the vehicle, landing module, nuclear reactor or other vital device. It is difficult to estimate the probability of such a mistake, but it is apparent that this probability increases with impaired human health. Under closed air cycling such a condition is highly probable as demonstrated by the Sick Building Syndrome (SBS) in highly sealed, so-called "energy efficient" buildings. Seemingly, the cause of SBS is a set of not completely identified factors, yet, it should be noted that in spite of complete pressurization the crew of Bios-3 did not have complaints typical for SBS. It cannot be ruled out that the higher plants may be the most realistic remedy to reduce the probability of the crew's fatal mistakes. All this gives the way to convert so difficultly formalizable parameter as quality of life into probability of accident. A simple monotonous dependence of deterioration of crew health and probability of a fatal mistake on mission time is discussed. Possible medical-biological experiments for more detailed estimations of this dependency are considered.

  3. Reef-associated crustacean fauna: biodiversity estimates using semi-quantitative sampling and DNA barcoding

    Science.gov (United States)

    Plaisance, L.; Knowlton, N.; Paulay, G.; Meyer, C.

    2009-12-01

    The cryptofauna associated with coral reefs accounts for a major part of the biodiversity in these ecosystems but has been largely overlooked in biodiversity estimates because the organisms are hard to collect and identify. We combine a semi-quantitative sampling design and a DNA barcoding approach to provide metrics for the diversity of reef-associated crustacean. Twenty-two similar-sized dead heads of Pocillopora were sampled at 10 m depth from five central Pacific Ocean localities (four atolls in the Northern Line Islands and in Moorea, French Polynesia). All crustaceans were removed, and partial cytochrome oxidase subunit I was sequenced from 403 individuals, yielding 135 distinct taxa using a species-level criterion of 5% similarity. Most crustacean species were rare; 44% of the OTUs were represented by a single individual, and an additional 33% were represented by several specimens found only in one of the five localities. The Northern Line Islands and Moorea shared only 11 OTUs. Total numbers estimated by species richness statistics (Chao1 and ACE) suggest at least 90 species of crustaceans in Moorea and 150 in the Northern Line Islands for this habitat type. However, rarefaction curves for each region failed to approach an asymptote, and Chao1 and ACE estimators did not stabilize after sampling eight heads in Moorea, so even these diversity figures are underestimates. Nevertheless, even this modest sampling effort from a very limited habitat resulted in surprisingly high species numbers.

  4. Quantitative Estimating Salt Content of Saline Soil Using Laboratory Hyperspectral Data Treated by Fractional Derivative

    Directory of Open Access Journals (Sweden)

    Dong Zhang

    2016-01-01

    Full Text Available Most present researches on estimation of soil salinity by hyperspectral data have focused on the spectral reflectance or their integer derivatives but ignored the fractional derivative information of hyperspectral data. Motivated by this situation, the selected study area is the Ebinur Lake basin located in the southwest border in the Xinjiang Uygur Autonomous Region, China, with severe salinization. The field work was conducted from 15 to 25 October, 2014, and a total of 180 soil samples were collected from 45 sampling sites; after measuring the soil salt content and spectral reflectance in the laboratory, the range from 0 to 2 was divided into 11 orders (interval 0.2 and then the hyperspectral data were treated by 4 kinds of mathematical transformations and 11 orders of fractional derivatives. Combined with the soil salt content, partial least square regression method was applied for model calibrations and predictions and some indexes were used to evaluate the performance of models. The results showed that the retrieval model built up by 250 bands based on 1.2-order derivative of 1/lg⁡R had excellent capacity of estimating soil salt content in the study area (RMSEC=14.685 g/kg, RMSEP=14.713 g/kg, R2C=0.782, R2P=0.768, and RPD = 2.080. This study provides an application reference for quantitative estimations of other land surface parameters and some other applications on hyperspectral technology.

  5. Method for Estimating Total Attenuation from a Spatial Map of Attenuation Slope for Quantitative Ultrasound Imaging

    OpenAIRE

    Pawlicki, Alexander D.; O'Brien, William D.

    2013-01-01

    Estimating total ultrasound attenuation from backscatter data is essential in the field of quantitative ultrasound (QUS) because of the need to compensate for attenuation when estimating the backscatter coefficient and QUS parameters. This work uses a reference phantom method of attenuation estimation to create a spatial map of attenuation slope (AS) from backscatter radio-frequency (RF) data of three phantoms and a rat mammary adenocarcinoma tumor (MAT). The attenuation maps show changes in ...

  6. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  7. Quantitative ultrasound characterization of locally advanced breast cancer by estimation of its scatterer properties

    Energy Technology Data Exchange (ETDEWEB)

    Tadayyon, Hadi [Physical Sciences, Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario M4N 3M5 (Canada); Department of Medical Biophysics, Faculty of Medicine, University of Toronto, Toronto, Ontario M5G 2M9 (Canada); Sadeghi-Naini, Ali; Czarnota, Gregory, E-mail: Gregory.Czarnota@sunnybrook.ca [Physical Sciences, Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario M4N 3M5 (Canada); Department of Medical Biophysics, Faculty of Medicine, University of Toronto, Toronto, Ontario M5G 2M9 (Canada); Department of Radiation Oncology, Odette Cancer Centre, Sunnybrook Health Sciences Centre, Toronto, Ontario M4N 3M5 (Canada); Department of Radiation Oncology, Faculty of Medicine, University of Toronto, Toronto, Ontario M5T 1P5 (Canada); Wirtzfeld, Lauren [Department of Physics, Ryerson University, Toronto, Ontario M5B 2K3 (Canada); Wright, Frances C. [Division of Surgical Oncology, Sunnybrook Health Sciences Centre, Toronto, Ontario M4N 3M5 (Canada)

    2014-01-15

    Purpose: Tumor grading is an important part of breast cancer diagnosis and currently requires biopsy as its standard. Here, the authors investigate quantitative ultrasound parameters in locally advanced breast cancers that can potentially separate tumors from normal breast tissue and differentiate tumor grades. Methods: Ultrasound images and radiofrequency data from 42 locally advanced breast cancer patients were acquired and analyzed. Parameters related to the linear regression of the power spectrum—midband fit, slope, and 0-MHz-intercept—were determined from breast tumors and normal breast tissues. Mean scatterer spacing was estimated from the spectral autocorrelation, and the effective scatterer diameter and effective acoustic concentration were estimated from the Gaussian form factor. Parametric maps of each quantitative ultrasound parameter were constructed from the gated radiofrequency segments in tumor and normal tissue regions of interest. In addition to the mean values of the parametric maps, higher order statistical features, computed from gray-level co-occurrence matrices were also determined and used for characterization. Finally, linear and quadratic discriminant analyses were performed using combinations of quantitative ultrasound parameters to classify breast tissues. Results: Quantitative ultrasound parameters were found to be statistically different between tumor and normal tissue (p < 0.05). The combination of effective acoustic concentration and mean scatterer spacing could separate tumor from normal tissue with 82% accuracy, while the addition of effective scatterer diameter to the combination did not provide significant improvement (83% accuracy). Furthermore, the two advanced parameters, including effective scatterer diameter and mean scatterer spacing, were found to be statistically differentiating among grade I, II, and III tumors (p = 0.014 for scatterer spacing, p = 0.035 for effective scatterer diameter). The separation of the tumor

  8. Quantitative Proteomic Analysis Provides Novel Insights into Cold Stress Responses in Petunia Seedlings.

    Science.gov (United States)

    Zhang, Wei; Zhang, Huilin; Ning, Luyun; Li, Bei; Bao, Manzhu

    2016-01-01

    Low temperature is a major adverse environmental factor that impairs petunia growth and development. To better understand the molecular mechanisms of cold stress adaptation of petunia plants, a quantitative proteomic analysis using iTRAQ technology was performed to detect the effects of cold stress on protein expression profiles in petunia seedlings which had been subjected to 2°C for 5 days. Of the 2430 proteins whose levels were quantitated, a total of 117 proteins were discovered to be differentially expressed under low temperature stress in comparison to unstressed controls. As an initial study, 44 proteins including well known and novel cold-responsive proteins were successfully annotated. By integrating the results of two independent Gene Ontology (GO) enrichment analyses, seven common GO terms were found of which "oxidation-reduction process" was the most notable for the cold-responsive proteins. By using the subcellular localization tool Plant-mPLoc predictor, as much as 40.2% of the cold-responsive protein group was found to be located within chloroplasts, suggesting that the chloroplast proteome is particularly affected by cold stress. Gene expression analyses of 11 cold-responsive proteins by real time PCR demonstrated that the mRNA levels were not strongly correlated with the respective protein levels. Further activity assay of anti-oxidative enzymes showed different alterations in cold treated petunia seedlings. Our investigation has highlighted the role of antioxidation mechanisms and also epigenetic factors in the regulation of cold stress responses. Our work has provided novel insights into the plant response to cold stress and should facilitate further studies regarding the molecular mechanisms which determine how plant cells cope with environmental perturbation. The data have been deposited to the ProteomeXchange with identifier PXD002189.

  9. Quantitative Functional Imaging Using Dynamic Positron Computed Tomography and Rapid Parameter Estimation Techniques

    Science.gov (United States)

    Koeppe, Robert Allen

    Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations

  10. SPECTRAL FEATURE ANALYSIS FOR QUANTITATIVE ESTIMATION OF CYANOBACTERIA CHLOROPHYLL-A

    Directory of Open Access Journals (Sweden)

    Y. Lin

    2016-06-01

    for quantitative estimation of chlorophyll-a, and more effective than the traditional single band model; the best regression models for SR, NDVI with chlorophyll-a are linear and power, respectively. Under the condition without water disturbance, the single band model works the best. For the SR index, there are two optimal band combinations, which is comprised of infrared (700nm-900nm and blue-green range (450nm-550nm, infrared and red range (600nm-650nm respectively, with band width between 45nm to 125nm. For NDVI, the optimal band combination includes the range from 750nm to 900nm and 700nm to 750nm, with band width less than 30nm. For single band model, band center located between 733nm-935nm, and its width mustn’t exceed the interval where band center located in. This study proved , as for SR or NDVI, the centers and widths are crucial factors for quantitative estimating chlorophyll-a. As for remote sensor, proper spectrum channel could not only improve the accuracy of recognizing cyanobacteria bloom, but reduce the redundancy of hyperspectral data. Those results will provide better reference for designing the suitable spectrum channel of customized sensors for cyanobacteria bloom monitoring at a low altitude. In other words, this study is also the basic research for developing the real-time remote sensing monitoring system with high time and high spatial resolution.

  11. Spectral Feature Analysis for Quantitative Estimation of Cyanobacteria Chlorophyll-A

    Science.gov (United States)

    Lin, Yi; Ye, Zhanglin; Zhang, Yugan; Yu, Jie

    2016-06-01

    estimation of chlorophyll-a, and more effective than the traditional single band model; the best regression models for SR, NDVI with chlorophyll-a are linear and power, respectively. Under the condition without water disturbance, the single band model works the best. For the SR index, there are two optimal band combinations, which is comprised of infrared (700nm-900nm) and blue-green range (450nm-550nm), infrared and red range (600nm-650nm) respectively, with band width between 45nm to 125nm. For NDVI, the optimal band combination includes the range from 750nm to 900nm and 700nm to 750nm, with band width less than 30nm. For single band model, band center located between 733nm-935nm, and its width mustn't exceed the interval where band center located in. This study proved , as for SR or NDVI, the centers and widths are crucial factors for quantitative estimating chlorophyll-a. As for remote sensor, proper spectrum channel could not only improve the accuracy of recognizing cyanobacteria bloom, but reduce the redundancy of hyperspectral data. Those results will provide better reference for designing the suitable spectrum channel of customized sensors for cyanobacteria bloom monitoring at a low altitude. In other words, this study is also the basic research for developing the real-time remote sensing monitoring system with high time and high spatial resolution.

  12. Providing Open-Access Know How for Directors of Quantitative and Mathematics Support Centers

    OpenAIRE

    Michael Schuckers; Mary B. O'Neill; Grace Coulombe

    2017-01-01

    The purpose of this editorial is to introduce the quantitative literacy community to the newly published A Handbook for Directors of Quantitative and Mathematics Centers. QMaSCs (pronounced “Q-masks”) can be broadly defined as centers that have supporting students in quantitative fields of study as part of their mission. Some focus only on calculus or mathematics; others concentrate on numeracy or quantitative literacy, and some do all of that. A QMaSC may be embedded in a mathematics departm...

  13. Quantitative evaluation of fiber fuse initiation with exposure to arc discharge provided by a fusion splicer.

    Science.gov (United States)

    Todoroki, Shin-Ichi

    2016-05-03

    The optical communication industry and power-over-fiber applications face a dilemma as a result of the expanding demand of light power delivery and the potential risks of high-power light manipulation including the fiber fuse phenomenon, a continuous destruction of the fiber core pumped by the propagating light and triggered by a heat-induced strong absorption of silica glass. However, we have limited knowledge on its initiation process in the viewpoint of energy flow in the reactive area. Therefore, the conditions required for a fiber fuse initiation in standard single-mode fibers were determined quantitatively, namely the power of a 1480 nm fiber laser and the arc discharge intensity provided by a fusion splicer for one second as an outer heat source. Systematic investigation on the energy flow balance between these energy sources revealed that the initiation process consists of two steps; the generation of a precursor at the heated spot and the transition to a stable fiber fuse. The latter step needs a certain degree of heat accumulation at the core where waveguide deformation is ongoing competitively. This method is useful for comparing the tolerance to fiber fuse initiation among various fibers with a fixed energy amount that was not noticed before.

  14. Application of quantitative structure-property relationship analysis to estimate the vapor pressure of pesticides.

    Science.gov (United States)

    Goodarzi, Mohammad; Coelho, Leandro dos Santos; Honarparvar, Bahareh; Ortiz, Erlinda V; Duchowicz, Pablo R

    2016-06-01

    The application of molecular descriptors in describing Quantitative Structure Property Relationships (QSPR) for the estimation of vapor pressure (VP) of pesticides is of ongoing interest. In this study, QSPR models were developed using multiple linear regression (MLR) methods to predict the vapor pressure values of 162 pesticides. Several feature selection methods, namely the replacement method (RM), genetic algorithms (GA), stepwise regression (SR) and forward selection (FS), were used to select the most relevant molecular descriptors from a pool of variables. The optimum subset of molecular descriptors was used to build a QSPR model to estimate the vapor pressures of the selected pesticides. The Replacement Method improved the predictive ability of vapor pressures and was more reliable for the feature selection of these selected pesticides. The results provided satisfactory MLR models that had a satisfactory predictive ability, and will be important for predicting vapor pressure values for compounds with unknown values. This study may open new opportunities for designing and developing new pesticide. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Providing Quantitative Information and a Nudge to Undergo Stool Testing in a Colorectal Cancer Screening Decision Aid: A Randomized Clinical Trial.

    Science.gov (United States)

    Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M

    2017-08-01

    Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.

  16. Dual Respiratory and Cardiac Motion Estimation in PET Imaging: Methods Design and Quantitative Evaluation.

    Science.gov (United States)

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-02-05

    The goal of this study is to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Methods 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Methods 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of 2 more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the 4 estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be the

  17. Estimating the Potential Toxicity of Chemicals Associated with Hydraulic Fracturing Operations Using Quantitative Structure-Activity Relationship Modeling.

    Science.gov (United States)

    Yost, Erin E; Stanek, John; DeWoskin, Robert S; Burgoon, Lyle D

    2016-07-19

    The United States Environmental Protection Agency (EPA) identified 1173 chemicals associated with hydraulic fracturing fluids, flowback, or produced water, of which 1026 (87%) lack chronic oral toxicity values for human health assessments. To facilitate the ranking and prioritization of chemicals that lack toxicity values, it may be useful to employ toxicity estimates from quantitative structure-activity relationship (QSAR) models. Here we describe an approach for applying the results of a QSAR model from the TOPKAT program suite, which provides estimates of the rat chronic oral lowest-observed-adverse-effect level (LOAEL). Of the 1173 chemicals, TOPKAT was able to generate LOAEL estimates for 515 (44%). To address the uncertainty associated with these estimates, we assigned qualitative confidence scores (high, medium, or low) to each TOPKAT LOAEL estimate, and found 481 to be high-confidence. For 48 chemicals that had both a high-confidence TOPKAT LOAEL estimate and a chronic oral reference dose from EPA's Integrated Risk Information System (IRIS) database, Spearman rank correlation identified 68% agreement between the two values (permutation p-value =1 × 10(-11)). These results provide support for the use of TOPKAT LOAEL estimates in identifying and prioritizing potentially hazardous chemicals. High-confidence TOPKAT LOAEL estimates were available for 389 of 1026 hydraulic fracturing-related chemicals that lack chronic oral RfVs and OSFs from EPA-identified sources, including a subset of chemicals that are frequently used in hydraulic fracturing fluids.

  18. Sensitivity of quantitative groundwater recharge estimates to volumetric and distribution uncertainty in rainfall forcing products

    Science.gov (United States)

    Werner, Micha; Westerhoff, Rogier; Moore, Catherine

    2017-04-01

    Quantitative estimates of recharge due to precipitation excess are an important input to determining sustainable abstraction of groundwater resources, as well providing one of the boundary conditions required for numerical groundwater modelling. Simple water balance models are widely applied for calculating recharge. In these models, precipitation is partitioned between different processes and stores; including surface runoff and infiltration, storage in the unsaturated zone, evaporation, capillary processes, and recharge to groundwater. Clearly the estimation of recharge amounts will depend on the estimation of precipitation volumes, which may vary, depending on the source of precipitation data used. However, the partitioning between the different processes is in many cases governed by (variable) intensity thresholds. This means that the estimates of recharge will not only be sensitive to input parameters such as soil type, texture, land use, potential evaporation; but mainly to the precipitation volume and intensity distribution. In this paper we explore the sensitivity of recharge estimates due to difference in precipitation volumes and intensity distribution in the rainfall forcing over the Canterbury region in New Zealand. We compare recharge rates and volumes using a simple water balance model that is forced using rainfall and evaporation data from; the NIWA Virtual Climate Station Network (VCSN) data (which is considered as the reference dataset); the ERA-Interim/WATCH dataset at 0.25 degrees and 0.5 degrees resolution; the TRMM-3B42 dataset; the CHIRPS dataset; and the recently releases MSWEP dataset. Recharge rates are calculated at a daily time step over the 14 year period from the 2000 to 2013 for the full Canterbury region, as well as at eight selected points distributed over the region. Lysimeter data with observed estimates of recharge are available at four of these points, as well as recharge estimates from the NGRM model, an independent model

  19. Improving satellite quantitative precipitation estimates by incorporating deep convective cloud optical depth

    Science.gov (United States)

    Stenz, Ronald D.

    As Deep Convective Systems (DCSs) are responsible for most severe weather events, increased understanding of these systems along with more accurate satellite precipitation estimates will improve NWS (National Weather Service) warnings and monitoring of hazardous weather conditions. A DCS can be classified into convective core (CC) regions (heavy rain), stratiform (SR) regions (moderate-light rain), and anvil (AC) regions (no rain). These regions share similar infrared (IR) brightness temperatures (BT), which can create large errors for many existing rain detection algorithms. This study assesses the performance of the National Mosaic and Multi-sensor Quantitative Precipitation Estimation System (NMQ) Q2, and a simplified version of the GOES-R Rainfall Rate algorithm (also known as the Self-Calibrating Multivariate Precipitation Retrieval, or SCaMPR), over the state of Oklahoma (OK) using OK MESONET observations as ground truth. While the average annual Q2 precipitation estimates were about 35% higher than MESONET observations, there were very strong correlations between these two data sets for multiple temporal and spatial scales. Additionally, the Q2 estimated precipitation distributions over the CC, SR, and AC regions of DCSs strongly resembled the MESONET observed ones, indicating that Q2 can accurately capture the precipitation characteristics of DCSs although it has a wet bias . SCaMPR retrievals were typically three to four times higher than the collocated MESONET observations, with relatively weak correlations during a year of comparisons in 2012. Overestimates from SCaMPR retrievals that produced a high false alarm rate were primarily caused by precipitation retrievals from the anvil regions of DCSs when collocated MESONET stations recorded no precipitation. A modified SCaMPR retrieval algorithm, employing both cloud optical depth and IR temperature, has the potential to make significant improvements to reduce the SCaMPR false alarm rate of retrieved

  20. Quantitative magnetization transfer provides information complementary to grey matter atrophy in Alzheimer's disease brains.

    Science.gov (United States)

    Giulietti, Giovanni; Bozzali, Marco; Figura, Viviana; Spanò, Barbara; Perri, Roberta; Marra, Camillo; Lacidogna, Giordano; Giubilei, Franco; Caltagirone, Carlo; Cercignani, Mara

    2012-01-16

    Preliminary studies, based on a region-of-interest approach, suggest that quantitative magnetization transfer (qMT), an extension of magnetization transfer imaging, provides complementary information to conventional magnetic resonance imaging (MRI) in the characterisation of Alzheimer's disease (AD). The aim of this study was to extend these findings to the whole brain, using a voxel-wise approach. We recruited 19AD patients and 11 healthy subjects (HS). All subjects had an MRI acquisition at 3.0T including a T(1)-weighted volume, 12 MT-weighted volumes for qMT, and data for computing T(1) and B(1) maps. The T(1)-weighted volumes were processed to yield grey matter (GM) volumetric maps, while the other sequences were used to compute qMT parametric maps of the whole brain. qMT maps were warped to standard space and smoothed, and subsequently compared between groups. Of all the qMT parameters considered, only the forward exchange rate, RM(0)(B), showed significant group differences. These images were therefore retained for the multimodal statistical analysis, designed to locate brain regions of RM(0)(B) differences between AD and HS groups, adjusting for local GM atrophy. Widespread areas of reduced RM(0)(B) were found in AD patients, mainly located in the hippocampus, in the temporal lobe, in the posterior cingulate and in the parietal cortex. These results indicate that, among qMT parameters, RM(0)(B) is the most sensitive to AD pathology. This quantity is altered in the hippocampus of patients with AD (as found by previous works) but also in other brain areas, that PET studies have highlighted as involved with both, reduced glucose metabolism and amyloid β deposition. RM(0)(B) might reflect, through the measurement of the efficiency of MT exchange, some information with a specific pathological counterpart. Given previous evidence of a strict relationship between RM(0)(B) and intracellular pH, an intriguing speculation is that our findings might reflect metabolic

  1. Estimation of genetic parameters and detection of quantitative trait loci for metabolites in Danish Holstein milk

    DEFF Research Database (Denmark)

    Buitenhuis, Albert Johannes; Sundekilde, Ulrik; Poulsen, Nina Aagaard

    2013-01-01

    Small components and metabolites in milk are significant for the utilization of milk, not only in dairy food production but also as disease predictors in dairy cattle. This study focused on estimation of genetic parameters and detection of quantitative trait loci for metabolites in bovine milk. F...... for lactic acid to >0.8 for orotic acid and β-hydroxybutyrate. A single SNP association analysis revealed 7 genome-wide significant quantitative trait loci [malonate: Bos taurus autosome (BTA)2 and BTA7; galactose-1-phosphate: BTA2; cis-aconitate: BTA11; urea: BTA12; carnitine: BTA25...

  2. Estimating bioerosion rate on fossil corals: a quantitative approach from Oligocene reefs (NW Italy)

    Science.gov (United States)

    Silvestri, Giulia

    2010-05-01

    Bioerosion of coral reefs, especially when related to the activity of macroborers, is considered to be one of the major processes influencing framework development in present-day reefs. Macroboring communities affecting both living and dead corals are widely distributed also in the fossil record and their role is supposed to be analogously important in determining flourishing vs demise of coral bioconstructions. Nevertheless, many aspects concerning environmental factors controlling the incidence of bioerosion, shifting in composition of macroboring communities and estimation of bioerosion rate in different contexts are still poorly documented and understood. This study presents an attempt to quantify bioerosion rate on reef limestones characteristic of some Oligocene outcrops of the Tertiary Piedmont Basin (NW Italy) and deposited under terrigenous sedimentation within prodelta and delta fan systems. Branching coral rubble-dominated facies have been recognized as prevailing in this context. Depositional patterns, textures, and the generally low incidence of taphonomic features, such as fragmentation and abrasion, suggest relatively quiet waters where coral remains were deposited almost in situ. Thus taphonomic signatures occurring on corals can be reliably used to reconstruct environmental parameters affecting these particular branching coral assemblages during their life and to compare them with those typical of classical clear-water reefs. Bioerosion is sparsely distributed within coral facies and consists of a limited suite of traces, mostly referred to clionid sponges and polychaete and sipunculid worms. The incidence of boring bivalves seems to be generally lower. Together with semi-quantitative analysis of bioerosion rate along vertical logs and horizontal levels, two quantitative methods have been assessed and compared. These consist in the elaboration of high resolution scanned thin sections through software for image analysis (Photoshop CS3) and point

  3. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    Science.gov (United States)

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are

  4. Improving high-resolution quantitative precipitation estimation via fusion of multiple radar-based precipitation products

    Science.gov (United States)

    Rafieeinasab, Arezoo; Norouzi, Amir; Seo, Dong-Jun; Nelson, Brian

    2015-12-01

    For monitoring and prediction of water-related hazards in urban areas such as flash flooding, high-resolution hydrologic and hydraulic modeling is necessary. Because of large sensitivity and scale dependence of rainfall-runoff models to errors in quantitative precipitation estimates (QPE), it is very important that the accuracy of QPE be improved in high-resolution hydrologic modeling to the greatest extent possible. With the availability of multiple radar-based precipitation products in many areas, one may now consider fusing them to produce more accurate high-resolution QPE for a wide spectrum of applications. In this work, we formulate and comparatively evaluate four relatively simple procedures for such fusion based on Fisher estimation and its conditional bias-penalized variant: Direct Estimation (DE), Bias Correction (BC), Reduced-Dimension Bias Correction (RBC) and Simple Estimation (SE). They are applied to fuse the Multisensor Precipitation Estimator (MPE) and radar-only Next Generation QPE (Q2) products at the 15-min 1-km resolution (Experiment 1), and the MPE and Collaborative Adaptive Sensing of the Atmosphere (CASA) QPE products at the 15-min 500-m resolution (Experiment 2). The resulting fused estimates are evaluated using the 15-min rain gauge observations from the City of Grand Prairie in the Dallas-Fort Worth Metroplex (DFW) in north Texas. The main criterion used for evaluation is that the fused QPE improves over the ingredient QPEs at their native spatial resolutions, and that, at the higher resolution, the fused QPE improves not only over the ingredient higher-resolution QPE but also over the ingredient lower-resolution QPE trivially disaggregated using the ingredient high-resolution QPE. All four procedures assume that the ingredient QPEs are unbiased, which is not likely to hold true in reality even if real-time bias correction is in operation. To test robustness under more realistic conditions, the fusion procedures were evaluated with and

  5. On sweat analysis for quantitative estimation of dehydration during physical exercise.

    Science.gov (United States)

    Ring, Matthias; Lohmueller, Clemens; Rauh, Manfred; Eskofier, Bjoern M

    2015-08-01

    Quantitative estimation of water loss during physical exercise is of importance because dehydration can impair both muscular strength and aerobic endurance. A physiological indicator for deficit of total body water (TBW) might be the concentration of electrolytes in sweat. It has been shown that concentrations differ after physical exercise depending on whether water loss was replaced by fluid intake or not. However, to the best of our knowledge, this fact has not been examined for its potential to quantitatively estimate TBW loss. Therefore, we conducted a study in which sweat samples were collected continuously during two hours of physical exercise without fluid intake. A statistical analysis of these sweat samples revealed significant correlations between chloride concentration in sweat and TBW loss (r = 0.41, p sweat osmolality and TBW loss (r = 0.43, p sweat samples.

  6. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon

    2011-01-01

    The Danish Meteorological Institute operates a radar network consisting of five C-band Doppler radars. Quantitative precipitation estimation (QPE) using radar data is performed on a daily basis. Radar QPE is considered to have the potential to signifi cantly improve the spatial representation...... of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... in western Denmark where alternative precipitation estimates were also used as input to an integrated hydrologic model. The hydrologic responses from the model were analyzed by comparing radar- and ground-based precipitation input scenarios. Results showed that radar QPE products are able to generate...

  7. Providing effective trauma care: the potential for service provider views to enhance the quality of care (qualitative study nested within a multicentre longitudinal quantitative study).

    Science.gov (United States)

    Beckett, Kate; Earthy, Sarah; Sleney, Jude; Barnes, Jo; Kellezi, Blerina; Barker, Marcus; Clarkson, Julie; Coffey, Frank; Elder, Georgina; Kendrick, Denise

    2014-07-08

    To explore views of service providers caring for injured people on: the extent to which services meet patients' needs and their perspectives on factors contributing to any identified gaps in service provision. Qualitative study nested within a quantitative multicentre longitudinal study assessing longer term impact of unintentional injuries in working age adults. Sampling frame for service providers was based on patient-reported service use in the quantitative study, patient interviews and advice of previously injured lay research advisers. Service providers' views were elicited through semistructured interviews. Data were analysed using thematic analysis. Participants were recruited from a range of settings and services in acute hospital trusts in four study centres (Bristol, Leicester, Nottingham and Surrey) and surrounding areas. 40 service providers from a range of disciplines. Service providers described two distinct models of trauma care: an 'ideal' model, informed by professional knowledge of the impact of injury and awareness of best models of care, and a 'real' model based on the realities of National Health Service (NHS) practice. Participants' 'ideal' model was consistent with standards of high-quality effective trauma care and while there were examples of services meeting the ideal model, 'real' care could also be fragmented and inequitable with major gaps in provision. Service provider accounts provide evidence of comprehensive understanding of patients' needs, awareness of best practice, compassion and research but reveal significant organisational and resource barriers limiting implementation of knowledge in practice. Service providers envisage an 'ideal' model of trauma care which is timely, equitable, effective and holistic, but this can differ from the care currently provided. Their experiences provide many suggestions for service improvements to bridge the gap between 'real' and 'ideal' care. Using service provider views to inform service design

  8. Quantitative shape analysis with weighted covariance estimates for increased statistical efficiency.

    Science.gov (United States)

    Ragheb, Hossein; Thacker, Neil A; Bromiley, Paul A; Tautz, Diethard; Schunke, Anja C

    2013-04-02

    The introduction and statistical formalisation of landmark-based methods for analysing biological shape has made a major impact on comparative morphometric analyses. However, a satisfactory solution for including information from 2D/3D shapes represented by 'semi-landmarks' alongside well-defined landmarks into the analyses is still missing. Also, there has not been an integration of a statistical treatment of measurement error in the current approaches. We propose a procedure based upon the description of landmarks with measurement covariance, which extends statistical linear modelling processes to semi-landmarks for further analysis. Our formulation is based upon a self consistent approach to the construction of likelihood-based parameter estimation and includes corrections for parameter bias, induced by the degrees of freedom within the linear model. The method has been implemented and tested on measurements from 2D fly wing, 2D mouse mandible and 3D mouse skull data. We use these data to explore possible advantages and disadvantages over the use of standard Procrustes/PCA analysis via a combination of Monte-Carlo studies and quantitative statistical tests. In the process we show how appropriate weighting provides not only greater stability but also more efficient use of the available landmark data. The set of new landmarks generated in our procedure ('ghost points') can then be used in any further downstream statistical analysis. Our approach provides a consistent way of including different forms of landmarks into an analysis and reduces instabilities due to poorly defined points. Our results suggest that the method has the potential to be utilised for the analysis of 2D/3D data, and in particular, for the inclusion of information from surfaces represented by multiple landmark points.

  9. Predicting urban stormwater runoff with quantitative precipitation estimates from commercial microwave links

    Science.gov (United States)

    Pastorek, Jaroslav; Fencl, Martin; Stránský, David; Rieckermann, Jörg; Bareš, Vojtěch

    2017-04-01

    Reliable and representative rainfall data are crucial for urban runoff modelling. However, traditional precipitation measurement devices often fail to provide sufficient information about the spatial variability of rainfall, especially when heavy storm events (determining design of urban stormwater systems) are considered. Commercial microwave links (CMLs), typically very dense in urban areas, allow for indirect precipitation detection with desired spatial and temporal resolution. Fencl et al. (2016) recognised the high bias in quantitative precipitation estimates (QPEs) from CMLs which significantly limits their usability and, in order to reduce the bias, suggested a novel method for adjusting the QPEs to existing rain gauge networks. Studies evaluating the potential of CMLs for rainfall detection so far focused primarily on direct comparison of the QPEs from CMLs to ground observations. In contrast, this investigation evaluates the suitability of these innovative rainfall data for stormwater runoff modelling on a case study of a small ungauged (in long-term perspective) urban catchment in Prague-Letňany, Czech Republic (Fencl et al., 2016). We compare the runoff measured at the outlet from the catchment with the outputs of a rainfall-runoff model operated using (i) CML data adjusted by distant rain gauges, (ii) rainfall data from the distant gauges alone and (iii) data from a single temporary rain gauge located directly in the catchment, as it is common practice in drainage engineering. Uncertainties of the simulated runoff are analysed using the Bayesian method for uncertainty evaluation incorporating a statistical bias description as formulated by Del Giudice et al. (2013). Our results show that adjusted CML data are able to yield reliable runoff modelling results, primarily for rainfall events with convective character. Performance statistics, most significantly the timing of maximal discharge, reach better (less uncertain) values with the adjusted CML data

  10. Do group-specific equations provide the best estimates of stature?

    Science.gov (United States)

    Albanese, John; Osley, Stephanie E; Tuck, Andrew

    2016-04-01

    An estimate of stature can be used by a forensic anthropologist with the preliminary identification of an unknown individual when human skeletal remains are recovered. Fordisc is a computer application that can be used to estimate stature; like many other methods it requires the user to assign an unknown individual to a specific group defined by sex, race/ancestry, and century of birth before an equation is applied. The assumption is that a group-specific equation controls for group differences and should provide the best results most often. In this paper we assess the utility and benefits of using group-specific equations to estimate stature using Fordisc. Using the maximum length of the humerus and the maximum length of the femur from individuals with documented stature, we address the question: Do sex-, race/ancestry- and century-specific stature equations provide the best results when estimating stature? The data for our sample of 19th Century White males (n=28) were entered into Fordisc and stature was estimated using 22 different equation options for a total of 616 trials: 19th and 20th Century Black males, 19th and 20th Century Black females, 19th and 20th Century White females, 19th and 20th Century White males, 19th and 20th Century any, and 20th Century Hispanic males. The equations were assessed for utility in any one case (how many times the estimated range bracketed the documented stature) and in aggregate using 1-way ANOVA and other approaches. This group-specific equation that should have provided the best results was outperformed by several other equations for both the femur and humerus. These results suggest that group-specific equations do not provide better results for estimating stature while at the same time are more difficult to apply because an unknown must be allocated to a given group before stature can be estimated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Software project estimation the fundamentals for providing high quality information to decision makers

    CERN Document Server

    Abran, Alain

    2015-01-01

    Software projects are often late and over-budget and this leads to major problems for software customers. Clearly, there is a serious issue in estimating a realistic, software project budget. Furthermore, generic estimation models cannot be trusted to provide credible estimates for projects as complex as software projects. This book presents a number of examples using data collected over the years from various organizations building software. It also presents an overview of the non-for-profit organization, which collects data on software projects, the International Software Benchmarking Stan

  12. FPGA-based fused smart-sensor for tool-wear area quantitative estimation in CNC machine inserts.

    Science.gov (United States)

    Trejo-Hernandez, Miguel; Osornio-Rios, Roque Alfredo; de Jesus Romero-Troncoso, Rene; Rodriguez-Donate, Carlos; Dominguez-Gonzalez, Aurelio; Herrera-Ruiz, Gilberto

    2010-01-01

    Manufacturing processes are of great relevance nowadays, when there is a constant claim for better productivity with high quality at low cost. The contribution of this work is the development of a fused smart-sensor, based on FPGA to improve the online quantitative estimation of flank-wear area in CNC machine inserts from the information provided by two primary sensors: the monitoring current output of a servoamplifier, and a 3-axis accelerometer. Results from experimentation show that the fusion of both parameters makes it possible to obtain three times better accuracy when compared with the accuracy obtained from current and vibration signals, individually used.

  13. FPGA-Based Fused Smart-Sensor for Tool-Wear Area Quantitative Estimation in CNC Machine Inserts

    Directory of Open Access Journals (Sweden)

    Miguel Trejo-Hernandez

    2010-04-01

    Full Text Available Manufacturing processes are of great relevance nowadays, when there is a constant claim for better productivity with high quality at low cost. The contribution of this work is the development of a fused smart-sensor, based on FPGA to improve the online quantitative estimation of flank-wear area in CNC machine inserts from the information provided by two primary sensors: the monitoring current output of a servoamplifier, and a 3-axis accelerometer. Results from experimentation show that the fusion of both parameters makes it possible to obtain three times better accuracy when compared with the accuracy obtained from current and vibration signals, individually used.

  14. MRI estimates of brain iron concentration in normal aging using quantitative susceptibility mapping.

    Science.gov (United States)

    Bilgic, Berkin; Pfefferbaum, Adolf; Rohlfing, Torsten; Sullivan, Edith V; Adalsteinsson, Elfar

    2012-02-01

    Quantifying tissue iron concentration in vivo is instrumental for understanding the role of iron in physiology and in neurological diseases associated with abnormal iron distribution. Herein, we use recently-developed Quantitative Susceptibility Mapping (QSM) methodology to estimate the tissue magnetic susceptibility based on MRI signal phase. To investigate the effect of different regularization choices, we implement and compare ℓ1 and ℓ2 norm regularized QSM algorithms. These regularized approaches solve for the underlying magnetic susceptibility distribution, a sensitive measure of the tissue iron concentration, that gives rise to the observed signal phase. Regularized QSM methodology also involves a pre-processing step that removes, by dipole fitting, unwanted background phase effects due to bulk susceptibility variations between air and tissue and requires data acquisition only at a single field strength. For validation, performances of the two QSM methods were measured against published estimates of regional brain iron from postmortem and in vivo data. The in vivo comparison was based on data previously acquired using Field-Dependent Relaxation Rate Increase (FDRI), an estimate of MRI relaxivity enhancement due to increased main magnetic field strength, requiring data acquired at two different field strengths. The QSM analysis was based on susceptibility-weighted images acquired at 1.5 T, whereas FDRI analysis used Multi-Shot Echo-Planar Spin Echo images collected at 1.5 T and 3.0 T. Both datasets were collected in the same healthy young and elderly adults. The in vivo estimates of regional iron concentration comported well with published postmortem measurements; both QSM approaches yielded the same rank ordering of iron concentration by brain structure, with the lowest in white matter and the highest in globus pallidus. Further validation was provided by comparison of the in vivo measurements, ℓ1-regularized QSM versus FDRI and ℓ2-regularized QSM

  15. Epithelium percentage estimation facilitates epithelial quantitative protein measurement in tissue specimens.

    Science.gov (United States)

    Chen, Jing; Toghi Eshghi, Shadi; Bova, George Steven; Li, Qing Kay; Li, Xingde; Zhang, Hui

    2013-12-01

    The rapid advancement of high-throughput tools for quantitative measurement of proteins has demonstrated the potential for the identification of proteins associated with cancer. However, the quantitative results on cancer tissue specimens are usually confounded by tissue heterogeneity, e.g. regions with cancer usually have significantly higher epithelium content yet lower stromal content. It is therefore necessary to develop a tool to facilitate the interpretation of the results of protein measurements in tissue specimens. Epithelial cell adhesion molecule (EpCAM) and cathepsin L (CTSL) are two epithelial proteins whose expressions in normal and tumorous prostate tissues were confirmed by measuring staining intensity with immunohistochemical staining (IHC). The expressions of these proteins were measured by ELISA in protein extracts from OCT embedded frozen prostate tissues. To eliminate the influence of tissue heterogeneity on epithelial protein quantification measured by ELISA, a color-based segmentation method was developed in-house for estimation of epithelium content using H&E histology slides from the same prostate tissues and the estimated epithelium percentage was used to normalize the ELISA results. The epithelium contents of the same slides were also estimated by a pathologist and used to normalize the ELISA results. The computer based results were compared with the pathologist's reading. We found that both EpCAM and CTSL levels, measured by ELISA assays itself, were greatly affected by epithelium content in the tissue specimens. Without adjusting for epithelium percentage, both EpCAM and CTSL levels appeared significantly higher in tumor tissues than normal tissues with a p value less than 0.001. However, after normalization by the epithelium percentage, ELISA measurements of both EpCAM and CTSL were in agreement with IHC staining results, showing a significant increase only in EpCAM with no difference in CTSL expression in cancer tissues. These results

  16. Comparison of quantitative flow cytometric data provided by panels with lower and increased color number

    Science.gov (United States)

    Bocsi, József; Mittag, Anja; Pierzchalski, Arkadiusz; Baumgartner, Adolf; Dähnert, Ingo; Tárnok, Attila

    2012-03-01

    To date the flow cytometry (FCM) industry is booming with new generations of commercial clinical instruments. Long-term clinical studies have the dilemma that moving to new instruments being capable of more complex cell-analysis makes it difficult to compare new data with those obtained on older instruments with less complex analysis panels. Since 15 years we conduct follow-up studies on children with congenital heart diseases. In this period we moved from 2- to 3- and now to 10-color FCM immunophenotyping panels. Questions arise how to compare and transfer data from lower to higher level of complexity. Two comparable antibody panels for leukocyte immunophenotyping (12-tube 2-colors, and 9-tube 4-colors) were measured on a BD FACScalibur FCM (calibration: Spherotech beads) in 19 blood samples from children with congenital heart disease. This increase of colors was accompanied by moving antibodies that were in the 2-color panel either FITC or PE labeled to red dyes such as PerCP or APC. Algorithms were developed for bridging data for quantitative characterization of antigen expression (mean fluorescence intensity) and frequency of different cell subpopulations in combination with rainbow bead standard data. This approach worked for the most relevant antibodies (CD3, CD4, CD8 etc.) well, but rendered substantial uncertainty for activation markers (CD69 etc.). Our techniques are particularly well suited to the analysis in long-term studies and have the potential to compare older and recent results in a standardized way.

  17. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study

    Science.gov (United States)

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-01

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  18. Speech graphs provide a quantitative measure of thought disorder in psychosis.

    Directory of Open Access Journals (Sweden)

    Natalia B Mota

    Full Text Available BACKGROUND: Psychosis has various causes, including mania and schizophrenia. Since the differential diagnosis of psychosis is exclusively based on subjective assessments of oral interviews with patients, an objective quantification of the speech disturbances that characterize mania and schizophrenia is in order. In principle, such quantification could be achieved by the analysis of speech graphs. A graph represents a network with nodes connected by edges; in speech graphs, nodes correspond to words and edges correspond to semantic and grammatical relationships. METHODOLOGY/PRINCIPAL FINDINGS: To quantify speech differences related to psychosis, interviews with schizophrenics, manics and normal subjects were recorded and represented as graphs. Manics scored significantly higher than schizophrenics in ten graph measures. Psychopathological symptoms such as logorrhea, poor speech, and flight of thoughts were grasped by the analysis even when verbosity differences were discounted. Binary classifiers based on speech graph measures sorted schizophrenics from manics with up to 93.8% of sensitivity and 93.7% of specificity. In contrast, sorting based on the scores of two standard psychiatric scales (BPRS and PANSS reached only 62.5% of sensitivity and specificity. CONCLUSIONS/SIGNIFICANCE: The results demonstrate that alterations of the thought process manifested in the speech of psychotic patients can be objectively measured using graph-theoretical tools, developed to capture specific features of the normal and dysfunctional flow of thought, such as divergence and recurrence. The quantitative analysis of speech graphs is not redundant with standard psychometric scales but rather complementary, as it yields a very accurate sorting of schizophrenics and manics. Overall, the results point to automated psychiatric diagnosis based not on what is said, but on how it is said.

  19. A unified Maximum Likelihood framework for simultaneous motion and T1 estimation in quantitative MR T1 mapping

    NARCIS (Netherlands)

    Ramos-Llorden, Gabriel; den Dekker, A.J.; Van Steenkiste, G.; Jeurissen, Ben; Vanhevel, Floris; Audekerke, Johan Van; Verhoye, Marleen; Sijbers, Jan

    2017-01-01

    In quantitative MR T1 mapping, the spin-lattice relaxation time T1 of tissues is estimated from a series of T1-weighted images. As the T1 estimation is a voxel-wise estimation procedure, correct spatial alignment of the T1-weighted images is crucial. Conventionally, the T1-weighted images are

  20. RESERCH CONCERNING THE ESTIMATE OF QUANTITATIVE AND QUALITATIVE PHYSIOLOGICAL GROUP BACTERIA IN PEATS SAMPLE

    Directory of Open Access Journals (Sweden)

    ADRIANA CRISTE

    2008-05-01

    Full Text Available The total aerobe micro flora can be determined on solid mediums for the aerobe bacteria and this relive quantity of micro organisms from the peat samples. The quantitative evaluation was done using solid nutritive mediums which allows the estimation of nr CFU/g as well observing the morphology of the colonies and their utility through their emplacement and morphological and biochemical characterization of isolated strains.. The evaluations where done through the method of dilution, using selective liquid mediums. Every day the characteristic reaction of the respective group was observed, either through the metabolising of the substrate, or through the appearance of a catabolic product in the medium.

  1. Moving Beyond Blind Men and Elephants: Providing Total Estimated Annual Costs Improves Health Insurance Decision Making.

    Science.gov (United States)

    Barnes, Andrew J; Hanoch, Yaniv; Rice, Thomas; Long, Sharon K

    2017-10-01

    Health insurance is among the most important financial and health-related decisions that people make. Choosing a health insurance plan that offers sufficient risk protection is difficult, in part because total expected health care costs are not transparent. This study examines the effect of providing total costs estimates on health insurance decisions using a series of hypothetical choice experiments given to 7,648 individuals responding to the fall 2015 Health Reform Monitoring Survey. Participants were given two health scenarios presented in random order asking which of three insurance plans would best meet their needs. Half received total estimated costs, which increased the probability of choosing a cost-minimizing plan by 3.0 to 10.6 percentage points, depending on the scenario ( p < .01). With many consumers choosing or failing to switch out of plans that offer insufficient coverage, incorporating insights on consumer decision making with personalized information to estimate costs can improve the quality of health insurance choices.

  2. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    Science.gov (United States)

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (Pmethods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  4. Comparison of the scanning linear estimator (SLE) and ROI methods for quantitative SPECT imaging

    Science.gov (United States)

    Könik, Arda; Kupinski, Meredith; Hendrik Pretorius, P.; King, Michael A.; Barrett, Harrison H.

    2015-08-01

    In quantitative emission tomography, tumor activity is typically estimated from calculations on a region of interest (ROI) identified in the reconstructed slices. In these calculations, unpredictable bias arising from the null functions of the imaging system affects ROI estimates. The magnitude of this bias depends upon the tumor size and location. In prior work it has been shown that the scanning linear estimator (SLE), which operates on the raw projection data, is an unbiased estimator of activity when the size and location of the tumor are known. In this work, we performed analytic simulation of SPECT imaging with a parallel-hole medium-energy collimator. Distance-dependent system spatial resolution and non-uniform attenuation were included in the imaging simulation. We compared the task of activity estimation by the ROI and SLE methods for a range of tumor sizes (diameter: 1-3 cm) and activities (contrast ratio: 1-10) added to uniform and non-uniform liver backgrounds. Using the correct value for the tumor shape and location is an idealized approximation to how task estimation would occur clinically. Thus we determined how perturbing this idealized prior knowledge impacted the performance of both techniques. To implement the SLE for the non-uniform background, we used a novel iterative algorithm for pre-whitening stationary noise within a compact region. Estimation task performance was compared using the ensemble mean-squared error (EMSE) as the criterion. The SLE method performed substantially better than the ROI method (i.e. EMSE(SLE) was 23-174 times lower) when the background is uniform and tumor location and size are known accurately. The variance of the SLE increased when a non-uniform liver texture was introduced but the EMSE(SLE) continued to be 5-20 times lower than the ROI method. In summary, SLE outperformed ROI under almost all conditions that we tested.

  5. Uncertainty in Quantitative Precipitation Estimates and Forecasts in a Hydrologic Modeling Context (Invited)

    Science.gov (United States)

    Gourley, J. J.; Kirstetter, P.; Hong, Y.; Hardy, J.; Flamig, Z.

    2013-12-01

    This study presents a methodology to account for uncertainty in radar-based rainfall rate estimation using NOAA/NSSL's Multi-Radar Multisensor (MRMS) products. The focus of the study in on flood forecasting, including flash floods, in ungauged catchments throughout the conterminous US. An error model is used to derive probability distributions of rainfall rates that explicitly accounts for rain typology and uncertainty in the reflectivity-to-rainfall relationships. This approach preserves the fine space/time sampling properties (2 min/1 km) of the radar and conditions probabilistic quantitative precipitation estimates (PQPE) on the rain rate and rainfall type. Uncertainty in rainfall amplitude is the primary factor that is accounted for in the PQPE development. Additional uncertainties due to rainfall structures, locations, and timing must be considered when using quantitative precipitation forecast (QPF) products as forcing to a hydrologic model. A new method will be presented that shows how QPF ensembles are used in a hydrologic modeling context to derive probabilistic flood forecast products. This method considers the forecast rainfall intensity and morphology superimposed on pre-existing hydrologic conditions to identify basin scales that are most at risk.

  6. The Fidelity Index provides a systematic quantitation of star activity of DNA restriction endonucleases.

    Science.gov (United States)

    Wei, Hua; Therrien, Caitlin; Blanchard, Aine; Guan, Shengxi; Zhu, Zhenyu

    2008-05-01

    Restriction endonucleases are the basic tools of molecular biology. Many restriction endonucleases show relaxed sequence recognition, called star activity, as an inherent property under various digestion conditions including the optimal ones. To quantify this property we propose the concept of the Fidelity Index (FI), which is defined as the ratio of the maximum enzyme amount showing no star activity to the minimum amount needed for complete digestion at the cognate recognition site for any particular restriction endonuclease. Fidelity indices for a large number of restriction endonucleases are reported here. The effects of reaction vessel, reaction volume, incubation mode, substrate differences, reaction time, reaction temperature and additional glycerol, DMSO, ethanol and Mn(2+) on the FI are also investigated. The FI provides a practical guideline for the use of restriction endonucleases and defines a fundamental property by which restriction endonucleases can be characterized.

  7. Quantifying the Extent of Emphysema : Factors Associated with Radiologists' Estimations and Quantitative Indices of Emphysema Severity Using the ECLIPSE Cohort

    NARCIS (Netherlands)

    Gietema, Hester A.; Mueller, Nestor L.; Fauerbach, Paola V. Nasute; Sharma, Sanjay; Edwards, Lisa D.; Camp, Pat G.; Coxson, Harvey O.

    Rationale and Objectives: This study investigated what factors radiologists take into account when estimating emphysema severity and assessed quantitative computed tomography (CT) measurements of low attenuation areas. Materials and Methods: CT scans and spirometry were obtained on 1519 chronic

  8. Estimation of qualitative and quantitative characteristics interrelation, having an impact on amount of tourists in hospitality industry

    Directory of Open Access Journals (Sweden)

    Tatyana P. Levchenko

    2011-01-01

    Full Text Available The article considers methods of estimation of qualitative and quantitative characteristics interrelation, having impact on amount of tourists in hospitality industry, offers the latest technologies of the given indicators calculation.

  9. Noninvasive IDH1 mutation estimation based on a quantitative radiomics approach for grade II glioma

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Jinhua [Fudan University, Department of Electronic Engineering, Shanghai (China); Computing and Computer-Assisted Intervention, Key Laboratory of Medical Imaging, Shanghai (China); Shi, Zhifeng; Chen, Liang; Mao, Ying [Fudan University, Department of Neurosurgery, Huashan Hospital, Shanghai (China); Lian, Yuxi; Li, Zeju; Liu, Tongtong; Gao, Yuan; Wang, Yuanyuan [Fudan University, Department of Electronic Engineering, Shanghai (China)

    2017-08-15

    The status of isocitrate dehydrogenase 1 (IDH1) is highly correlated with the development, treatment and prognosis of glioma. We explored a noninvasive method to reveal IDH1 status by using a quantitative radiomics approach for grade II glioma. A primary cohort consisting of 110 patients pathologically diagnosed with grade II glioma was retrospectively studied. The radiomics method developed in this paper includes image segmentation, high-throughput feature extraction, radiomics sequencing, feature selection and classification. Using the leave-one-out cross-validation (LOOCV) method, the classification result was compared with the real IDH1 situation from Sanger sequencing. Another independent validation cohort containing 30 patients was utilised to further test the method. A total of 671 high-throughput features were extracted and quantized. 110 features were selected by improved genetic algorithm. In LOOCV, the noninvasive IDH1 status estimation based on the proposed approach presented an estimation accuracy of 0.80, sensitivity of 0.83 and specificity of 0.74. Area under the receiver operating characteristic curve reached 0.86. Further validation on the independent cohort of 30 patients produced similar results. Radiomics is a potentially useful approach for estimating IDH1 mutation status noninvasively using conventional T2-FLAIR MRI images. The estimation accuracy could potentially be improved by using multiple imaging modalities. (orig.)

  10. Visual estimation versus different quantitative coronary angiography methods to assess lesion severity in bifurcation lesions.

    Science.gov (United States)

    Grundeken, Maik J; Collet, Carlos; Ishibashi, Yuki; Généreux, Philippe; Muramatsu, Takashi; LaSalle, Laura; Kaplan, Aaron V; Wykrzykowska, Joanna J; Morel, Marie-Angèle; Tijssen, Jan G; de Winter, Robbert J; Onuma, Yoshinobu; Leon, Martin B; Serruys, Patrick W

    2017-08-24

    To compare visual estimation with different quantitative coronary angiography (QCA) methods (single-vessel versus bifurcation software) to assess coronary bifurcation lesions. QCA has been developed to overcome the limitations of visual estimation. Conventional QCA however, developed in "straight vessels," has proved to be inaccurate in bifurcation lesions. Therefore, bifurcation QCA was developed. However, the impact of these different modalities on bifurcation lesion severity classification is yet unknown METHODS: From a randomized controlled trial investigating a novel bifurcation stent (Clinicaltrials.gov NCT01258972), patients with baseline assessment of lesion severity by means of visual estimation, single-vessel QCA, 2D bifurcation QCA and 3D bifurcation QCA were included. We included 113 bifurcations lesions in which all 5 modalities were assessed. The primary end-point was to evaluate how the different modalities affected the classification of bifurcation lesion severity and extent of disease. On visual estimation, 100% of lesions had side-branch diameter stenosis (%DS) >50%, whereas in 83% with single-vessel QCA, 27% with 2D bifurcation QCA and 26% with 3D bifurcation QCA a side-branch %DS >50% was found (P < 0.0001). With regard to the percentage of "true" bifurcation lesions, there was a significant difference between visual estimate (100%), single-vessel QCA (75%) and bifurcation QCA (17% with 2D bifurcation software and 13% with 3D bifurcation software, P < 0.0001). Our study showed that bifurcation lesion complexity was significantly affected when more advanced bifurcation QCA software were used. "True" bifurcation lesion rate was 100% on visual estimation, but as low as 13% when analyzed with dedicated bifurcation QCA software. © 2017 Wiley Periodicals, Inc.

  11. Assimilation of radar quantitative precipitation estimations in the Canadian Precipitation Analysis (CaPA)

    Science.gov (United States)

    Fortin, Vincent; Roy, Guy; Donaldson, Norman; Mahidjiba, Ahmed

    2015-12-01

    The Canadian Precipitation Analysis (CaPA) is a data analysis system used operationally at the Canadian Meteorological Center (CMC) since April 2011 to produce gridded 6-h and 24-h precipitation accumulations in near real-time on a regular grid covering all of North America. The current resolution of the product is 10-km. Due to the low density of the observational network in most of Canada, the system relies on a background field provided by the Regional Deterministic Prediction System (RDPS) of Environment Canada, which is a short-term weather forecasting system for North America. For this reason, the North American configuration of CaPA is known as the Regional Deterministic Precipitation Analysis (RDPA). Early in the development of the CaPA system, weather radar reflectivity was identified as a very promising additional data source for the precipitation analysis, but necessary quality control procedures and bias-correction algorithms were lacking for the radar data. After three years of development and testing, a new version of CaPA-RDPA system was implemented in November 2014 at CMC. This version is able to assimilate radar quantitative precipitation estimates (QPEs) from all 31 operational Canadian weather radars. The radar QPE is used as an observation source and not as a background field, and is subject to a strict quality control procedure, like any other observation source. The November 2014 upgrade to CaPA-RDPA was implemented at the same time as an upgrade to the RDPS system, which brought minor changes to the skill and bias of CaPA-RDPA. This paper uses the frequency bias indicator (FBI), the equitable threat score (ETS) and the departure from the partial mean (DPM) in order to assess the improvements to CaPA-RDPA brought by the assimilation of radar QPE. Verification focuses on the 6-h accumulations, and is done against a network of 65 synoptic stations (approximately two stations per radar) that were withheld from the station data assimilated by Ca

  12. Estimating the national cost of treating people with HIV disease: patient, payer, and provider data.

    Science.gov (United States)

    Hellinger, F J; Fleishman, J A

    2000-06-01

    Existing estimates of the national cost of treating all people with HIV disease use data from a sample of people with HIV disease to extrapolate the cost of treating all people with HIV disease (patient-based approach). This study derives estimates using two novel approaches (i.e., payer-based and provider-based) and compares these with existing estimates. These include the Health Insurance Association of American and the American Council of Life Insurance 1996 HIV survey, the 1996 State Inpatient Databases (SID) maintained by the Agency for Healthcare Research and Quality, and the IMS America Ltd. survey of independent and chain drugstores. The cost of treating all people with HIV disease in 1996 was between $6.7 and $7.8 billion U.S., and the average annual cost of treating a person with HIV disease was between $20,000 and $24,700 U.S. Analysts should derive estimates of the cost of treating people with HIV disease using several different approaches. K

  13. The perspective of healthcare providers and patients on health literacy: a systematic review of the quantitative and qualitative studies.

    Science.gov (United States)

    Rajah, Retha; Ahmad Hassali, Mohamed Azmi; Jou, Lim Ching; Murugiah, Muthu Kumar

    2017-10-01

    Health literacy (HL) is a multifaceted concept, thus understanding the perspective of healthcare providers, patients, and the system is vital. This systematic review examines and synthesises the available studies on HL-related knowledge, attitude, practice, and perceived barriers. CINAHL and Medline (via EBSCOhost), Google Scholar, PubMed, ProQuest, Sage Journals, and Science Direct were searched. Both quantitative and/or qualitative studies in the English language were included. Intervention studies and studies focusing on HL assessment tools and prevalence of low HL were excluded. The risk of biasness reduced with the involvement of two reviewers independently assessing study eligibility and quality. A total of 30 studies were included, which consist of 19 quantitative, 9 qualitative, and 2 mixed-method studies. Out of 17 studies, 13 reported deficiency of HL-related knowledge among healthcare providers and 1 among patients. Three studies showed a positive attitude of healthcare providers towards learning about HL. Another three studies demonstrated patients feel shame exposing their literacy and undergoing HL assessment. Common HL communication techniques reported practiced by healthcare providers were the use of everyday language, teach-back method, and providing patients with reading materials and aids, while time constraint was the most reported HL perceived barriers by both healthcare providers and patients. Significant gaps exists in HL knowledge among healthcare providers and patients that needs immediate intervention. Such as, greater effort placed in creating a health system that provides an opportunity for healthcare providers to learn about HL and patients to access health information with taking consideration of their perceived barriers.

  14. Using extended genealogy to estimate components of heritability for 23 quantitative and dichotomous traits.

    Directory of Open Access Journals (Sweden)

    Noah Zaitlen

    2013-05-01

    Full Text Available Important knowledge about the determinants of complex human phenotypes can be obtained from the estimation of heritability, the fraction of phenotypic variation in a population that is determined by genetic factors. Here, we make use of extensive phenotype data in Iceland, long-range phased genotypes, and a population-wide genealogical database to examine the heritability of 11 quantitative and 12 dichotomous phenotypes in a sample of 38,167 individuals. Most previous estimates of heritability are derived from family-based approaches such as twin studies, which may be biased upwards by epistatic interactions or shared environment. Our estimates of heritability, based on both closely and distantly related pairs of individuals, are significantly lower than those from previous studies. We examine phenotypic correlations across a range of relationships, from siblings to first cousins, and find that the excess phenotypic correlation in these related individuals is predominantly due to shared environment as opposed to dominance or epistasis. We also develop a new method to jointly estimate narrow-sense heritability and the heritability explained by genotyped SNPs. Unlike existing methods, this approach permits the use of information from both closely and distantly related pairs of individuals, thereby reducing the variance of estimates of heritability explained by genotyped SNPs while preventing upward bias. Our results show that common SNPs explain a larger proportion of the heritability than previously thought, with SNPs present on Illumina 300K genotyping arrays explaining more than half of the heritability for the 23 phenotypes examined in this study. Much of the remaining heritability is likely to be due to rare alleles that are not captured by standard genotyping arrays.

  15. Using extended genealogy to estimate components of heritability for 23 quantitative and dichotomous traits.

    Science.gov (United States)

    Zaitlen, Noah; Kraft, Peter; Patterson, Nick; Pasaniuc, Bogdan; Bhatia, Gaurav; Pollack, Samuela; Price, Alkes L

    2013-05-01

    Important knowledge about the determinants of complex human phenotypes can be obtained from the estimation of heritability, the fraction of phenotypic variation in a population that is determined by genetic factors. Here, we make use of extensive phenotype data in Iceland, long-range phased genotypes, and a population-wide genealogical database to examine the heritability of 11 quantitative and 12 dichotomous phenotypes in a sample of 38,167 individuals. Most previous estimates of heritability are derived from family-based approaches such as twin studies, which may be biased upwards by epistatic interactions or shared environment. Our estimates of heritability, based on both closely and distantly related pairs of individuals, are significantly lower than those from previous studies. We examine phenotypic correlations across a range of relationships, from siblings to first cousins, and find that the excess phenotypic correlation in these related individuals is predominantly due to shared environment as opposed to dominance or epistasis. We also develop a new method to jointly estimate narrow-sense heritability and the heritability explained by genotyped SNPs. Unlike existing methods, this approach permits the use of information from both closely and distantly related pairs of individuals, thereby reducing the variance of estimates of heritability explained by genotyped SNPs while preventing upward bias. Our results show that common SNPs explain a larger proportion of the heritability than previously thought, with SNPs present on Illumina 300K genotyping arrays explaining more than half of the heritability for the 23 phenotypes examined in this study. Much of the remaining heritability is likely to be due to rare alleles that are not captured by standard genotyping arrays.

  16. An Ensemble Generator for Quantitative Precipitation Estimation Based on Censored Shifted Gamma Distributions

    Science.gov (United States)

    Wright, D.; Kirschbaum, D.; Yatheendradas, S.

    2016-12-01

    The considerable uncertainties associated with quantitative precipitation estimates (QPE), whether from satellite platforms, ground-based weather radar, or numerical weather models, suggest that such QPE should be expressed as distributions or ensembles of possible values, rather than as single values. In this research, we borrow a framework from the weather forecast verification community, to "correct" satellite precipitation and generate ensemble QPE. This approach is based on the censored shifted gamma distribution (CSGD). The probability of precipitation, central tendency (i.e. mean), and the uncertainty can be captured by the three parameters of the CSGD. The CSGD can then be applied for simulation of rainfall ensembles using a flexible nonlinear regression framework, whereby the CSGD parameters can be conditioned on one or more reference rainfall datasets and on other time-varying covariates such as modeled or measured estimates of precipitable water and relative humidity. We present the framework and initial results by generating precipitation ensembles based on the Tropical Rainfall Measuring Mission Multi-satellite Precipitation Analysis (TMPA) dataset, using both NLDAS and PERSIANN-CDR precipitation datasets as references. We also incorporate a number of covariates from MERRA2 reanalysis including model-estimated precipitation, precipitable water, relative humidity, and lifting condensation level. We explore the prospects for applying the framework and other ensemble error models globally, including in regions where high-quality "ground truth" rainfall estimates are lacking. We compare the ensemble outputs against those of an independent rain gage-based ensemble rainfall dataset. "Pooling" of regional rainfall observations is explored as one option for improving ensemble estimates of rainfall extremes. The approach has potential applications in near-realtime, retrospective, and scenario modeling of rainfall-driven hazards such as floods and landslides

  17. A test for Improvement of high resolution Quantitative Precipitation Estimation for localized heavy precipitation events

    Science.gov (United States)

    Lee, Jung-Hoon; Roh, Joon-Woo; Park, Jeong-Gyun

    2017-04-01

    Accurate estimation of precipitation is one of the most difficult and significant tasks in the area of weather diagnostic and forecasting. In the Korean Peninsula, heavy precipitations are caused by various physical mechanisms, which are affected by shortwave trough, quasi-stationary moisture convergence zone among varying air masses, and a direct/indirect effect of tropical cyclone. In addition to, various geographical and topographical elements make production of temporal and spatial distribution of precipitation is very complicated. Especially, localized heavy rainfall events in South Korea generally arise from mesoscale convective systems embedded in these synoptic scale disturbances. In weather radar data with high temporal and spatial resolution, accurate estimation of rain rate from radar reflectivity data is too difficult. Z-R relationship (Marshal and Palmer 1948) have adapted representatively. In addition to, several methods such as support vector machine (SVM), neural network, Fuzzy logic, Kriging were utilized in order to improve the accuracy of rain rate. These methods show the different quantitative precipitation estimation (QPE) and the performances of accuracy are different for heavy precipitation cases. In this study, in order to improve the accuracy of QPE for localized heavy precipitation, ensemble method for Z-R relationship and various techniques was tested. This QPE ensemble method was developed by a concept based on utilizing each advantage of precipitation calibration methods. The ensemble members were produced for a combination of different Z-R coefficient and calibration method.

  18. Estimating the development assistance for health provided to faith-based organizations, 1990-2013.

    Directory of Open Access Journals (Sweden)

    Annie Haakenstad

    Full Text Available Faith-based organizations (FBOs have been active in the health sector for decades. Recently, the role of FBOs in global health has been of increased interest. However, little is known about the magnitude and trends in development assistance for health (DAH channeled through these organizations.Data were collected from the 21 most recent editions of the Report of Voluntary Agencies. These reports provide information on the revenue and expenditure of organizations. Project-level data were also collected and reviewed from the Bill & Melinda Gates Foundation and the Global Fund to Fight AIDS, Tuberculosis and Malaria. More than 1,900 non-governmental organizations received funds from at least one of these three organizations. Background information on these organizations was examined by two independent reviewers to identify the amount of funding channeled through FBOs.In 2013, total spending by the FBOs identified in the VolAg amounted to US$1.53 billion. In 1990, FB0s spent 34.1% of total DAH provided by private voluntary organizations reported in the VolAg. In 2013, FBOs expended 31.0%. Funds provided by the Global Fund to FBOs have grown since 2002, amounting to $80.9 million in 2011, or 16.7% of the Global Fund's contributions to NGOs. In 2011, the Gates Foundation's contributions to FBOs amounted to $7.1 million, or 1.1% of the total provided to NGOs.Development assistance partners exhibit a range of preferences with respect to the amount of funds provided to FBOs. Overall, estimates show that FBOS have maintained a substantial and consistent share over time, in line with overall spending in global health on NGOs. These estimates provide the foundation for further research on the spending trends and effectiveness of FBOs in global health.

  19. Parameter estimation using the genetic algorithm and its impact on quantitative precipitation forecast

    Directory of Open Access Journals (Sweden)

    Y. H. Lee

    2006-12-01

    Full Text Available In this study, optimal parameter estimations are performed for both physical and computational parameters in a mesoscale meteorological model, and their impacts on the quantitative precipitation forecasting (QPF are assessed for a heavy rainfall case occurred at the Korean Peninsula in June 2005. Experiments are carried out using the PSU/NCAR MM5 model and the genetic algorithm (GA for two parameters: the reduction rate of the convective available potential energy in the Kain-Fritsch (KF scheme for cumulus parameterization, and the Asselin filter parameter for numerical stability. The fitness function is defined based on a QPF skill score. It turns out that each optimized parameter significantly improves the QPF skill. Such improvement is maximized when the two optimized parameters are used simultaneously. Our results indicate that optimizations of computational parameters as well as physical parameters and their adequate applications are essential in improving model performance.

  20. Toward quantitative forecasts of volcanic ash dispersal: Using satellite retrievals for optimal estimation of source terms

    Science.gov (United States)

    Zidikheri, Meelis J.; Lucas, Christopher; Potts, Rodney J.

    2017-08-01

    Airborne volcanic ash is a hazard to aviation. There is an increasing demand for quantitative forecasts of ash properties such as ash mass load to allow airline operators to better manage the risks of flying through airspace likely to be contaminated by ash. In this paper we show how satellite-derived mass load information at times prior to the issuance of the latest forecast can be used to estimate various model parameters that are not easily obtained by other means such as the distribution of mass of the ash column at the volcano. This in turn leads to better forecasts of ash mass load. We demonstrate the efficacy of this approach using several case studies.

  1. Accuracy in the estimation of quantitative minimal area from the diversity/area curve.

    Science.gov (United States)

    Vives, Sergi; Salicrú, Miquel

    2005-05-01

    The problem of representativity is fundamental in ecological studies. A qualitative minimal area that gives a good representation of species pool [C.M. Bouderesque, Methodes d'etude qualitative et quantitative du benthos (en particulier du phytobenthos), Tethys 3(1) (1971) 79] can be discerned from a quantitative minimal area which reflects the structural complexity of community [F.X. Niell, Sobre la biologia de Ascophyllum nosodum (L.) Le Jolis en Galicia, Invest. Pesq. 43 (1979) 501]. This suggests that the populational diversity can be considered as the value of the horizontal asymptote corresponding to the curve sample diversity/biomass [F.X. Niell, Les applications de l'index de Shannon a l'etude de la vegetation interdidale, Soc. Phycol. Fr. Bull. 19 (1974) 238]. In this study we develop a expression to determine minimal areas and use it to obtain certain information about the community structure based on diversity/area curve graphs. This expression is based on the functional relationship between the expected value of the diversity and the sample size used to estimate it. In order to establish the quality of the estimation process, we obtained the confidence intervals as a particularization of the functional (h-phi)-entropies proposed in [M. Salicru, M.L. Menendez, D. Morales, L. Pardo, Asymptotic distribution of (h,phi)-entropies, Commun. Stat. (Theory Methods) 22 (7) (1993) 2015]. As an example used to demonstrate the possibilities of this method, and only for illustrative purposes, data about a study on the rocky intertidal seawed populations in the Ria of Vigo (N.W. Spain) are analyzed [F.X. Niell, Estudios sobre la estructura, dinamica y produccion del Fitobentos intermareal (Facies rocosa) de la Ria de Vigo. Ph.D. Mem. University of Barcelona, Barcelona, 1979].

  2. Cancer and the LGBTQ Population: Quantitative and Qualitative Results from an Oncology Providers' Survey on Knowledge, Attitudes, and Practice Behaviors.

    Science.gov (United States)

    Tamargo, Christina L; Quinn, Gwendolyn P; Sanchez, Julian A; Schabath, Matthew B

    2017-10-07

    Despite growing social acceptance, the LGBTQ population continues to face barriers to healthcare including fear of stigmatization by healthcare providers, and providers' lack of knowledge about LGBTQ-specific health issues. This analysis focuses on the assessment of quantitative and qualitative responses from a subset of providers who identified as specialists that treat one or more of the seven cancers that may be disproportionate in LGBTQ patients. A 32-item web-based survey was emailed to 388 oncology providers at a single institution. The survey assessed: demographics, knowledge, attitudes, and practice behaviors. Oncology providers specializing in seven cancer types had poor knowledge of LGBTQ-specific health needs, with fewer than half of the surveyed providers (49.5%) correctly answering knowledge questions. Most providers had overall positive attitudes toward LGBTQ patients, with 91.7% agreeing they would be comfortable treating this population, and would support education and/or training on LGBTQ-related cancer health issues. Results suggest that despite generally positive attitudes toward the LGBTQ population, oncology providers who treat cancer types most prevalent among the population, lack knowledge of their unique health issues. Knowledge and practice behaviors may improve with enhanced education and training on this population's specific needs.

  3. Age estimation during the blow fly intra-puparial period: a qualitative and quantitative approach using micro-computed tomography.

    Science.gov (United States)

    Martín-Vega, Daniel; Simonsen, Thomas J; Wicklein, Martina; Hall, Martin J R

    2017-05-04

    Minimum post-mortem interval ( min PMI) estimates often rely on the use of developmental data from blow flies (Diptera: Calliphoridae), which are generally the first colonisers of cadavers and, therefore, exemplar forensic indicators. Developmental data of the intra-puparial period are of particular importance, as it can account for more than half of the developmental duration of the blow fly life cycle. During this period, the insect undergoes metamorphosis inside the opaque, barrel-shaped puparium, formed by the hardening and darkening of the third instar larval cuticle, which shows virtually no external changes until adult emergence. Regrettably, estimates based on the intra-puparial period are severely limited due to the lack of reliable, non-destructive ageing methods and are frequently based solely on qualitative developmental markers. In this study, we use non-destructive micro-computed tomography (micro-CT) for (i) performing qualitative and quantitative analyses of the morphological changes taking place during the intra-puparial period of two forensically relevant blow fly species, Calliphora vicina and Lucilia sericata, and (ii) developing a novel and reliable method for estimating insect age in forensic practice. We show that micro-CT provides age-diagnostic qualitative characters for most 10% time intervals of the total intra-puparial period, which can be used over a range of temperatures and with a resolution comparable to more invasive and time-consuming traditional imaging techniques. Moreover, micro-CT can be used to yield a quantitative measure of the development of selected organ systems to be used in combination with qualitative markers. Our results confirm micro-CT as an emerging, powerful tool in medico-legal investigations.

  4. Ultrasonic 3-D Vector Flow Method for Quantitative In Vivo Peak Velocity and Flow Rate Estimation.

    Science.gov (United States)

    Holbek, Simon; Ewertsen, Caroline; Bouzari, Hamed; Pihl, Michael Johannes; Hansen, Kristoffer Lindskov; Stuart, Matthias Bo; Thomsen, Carsten; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2017-03-01

    Current clinical ultrasound (US) systems are limited to show blood flow movement in either 1-D or 2-D. In this paper, a method for estimating 3-D vector velocities in a plane using the transverse oscillation method, a 32×32 element matrix array, and the experimental US scanner SARUS is presented. The aim of this paper is to estimate precise flow rates and peak velocities derived from 3-D vector flow estimates. The emission sequence provides 3-D vector flow estimates at up to 1.145 frames/s in a plane, and was used to estimate 3-D vector flow in a cross-sectional image plane. The method is validated in two phantom studies, where flow rates are measured in a flow-rig, providing a constant parabolic flow, and in a straight-vessel phantom ( ∅=8 mm) connected to a flow pump capable of generating time varying waveforms. Flow rates are estimated to be 82.1 ± 2.8 L/min in the flow-rig compared with the expected 79.8 L/min, and to 2.68 ± 0.04 mL/stroke in the pulsating environment compared with the expected 2.57 ± 0.08 mL/stroke. Flow rates estimated in the common carotid artery of a healthy volunteer are compared with magnetic resonance imaging (MRI) measured flow rates using a 1-D through-plane velocity sequence. Mean flow rates were 333 ± 31 mL/min for the presented method and 346 ± 2 mL/min for the MRI measurements.

  5. Measurement of bubble size distributions in vesiculated rocks with implications for quantitative estimation of eruption processes

    Science.gov (United States)

    Toramaru, Atsushi

    1990-10-01

    This paper outlines methods for determining a bubble size distribution (BSD) and the moments of the BSD function in vesiculated clasts produced by volcanic eruptions. It reports the results of applications of the methods to 11 natural samples and discusses the implications for quantitative estimates of eruption processes. The analysis is based on a quantitative morphological (stereological) method for 2-dimensional imaging of cross-sections of samples. One method determines, with some assumptions, the complete shape of the BSD function from the chord lengths cut by bubbles. The other determines the 1st, 2nd and 3rd moments of distribution functions by measurement of the number of bubbles per unit area, the surface area per unit volume, and the volume fraction of bubbles. Comparison of procedures and results of these two distinct methods shows that the latter yields rather more reliable results than the former, though the results coincide in absolute and relative magnitudes. Results of the analysis for vesiculated rocks from eleven subPlinian to Plinian eruptions show some interesting systematic correlations both between moments of the BSD and between a moment and the eruption column height or the SiO 2 content of magma. These correlations are successfully interpreted in terms of the nucleation and growth processes of bubbles in ascending magmas. This suggests that bubble coalescence does not predominate in sub-Plinian to Plinian explosive eruptions. The moment-moment correlations put constraints on the style of the nucleation and growth process of bubbles. The scaling argument suggests that a single nucleation event and subsequent growth with any kind of bubble interaction under continuous depressurization, which leads to an intermediate growth law between the diffusional growth ( R m ∝ t {2}/{3}) at a constant depressurization rate and the Ostwald ripening ( R m ∝ t {1}/{3}) under a constant pressure, where Rm and t are the mean radius of bubble and the

  6. Quantitative estimation of carbonation and chloride penetration in reinforced concrete by laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Eto, Shuzo, E-mail: eto@criepi.denken.or.jp [Central Research Institute of Electric Power Industry, 2-6-1 Nagasaka, Yokosuka, Kanagawa 240-0196 (Japan); Matsuo, Toyofumi; Matsumura, Takuro; Fujii, Takashi [Central Research Institute of Electric Power Industry, 2-6-1 Nagasaka, Yokosuka, Kanagawa 240-0196 (Japan); Tanaka, Masayoshi Y. [Interdisciplinary Graduate School of Engineering Sciences, Kyushu University, 2-6-1 Nagasaka, Yokosuka, Kanagawa 240-0196 (Japan)

    2014-11-01

    The penetration profile of chlorine in a reinforced concrete (RC) specimen was determined by laser-induced breakdown spectroscopy (LIBS). The concrete core was prepared from RC beams with cracking damage induced by bending load and salt water spraying. LIBS was performed using a specimen that was obtained by splitting the concrete core, and the line scan of laser pulses gave the two-dimensional emission intensity profiles of 100 × 80 mm{sup 2} within one hour. The two-dimensional profile of the emission intensity suggests that the presence of the crack had less effect on the emission intensity when the measurement interval was larger than the crack width. The chlorine emission spectrum was measured without using the buffer gas, which is usually used for chlorine measurement, by collinear double-pulse LIBS. The apparent diffusion coefficient, which is one of the most important parameters for chloride penetration in concrete, was estimated using the depth profile of chlorine emission intensity and Fick's law. The carbonation depth was estimated on the basis of the relationship between carbon and calcium emission intensities. When the carbon emission intensity was statistically higher than the calcium emission intensity at the measurement point, we determined that the point was carbonated. The estimation results were consistent with the spraying test results using phenolphthalein solution. These results suggest that the quantitative estimation by LIBS of carbonation depth and chloride penetration can be performed simultaneously. - Highlights: • We estimated the carbonation depth and the apparent diffusion coefficient of chlorine sodium in the reinforced concrete with cracking damage by LIBS. • Two-dimensional profile measurement of the emission intensity in each element was performed to visualize the chloride penetration and the carbonation in the reinforced concrete. • Apparent diffusion coefficient of chlorine and sodium can be estimated using the Fick

  7. Quantitative PCR-based genome size estimation of the astigmatid mites Sarcoptes scabiei, Psoroptes ovis and Dermatophagoides pteronyssinus

    Directory of Open Access Journals (Sweden)

    Mounsey Kate E

    2012-01-01

    Full Text Available Abstract Background The lack of genomic data available for mites limits our understanding of their biology. Evolving high-throughput sequencing technologies promise to deliver rapid advances in this area, however, estimates of genome size are initially required to ensure sufficient coverage. Methods Quantitative real-time PCR was used to estimate the genome sizes of the burrowing ectoparasitic mite Sarcoptes scabiei, the non-burrowing ectoparasitic mite Psoroptes ovis, and the free-living house dust mite Dermatophagoides pteronyssinus. Additionally, the chromosome number of S. scabiei was determined by chromosomal spreads of embryonic cells derived from single eggs. Results S. scabiei cells were shown to contain 17 or 18 small (S. scabiei and P. ovis were 96 (± 7 Mb and 86 (± 2 Mb respectively, among the smallest arthropod genomes reported to date. The D. pteronyssinus genome was estimated to be larger than its parasitic counterparts, at 151 Mb in female mites and 218 Mb in male mites. Conclusions This data provides a starting point for understanding the genetic organisation and evolution of these astigmatid mites, informing future sequencing projects. A comparitive genomic approach including these three closely related mites is likely to reveal key insights on mite biology, parasitic adaptations and immune evasion.

  8. Indirect enzyme-linked immunosorbent assay for the quantitative estimation of lysergic acid diethylamide in urine.

    Science.gov (United States)

    Kerrigan, S; Brooks, D E

    1998-05-01

    A new antibody to lysergic acid diethylamide (LSD) was used to develop a novel indirect ELISA for the quantification of drug in urine. Evaluation of the new assay with the commercially available LSD ELISA (STC Diagnostics) shows improved performance. The test requires 50 microL of urine, which is used to measure concentrations of drug in the microg/L to ng/L range. The limit of detection was 8 ng/L compared with 85 ng/L in the commercial assay, and analytical recoveries were 98-106%. Our test detected 0.1 microg/L of LSD in urine with an intraassay CV of 2.4% (n = 8) compared with 6.0% for a 0.5 microg/L sample in the commercial assay (n = 20). The upper and lower limits of quantification were estimated to be 7 microg/L and 50 ng/L, respectively. Specificity was evaluated by measuring the extent of cross-reactivity with 24 related substances. Drug determination using the new assay offers both improved sensitivity and precision compared with existing methods, thus facilitating the preliminary quantitative estimation of LSD in urine at lower concentrations with a greater degree of certainty.

  9. Improved radar data processing algorithms for quantitative rainfall estimation in real time.

    Science.gov (United States)

    Krämer, S; Verworn, H R

    2009-01-01

    This paper describes a new methodology to process C-band radar data for direct use as rainfall input to hydrologic and hydrodynamic models and in real time control of urban drainage systems. In contrast to the adjustment of radar data with the help of rain gauges, the new approach accounts for the microphysical properties of current rainfall. In a first step radar data are corrected for attenuation. This phenomenon has been identified as the main cause for the general underestimation of radar rainfall. Systematic variation of the attenuation coefficients within predefined bounds allows robust reflectivity profiling. Secondly, event specific R-Z relations are applied to the corrected radar reflectivity data in order to generate quantitative reliable radar rainfall estimates. The results of the methodology are validated by a network of 37 rain gauges located in the Emscher and Lippe river basins. Finally, the relevance of the correction methodology for radar rainfall forecasts is demonstrated. It has become clearly obvious, that the new methodology significantly improves the radar rainfall estimation and rainfall forecasts. The algorithms are applicable in real time.

  10. Quantitative estimation of 21st-century urban greenspace changes in Chinese populous cities.

    Science.gov (United States)

    Chen, Bin; Nie, Zhen; Chen, Ziyue; Xu, Bing

    2017-12-31

    Understanding the spatiotemporal changes of urban greenspace is a critical requirement for supporting urban planning and maintaining the function of urbanities. Although plenty of previous studies have attempted to estimate urban greenspace changes in China, there still remain shortcomings such as inconsistent surveying procedures and insufficient spatial resolution and city samples. Using cloud-free Landsat image composites in circa years 2000 and 2014, and Defense Meteorological Program Satellite Program's Operational Line-scan System (DMSP/OLS) nighttime lights dataset, we quantitatively estimated the urban greenspace changes regarding both administrative divisions and urban core boundaries across 98 Chinese populous cities. Results showed that a consistent decline of urban greenspace coverage was identified at both old and new urban areas in the majority of analyzed cities (i.e., 81.63% of cities regarding the administrative boundaries, and 86.73% of cities regarding the urban core boundaries). Partial correlation analysis also revealed that total urban greenspace area shrank as a linear function of the core urban expansion (R2=0.28, Pcities included in this study (R2=0.11, P<0.001). Copyright © 2017. Published by Elsevier B.V.

  11. SU-F-I-33: Estimating Radiation Dose in Abdominal Fat Quantitative CT

    Energy Technology Data Exchange (ETDEWEB)

    Li, X; Yang, K; Liu, B [Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: To compare size-specific dose estimate (SSDE) in abdominal fat quantitative CT with another dose estimate D{sub size,L} that also takes into account scan length. Methods: This study complied with the requirements of the Health Insurance Portability and Accountability Act. At our institution, abdominal fat CT is performed with scan length = 1 cm and CTDI{sub vol} = 4.66 mGy (referenced to body CTDI phantom). A previously developed CT simulation program was used to simulate single rotation axial scans of 6–55 cm diameter water cylinders, and dose integral of the longitudinal dose profile over the central 1 cm length was used to predict the dose at the center of one-cm scan range. SSDE and D{sub size,L} were assessed for 182 consecutive abdominal fat CT examinations with mean water-equivalent diameter (WED) of 27.8 cm ± 6.0 (range, 17.9 - 42.2 cm). Patient age ranged from 18 to 75 years, and weight ranged from 39 to 163 kg. Results: Mean SSDE was 6.37 mGy ± 1.33 (range, 3.67–8.95 mGy); mean D{sub size,L} was 2.99 mGy ± 0.85 (range, 1.48 - 4.88 mGy); and mean D{sub size,L}/SSDE ratio was 0.46 ± 0.04 (range, 0.40 - 0.55). Conclusion: The conversion factors for size-specific dose estimate in AAPM Report No. 204 were generated using 15 - 30 cm scan lengths. One needs to be cautious in applying SSDE to small length CT scans. For abdominal fat CT, SSDE was 80–150% higher than the dose of 1 cm scan length.

  12. Estimated Nutritive Value of Low-Price Model Lunch Sets Provided to Garment Workers in Cambodia.

    Science.gov (United States)

    Makurat, Jan; Pillai, Aarati; Wieringa, Frank T; Chamnan, Chhoun; Krawinkel, Michael B

    2017-07-21

    The establishment of staff canteens is expected to improve the nutritional situation of Cambodian garment workers. The objective of this study is to assess the nutritive value of low-price model lunch sets provided at a garment factory in Phnom Penh, Cambodia. Exemplary lunch sets were served to female workers through a temporary canteen at a garment factory in Phnom Penh. Dish samples were collected repeatedly to examine mean serving sizes of individual ingredients. Food composition tables and NutriSurvey software were used to assess mean amounts and contributions to recommended dietary allowances (RDAs) or adequate intake of energy, macronutrients, dietary fiber, vitamin C (VitC), iron, vitamin A (VitA), folate and vitamin B12 (VitB12). On average, lunch sets provided roughly one third of RDA or adequate intake of energy, carbohydrates, fat and dietary fiber. Contribution to RDA of protein was high (46% RDA). The sets contained a high mean share of VitC (159% RDA), VitA (66% RDA), and folate (44% RDA), but were low in VitB12 (29% RDA) and iron (20% RDA). Overall, lunches satisfied recommendations of caloric content and macronutrient composition. Sets on average contained a beneficial amount of VitC, VitA and folate. Adjustments are needed for a higher iron content. Alternative iron-rich foods are expected to be better suited, compared to increasing portions of costly meat/fish components. Lunch provision at Cambodian garment factories holds the potential to improve food security of workers, approximately at costs of <1 USD/person/day at large scale. Data on quantitative total dietary intake as well as physical activity among workers are needed to further optimize the concept of staff canteens.

  13. Estimated Nutritive Value of Low-Price Model Lunch Sets Provided to Garment Workers in Cambodia

    Directory of Open Access Journals (Sweden)

    Jan Makurat

    2017-07-01

    Full Text Available Background: The establishment of staff canteens is expected to improve the nutritional situation of Cambodian garment workers. The objective of this study is to assess the nutritive value of low-price model lunch sets provided at a garment factory in Phnom Penh, Cambodia. Methods: Exemplary lunch sets were served to female workers through a temporary canteen at a garment factory in Phnom Penh. Dish samples were collected repeatedly to examine mean serving sizes of individual ingredients. Food composition tables and NutriSurvey software were used to assess mean amounts and contributions to recommended dietary allowances (RDAs or adequate intake of energy, macronutrients, dietary fiber, vitamin C (VitC, iron, vitamin A (VitA, folate and vitamin B12 (VitB12. Results: On average, lunch sets provided roughly one third of RDA or adequate intake of energy, carbohydrates, fat and dietary fiber. Contribution to RDA of protein was high (46% RDA. The sets contained a high mean share of VitC (159% RDA, VitA (66% RDA, and folate (44% RDA, but were low in VitB12 (29% RDA and iron (20% RDA. Conclusions: Overall, lunches satisfied recommendations of caloric content and macronutrient composition. Sets on average contained a beneficial amount of VitC, VitA and folate. Adjustments are needed for a higher iron content. Alternative iron-rich foods are expected to be better suited, compared to increasing portions of costly meat/fish components. Lunch provision at Cambodian garment factories holds the potential to improve food security of workers, approximately at costs of <1 USD/person/day at large scale. Data on quantitative total dietary intake as well as physical activity among workers are needed to further optimize the concept of staff canteens.

  14. Myocardial blood flow estimates from dynamic contrast-enhanced magnetic resonance imaging: three quantitative methods

    Science.gov (United States)

    Borrazzo, Cristian; Galea, Nicola; Pacilio, Massimiliano; Altabella, Luisa; Preziosi, Enrico; Carnì, Marco; Ciolina, Federica; Vullo, Francesco; Francone, Marco; Catalano, Carlo; Carbone, Iacopo

    2018-02-01

    Dynamic contrast-enhanced cardiovascular magnetic resonance imaging can be used to quantitatively assess the myocardial blood flow (MBF), recovering the tissue impulse response function for the transit of a gadolinium bolus through the myocardium. Several deconvolution techniques are available, using various models for the impulse response. The method of choice may influence the results, producing differences that have not been deeply investigated yet. Three methods for quantifying myocardial perfusion have been compared: Fermi function modelling (FFM), the Tofts model (TM) and the gamma function model (GF), with the latter traditionally used in brain perfusion MRI. Thirty human subjects were studied at rest as well as under cold pressor test stress (submerging hands in ice-cold water), and a single bolus of gadolinium weighing 0.1  ±  0.05 mmol kg‑1 was injected. Perfusion estimate differences between the methods were analysed by paired comparisons with Student’s t-test, linear regression analysis, and Bland–Altman plots, as well as also using the two-way ANOVA, considering the MBF values of all patients grouped according to two categories: calculation method and rest/stress conditions. Perfusion estimates obtained by various methods in both rest and stress conditions were not significantly different, and were in good agreement with the literature. The results obtained during the first-pass transit time (20 s) yielded p-values in the range 0.20–0.28 for Student’s t-test, linear regression analysis slopes between 0.98–1.03, and R values between 0.92–1.01. From the Bland–Altman plots, the paired comparisons yielded a bias (and a 95% CI)—expressed as ml/min/g—for FFM versus TM, ‑0.01 (‑0.20, 0.17) or 0.02 (‑0.49, 0.52) at rest or under stress respectively, for FFM versus GF, ‑0.05 (‑0.29, 0.20) or  ‑0.07 (‑0.55, 0.41) at rest or under stress, and for TM versus GF, ‑0.03 (‑0.30, 0.24) or  ‑0.09 (‑0.43, 0

  15. Smoking duration alone provides stronger risk estimates of chronic obstructive pulmonary disease than pack-years.

    Science.gov (United States)

    Bhatt, Surya P; Kim, Young-Il; Harrington, Kathy F; Hokanson, John E; Lutz, Sharon M; Cho, Michael H; DeMeo, Dawn L; Wells, James M; Make, Barry J; Rennard, Stephen I; Washko, George R; Foreman, Marilyn G; Tashkin, Donald P; Wise, Robert A; Dransfield, Mark T; Bailey, William C

    2018-01-11

    Cigarette smoking is the strongest risk factor for COPD. Smoking burden is frequently measured in pack-years, but the relative contribution of cigarettes smoked per day versus duration towards the development of structural lung disease, airflow obstruction and functional outcomes is not known. We analysed cross-sectional data from a large multicentre cohort (COPDGene) of current and former smokers. Primary outcome was airflow obstruction (FEV 1 /FVC); secondary outcomes included five additional measures of disease: FEV 1 , CT emphysema, CT gas trapping, functional capacity (6 min walk distance, 6MWD) and respiratory morbidity (St George's Respiratory Questionnaire, SGRQ). Generalised linear models were estimated to compare the relative contribution of each smoking variable with the outcomes, after adjustment for age, race, sex, body mass index, CT scanner, centre, age of smoking onset and current smoking status. We also estimated adjusted means of each outcome by categories of pack-years and combined groups of categorised smoking duration and cigarettes/day, and estimated linear trends of adjusted means for each outcome by categorised cigarettes/day, smoking duration and pack-years. 10 187 subjects were included. For FEV 1 /FVC, standardised beta coefficient for smoking duration was greater than for cigarettes/day and pack-years (P<0.001). After categorisation, there was a linear increase in adjusted means FEV 1 /FVC with increase in pack-years (regression coefficient β=-0.023±SE0.003; P=0.003) and duration over all ranges of smoking cigarettes/day (β=-0.041±0.004; P<0.001) but a relatively flat slope for cigarettes/day across all ranges of smoking duration (β=-0.009±0.0.009; P=0.34). Strength of association of duration was similarly greater than pack-years for emphysema, gas trapping, FEV 1 , 6MWD and SGRQ. Smoking duration alone provides stronger risk estimates of COPD than the composite index of pack-years. Post-results; NCT00608764. © Article author

  16. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  17. A reliable and accurate portable device for rapid quantitative estimation of iodine content in different types of edible salt.

    Science.gov (United States)

    Yadav, Kapil; Kumar, Rakesh; Chakrabarty, Arijit; Pandav, Chandrakant S

    2015-01-01

    Continuous monitoring of salt iodization to ensure the success of the Universal Salt Iodization (USI) program can be significantly strengthened by the use of a simple, safe, and rapid method of salt iodine estimation. This study assessed the validity of a new portable device, iCheck Iodine developed by the BioAnalyt GmbH to estimate the iodine content in salt. Validation of the device was conducted in the laboratory of the South Asia regional office of the International Council for Control of Iodine Deficiency Disorders (ICCIDD). The validity of the device was assessed using device specific indicators, comparison of iCheck Iodine device with the iodometric titration, and comparison between iodine estimation using 1 g and 10 g salt by iCheck Iodine using 116 salt samples procured from various small-, medium-, and large-scale salt processors across India. The intra- and interassay imprecision for 10 parts per million (ppm), 30 ppm, and 50 ppm concentrations of iodized salt were 2.8%, 6.1%, and 3.1%, and 2.4%, 2.2%, and 2.1%, respectively. Interoperator imprecision was 6.2%, 6.3%, and 4.6% for the salt with iodine concentrations of 10 ppm, 30 ppm, and 50 ppm respectively. The correlation coefficient between measurements by the two methods was 0.934 and the correlation coefficient between measurements using 1 g of iodized salt and 10 g of iodized salt by the iCheck Iodine device was 0.983. The iCheck Iodine device is reliable and provides a valid method for the quantitative estimation of the iodine content of iodized salt fortified with potassium iodate in the field setting and in different types of salt.

  18. Quantitative estimation of undiscovered mineral resources - a case study of US Forest Service Wilderness tracts in the Pacific Mountain system.

    Science.gov (United States)

    Drew, L.J.

    1986-01-01

    The need by land managers and planners for more quantitative measures of mineral values has prompted scientists at the U.S. Geological Survey to test a probabilistic method of mineral resource assessment on a portion of the wilderness lands that have been studied during the past 20 years. A quantitative estimate of undiscovered mineral resources is made by linking the techniques of subjective estimation, geologic mineral deposit models, and Monte Carlo simulation. The study considers 91 U.S. Forest Service wilderness tracts in California, Nevada, Oregon, and Washington. -from Authors

  19. Quantitative estimation of groundwater recharge ratio along the riparian of the Yellow River.

    Science.gov (United States)

    Yan, Zhang; Fadong, Li; Jing, Li; Qiang, Liu; Guangshuai, Zhao

    2013-01-01

    Quantitative estimation of groundwater recharge is crucial for limited water resources management. A combination of isotopic and chemical indicators has been used to evaluate the relationship between surface water, groundwater, and rainfall around the riparian of the Yellow River in the North China Plain (NCP). The ion molar ratio of sodium to chloride in surface- and groundwater is 0.6 and 0.9, respectively, indicating cation exchange of Ca(2+) and/or Mg(2+) for Na(+) in groundwater. The δD and δ(18)O values in rainfall varied from -64.4 to -33.4‰ and from -8.39 to -4.49‰. The groundwater samples have δD values in the range of -68.7 to -58.0‰ and δ(18)O from -9.29 to -6.85‰. The δ(18)O and δD in surface water varied from -8.51 to -7.23‰ and from -64.42 to -53.73‰. The average values of both δD and δ(18)O from surface water are 3.92‰ and 0.57‰, respectively, higher compared to groundwater. Isotopic composition indicated that the groundwater in the riparian area of the Yellow River was influenced by heavy rainfall events and seepage of surface water. The mass balance was applied for the first time to estimate the amount of recharge, which is probably 6% and 94% of the rainfall and surface water, respectively.

  20. Improving Satellite Quantitative Precipitation Estimation Using GOES-Retrieved Cloud Optical Depth

    Energy Technology Data Exchange (ETDEWEB)

    Stenz, Ronald; Dong, Xiquan; Xi, Baike; Feng, Zhe; Kuligowski, Robert J.

    2016-02-01

    To address significant gaps in ground-based radar coverage and rain gauge networks in the U.S., geostationary satellite quantitative precipitation estimates (QPEs) such as the Self-Calibrating Multivariate Precipitation Retrievals (SCaMPR) can be used to fill in both the spatial and temporal gaps of ground-based measurements. Additionally, with the launch of GOES-R, the temporal resolution of satellite QPEs may be comparable to that of Weather Service Radar-1988 Doppler (WSR-88D) volume scans as GOES images will be available every five minutes. However, while satellite QPEs have strengths in spatial coverage and temporal resolution, they face limitations particularly during convective events. Deep Convective Systems (DCSs) have large cloud shields with similar brightness temperatures (BTs) over nearly the entire system, but widely varying precipitation rates beneath these clouds. Geostationary satellite QPEs relying on the indirect relationship between BTs and precipitation rates often suffer from large errors because anvil regions (little/no precipitation) cannot be distinguished from rain-cores (heavy precipitation) using only BTs. However, a combination of BTs and optical depth (τ) has been found to reduce overestimates of precipitation in anvil regions (Stenz et al. 2014). A new rain mask algorithm incorporating both τ and BTs has been developed, and its application to the existing SCaMPR algorithm was evaluated. The performance of the modified SCaMPR was evaluated using traditional skill scores and a more detailed analysis of performance in individual DCS components by utilizing the Feng et al. (2012) classification algorithm. SCaMPR estimates with the new rain mask applied benefited from significantly reduced overestimates of precipitation in anvil regions and overall improvements in skill scores.

  1. Can high resolution 3D topographic surveys provide reliable grain size estimates in gravel bed rivers?

    Science.gov (United States)

    Pearson, E.; Smith, M. W.; Klaar, M. J.; Brown, L. E.

    2017-09-01

    High resolution topographic surveys such as those provided by Structure-from-Motion (SfM) contain a wealth of information that is not always exploited in the generation of Digital Elevation Models (DEMs). In particular, several authors have related sub-metre scale topographic variability (or 'surface roughness') to sediment grain size by deriving empirical relationships between the two. In fluvial applications, such relationships permit rapid analysis of the spatial distribution of grain size over entire river reaches, providing improved data to drive three-dimensional hydraulic models, allowing rapid geomorphic monitoring of sub-reach river restoration projects, and enabling more robust characterisation of riverbed habitats. However, comparison of previously published roughness-grain-size relationships shows substantial variability between field sites. Using a combination of over 300 laboratory and field-based SfM surveys, we demonstrate the influence of inherent survey error, irregularity of natural gravels, particle shape, grain packing structure, sorting, and form roughness on roughness-grain-size relationships. Roughness analysis from SfM datasets can accurately predict the diameter of smooth hemispheres, though natural, irregular gravels result in a higher roughness value for a given diameter and different grain shapes yield different relationships. A suite of empirical relationships is presented as a decision tree which improves predictions of grain size. By accounting for differences in patch facies, large improvements in D50 prediction are possible. SfM is capable of providing accurate grain size estimates, although further refinement is needed for poorly sorted gravel patches, for which c-axis percentiles are better predicted than b-axis percentiles.

  2. A new TLC bioautographic assay for qualitative and quantitative estimation of lipase inhibitors.

    Science.gov (United States)

    Tang, Jihe; Zhou, Jinge; Tang, Qingjiu; Wu, Tao; Cheng, Zhihong

    2016-01-01

    Lipase inhibitory assays based on TLC bioautography have made recent progress; however, an assay with greater substrate specificity and quantitative capabilities would advance the efficacy of this particular bioassay. To address these limitations, a new TLC bioautographic assay for detecting lipase inhibitors was developed and validated in this study. The new TLC bioautographic assay was based on reaction of lipase with β-naphthyl myristate and the subsequent formation of the purple dye between β-naphthol and Fast Blue B salt (FBB). The relative lipase inhibitory capacity (RLIC) was determined by a TLC densitometry with fluorescence detection, expressed as orlistat equivalents in millimoles on a per sample weight basis. Six pure compounds and three natural extracts were evaluated for their potential lipase inhibitory activities by this TLC bioautographic assay. The β-naphthyl myristate as the substrate improved the detection sensitivity and specificity significantly. The limit of detection (LOD) of this assay was 0.01 ng for orlistat, the current treatment for obesity. This assay has acceptable accuracy (92.07-105.39%), intra-day and inter-day precisions [relative standard deviation (RSD), 2.64-4.40%], as well as intra-plate and inter-plate precisions (RSD, 1.8-4.9%). The developed method is rapid, simple, stable, and specific for screening and estimation of the potential lipase inhibitors. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Motor unit number estimation and quantitative needle electromyography in stroke patients.

    Science.gov (United States)

    Kouzi, Ioanna; Trachani, Eftichia; Anagnostou, Evangelos; Rapidi, Christina-Anastasia; Ellul, John; Sakellaropoulos, George C; Chroni, Elisabeth

    2014-12-01

    To evaluate the effect of upper motor neuron damage upon motor units' function by means of two separate and supplementary electrophysiological methods. The abductor digiti minimi muscle of the non-paretic and the paretic side was studied in forty-six stroke patients with (a) motor unit number estimation (MUNE) - adapted multiple point stimulation method and (b) computerized quantitative needle electromyography (EMG) assessing the configuration of voluntary recruited motor unit potentials. Main outcome comparisons were focused on differences between non-paretic and paretic side. On the affected hands mean MUNE value was significantly lower and mean area of the surface recorded single motor unit potentials was significantly larger than the corresponding ones on the non-paretic hands. EMG findings did not reveal remarkable differences between the two sides. Neither severity nor chronicity of stroke was related to MUNE or EMG parameters. MUNE results, which suggested reduced motor unit numbers in stroke patients, in conjunction with the normal EMG features in these same muscles has given rise to different interpretations. In a clinical setting, reinnervation type changes in the EMG similar to that occurring in neuronopathies or axonal neuropathies should not be expected in muscles with central neurogenic lesion. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Quantitative Precipitation Estimation over Ocean Using Bayesian Approach from Microwave Observations during the Typhoon Season

    Directory of Open Access Journals (Sweden)

    Jen-Chi Hu

    2009-01-01

    Full Text Available We have developed a new Bayesian approach to retrieve oceanic rain rate from the Tropical Rainfall Measuring Mission (TRMM Microwave Imager (TMI, with an emphasis on typhoon cases in the West Pacific. Retrieved rain rates are validated with measurements of rain gauges located on Japanese islands. To demonstrate improvement, retrievals are also compared with those from the TRMM/Precipitation Radar (PR, the Goddard Profiling Algorithm (GPROF, and a multi-channel linear regression statistical method (MLRS. We have found that qualitatively, all methods retrieved similar horizontal distributions in terms of locations of eyes and rain bands of typhoons. Quantitatively, our new Bayesian retrievals have the best linearity and the smallest root mean square (RMS error against rain gauge data for 16 typhoon over passes in 2004. The correlation coefficient and RMS of our retrievals are 0.95 and ~2 mm hr-1, respectively. In particular, at heavy rain rates, our Bayesian retrievals out perform those retrieved from GPROF and MLRS. Over all, the new Bayesian approach accurately retrieves surface rain rate for typhoon cases. Ac cu rate rain rate estimates from this method can be assimilated in models to improve forecast and prevent potential damages in Taiwan during typhoon seasons.

  5. The estimation of quantitative parameters of oligonucleotides immobilization on mica surface

    Science.gov (United States)

    Sharipov, T. I.; Bakhtizin, R. Z.

    2017-05-01

    Immobilization of nucleic acids on the surface of various materials is increasingly being used in research and some practical applications. Currently, the DNA chip technology is rapidly developing. The basis of the immobilization process can be both physical adsorption and chemisorption. A useful way to control the immobilization of nucleic acids on a surface is to use atomic force microscopy. It allows you to investigate the topography of the surface by its direct imaging with high resolution. Usually, to fix the DNA on the surface of mica are used cations which mediate the interaction between the mica surface and the DNA molecules. In our work we have developed a method for estimation of quantitative parameter of immobilization of oligonucleotides is their degree of aggregation depending on the fixation conditions on the surface of mica. The results on study of aggregation of oligonucleotides immobilized on mica surface will be presented. The single oligonucleotides molecules have been imaged clearly, whereas their surface areas have been calculated and calibration curve has been plotted.

  6. Model-based estimation of quantitative ultrasound variables at the proximal femur.

    Science.gov (United States)

    Dencks, Stefanie; Barkmann, Reinhard; Padilla, Frédéric; Laugier, Pascal; Schmitz, Georg; Glüer, Claus-C

    2008-01-01

    To improve the prediction of the osteoporotic fracture risk at the proximal femur we are developing a scanner for quantitative ultrasound (QUS) measurements at this site. Due to multipath transmission in this complex shaped bone, conventional signal processing techniques developed for QUS measurements at peripheral sites frequently fail. Therefore, we propose a model-based estimation of the QUS variables and analyze the performance of the new algorithm. Applying the proposed method to QUS scans of excised proximal femurs increased the fraction of evaluable signals from approx. 60% (using conventional algorithms) to 97%. The correlation of the standard QUS variables broadband ultrasound attenuation (BUA) and speed of sound (SOS) with the established variable bone mineral density (BMD) reported in previous studies is maintained (BUA/BMD: r(2) = 0.69; SOS/BMD: r(2) = 0.71; SOS+BUA/BMD: r(2) = 0.88). Additionally, different wave types could be clearly detected and characterized in the trochanteric region. The ability to separate superimposed signals with this approach opens up further diagnostic potential for evaluating waves of different sound paths and wave types through bone tissue.

  7. Development of combination tapered fiber-optic biosensor dip probe for quantitative estimation of interleukin-6 in serum samples

    Science.gov (United States)

    Wang, Chun Wei; Manne, Upender; Reddy, Vishnu B.; Oelschlager, Denise K.; Katkoori, Venkat R.; Grizzle, William E.; Kapoor, Rakesh

    2010-11-01

    A combination tapered fiber-optic biosensor (CTFOB) dip probe for rapid and cost-effective quantification of proteins in serum samples has been developed. This device relies on diode laser excitation and a charged-coupled device spectrometer and functions on a technique of sandwich immunoassay. As a proof of principle, this technique was applied in a quantitative estimation of interleukin IL-6. The probes detected IL-6 at picomolar levels in serum samples obtained from a patient with lupus, an autoimmune disease, and a patient with lymphoma. The estimated concentration of IL-6 in the lupus sample was 5.9 +/- 0.6 pM, and in the lymphoma sample, it was below the detection limit. These concentrations were verified by a procedure involving bead-based xMAP technology. A similar trend in the concentrations was observed. The specificity of the CTFOB dip probes was assessed by analysis with receiver operating characteristics. This analysis suggests that the dip probes can detect 5-pM or higher concentration of IL-6 in these samples with specificities of 100%. The results provide information for guiding further studies in the utilization of these probes to quantify other analytes in body fluids with high specificity and sensitivity.

  8. Estimation of immunization providers' activities cost, medication cost, and immunization dose errors cost in Iraq.

    Science.gov (United States)

    Al-lela, Omer Qutaiba B; Bahari, Mohd Baidi; Al-abbassi, Mustafa G; Salih, Muhannad R M; Basher, Amena Y

    2012-06-06

    The immunization status of children is improved by interventions that increase community demand for compulsory and non-compulsory vaccines, one of the most important interventions related to immunization providers. The aim of this study is to evaluate the activities of immunization providers in terms of activities time and cost, to calculate the immunization doses cost, and to determine the immunization dose errors cost. Time-motion and cost analysis study design was used. Five public health clinics in Mosul-Iraq participated in the study. Fifty (50) vaccine doses were required to estimate activities time and cost. Micro-costing method was used; time and cost data were collected for each immunization-related activity performed by the clinic staff. A stopwatch was used to measure the duration of activity interactions between the parents and clinic staff. The immunization service cost was calculated by multiplying the average salary/min by activity time per minute. 528 immunization cards of Iraqi children were scanned to determine the number and the cost of immunization doses errors (extraimmunization doses and invalid doses). The average time for child registration was 6.7 min per each immunization dose, and the physician spent more than 10 min per dose. Nurses needed more than 5 min to complete child vaccination. The total cost of immunization activities was 1.67 US$ per each immunization dose. Measles vaccine (fifth dose) has a lower price (0.42 US$) than all other immunization doses. The cost of a total of 288 invalid doses was 744.55 US$ and the cost of a total of 195 extra immunization doses was 503.85 US$. The time spent on physicians' activities was longer than that spent on registrars' and nurses' activities. Physician total cost was higher than registrar cost and nurse cost. The total immunization cost will increase by about 13.3% owing to dose errors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Quantitative estimation of temperature variations in plantar angiosomes: a study case for diabetic foot.

    Science.gov (United States)

    Peregrina-Barreto, H; Morales-Hernandez, L A; Rangel-Magdaleno, J J; Avina-Cervantes, J G; Ramirez-Cortes, J M; Morales-Caporal, R

    2014-01-01

    Thermography is a useful tool since it provides information that may help in the diagnostic of several diseases in a noninvasive and fast way. Particularly, thermography has been applied in the study of the diabetic foot. However, most of these studies report only qualitative information making it difficult to measure significant parameters such as temperature variations. These variations are important in the analysis of the diabetic foot since they could bring knowledge, for instance, regarding ulceration risks. The early detection of ulceration risks is considered an important research topic in the medicine field, as its objective is to avoid major complications that might lead to a limb amputation. The absence of symptoms in the early phase of the ulceration is conceived as the main disadvantage to provide an opportune diagnostic in subjects with neuropathy. Since the relation between temperature and ulceration risks is well established in the literature, a methodology that obtains quantitative temperature differences in the plantar area of the diabetic foot to detect ulceration risks is proposed in this work. Such methodology is based on the angiosome concept and image processing.

  10. Quantitative Estimation of Temperature Variations in Plantar Angiosomes: A Study Case for Diabetic Foot

    Directory of Open Access Journals (Sweden)

    H. Peregrina-Barreto

    2014-01-01

    Full Text Available Thermography is a useful tool since it provides information that may help in the diagnostic of several diseases in a noninvasive and fast way. Particularly, thermography has been applied in the study of the diabetic foot. However, most of these studies report only qualitative information making it difficult to measure significant parameters such as temperature variations. These variations are important in the analysis of the diabetic foot since they could bring knowledge, for instance, regarding ulceration risks. The early detection of ulceration risks is considered an important research topic in the medicine field, as its objective is to avoid major complications that might lead to a limb amputation. The absence of symptoms in the early phase of the ulceration is conceived as the main disadvantage to provide an opportune diagnostic in subjects with neuropathy. Since the relation between temperature and ulceration risks is well established in the literature, a methodology that obtains quantitative temperature differences in the plantar area of the diabetic foot to detect ulceration risks is proposed in this work. Such methodology is based on the angiosome concept and image processing.

  11. 49 CFR 375.405 - How must I provide a non-binding estimate?

    Science.gov (United States)

    2010-10-01

    ... TRANSPORTATION OF HOUSEHOLD GOODS IN INTERSTATE COMMERCE; CONSUMER PROTECTION REGULATIONS Estimating Charges...-binding estimate as an attachment to be made an integral part of the bill of lading contract. (5) You must... services are necessary to properly service a shipment after the bill of lading has been issued, you must...

  12. 49 CFR 375.403 - How must I provide a binding estimate?

    Science.gov (United States)

    2010-10-01

    ... TRANSPORTATION OF HOUSEHOLD GOODS IN INTERSTATE COMMERCE; CONSUMER PROTECTION REGULATIONS Estimating Charges... integral part of the bill of lading contract. (4) You must clearly indicate upon each binding estimate's... service a shipment after the bill of lading has been issued, you must inform the individual shipper what...

  13. Quantitative precipitation estimation in complex orography using quasi-vertical profiles of dual polarization radar variables

    Science.gov (United States)

    Montopoli, Mario; Roberto, Nicoletta; Adirosi, Elisa; Gorgucci, Eugenio; Baldini, Luca

    2017-04-01

    Weather radars are nowadays a unique tool to estimate quantitatively the rain precipitation near the surface. This is an important task for a plenty of applications. For example, to feed hydrological models, mitigate the impact of severe storms at the ground using radar information in modern warning tools as well as aid the validation studies of satellite-based rain products. With respect to the latter application, several ground validation studies of the Global Precipitation Mission (GPM) products have recently highlighted the importance of accurate QPE from ground-based weather radars. To date, a plenty of works analyzed the performance of various QPE algorithms making use of actual and synthetic experiments, possibly trained by measurement of particle size distributions and electromagnetic models. Most of these studies support the use of dual polarization variables not only to ensure a good level of radar data quality but also as a direct input in the rain estimation equations. Among others, one of the most important limiting factors in radar QPE accuracy is the vertical variability of particle size distribution that affects at different levels, all the radar variables acquired as well as rain rates. This is particularly impactful in mountainous areas where the altitudes of the radar sampling is likely several hundred of meters above the surface. In this work, we analyze the impact of the vertical profile variations of rain precipitation on several dual polarization radar QPE algorithms when they are tested a in complex orography scenario. So far, in weather radar studies, more emphasis has been given to the extrapolation strategies that make use of the signature of the vertical profiles in terms of radar co-polar reflectivity. This may limit the use of the radar vertical profiles when dual polarization QPE algorithms are considered because in that case all the radar variables used in the rain estimation process should be consistently extrapolated at the surface

  14. GENE ACTION AND HERITABILITY ESTIMATES OF QUANTITATIVE CHARACTERS AMONG LINES DERIVED FROM VARIETAL CROSSES OF SOYBEAN

    Directory of Open Access Journals (Sweden)

    Lukman Hakim

    2017-09-01

    Full Text Available The knowledge of genetic action, heritability and genetic variability is useful and permits plant breeder to design efficient breeding strategies in soybean.  The objectives of this study were to determine gene action, genetic variability, heritability and genetic advance of quantitative characters that could be realized through selection of segregation progenies. The F1 population and F2 progenies of six crosses among five soybean varieties were evaluated at Muneng Experimental Station, East Java during the dry season of 2014.  The lines were planted in a randomized block design with four replications.  The seeds of each F1 and F2 progenies and parents were planted in four rows of 3 m long, 40 cm x 20 cm plant spacing, one plant per hill. The result showed that pod number per plant, seed yield, plant yield and harvest index were found to be predominantly controlled by additive gene effects.  Seed size was also controlled by additive gene effects, with small seed dominant to large seed size.  Plant height was found to be controlled by both additive and nonadditive gene effects.  Similarly, days to maturity was due mainly to additive and nonadditive gene effects, with earliness dominant to lateness.  Days to maturity had the highest heritability estimates of 49.3%, followed by seed size (47.0%, harvest index (45.8%, and pod number per plant (45.5%.  Therefore, they could be used in the selection of a high yielding soybean genotype in the F3 generation. 

  15. Ecosystem Services Provided by Agroecosystems: A Qualitative and Quantitative Assessment of this Relationship in the Pampa Region, Argentina

    Science.gov (United States)

    Rositano, Florencia; Ferraro, Diego Omar

    2014-03-01

    The development of an analytical framework relating agricultural conditions and ecosystem services (ES) provision could be very useful for developing land-use systems which sustain natural resources for future use. According to this, a conceptual network was developed, based on literature review and expert knowledge, about the functional relationships between agricultural management and ES provision in the Pampa region (Argentina). We selected eight ES to develop this conceptual network: (1) carbon (C) balance, (2) nitrogen (N) balance, (3) groundwater contamination control, (4) soil water balance, (5) soil structural maintenance, (6) N2O emission control, (7) regulation of biotic adversities, and (8) biodiversity maintenance. This conceptual network revealed a high degree of interdependence among ES provided by Pampean agroecosystems, finding two trade-offs, and two synergies among them. Then, we analyzed the conceptual network structure, and found that both environmental and management variables influenced ES provision. Finally, we selected four ES to parameterize and quantify along 10 growing seasons (2000/2001-2009/2010) through a probabilistic methodology called Bayesian Networks. Only N balance was negatively impacted by agricultural management; while C balance, groundwater contamination control, and N2O emission control were not. Outcomes of our work emphasize the idea that qualitative and quantitative methodologies should be implemented together to assess ES provision in Pampean agroecosystems, as well as in other agricultural systems.

  16. Regadenoson provides perfusion results comparable to adenosine in heterogeneous patient populations: a quantitative analysis from the ADVANCE MPI trials.

    Science.gov (United States)

    Mahmarian, John J; Peterson, Leif E; Xu, Jiaqiong; Cerqueira, Manuel D; Iskandrian, Ami E; Bateman, Timothy M; Thomas, Gregory S; Nabi, Faisal

    2015-04-01

    Total and reversible left ventricular (LV) perfusion defect size (PDS) predict patient outcome. Limited data exist as to whether regadenoson induces similar perfusion abnormalities as observed with adenosine. We sought to determine whether regadenoson induces a similar LV PDS as seen with adenosine across varying patient populations. ADVANCE MPI were prospective, double-blind randomized trials comparing regadenoson to standard adenosine myocardial perfusion tomography (SPECT). Following an initial adenosine SPECT, patients were randomized to either regadenoson (N = 1284) or a second adenosine study (N = 660). SPECT quantification was performed blinded to randomization and image sequence. Propensity analysis was used to define comparability of regadenoson and adenosine perfusion results. Baseline clinical and SPECT results were similar in the two randomized groups. There was a close correlation between adenosine and regadenoson-induced total (r (2) = 0.98, P regadenoson vs adenosine, respectively, and irrespective of age, gender, diabetic status, body mass index, or prior cardiovascular history. By propensity analysis, regadenoson-induced total PDS was significantly larger than observed with adenosine. This is the first study to show that regadenoson induces similar, if not larger, perfusion defects than those observed with adenosine across different patient populations and demonstrates the value of quantitative analysis for defining serial changes in SPECT perfusion results. Regadenoson should provide comparable diagnostic and prognostic SPECT information to that obtained with adenosine.

  17. Quantitative Phosphoproteomic Analysis Provides Insight into the Response to Short-Term Drought Stress in Ammopiptanthus mongolicus Roots

    Directory of Open Access Journals (Sweden)

    Huigai Sun

    2017-10-01

    Full Text Available Drought is one of the major abiotic stresses that negatively affects plant growth and development. Ammopiptanthus mongolicus is an ecologically important shrub in the mid-Asia desert region and used as a model for abiotic tolerance research in trees. Protein phosphorylation participates in the regulation of various biological processes, however, phosphorylation events associated with drought stress signaling and response in plants is still limited. Here, we conducted a quantitative phosphoproteomic analysis of the response of A. mongolicus roots to short-term drought stress. Data are available via the iProx database with project ID IPX0000971000. In total, 7841 phosphorylation sites were found from the 2019 identified phosphopeptides, corresponding to 1060 phosphoproteins. Drought stress results in significant changes in the abundance of 103 phosphopeptides, corresponding to 90 differentially-phosphorylated phosphoproteins (DPPs. Motif-x analysis identified two motifs, including [pSP] and [RXXpS], from these DPPs. Functional enrichment and protein-protein interaction analysis showed that the DPPs were mainly involved in signal transduction and transcriptional regulation, osmotic adjustment, stress response and defense, RNA splicing and transport, protein synthesis, folding and degradation, and epigenetic regulation. These drought-corresponsive phosphoproteins, and the related signaling and metabolic pathways probably play important roles in drought stress signaling and response in A. mongolicus roots. Our results provide new information for understanding the molecular mechanism of the abiotic stress response in plants at the posttranslational level.

  18. Estimation of spinopelvic muscles' volumes in young asymptomatic subjects: a quantitative analysis.

    Science.gov (United States)

    Amabile, Celia; Moal, Bertrand; Chtara, Oussama Arous; Pillet, Helene; Raya, Jose G; Iannessi, Antoine; Skalli, Wafa; Lafage, Virginie; Bronsard, Nicolas

    2017-04-01

    Muscles have been proved to be a major component in postural regulation during pathological evolution or aging. Particularly, spinopelvic muscles are recruited for compensatory mechanisms such as pelvic retroversion, or knee flexion. Change in muscles' volume could, therefore, be a marker of greater postural degradation. Yet, it is difficult to interpret spinopelvic muscular degradation as there are few reported values for young asymptomatic adults to compare to. The objective was to provide such reference values on spinopelvic muscles. A model predicting the muscular volume from reduced set of MRI segmented images was investigated. A total of 23 asymptomatic subjects younger than 24 years old underwent an MRI acquisition from T12 to the knee. Spinopelvic muscles were segmented to obtain an accurate 3D reconstruction, allowing precise computation of muscle's volume. A model computing the volume of muscular groups from less than six MRI segmented slices was investigated. Baseline values have been reported in tables. For all muscles, invariance was found for the shape factor [ratio of volume over (area times length): SD muscles' values for a reference population have been reported. A new model predicting the muscles' volumes from a reduced set of MRI slices is proposed. While this model still needs to be validated on other populations, the current study appears promising for clinical use to determine, quantitatively, the muscular degradation.

  19. Noninvasive and quantitative intracranial pressure estimation using ultrasonographic measurement of optic nerve sheath diameter.

    Science.gov (United States)

    Wang, Li-Juan; Yao, Yan; Feng, Liang-Shu; Wang, Yu-Zhi; Zheng, Nan-Nan; Feng, Jia-Chun; Xing, Ying-Qi

    2017-02-07

    We aimed to quantitatively assess intracranial pressure (ICP) using optic nerve sheath diameter (ONSD) measurements. We recruited 316 neurology patients in whom ultrasonographic ONSD was measured before lumbar puncture. They were randomly divided into a modeling and a test group at a ratio of 7:3. In the modeling group, we conducted univariate and multivariate analyses to assess associations between ICP and ONSD, age, sex, BMI, mean arterial blood pressure, diastolic blood pressure. We derived the mathematical function "Xing &Wang" from the modelling group to predict ICP and evaluated the function in the test group. In the modeling group, ICP was strongly correlated with ONSD (r = 0.758, p Durbin-Watson value = 1.94). In the test group, a significant correlation was found between the observed and predicted ICP (r = 0.76, p < 0.001). Bland-Altman analysis yielded a mean difference between measurements of -0.07 ± 41.55 mmH2O. The intraclass correlation coefficient and its 95%CIs for noninvasive ICP assessments using our prediction model was 0.86 (0.79-0.90). Ultrasonographic ONSD measurements provide a potential noninvasive method to quantify ICP that can be conducted at the bedside.

  20. Mechanistic implications for the formation of the diiron cluster in ribonucleotide reductase provided by quantitative EPR spectroscopy.

    Science.gov (United States)

    Pierce, Brad S; Elgren, Timothy E; Hendrich, Michael P

    2003-07-23

    -peptide (beta(II)) approximately 25 A away. Furthermore, we show that metal incorporation into beta(II) occurs only during the O(2) activation chemistry of the beta(I)-peptide. This is the first direct evidence of an allosteric interaction between the two beta-peptides of R2. Furthermore, this model can explain the generally observed low Fe occupancy of R2. We also demonstrate that metal uptake and this newly observed allosteric effect are buffer dependent. Higher levels of glycerol cause loss of the allosteric effect. Reductive cycling of samples in the presence of Mn(II) produced a novel mixed metal Fe(III)Mn(III)R2 species within the active site of R2. The magnitude of the exchange coupling (J) determined for both the Mn(2)(II)R2 and Fe(III)Mn(III)R2 species was determined to be -1.8 +/- 0.3 and -18 +/- 3 cm(-)(1), respectively. Quantitative spectral simulations for the Fe(III)Mn(III)R2 and mononuclear Mn(II)R2 species are provided. This work represents the first instance where both X- and Q-band simulations of perpendicular and parallel mode spectra were used to quantitatively predict the concentration of a protein bound mononuclear Mn(II) species.

  1. Integrating field plots, lidar, and landsat time series to provide temporally consistent annual estimates of biomass from 1990 to present

    Science.gov (United States)

    Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan. Huang

    2015-01-01

    We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...

  2. A quantitative framework for estimating risk of collision between marine mammals and boats

    Science.gov (United States)

    Martin, Julien; Sabatier, Quentin; Gowan, Timothy A.; Giraud, Christophe; Gurarie, Eliezer; Calleson, Scott; Ortega-Ortiz, Joel G.; Deutsch, Charles J.; Rycyk, Athena; Koslovsky, Stacie M.

    2016-01-01

    Speed regulations of watercraft in protected areas are designed to reduce lethal collisions with wildlife but can have economic consequences. We present a quantitative framework for investigating the risk of deadly collisions between boats and wildlife.

  3. Mesoscale and Local Scale Evaluations of Quantitative Precipitation Estimates by Weather Radar Products during a Heavy Rainfall Event

    Directory of Open Access Journals (Sweden)

    Basile Pauthier

    2016-01-01

    Full Text Available A 24-hour heavy rainfall event occurred in northeastern France from November 3 to 4, 2014. The accuracy of the quantitative precipitation estimation (QPE by PANTHERE and ANTILOPE radar-based gridded products during this particular event, is examined at both mesoscale and local scale, in comparison with two reference rain-gauge networks. Mesoscale accuracy was assessed for the total rainfall accumulated during the 24-hour event, using the Météo France operational rain-gauge network. Local scale accuracy was assessed for both total event rainfall and hourly rainfall accumulations, using the recently developed HydraVitis high-resolution rain gauge network Evaluation shows that (1 PANTHERE radar-based QPE underestimates rainfall fields at mesoscale and local scale; (2 both PANTHERE and ANTILOPE successfully reproduced the spatial variability of rainfall at local scale; (3 PANTHERE underestimates can be significantly improved at local scale by merging these data with rain gauge data interpolation (i.e., ANTILOPE. This study provides a preliminary evaluation of radar-based QPE at local scale, suggesting that merged products are invaluable for applications at very high resolution. The results obtained underline the importance of using high-density rain-gauge networks to obtain information at high spatial and temporal resolution, for better understanding of local rainfall variation, to calibrate remotely sensed rainfall products.

  4. Estimation of low quantity genes: a hierarchical model for analyzing censored quantitative real-time PCR data.

    Science.gov (United States)

    Boyer, Tim C; Hanson, Tim; Singer, Randall S

    2013-01-01

    Analysis of gene quantities measured by quantitative real-time PCR (qPCR) can be complicated by observations that are below the limit of quantification (LOQ) of the assay. A hierarchical model estimated using MCMC methods was developed to analyze qPCR data of genes with observations that fall below the LOQ (censored observations). Simulated datasets with moderate to very high levels of censoring were used to assess the performance of the model; model results were compared to approaches that replace censored observations with a value on the log scale approximating zero or with values ranging from one to the LOQ of ten gene copies. The model was also compared to a Tobit regression model. Finally, all approaches for handling censored observations were evaluated with DNA extracted from samples that were spiked with known quantities of the antibiotic resistance gene tetL. For the simulated datasets, the model outperformed substitution of all values from 1-10 under all censoring scenarios in terms of bias, mean square error, and coverage of 95% confidence intervals for regression parameters. The model performed as well or better than substitution of a value approximating zero under two censoring scenarios (approximately 57% and 79% censored values). The model also performed as well or better than Tobit regression in two of three censoring scenarios (approximately 79% and 93% censored values). Under the levels of censoring present in the three scenarios of this study, substitution of any values greater than 0 produced the least accurate results. When applied to data produced from spiked samples, the model produced the lowest mean square error of the three approaches. This model provides a good alternative for analyzing large amounts of left-censored qPCR data when the goal is estimation of population parameters. The flexibility of this approach can accommodate complex study designs such as longitudinal studies.

  5. Estimation of low quantity genes: a hierarchical model for analyzing censored quantitative real-time PCR data.

    Directory of Open Access Journals (Sweden)

    Tim C Boyer

    Full Text Available Analysis of gene quantities measured by quantitative real-time PCR (qPCR can be complicated by observations that are below the limit of quantification (LOQ of the assay. A hierarchical model estimated using MCMC methods was developed to analyze qPCR data of genes with observations that fall below the LOQ (censored observations. Simulated datasets with moderate to very high levels of censoring were used to assess the performance of the model; model results were compared to approaches that replace censored observations with a value on the log scale approximating zero or with values ranging from one to the LOQ of ten gene copies. The model was also compared to a Tobit regression model. Finally, all approaches for handling censored observations were evaluated with DNA extracted from samples that were spiked with known quantities of the antibiotic resistance gene tetL. For the simulated datasets, the model outperformed substitution of all values from 1-10 under all censoring scenarios in terms of bias, mean square error, and coverage of 95% confidence intervals for regression parameters. The model performed as well or better than substitution of a value approximating zero under two censoring scenarios (approximately 57% and 79% censored values. The model also performed as well or better than Tobit regression in two of three censoring scenarios (approximately 79% and 93% censored values. Under the levels of censoring present in the three scenarios of this study, substitution of any values greater than 0 produced the least accurate results. When applied to data produced from spiked samples, the model produced the lowest mean square error of the three approaches. This model provides a good alternative for analyzing large amounts of left-censored qPCR data when the goal is estimation of population parameters. The flexibility of this approach can accommodate complex study designs such as longitudinal studies.

  6. NEXRAD quantitative precipitation estimates, data acquisition, and processing for the DuPage County, Illinois, streamflow-simulation modeling system

    Science.gov (United States)

    Ortel, Terry W.; Spies, Ryan R.

    2015-11-19

    Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).

  7. Estimation of Low Quantity Genes: A Hierarchical Model for Analyzing Censored Quantitative Real-Time PCR Data

    OpenAIRE

    Boyer, Tim C.; Tim Hanson; Singer, Randall S.

    2013-01-01

    Analysis of gene quantities measured by quantitative real-time PCR (qPCR) can be complicated by observations that are below the limit of quantification (LOQ) of the assay. A hierarchical model estimated using MCMC methods was developed to analyze qPCR data of genes with observations that fall below the LOQ (censored observations). Simulated datasets with moderate to very high levels of censoring were used to assess the performance of the model; model results were compared to approaches that r...

  8. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    Science.gov (United States)

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  9. Quantitative precipitation estimates for the northeastern Qinghai-Tibetan Plateau over the last 18,000 years

    Science.gov (United States)

    Li, Jianyong; Dodson, John; Yan, Hong; Cheng, Bo; Zhang, Xiaojian; Xu, Qinghai; Ni, Jian; Lu, Fengyan

    2017-05-01

    Quantitative information regarding the long-term variability of precipitation and vegetation during the period covering both the Late Glacial and the Holocene on the Qinghai-Tibetan Plateau (QTP) is scarce. Herein, we provide new and numerical reconstructions for annual mean precipitation (PANN) and vegetation history over the last 18,000 years using high-resolution pollen data from Lakes Dalianhai and Qinghai on the northeastern QTP. Hitherto, five calibration techniques including weighted averaging, weighted average-partial least squares regression, modern analogue technique, locally weighted weighted averaging regression, and maximum likelihood were first employed to construct robust inference models and to produce reliable PANN estimates on the QTP. The biomization method was applied for reconstructing the vegetation dynamics. The study area was dominated by steppe and characterized with a highly variable, relatively dry climate at 18,000-11,000 cal years B.P. PANN increased since the early Holocene, obtained a maximum at 8000-3000 cal years B.P. with coniferous-temperate mixed forest as the dominant biome, and thereafter declined to present. The PANN reconstructions are broadly consistent with other proxy-based paleoclimatic records from the northeastern QTP and the northern region of monsoonal China. The possible mechanisms behind the precipitation changes may be tentatively attributed to the internal feedback processes of higher latitude (e.g., North Atlantic) and lower latitude (e.g., subtropical monsoon) competing climatic regimes, which are primarily modulated by solar energy output as the external driving force. These findings may provide important insights into understanding the future Asian precipitation dynamics under the projected global warming.

  10. A high en-face resolution AS-OCT providing quantitative ability to measure layered corneal opacities

    Science.gov (United States)

    Chiu, Yu-Kuang; Chen, Wei-Li; Tsai, Cheng-Tsung; Yang, Chang-Hao; Huang, Sheng-Lung

    2017-07-01

    An in-vivo anterior-segment optical coherence tomography with sub-micron isotropic resolutions is demonstrated on rat cornea. The opacity of the layered cornea was quantitatively analyzed. The morphology of corneal layers was well-depicted by the en-face image.

  11. Quantitative Estimation of the Velocity of Urbanization in China Using Nighttime Luminosity Data

    Directory of Open Access Journals (Sweden)

    Ting Ma

    2016-01-01

    Full Text Available Rapid urbanization with sizeable enhancements of urban population and built-up land in China creates challenging planning and management issues due to the complexity of both the urban development and the socioeconomic drivers of environmental change. Improved understanding of spatio-temporal characteristics of urbanization processes are increasingly important for investigating urban expansion and environmental responses to corresponding socioeconomic and landscape dynamics. In this study, we present an artificial luminosity-derived index of the velocity of urbanization, defined as the ratio of temporal trend and spatial gradient of mean annual stable nighttime brightness, to estimate the pace of urbanization and consequent changes in land cover in China for the period of 2000–2010. Using the Defense Meteorological Satellite Program–derived time series of nighttime light data and corresponding satellite-based land cover maps, our results show that the geometric mean velocity of urban dispersal at the country level was 0.21 km·yr−1 across 88.58 × 103 km2 urbanizing areas, in which ~23% of areas originally made of natural and cultivated lands were converted to artificial surfaces between 2000 and 2010. The speed of urbanization varies among urban agglomerations and cities with different development stages and urban forms. Particularly, the Yangtze River Delta conurbation shows the fastest (0.39 km·yr−1 and most extensive (16.12 × 103 km2 urban growth in China over the 10-year period. Moreover, if the current velocity holds, our estimates suggest that an additional 13.29 × 103 km2 in land area will be converted to human-built features while high density socioeconomic activities across the current urbanizing regions and urbanized areas will greatly increase from 52.44 × 103 km2 in 2010 to 62.73 × 103 km2 in China’s mainland during the next several decades. Our findings may provide potential insights into the pace of urbanization in

  12. Comparison of different glomerular filtration methods in the elderly: which formula provides better estimates?

    Science.gov (United States)

    Aras, Sevgi; Varli, Murat; Uzun, Burcu; Atli, Teslime; Keven, Kenan; Turgay, Murat

    2012-01-01

    Technetium-99m diethylenetriaminepentaacetic acid ((99m)Tc-DTPA) is an ideal radioisotopic method having a high correlation with inulin clearance for the determination of glomerular filtration rate (GFR). Different formulas like creatinine clearance (CrCl) in 24 h urine samples, Cockroft-Gault formula (CGF), and modification of diet in renal disease (MDRD) are being used to come up with an estimate. In this study, we compared (99m)Tc-DTPA with the formulas mentioned above in an attempt to best identify the method that would yield the nearly ideal GFR estimates in the elderly. In 76 patients who were admitted to our clinic, we measured 24 h urine volume (V), urine creatinine (Ucr), and serum creatinine (Scr) levels together with CrCl, Scr, serum urea (Su), and albumin (Alb) levels. By using coefficients identified for age, gender, and race, we calculated modification of diet in renal disease 1 (MDRD1). Different from MDRD1, we calculated modification of diet in renal disease 2 (MDRD2) that does not include Su and Alb parameters and formulas like CGF that include Scr, age, gender, and weight parameters to come up with GFR levels. All patients underwent (99m)Tc-DTPA procedure. The mean of the GFR values measured by (99m)Tc-DTPA was 54.3 ± 19.9. The means of GFR values calculated by CrCl, MDRD1, MDRD2, and CGF were 58.0 ± 30.5, 60.9 ± 22.1, 54.4 ± 20.1, and 57.9 ± 22.4, respectively. GFR as measured by (99m)Tc-DTPA showed statistically significant correlations with the results of other methods (p < 0.001 for all methods). The most significant correlation was with MDRD1. MDRD1 can be used for next to ideal and accurate predictions of GFR in the elderly in the daily practice.

  13. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  14. High-density surface electromyography provides reliable estimates of motor unit behavior.

    Science.gov (United States)

    Martinez-Valdes, E; Laine, C M; Falla, D; Mayer, F; Farina, D

    2016-06-01

    To assess the intra- and inter-session reliability of estimates of motor unit behavior and muscle fiber properties derived from high-density surface electromyography (HDEMG). Ten healthy subjects performed submaximal isometric knee extensions during three recording sessions (separate days) at 10%, 30%, 50% and 70% of their maximum voluntary effort. The discharge timings of motor units of the vastus lateralis and medialis muscles were automatically identified from HDEMG by a decomposition algorithm. We characterized the number of detected motor units, their discharge rates, the coefficient of variation of their inter-spike intervals (CoVisi), the action potential conduction velocity and peak-to-peak amplitude. Reliability was assessed for each motor unit characteristics by intra-class correlation coefficient (ICC). Additionally, a pulse-to-noise ratio (PNR) was calculated, to verify the accuracy of the decomposition. Good to excellent reliability within and between sessions was found for all motor unit characteristics at all force levels (ICCs>0.8), with the exception of CoVisi that presented poor reliability (ICC95%). Motor unit features can be assessed non-invasively and reliably within and across sessions over a wide range of force levels. These results suggest that it is possible to characterize motor units in longitudinal intervention studies. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  15. Validation and measurement uncertainty estimation in food microbiology: differences between quantitative and qualitative methods

    Directory of Open Access Journals (Sweden)

    Vesna Režić Dereani

    2010-09-01

    Full Text Available The aim of this research is to describe quality control procedures, procedures for validation and measurement uncertainty (MU determination as an important element of quality assurance in food microbiology laboratory for qualitative and quantitative type of analysis. Accreditation is conducted according to the standard ISO 17025:2007. General requirements for the competence of testing and calibration laboratories, which guarantees the compliance with standard operating procedures and the technical competence of the staff involved in the tests, recently are widely introduced in food microbiology laboratories in Croatia. In addition to quality manual introduction, and a lot of general documents, some of the most demanding procedures in routine microbiology laboratories are measurement uncertainty (MU procedures and validation experiment design establishment. Those procedures are not standardized yet even at international level, and they require practical microbiological knowledge, altogether with statistical competence. Differences between validation experiments design for quantitative and qualitative food microbiology analysis are discussed in this research, and practical solutions are shortly described. MU for quantitative determinations is more demanding issue than qualitative MU calculation. MU calculations are based on external proficiency testing data and internal validation data. In this paper, practical schematic descriptions for both procedures are shown.

  16. Estimation of haplotype associated with several quantitative phenotypes based on maximization of area under a receiver operating characteristic (ROC) curve.

    Science.gov (United States)

    Kamitsuji, Shigeo; Kamatani, Naoyuki

    2006-01-01

    An algorithm for estimating haplotypes associated with several quantitative phenotypes is proposed. The concept of a receiver operating characteristic (ROC) curve was introduced, and a linear combination of the quantitative phenotypic values was considered. This set of values was divided into two parts: values for subjects with and without a particular haplotype. The goodness of its partition was evaluated by the area under the ROC curve (AUC). The AUC value varied from 0 to 1; this value was close to 1 when the partition had high accuracy. Therefore, the strength of association between phenotypes and haplotypes was considered to be proportional to the AUC value. In our algorithm, the parameters representing a degree of association between the haplotypes and phenotypes were estimated so as to maximize the AUC value; further, the haplotype with the maximum AUC value was considered to be the best haplotype associated with the phenotypes. This algorithm was implemented by using R language. The effectiveness of our algorithm was evaluated by applying it to real genotype data of the Calpine-10 gene obtained from diabetics. The results showed that our algorithm was more reasonable and advantageous for use with several quantitative phenotypes than the generalized linear model or the neural network model.

  17. Modified DTW for a quantitative estimation of the similarity between rainfall time series

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Barthès, Laurent; Mallet, Cécile; Chazottes, Aymeric

    2017-04-01

    The Precipitations are due to complex meteorological phenomenon and can be described as intermittent process. The spatial and temporal variability of this phenomenon is significant and covers large scales. To analyze and model this variability and / or structure, several studies use a network of rain gauges providing several time series of precipitation measurements. To compare these different time series, the authors compute for each time series some parameters (PDF, rain peak intensity, occurrence, amount, duration, intensity …). However, and despite the calculation of these parameters, the comparison of the parameters between two series of measurements remains qualitative. Due to the advection processes, when different sensors of an observation network measure precipitation time series identical in terms of intermitency or intensities, there is a time lag between the different measured series. Analyzing and extracting relevant information on physical phenomena from these precipitation time series implies the development of automatic analytical methods capable of comparing two time series of precipitation measured by different sensors or at two different locations and thus quantifying the difference / similarity. The limits of the Euclidean distance to measure the similarity between the time series of precipitation have been well demonstrated and explained (eg the Euclidian distance is indeed very sensitive to the effects of phase shift : between two identical but slightly shifted time series, this distance is not negligible). To quantify and analysis these time lag, the correlation functions are well established, normalized and commonly used to measure the spatial dependences that are required by many applications. However, authors generally observed that there is always a considerable scatter of the inter-rain gauge correlation coefficients obtained from the individual pairs of rain gauges. Because of a substantial dispersion of estimated time lag, the

  18. Modeling real-time PCR kinetics: Richards reparametrized equation for quantitative estimation of European hake (Merluccius merluccius).

    Science.gov (United States)

    Sánchez, Ana; Vázquez, José A; Quinteiro, Javier; Sotelo, Carmen G

    2013-04-10

    Real-time PCR is the most sensitive method for detection and precise quantification of specific DNA sequences, but it is not usually applied as a quantitative method in seafood. In general, benchmark techniques, mainly cycle threshold (Ct), are the routine method for quantitative estimations, but they are not the most precise approaches for a standard assay. In the present work, amplification data from European hake (Merluccius merluccius) DNA samples were accurately modeled by three sigmoid reparametrized equations, where the lag phase parameter (λc) from the Richards equation with four parameters was demonstrated to be the perfect substitute for Ct for PCR quantification. The concentrations of primers and probes were subsequently optimized by means of that selected kinetic parameter. Finally, the linear correlation among DNA concentration and λc was also confirmed.

  19. Improved TLC Bioautographic Assay for Qualitative and Quantitative Estimation of Tyrosinase Inhibitors in Natural Products.

    Science.gov (United States)

    Zhou, Jinge; Tang, Qingjiu; Wu, Tao; Cheng, Zhihong

    2017-03-01

    TLC bioautography for tyrosinase inhibitors has made recent progress; however, an assay with a relative low consumption of enzyme and quantitative capability would greatly advance the efficacy of related TLC bioautographic assays. An improved TLC bioautographic assay for detecting tyrosinase inhibitors was developed and validated in this study. L-DOPA (better water-solubility than L-tyrosine) was used as the substrate instead of reported L-tyrosine. The effects of enzyme and substrate concentrations, reaction temperatures and times, and pH values of the reaction system as well as different plate types on the TLC bioautographic assay were optimised. The quantitative analysis was conducted by densitometric scanning of spot areas, and expressed as the relative tyrosinase inhibitory capacity (RTIC) using a positive control (kojic acid) equivalent. The limit of detection (LOD) of this assay was 1.0 ng for kojic acid. This assay has acceptable accuracy (101.73-102.90%), intra- and inter-day, and intra- and inter-plate precisions [relative standard deviation (RSD), less than 7.0%], and ruggedness (RSD, less than 3.5%). The consumption of enzyme (75 U/mL) is relatively low. Two tyrosinase inhibitory compounds including naringenin and 1-O-β-D-glucopyranosyl-4-allylbenzene have been isolated from Rhodiola sacra guided by this TLC bioautographic assay. Our improved assay is a relatively low-cost, sensitive, and quantitative method compared to the reported TLC bioautographic assays. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Statistical estimation of correlated genome associations to a quantitative trait network.

    Directory of Open Access Journals (Sweden)

    Seyoung Kim

    2009-08-01

    Full Text Available Many complex disease syndromes, such as asthma, consist of a large number of highly related, rather than independent, clinical or molecular phenotypes. This raises a new technical challenge in identifying genetic variations associated simultaneously with correlated traits. In this study, we propose a new statistical framework called graph-guided fused lasso (GFlasso to directly and effectively incorporate the correlation structure of multiple quantitative traits such as clinical metrics and gene expressions in association analysis. Our approach represents correlation information explicitly among the quantitative traits as a quantitative trait network (QTN and then leverages this network to encode structured regularization functions in a multivariate regression model over the genotypes and traits. The result is that the genetic markers that jointly influence subgroups of highly correlated traits can be detected jointly with high sensitivity and specificity. While most of the traditional methods examined each phenotype independently and combined the results afterwards, our approach analyzes all of the traits jointly in a single statistical framework. This allows our method to borrow information across correlated phenotypes to discover the genetic markers that perturb a subset of the correlated traits synergistically. Using simulated datasets based on the HapMap consortium and an asthma dataset, we compared the performance of our method with other methods based on single-marker analysis and regression-based methods that do not use any of the relational information in the traits. We found that our method showed an increased power in detecting causal variants affecting correlated traits. Our results showed that, when correlation patterns among traits in a QTN are considered explicitly and directly during a structured multivariate genome association analysis using our proposed methods, the power of detecting true causal SNPs with possibly pleiotropic

  1. Modeling Bone Surface Morphology: A Fully Quantitative Method for Age-at-Death Estimation Using the Pubic Symphysis.

    Science.gov (United States)

    Slice, Dennis E; Algee-Hewitt, Bridget F B

    2015-07-01

    The pubic symphysis is widely used in age estimation for the adult skeleton. Standard practice requires the visual comparison of surface morphology against criteria representing predefined phases and the estimation of case-specific age from an age range associated with the chosen phase. Known problems of method and observer error necessitate alternative tools to quantify age-related change in pubic morphology. This paper presents an objective, fully quantitative method for estimating age-at-death from the skeleton, which exploits a variance-based score of surface complexity computed from vertices obtained from a scanner sampling the pubic symphysis. For laser scans from 41 modern American male skeletons, this method produces results that are significantly associated with known age-at-death (RMSE = 17.15 years). Chronological age is predicted, therefore, equally well, if not, better, with this robust, objective, and fully quantitative method than with prevailing phase-aging systems. This method contributes to forensic casework by responding to medico-legal expectations for evidence standards. © 2015 American Academy of Forensic Sciences.

  2. THE QUADRANTS METHOD TO ESTIMATE QUANTITATIVE VARIABLES IN MANAGEMENT PLANS IN THE AMAZON

    Directory of Open Access Journals (Sweden)

    Gabriel da Silva Oliveira

    2015-12-01

    Full Text Available This work aimed to evaluate the accuracy in estimates of abundance, basal area and commercial volume per hectare, by the quadrants method applied to an area of 1.000 hectares of rain forest in the Amazon. Samples were simulated by random and systematic process with different sample sizes, ranging from 100 to 200 sampling points. The amounts estimated by the samples were compared with the parametric values recorded in the census. In the analysis we considered as the population all trees with diameter at breast height equal to or greater than 40 cm. The quadrants method did not reach the desired level of accuracy for the variables basal area and commercial volume, overestimating the observed values recorded in the census. However, the accuracy of the estimates of abundance, basal area and commercial volume was satisfactory for applying the method in forest inventories for management plans in the Amazon.

  3. Bottom-up modeling approach for the quantitative estimation of parameters in pathogen-host interactions

    Directory of Open Access Journals (Sweden)

    Teresa eLehnert

    2015-06-01

    Full Text Available Opportunistic fungal pathogens can cause bloodstream infection and severe sepsis upon entering the blood stream of the host. The early immune response in human blood comprises the elimination of pathogens by antimicrobial peptides and innate immune cells, such as neutrophils or monocytes. Mathematical modeling is a predictive method to examine these complex processes and to quantify the dynamics of pathogen-host interactions. Since model parameters are often not directly accessible from experiment, their estimation is required by calibrating model predictions with experimental data. Depending on the complexity of the mathematical model, parameter estimation can be associated with excessively high computational costs in terms of run time and memory. We apply a strategy for reliable parameter estimation where different modeling approaches with increasing complexity are used that build on one another. This bottom-up modeling approach is applied to an experimental human whole-blood infection assay for Candida albicans. Aiming for the quantification of the relative impact of different routes of the immune response against this human-pathogenic fungus, we start from a non-spatial state-based model (SBM, because this level of model complexity allows estimating textit{a priori} unknown transition rates between various system states by the global optimization method simulated annealing. Building on the non-spatial SBM, an agent-based model (ABM is implemented that incorporates the migration of interacting cells in three-dimensional space. The ABM takes advantage of estimated parameters from the non-spatial SBM, leading to a decreased dimensionality of the parameter space. This space can be scanned using a local optimization approach, i.e. least-squares error estimation based on an adaptive regular grid search, to predict cell migration parameters that are not accessible in experiment.

  4. Methods for the quantitative comparison of molecular estimates of clade age and the fossil record.

    Science.gov (United States)

    Clarke, Julia A; Boyd, Clint A

    2015-01-01

    Approaches quantifying the relative congruence, or incongruence, of molecular divergence estimates and the fossil record have been limited. Previously proposed methods are largely node specific, assessing incongruence at particular nodes for which both fossil data and molecular divergence estimates are available. These existing metrics, and other methods that quantify incongruence across topologies including entirely extinct clades, have so far not taken into account uncertainty surrounding both the divergence estimates and the ages of fossils. They have also treated molecular divergence estimates younger than previously assessed fossil minimum estimates of clade age as if they were the same as cases in which they were older. However, these cases are not the same. Recovered divergence dates younger than compared oldest known occurrences require prior hypotheses regarding the phylogenetic position of the compared fossil record and standard assumptions about the relative timing of morphological and molecular change to be incorrect. Older molecular dates, by contrast, are consistent with an incomplete fossil record and do not require prior assessments of the fossil record to be unreliable in some way. Here, we compare previous approaches and introduce two new descriptive metrics. Both metrics explicitly incorporate information on uncertainty by utilizing the 95% confidence intervals on estimated divergence dates and data on stratigraphic uncertainty concerning the age of the compared fossils. Metric scores are maximized when these ranges are overlapping. MDI (minimum divergence incongruence) discriminates between situations where molecular estimates are younger or older than known fossils reporting both absolute fit values and a number score for incompatible nodes. DIG range (divergence implied gap range) allows quantification of the minimum increase in implied missing fossil record induced by enforcing a given set of molecular-based estimates. These metrics are used

  5. Improved power-law estimates from multiple samples provided by millennium climate simulations

    Science.gov (United States)

    Henriksson, S. V.; Räisänen, P.; Silen, J.; Järvinen, H.; Laaksonen, A.

    2015-02-01

    Using the long annual mean temperature time series provided by millennium Earth System Model simulations and a method of discrete Fourier transform with varying starting point and length of time window together with averaging, we get good fits to power laws between two characteristic oscillatory timescales of the model climate: multidecadal (50-80 years) and El Nino (3-6 years) timescales. For global mean temperature, we fit β ˜ 0.35 in a relation S( f) ˜ f - β in a simulation without external climate forcing and β over 0.7 in a simulation with external forcing included. The power law is found both with and without external forcing despite the forcings, e.g. the volcanic forcing, not showing similar behaviour, indicating a nonlinear temperature response to time-varying forcing. We also fit a power law with β ˜ 8 to the narrow frequency range between El Nino frequencies (up to 1/(3.2 years)) and the Nyquist frequency (1/(2 years)). Also, monthly mean temperature time series are considered and a decent power-law fit for frequencies above 1/year is obtained. Regional variability in best-fit β is explored, and the impact of choosing the frequency range on the result is illustrated. When all resolved frequencies are used, land areas seem to have lower βs than ocean areas on average, but when fits are restricted to frequencies below 1/(6 years), this difference disappears, while regional differences still remain. Results compare well with measurements both for global mean temperature and for the central England temperature record.

  6. Quantitative Estimation of Land Surface Characteristic Parameters and Actual Evapotranspiration in the Nagqu River Basin over the Tibetan Plateau

    Science.gov (United States)

    Zhong, L.; Ma, Y.; Ma, W.; Zou, M.; Hu, Y.

    2016-12-01

    Actual evapotranspiration (ETa) is an important component of the water cycle in the Tibetan Plateau. It is controlled by many hydrological and meteorological factors. Therefore, it is of great significance to estimate ETa accurately and continuously. It is also drawing much attention of scientific community to understand land surface parameters and land-atmosphere water exchange processes in small watershed-scale areas. Based on in-situ meteorological data in the Nagqu river basin and surrounding regions, the main meteorological factors affecting the evaporation process were quantitatively analyzed and the point-scale ETa estimation models in the study area were successfully built. On the other hand, multi-source satellite data (such as SPOT, MODIS, FY-2C) were used to derive the surface characteristics in the river basin. A time series processing technique was applied to remove cloud cover and reconstruct data series. Then improved land surface albedo, improved downward shortwave radiation flux and reconstructed normalized difference vegetation index (NDVI) were coupled into the topographical enhanced surface energy balance system to estimate ETa. The model-estimated results were compared with those ETa values determined by combinatory method. The results indicated that the model-estimated ETa agreed well with in-situ measurements with correlation coefficient, mean bias error and root mean square error of 0.836, 0.087 and 0.140 mm/h respectively.

  7. Stereological estimates of nuclear volume and other quantitative variables in supratentorial brain tumors. Practical technique and use in prognostic evaluation

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Braendgaard, H; Chistiansen, A O

    1991-01-01

    the practical technique. The continuous variables were correlated with the subjective, qualitative WHO classification of brain tumors, and the prognostic value of the parameters was assessed. Well differentiated astrocytomas (n = 14) had smaller estimates of the volume-weighted mean nuclear volume and mean......p approximately 0.05, respectively). Age above the median and short duration of symptoms were significantly associated with short survival (2p = 0.01). Further investigations of larger series of patients are needed to define the clinical usefulness of these objective, reproducible, and quantitative...

  8. Estimating marginal properties of quantitative real-time PCR data using nonlinear mixed models

    DEFF Research Database (Denmark)

    Gerhard, Daniel; Bremer, Melanie; Ritz, Christian

    2014-01-01

    A unified modeling framework based on a set of nonlinear mixed models is proposed for flexible modeling of gene expression in real-time PCR experiments. Focus is on estimating the marginal or population-based derived parameters: cycle thresholds and ΔΔc(t), but retaining the conditional mixed mod...

  9. Ultrasonic 3-D Vector Flow Method for Quantitative In Vivo Peak Velocity and Flow Rate Estimation

    DEFF Research Database (Denmark)

    Holbek, Simon; Ewertsen, Caroline; Bouzari, Hamed

    2017-01-01

    Current clinical ultrasound (US) systems are limited to show blood flow movement in either 1-D or 2-D. In this paper, a method for estimating 3-D vector velocities in a plane using the transverse oscillation method, a 32×32 element matrix array, and the experimental US scanner SARUS is presented....

  10. A subagging regression method for estimating the qualitative and quantitative state of groundwater

    Science.gov (United States)

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young

    2017-08-01

    A subsample aggregating (subagging) regression (SBR) method for the analysis of groundwater data pertaining to trend-estimation-associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of other methods, and the uncertainties are reasonably estimated; the others have no uncertainty analysis option. To validate further, actual groundwater data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by both SBR and GPR regardless of Gaussian or non-Gaussian skewed data. However, it is expected that GPR has a limitation in applications to severely corrupted data by outliers owing to its non-robustness. From the implementations, it is determined that the SBR method has the potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data such as the groundwater level and contaminant concentration.

  11. Line Transect and Triangulation Surveys Provide Reliable Estimates of the Density of Kloss' Gibbons (Hylobates klossii) on Siberut Island, Indonesia.

    Science.gov (United States)

    Höing, Andrea; Quinten, Marcel C; Indrawati, Yohana Maria; Cheyne, Susan M; Waltert, Matthias

    2013-02-01

    Estimating population densities of key species is crucial for many conservation programs. Density estimates provide baseline data and enable monitoring of population size. Several different survey methods are available, and the choice of method depends on the species and study aims. Few studies have compared the accuracy and efficiency of different survey methods for large mammals, particularly for primates. Here we compare estimates of density and abundance of Kloss' gibbons (Hylobates klossii) using two of the most common survey methods: line transect distance sampling and triangulation. Line transect surveys (survey effort: 155.5 km) produced a total of 101 auditory and visual encounters and a density estimate of 5.5 gibbon clusters (groups or subgroups of primate social units)/km(2). Triangulation conducted from 12 listening posts during the same period revealed a similar density estimate of 5.0 clusters/km(2). Coefficients of variation of cluster density estimates were slightly higher from triangulation (0.24) than from line transects (0.17), resulting in a lack of precision in detecting changes in cluster densities of triangulation and triangulation method also may be appropriate.

  12. Skill Assessment of An Hybrid Technique To Estimate Quantitative Precipitation Forecast For Galicia (nw Spain)

    Science.gov (United States)

    Lage, A.; Taboada, J. J.

    Precipitation is the most obvious of the weather elements in its effects on normal life. Numerical weather prediction (NWP) is generally used to produce quantitative precip- itation forecast (QPF) beyond the 1-3 h time frame. These models often fail to predict small-scale variations of rain because of spin-up problems and their coarse spatial and temporal resolution (Antolik, 2000). Moreover, there are some uncertainties about the behaviour of the NWP models in extreme situations (de Bruijn and Brandsma, 2000). Hybrid techniques, combining the benefits of NWP and statistical approaches in a flexible way, are very useful to achieve a good QPF. In this work, a new technique of QPF for Galicia (NW of Spain) is presented. This region has a percentage of rainy days per year greater than 50% with quantities that may cause floods, with human and economical damages. The technique is composed of a NWP model (ARPS) and a statistical downscaling process based on an automated classification scheme of at- mospheric circulation patterns for the Iberian Peninsula (J. Ribalaygua and R. Boren, 1995). Results show that QPF for Galicia is improved using this hybrid technique. [1] Antolik, M.S. 2000 "An Overview of the National Weather Service's centralized statistical quantitative precipitation forecasts". Journal of Hydrology, 239, pp:306- 337. [2] de Bruijn, E.I.F and T. Brandsma "Rainfall prediction for a flooding event in Ireland caused by the remnants of Hurricane Charley". Journal of Hydrology, 239, pp:148-161. [3] Ribalaygua, J. and Boren R. "Clasificación de patrones espaciales de precipitación diaria sobre la España Peninsular". Informes N 3 y 4 del Servicio de Análisis e Investigación del Clima. Instituto Nacional de Meteorología. Madrid. 53 pp.

  13. HER-2 and INT-2 amplification estimated by quantitative PCR in paraffin-embedded ovarian cancer tissue samples.

    Science.gov (United States)

    Hruza, C; Dobianer, K; Beck, A; Czerwenka, K; Hanak, H; Klein, M; Leodolter, S; Medl, M; Müllauer-Ertl, S; Preiser, J

    1993-01-01

    Competitive polymerase chain reaction (PCR) systems were developed for rapid and quantitative estimation of HER-2 (c-erbB-2) and INT-2 oncogene amplification in paraffin-embedded ovarian cancer tissue samples. The beta-globin gene was used as reference and DNA from paraffin-embedded placenta tissue as single copy control. Reliability of the PCR method could be demonstrated by comparing dot blot data with PCR data of identical tumour samples. The PCR method was used to determine HER-2 and INT-2 copy numbers in 196 ovarian cancer samples. HER-2 and INT-2 were found to be amplified in 40 and 19%, respectively. In 8% HER-2 copy numbers were greater than five, but no high INT-2 copies were noted. Kaplan-Meier estimates did not reveal significant association with overall survival. Indirect correlation between HER-2 and INT-2 amplification was observed. The present PCR system is a valuable method for prospective and retrospective studies.

  14. Quantitative estimation of parthenolide in Tanacetum parthenium (L.) Schultz-Bip. cultivated in Egypt.

    Science.gov (United States)

    El-Shamy, Ali M; El-Hawary, Seham S; Rateb, Mostafa E M

    2007-01-01

    Parthenolide, a germacranolide-type sesquiterpene lactone, was estimated in Tanacetum parthenium (L.) cultivated in Egypt by using colorimetric, planar chromatographic, and high-performance liquid chromatographic (HPLC) methods. Parthenolide levels in the open-field herb and aseptically germinated shoots were also compared by using the HPLC method. Parthenolide was produced and estimated for the first time in the callus culture of the plant. In addition, 2 Egyptian market preparations were analyzed for their parthenolide content by using the HPLC method. The relative standard deviations were 0.093, 0.095, and 0.098% (n = 5, 5, and 7, respectively), and the corresponding recoveries were 98.2, 98.9, and 99.4% for the colorimetric, planar chromatographic, and HPLC determinations, respectively.

  15. A quantitative method to estimate high gloss polished tool steel surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Rebeggiani, S; Rosen, B-G [Halmstad University, The Functional Surfaces Research Group, Box 823, SE-301 18 HALMSTAD (Sweden); Sandberg, A, E-mail: sabina.rebeggiani@hh.se [Uddeholms AB, SE-683 85 Hagfors (Sweden)

    2011-08-19

    Visual estimations are today the most common way to assess the surface quality of moulds and dies; a method that are both subjective and, with today's high demands on surfaces, hardly usable to distinguish between the finest surface qualities. Instead a method based on non-contact 3D-surface texture analysis is suggested. Several types of tool steel samples, manually as well as machine polished, were analysed to study different types of surface defects such as pitting, orange peel and outwardly features. The classification of the defect structures serves as a catalogue where known defects are described. Suggestions of different levels of 'high surface quality' defined in numerical values adapted to high gloss polished tool steel surfaces are presented. The final goal is to develop a new manual that can work as a 'standard' for estimations of tool steel surfaces for steel producers, mould makers, polishers etc.

  16. Quantitative estimation of itopride hydrochloride and rabeprazole sodium from capsule formulation.

    Science.gov (United States)

    Pillai, S; Singhvi, I

    2008-09-01

    Two simple, accurate, economical and reproducible UV spectrophotometric methods and one HPLC method for simultaneous estimation of two component drug mixture of itopride hydrochloride and rabeprazole sodium from combined capsule dosage form have been developed. First developed method involves formation and solving of simultaneous equations using 265.2 nm and 290.8 nm as two wavelengths. Second method is based on two wavelength calculation, wavelengths selected for estimation of itopride hydrochloride was 278.0 nm and 298.8 nm and for rabeprazole sodium 253.6 nm and 275.2 nm. Developed HPLC method is a reverse phase chromatographic method using phenomenex C(18) column and acetonitrile: phosphate buffer (35:65 v/v) pH 7.0 as mobile phase. All developed methods obey Beer's law in concentration range employed for respective methods. Results of analysis were validated statistically and by recovery studies.

  17. Contemporary group estimates adjusted for climatic effects provide a finer definition of the unknown environmental challenges experienced by growing pigs.

    Science.gov (United States)

    Guy, S Z Y; Li, L; Thomson, P C; Hermesch, S

    2017-12-01

    Environmental descriptors derived from mean performances of contemporary groups (CGs) are assumed to capture any known and unknown environmental challenges. The objective of this paper was to obtain a finer definition of the unknown challenges, by adjusting CG estimates for the known climatic effects of monthly maximum air temperature (MaxT), minimum air temperature (MinT) and monthly rainfall (Rain). As the unknown component could include infection challenges, these refined descriptors may help to better model varying responses of sire progeny to environmental infection challenges for the definition of disease resilience. Data were recorded from 1999 to 2013 at a piggery in south-east Queensland, Australia (n = 31,230). Firstly, CG estimates of average daily gain (ADG) and backfat (BF) were adjusted for MaxT, MinT and Rain, which were fitted as splines. In the models used to derive CG estimates for ADG, MaxT and MinT were significant variables. The models that contained these significant climatic variables had CG estimates with a lower variance compared to models without significant climatic variables. Variance component estimates were similar across all models, suggesting that these significant climatic variables accounted for some known environmental variation captured in CG estimates. No climatic variables were significant in the models used to derive the CG estimates for BF. These CG estimates were used to categorize environments. There was no observable sire by environment interaction (Sire×E) for ADG when using the environmental descriptors based on CG estimates on BF. For the environmental descriptors based on CG estimates of ADG, there was significant Sire×E only when MinT was included in the model (p = .01). Therefore, this new definition of the environment, preadjusted by MinT, increased the ability to detect Sire×E. While the unknown challenges captured in refined CG estimates need verification for infection challenges, this may provide a

  18. Methane emission estimation from landfills in Korea (1978-2004): quantitative assessment of a new approach.

    Science.gov (United States)

    Kim, Hyun-Sun; Yi, Seung-Muk

    2009-01-01

    Quantifying methane emission from landfills is important to evaluating measures for reduction of greenhouse gas (GHG) emissions. To quantify GHG emissions and identify sensitive parameters for their measurement, a new assessment approach consisting of six different scenarios was developed using Tier 1 (mass balance method) and Tier 2 (the first-order decay method) methodologies for GHG estimation from landfills, suggested by the Intergovernmental Panel on Climate Change (IPCC). Methane emissions using Tier 1 correspond to trends in disposed waste amount, whereas emissions from Tier 2 gradually increase as disposed waste decomposes over time. The results indicate that the amount of disposed waste and the decay rate for anaerobic decomposition were decisive parameters for emission estimation using Tier 1 and Tier 2. As for the different scenarios, methane emissions were highest under Scope 1 (scenarios I and II), in which all landfills in Korea were regarded as one landfill. Methane emissions under scenarios III, IV, and V, which separated the dissimilated fraction of degradable organic carbon (DOC(F)) by waste type and/or revised the methane correction factor (MCF) by waste layer, were underestimated compared with scenarios II and III. This indicates that the methodology of scenario I, which has been used in most previous studies, may lead to an overestimation of methane emissions. Additionally, separate DOC(F) and revised MCF were shown to be important parameters for methane emission estimation from landfills, and revised MCF by waste layer played an important role in emission variations. Therefore, more precise information on each landfill and careful determination of parameter values and characteristics of disposed waste in Korea should be used to accurately estimate methane emissions from landfills.

  19. Quantitative estimates of changes in marine and terrestrial primary productivity over the past 300 million years

    OpenAIRE

    Beerling, D. J.

    1999-01-01

    Changes in marine primary production over geological time have influenced a network of global biogeochemical cycles with corresponding feedbacks on climate. However, these changes continue to remain largely unquantified because of uncertainties in calculating global estimates from sedimentary palaeoproductivity indicators. I therefore describe a new approach to the problem using a mass balance analysis of the stable isotopes (18O/16O) of oxygen with modelled O2 fluxes and isotopic exchanges b...

  20. Disaster metrics: quantitative estimation of the number of ambulances required in trauma-related multiple casualty events.

    Science.gov (United States)

    Bayram, Jamil D; Zuabi, Shawki; El Sayed, Mazen J

    2012-10-01

    Estimating the number of ambulances needed in trauma-related Multiple Casualty Events (MCEs) is a challenging task. Emergency medical services (EMS) regions in the United States have varying "best practices" for the required number of ambulances in MCE, none of which is based on metric criteria. The objective of this study was to estimate the number of ambulances required to respond to the scene of trauma-related MCE in order to initiate treatment and complete the transport of critical (T1) and moderate (T2) patients. The proposed model takes into consideration the different transport times and capacities of receiving hospitals, the time interval from injury occurrence, the number of patients per ambulance, and the pre-designated time frame allowed from injury until the transfer care of T1 and T2 patients. The main theoretical framework for this model was based on prehospital time intervals described in the literature and used by EMS systems to evaluate operational and patient care issues. The North Atlantic Treaty Organization (NATO) triage categories (T1-T4) were used for simplicity. The minimum number of ambulances required to respond to the scene of an MCE was modeled as being primarily dependent on the number of critical patients (T1) present at the scene any particular time. A robust quantitative model was also proposed to dynamically estimate the number of ambulances needed at any time during an MCE to treat, transport and transfer the care of T1 and T2 patients. A new quantitative model for estimation of the number of ambulances needed during the prehospital response in trauma-related multiple casualty events has been proposed. Prospective studies of this model are needed to examine its validity and applicability.

  1. Quantitative phosphoproteomics of murine Fmr1-KO cell lines provides new insights into FMRP-dependent signal transduction mechanisms.

    Science.gov (United States)

    Matic, Katarina; Eninger, Timo; Bardoni, Barbara; Davidovic, Laetitia; Macek, Boris

    2014-10-03

    Fragile X mental retardation protein (FMRP) is an RNA-binding protein that has a major effect on neuronal protein synthesis. Transcriptional silencing of the FMR1 gene leads to loss of FMRP and development of Fragile X syndrome (FXS), the most common known hereditary cause of intellectual impairment and autism. Here we utilize SILAC-based quantitative phosphoproteomics to analyze murine FMR1(-) and FMR1(+) fibroblastic cell lines derived from FMR1-KO embryos to identify proteins and phosphorylation sites dysregulated as a consequence of FMRP loss. We quantify FMRP-related changes in the levels of 5,023 proteins and 6,133 phosphorylation events and map them onto major signal transduction pathways. Our study confirms global downregulation of the MAPK/ERK pathway and decrease in phosphorylation level of ERK1/2 in the absence of FMRP, which is connected to attenuation of long-term potentiation. We detect differential expression of several key proteins from the p53 pathway, pointing to the involvement of p53 signaling in dysregulated cell cycle control in FXS. Finally, we detect differential expression and phosphorylation of proteins involved in pre-mRNA processing and nuclear transport, as well as Wnt and calcium signaling, such as PLC, PKC, NFAT, and cPLA2. We postulate that calcium homeostasis is likely affected in molecular pathogenesis of FXS.

  2. Quantitative analysis of oyster larval proteome provides new insights into the effects of multiple climate change stressors

    KAUST Repository

    Dineshram, Ramadoss

    2016-03-19

    The metamorphosis of planktonic larvae of the Pacific oyster (Crassostrea gigas) underpins their complex life-history strategy by switching on the molecular machinery required for sessile life and building calcite shells. Metamorphosis becomes a survival bottleneck, which will be pressured by different anthropogenically induced climate change-related variables. Therefore, it is important to understand how metamorphosing larvae interact with emerging climate change stressors. To predict how larvae might be affected in a future ocean, we examined changes in the proteome of metamorphosing larvae under multiple stressors: decreased pH (pH 7.4), increased temperature (30 °C), and reduced salinity (15 psu). Quantitative protein expression profiling using iTRAQ-LC-MS/MS identified more than 1300 proteins. Decreased pH had a negative effect on metamorphosis by down-regulating several proteins involved in energy production, metabolism, and protein synthesis. However, warming switched on these down-regulated pathways at pH 7.4. Under multiple stressors, cell signaling, energy production, growth, and developmental pathways were up-regulated, although metamorphosis was still reduced. Despite the lack of lethal effects, significant physiological responses to both individual and interacting climate change related stressors were observed at proteome level. The metamorphosing larvae of the C. gigas population in the Yellow Sea appear to have adequate phenotypic plasticity at the proteome level to survive in future coastal oceans, but with developmental and physiological costs. © 2016 John Wiley & Sons Ltd.

  3. Extra petals in the buttercup (Ranunculus repens) provide a quick method to estimate the age of meadows.

    Science.gov (United States)

    Warren, John

    2009-09-01

    There is a widely used crude method to estimate the age of hedgerows (Hooper's rule) based on species' richness. The aim of this study was to try and establish a similar field method for estimating the age of grasslands based on the accumulation of macro-somatic mutations. A countrywide survey was carried out by the British public to investigate the relationship between grassland age and the number of Ranunculus repens (creeping buttercup) plants with extra petals. In addition the relationship between grassland age and R. repens pollen viability was also investigated. Each plant with flowers with additional petals in a sample of 100 was found to equate to approx. 7 years. A higher significant correlation was observed between pollen viability and population age; however, this is not amenable to providing field estimates. The age of British grasslands can be easily and reliably estimated in the field by counting the number flowers with additional petals in R. repens in meadows up to 200 years old. An attempt to estimate the heritability of extra petals suggests that the phenotype results from the slow accumulation of somatic mutations in a species that primarily reproduces vegetatively.

  4. Estimation of the Accuracy of Method for Quantitative Determination of Volatile Compounds in Alcohol Products

    CERN Document Server

    Charepitsa, S V; Zadreyko, Y V; Sytova, S N

    2016-01-01

    Results of the estimation of the precision for determination volatile compounds in alcohol-containing products by gas chromatography: acetaldehyde, methyl acetate, ethyl acetate, methanol, isopropyl alcohol, propyl alcohol, isobutyl alcohol, butyl alcohol, isoamyl alcohol are presented. To determine the accuracy, measurements were planned in accordance with ISO 5725 and held at the gas chromatograph Crystal-5000. Standard deviation of repeatability, intermediate precision and their limits are derived from obtained experimental data. The uncertainty of the measurements was calculated on the base of an "empirical" method. The obtained values of accuracy indicate that the developed method allows measurement uncertainty extended from 2 to 20% depending on the analyzed compound and measured concentration.

  5. Quantitative estimation of AgNORs in normal, dysplastic and malignant oral mucosa.

    Science.gov (United States)

    Chowdhry, Aman; Deshmukh, Revati Shailesh; Shukla, Deepika; Bablani, Deepika; Mishra, Shashwat

    2014-06-01

    Silver stainable nucleolar organizer regions (AgNORs) have received a great deal of attention recently as their frequency within the nuclei is significantly higher in malignant cells than in normal, reactive or benign neoplastic cells. The objective of this study was to carry out a quantitative assessment of large and small AgNORs in oral normal mucosa, precancerous lesions and infiltrating squamous cell carcinomas. The study comprised 110 formalin-fixed, paraffin-embedded oral mucosal biopsies consisting of 30 oral dysplasia, 60 oral squamous cell carcinomas and 20 normal oral mucosa. AgNORs were counted in each nucleus, categorized as small, large and total number of AgNORs in each cell and their means were calculated. The mean value of small AgNORs, large AgNORs and total AgNORs increased gradually from normal mucosa to dysplastic lesions to squamous cell carcinomas. The study clearly indicates that in oral squamous cell carcinomas, AgNORs diminish in size as they increase in number. Further, AgNOR counts increase as the degree of malignant potential of the cell increases. By combining both the enumeration of AgNORs and their size, good distinction can be made between normal, dysplastic and infiltrating squamous cell carcinomas. This could help in the early diagnosis and prognosis of dysplastic mucosal lesions and their malignant transformation.

  6. Tree Root System Characterization and Volume Estimation by Terrestrial Laser Scanning and Quantitative Structure Modeling

    Directory of Open Access Journals (Sweden)

    Aaron Smith

    2014-12-01

    Full Text Available The accurate characterization of three-dimensional (3D root architecture, volume, and biomass is important for a wide variety of applications in forest ecology and to better understand tree and soil stability. Technological advancements have led to increasingly more digitized and automated procedures, which have been used to more accurately and quickly describe the 3D structure of root systems. Terrestrial laser scanners (TLS have successfully been used to describe aboveground structures of individual trees and stand structure, but have only recently been applied to the 3D characterization of whole root systems. In this study, 13 recently harvested Norway spruce root systems were mechanically pulled from the soil, cleaned, and their volumes were measured by displacement. The root systems were suspended, scanned with TLS from three different angles, and the root surfaces from the co-registered point clouds were modeled with the 3D Quantitative Structure Model to determine root architecture and volume. The modeling procedure facilitated the rapid derivation of root volume, diameters, break point diameters, linear root length, cumulative percentages, and root fraction counts. The modeled root systems underestimated root system volume by 4.4%. The modeling procedure is widely applicable and easily adapted to derive other important topological and volumetric root variables.

  7. Quantitative modelling to estimate the transfer of pharmaceuticals through the food production system.

    Science.gov (United States)

    Chiţescu, Carmen Lidia; Nicolau, Anca Ioana; Römkens, Paul; Van Der Fels-Klerx, H J

    2014-01-01

    Use of pharmaceuticals in animal production may cause an indirect route of contamination of food products of animal origin. This study aimed to assess, through mathematical modelling, the transfer of pharmaceuticals from contaminated soil, through plant uptake, into the dairy food production chain. The scenarios, model parameters, and values refer to contaminants in emission slurry production, storage time, immission into soil, plant uptake, bioaccumulation in the animal's body, and transfer to meat and milk. Modelling results confirm the possibility of contamination of dairy cow's meat and milk due the ingestion of contaminated feed by the cattle. The estimated concentration of pharmaceutical residues obtained for meat ranged from 0 to 6 ng kg(-1) for oxytetracycline, from 0.011 to 0.181 μg kg(-1) for sulfamethoxazole, and from 4.70 to 11.86 μg kg(-1) for ketoconazole. The estimated concentrations for milk were: zero for oxytetracycline, lower than 40 ng L(-1) for sulfamethoxazole, and from 0.98 to 2.48 μg L(-1) for ketoconazole. Results obtained for the three selected pharmaceuticals indicate a minor risk for human health. This study showed that supply chain modelling could be an effective tool in assessing the indirect contamination of feedstuff and animal products by residues of pharmaceuticals. The model can easily be adjusted to other contaminants and supply chain and, in this way, present a valuable tool to underpin decision making.

  8. Methods for estimating epidemiological effects of quantitative resistance to plant diseases.

    Science.gov (United States)

    Leonard, K J; Mundt, C C

    1984-01-01

    A model developed by R.C. Lewontin relating rate of population increase to key parameters of the organism's fecundity curve is described and adapted for use with plant pathogenic fungi. For diseases such as cereal rusts, rice blast, and powdery mildew and downy mildew of cucumber, the sporulation curves for the pathogens have been shown to follow an approximately triangular pattern. In the Lewontin model the key features of the pattern are: A, the time from inoculation to first sporulation (i.e. latent period); T, the time of peak spore production per day; W, the time at which sporulation ceases; and S, the area of the triangle (total reproduction per generation). For exponential increase, the values of A, T, W, and S are related to r 1, the rate of population increase, according to the following equation: [Formula: see text] This equation was used to generate families of curves showing effects on r 1 of changes in the position of the triangle (altering latent period) or area (altering reproduction per generation). Data for barley leaf rust, oat crown rust, wheat leaf rust, wheat stem rust, rice blast, cucumber downy mildew, and cucumber powdery mildew were analyzed according to the model to show the effects of different components of resistance on r 1 for each disease. Predictions from the model for barley leaf rust were compared with published data for components of resistance and rates of disease increase for eight barley cultivars. For cultivars of similar crop canopy type (two cultivars sparse; six cultivars, dense canopies), the predicted r 1 values closely corresponded to observed values. Applications of the model to cultivar mixtures and to integrated control (involving protectant fungicides in combination with quantitative resistance) are also discussed.

  9. Quantitative estimation of lithofacies from seismic data in a tertiary turbidite system in the North Sea

    Energy Technology Data Exchange (ETDEWEB)

    Joerstad, A.K.; Avseth, P.Aa; Mukerji, T.; Mavko, G.; Granli, J.R.

    1998-12-31

    Deep water clastic systems and associated turbidite reservoirs are often characterized by very complex sand distributions and reservoir description based on conventional seismic and well-log stratigraphic analysis may be very uncertain in these depositional environments. There is shown that reservoirs in turbidite systems have been produced very inefficiently in conventional development. More than 70% of the mobile oil is commonly left behind, because of the heterogeneous nature of these reservoirs. In this study there is examined a turbidite system in the North Sea with five available wells and a 3-D seismic near and far offset stack to establish most likely estimates of facies and pore fluid within the cube. 5 figs.

  10. Phytochemical studies for quantitative estimation of iridoid glycosides in Picrorhiza kurroa Royle.

    Science.gov (United States)

    Sultan, Phalisteen; Jan, Arif; Pervaiz, Qazi

    2016-12-01

    Picrorhiza kurroa Royle commonly known as 'Kutki or Kutaki' is an important medicinal plant in Ayurvedic system of medicine and has traditionally been used to treat disorders of the liver and upper respiratory tract. The plant is the principle source of iridoid glycosides, picrosides I, II and kutkoside used in various herbal drug formulations mainly as strong hepatoprotective and immune-modulatory compound. The species has become endangered to near extinction due to the unregulated collection from the wild, slower plant growth and ecological destruction of natural habitats. There is a severe shortage of plant material, while the market demand is ever increasing. Hence, it is very important to apply a simple and precise analytical method to determine and validate the concentration of the major bioactive constituents in different populations of this plant species for development of a high yielding chemotype for large scale production and its commercial exploitation on scientific lines. This study assessed and validated a fast and reliable chromatography method for the determination of picroside-I and picroside-II in different populations of this priortized medicinal plant species. Separation and resolution of picrosides was carried out on a reversed phase (C-18) column by using a mobile phase of methanol and water (40:60 v/v). The detection of picrosides was carried out at 270 nm. The average levels of these two major marker compounds in all the seven accessions showed significant quantitative variation (ANOVA, p < 0.05) between mean levels of marker compounds and their accumulation in different parts of the plant viz. roots, rhizomes and leaves. The highest content of pk-I was found in the accession from Gurez altitude (3750 masl) while the highest content of pk-II was found in accession from Keller (Shopian) altitude (3300 masl) demonstrate that picroside accusation is directly correlated with altitudinal variation. The method was validated in terms of

  11. Systematic feasibility analysis of a quantitative elasticity estimation for breast anatomy using supine/prone patient postures.

    Science.gov (United States)

    Hasse, Katelyn; Neylon, John; Sheng, Ke; Santhanam, Anand P

    2016-03-01

    Breast elastography is a critical tool for improving the targeted radiotherapy treatment of breast tumors. Current breast radiotherapy imaging protocols only involve prone and supine CT scans. There is a lack of knowledge on the quantitative accuracy with which breast elasticity can be systematically measured using only prone and supine CT datasets. The purpose of this paper is to describe a quantitative elasticity estimation technique for breast anatomy using only these supine/prone patient postures. Using biomechanical, high-resolution breast geometry obtained from CT scans, a systematic assessment was performed in order to determine the feasibility of this methodology for clinically relevant elasticity distributions. A model-guided inverse analysis approach is presented in this paper. A graphics processing unit (GPU)-based linear elastic biomechanical model was employed as a forward model for the inverse analysis with the breast geometry in a prone position. The elasticity estimation was performed using a gradient-based iterative optimization scheme and a fast-simulated annealing (FSA) algorithm. Numerical studies were conducted to systematically analyze the feasibility of elasticity estimation. For simulating gravity-induced breast deformation, the breast geometry was anchored at its base, resembling the chest-wall/breast tissue interface. Ground-truth elasticity distributions were assigned to the model, representing tumor presence within breast tissue. Model geometry resolution was varied to estimate its influence on convergence of the system. A priori information was approximated and utilized to record the effect on time and accuracy of convergence. The role of the FSA process was also recorded. A novel error metric that combined elasticity and displacement error was used to quantify the systematic feasibility study. For the authors' purposes, convergence was set to be obtained when each voxel of tissue was within 1 mm of ground-truth deformation. The authors

  12. Provider report of the existence of detection and care of perinatal depression: quantitative evidence from public obstetric units in Mexico

    Directory of Open Access Journals (Sweden)

    Filipa de Castro

    2016-07-01

    Full Text Available Objective. To provide evidence on perinatal mental healthcare in Mexico. Materials and methods. Descriptive and bivariate analyses of data from a cross-sectional probabilistic survey of 211 public obstetric units. Results. Over half (64.0% of units offer mental healthcare; fewer offer perinatal depression (PND detection (37.1% and care (40.3%. More units had protocols/guidelines for PND detection and for care, respectively, in Mexico City-Mexico state (76.7%; 78.1% than in Southern (26.5%; 36.4%, Northern (27.3%; 28.1% and Central Mexico (50.0%; 52.7%. Conclusion. Protocols and provider training in PND, implementation of brief screening tools and psychosocial interventions delivered by non-clinical personnel are needed.      DOI: http://dx.doi.org/10.21149/spm.v58i4.8028

  13. A quantitative framework to estimate the relative importance of environment, spatial variation and patch connectivity in driving community composition.

    Science.gov (United States)

    Monteiro, Viviane F; Paiva, Paulo C; Peres-Neto, Pedro R

    2017-03-01

    Perhaps the most widely used quantitative approach in metacommunity ecology is the estimation of the importance of local environment vs. spatial structuring using the variation partitioning framework. Contrary to metapopulation models, however, current empirical studies of metacommunity structure using variation partitioning assume a space-for-dispersal substitution due to the lack of analytical frameworks that incorporate patch connectivity predictors of dispersal dynamics. Here, a method is presented that allows estimating the relative importance of environment, spatial variation and patch connectivity in driving community composition variation within metacommunities. The proposed approach is illustrated by a study designed to understand the factors driving the structure of a soft-bottom marine polychaete metacommunity. Using a standard variation partitioning scheme (i.e. where only environmental and spatial predictors are used), only about 13% of the variation in metacommunity structure was explained. With the connectivity set of predictors, the total amount of explained variation increased up to 51% of the variation. These results highlight the importance of considering predictors of patch connectivity rather than just spatial predictors. Given that information on connectivity can be estimated by commonly available data on species distributions for a number of taxa, the framework presented here can be readily applied to past studies as well, facilitating a more robust evaluation of the factors contributing to metacommunity structure. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.

  14. Numerical model for a quantitative estimation of sliver formation in shearing process

    Science.gov (United States)

    Selvaraj, Ramya; Quagliato, Luca; Jang, Sewon; Kim, Naksoo

    2017-09-01

    In the present research work a model for the estimation of the probability of sliver formation during the shearing process, based on the amount of damage in the element adjacent to the shearing edge, is proposed. A full material characterization, considering different temperatures and strain rates, have been carried out in order to determine the model constants for Johnson-Cook (JC) flow stress and damage model. A 3D numerical simulation replicating the shearing process has been implemented in ABAQUS/Explicit and the results of the shearing surface have been compared with those of laboratory experiments, proving the validity of the developed simulation. Finally, while varying holder force, clearance and punch velocity, the average damage in the elements adjacent to the shearing edge have been calculated and the results allowed to conclude that punch velocity has the higher influence on the damage in the shearing edge but also that holder force and clearance percentage cannot be neglected. A too high punch velocity, as well as a higher holder force, result in an increase of the damage state, whereas a higher clearance percentage allows reducing it. Based on the proposed correlation between damage in the burrs and probability of sliver formation, the combination of process parameters those assure to reduce the probability of sliver occurrence can be identified.

  15. Quantitative estimation of plum pox virus targets acquired and transmitted by a single Myzus persicae.

    Science.gov (United States)

    Moreno, Aranzazu; Fereres, Alberto; Cambra, Mariano

    2009-01-01

    The viral charge acquired and inoculated by single aphids in a non-circulative transmission is estimated using plum pox virus (PPV). A combination of electrical penetration graph and TaqMan real-time RT-PCR techniques was used to establish the average number of PPV RNA targets inoculated by an aphid in a single probe (26,750), approximately half of the acquired ones. This number of PPV targets is responsible for a systemic infection of 20% on the inoculated receptor plants. No significant differences were found between the number of PPV RNA targets acquired after one and after five intracellular punctures (pd), but the frequency of infected receptor plants was higher after 5 pd. The percentage of PPV-positive leaf discs after just 1 pd of inoculation probe (28%/4,603 targets) was lower than after 5 pd (45.8%/135 x 10(6) targets). The methodology employed could be easily extended to other virus-vector-host combinations to improve the accuracy of models used in virus epidemiology.

  16. A generalized estimating equations approach to quantitative trait locus detection of non-normal traits

    Directory of Open Access Journals (Sweden)

    Thomson Peter C

    2003-05-01

    Full Text Available Abstract To date, most statistical developments in QTL detection methodology have been directed at continuous traits with an underlying normal distribution. This paper presents a method for QTL analysis of non-normal traits using a generalized linear mixed model approach. Development of this method has been motivated by a backcross experiment involving two inbred lines of mice that was conducted in order to locate a QTL for litter size. A Poisson regression form is used to model litter size, with allowances made for under- as well as over-dispersion, as suggested by the experimental data. In addition to fixed parity effects, random animal effects have also been included in the model. However, the method is not fully parametric as the model is specified only in terms of means, variances and covariances, and not as a full probability model. Consequently, a generalized estimating equations (GEE approach is used to fit the model. For statistical inferences, permutation tests and bootstrap procedures are used. This method is illustrated with simulated as well as experimental mouse data. Overall, the method is found to be quite reliable, and with modification, can be used for QTL detection for a range of other non-normally distributed traits.

  17. Quantitative Estimation of Risks for Production Unit Based on OSHMS and Process Resilience

    Science.gov (United States)

    Nyambayar, D.; Koshijima, I.; Eguchi, H.

    2017-06-01

    Three principal elements in the production field of chemical/petrochemical industry are (i) Production Units, (ii) Production Plant Personnel and (iii) Production Support System (computer system introduced for improving productivity). Each principal element has production process resilience, i.e. a capability to restrain disruptive signals occurred in and out of the production field. In each principal element, risk assessment is indispensable for the production field. In a production facility, the occupational safety and health management system (Hereafter, referred to as OSHMS) has been introduced to reduce a risk of accidents and troubles that may occur during production. In OSHMS, a risk assessment is specified to reduce a potential risk in the production facility such as a factory, and PDCA activities are required for a continual improvement of safety production environments. However, there is no clear statement to adopt the OSHMS standard into the production field. This study introduces a metric to estimate the resilience of the production field by using the resilience generated by the production plant personnel and the result of the risk assessment in the production field. A method for evaluating how OSHMS functions are systematically installed in the production field is also discussed based on the resilience of the three principal elements.

  18. Quantitative estimation of UV light dose concomitant to irradiation with ionizing radiation

    Energy Technology Data Exchange (ETDEWEB)

    Petin, Vladislav G.; Morozov, Ivan I. [Biophysical Laboratory, Medical Radiological Research Center, 249036 Obninsk, Kaluga Region (Russian Federation); Kim, Jin Kyu, E-mail: jkkim@kaeri.re.k [Korea Atomic Energy Research Institute, Advanced Radiation Technology Institute, Jeongeup 580-185 (Korea, Republic of); Semkina, Maria A. [Biophysical Laboratory, Medical Radiological Research Center, 249036 Obninsk, Kaluga Region (Russian Federation)

    2011-01-15

    A simple mathematical model for biological estimation of UV light dose concomitant to ionizing radiation was suggested. This approach was applied to determine the dependency of equivalent UV light dose accompanied by 100 Gy of ionizing radiation on energy of sparsely ionizing radiation and on volume of the exposed cell suspension. It was revealed that the relative excitation contribution to the total lethal effect and the value of UV dose was greatly increased with an increase in energy of ionizing radiation and volume of irradiated suspensions. It is concluded that these observations are in agreement with the supposition that Cerenkov emission is responsible for the production of UV light damage and the phenomenon of photoreactivation observed after ionizing exposure of bacterial and yeast cells hypersensitive to UV light. A possible synergistic interaction of the damages produced by ionizations and excitations as well as a probable participation of UV component of ionizing radiation in the mechanism of hormesis and adaptive response observed after ionizing radiation exposure is discussed.

  19. Assessing the Benefits Provided by SWOT Data Towards Estimating Reservoir Residence Time in the Mekong River Basin

    Science.gov (United States)

    Bonnema, M.; Hossain, F.

    2016-12-01

    The Mekong River Basin is undergoing rapid hydropower development. Nine dams are planned on the main stem of the Mekong and many more on its extensive tributaries. Understanding the effects that current and future dams have on the river system and water cycle as a whole is vital for the millions of people living in the basin. reservoir residence time, the amount of time water spends stored in a reservoir, is a key parameter in investigating these impacts. The forthcoming Surface Water and Ocean Topography (SWOT) mission is poised to provide an unprecedented amount of surface water observations. SWOT, when augmented by current satellite missions, will provide the necessary information to estimate the residence time of reservoirs across the entire basin in a more comprehensive way than ever before. In this study, we first combine observations from current satellite missions (altimetry, spectral imaging, precipitation) to estimate the residence times of existing reservoirs. We then use this information to project how future reservoirs will increase the residence time of the river system. Next, we explore how SWOT observations can be used to improve residence time estimation by examining the accuracy of reservoir surface area and elevation observations as well as the accuracy of river discharge observations.

  20. Quantitative Estimation of Temperature Variations in Plantar Angiosomes: A Study Case for Diabetic Foot

    OpenAIRE

    Peregrina-Barreto, H.; Morales-Hernandez, L. A.; Rangel-Magdaleno, J. J.; Avina-Cervantes, J. G.; Ramirez-Cortes, J. M.; Morales-Caporal, R.

    2014-01-01

    Thermography is a useful tool since it provides information that may help in the diagnostic of several diseases in a noninvasive and fast way. Particularly, thermography has been applied in the study of the diabetic foot. However, most of these studies report only qualitative information making it difficult to measure significant parameters such as temperature variations. These variations are important in the analysis of the diabetic foot since they could bring knowledge, for instance, regard...

  1. A Quantitative Method to Estimate Vulnerability. Case Study: Motozintla de Mendoza, Chiapas

    Science.gov (United States)

    Rodriguez, F.; Novelo-Casanova, D. A.

    2011-12-01

    The community of Motozintla de Mendoza is located in the State of Chiapas, México (15' 22' N and 92' 15' W) near to the international border with Guatemala. Due to its location, this community is continuously exposed to many different hazards. Motozintla has a population of 20,000 inhabitants. This community suffered the impact of had two disasters recently. In view of these scenarios we carried out the present research with the objective quantifying the vulnerability of this community. We prepared a tool that allow us to document the physical vulnerability conducting interviews with people in risk situation. Our tool included the analysis of five elements: household structure and public services, socioeconomic characteristics, community preparation for facing a disaster situation, and risk perception of the inhabitants using a sample statistically significant. Three field works were carried out (October and November 2009, and October 2010) and 444 interviews were registered. Five levels of vulnerability were considered: very high, high, middle, moderate and low. Our region of study was classified spatially and the different estimated levels of vulnerability were located in geo referenced on maps. Our results indicate that the locality has a high level of physical vulnerability because about 74% of the population reports that their household had suffered damages in the past; 86% of the households present low resistance building materials; 70% of the interviewed families has a daily income under five to fifteen dollars; 66% of population does not know any existing Civil Protection Plan; 83% of the population considers that they live in a high level of risk due to floods; finally, the community organization is practically nonexistent. In conclusion, the level of vulnerability of Motozintla is high due to the many factors to which is exposed, in addition, to the structural, socioeconomic and cultural characteristics of their inhabitants. Evidently, those elements of

  2. Quantitative and qualitative estimates of cross-border tobacco shopping and tobacco smuggling in France.

    Science.gov (United States)

    Lakhdar, C Ben

    2008-02-01

    In France, cigarette sales have fallen sharply, especially in border areas, since the price increases of 2003 and 2004. It was proposed that these falls were not due to people quitting smoking but rather to increased cross-border sales of tobacco and/or smuggling. This paper aims to test this proposition. Three approaches have been used. First, cigarette sales data from French sources for the period 1999-2006 were collected, and a simulation of the changes seen within these sales was carried out in order to estimate what the sales situation would have looked like without the presence of foreign tobacco. Second, the statements regarding tobacco consumed reported by the French population with registered tobacco sales were compared. Finally, in order to identify the countries of origin of foreign tobacco entering France, we collected a random sample of cigarette packs from a waste collection centre. According to the first method, cross-border shopping and smuggling of tobacco accounted for 8635 tones of tobacco in 2004, 9934 in 2005, and 9930 in 2006, ie, between 14% and 17% of total sales. The second method gave larger results: the difference between registered cigarette sales and cigarettes declared as being smoked was around 12,000 to 13,000 tones in 2005, equivalent to 20% of legal sales. The collection of cigarette packs at a waste collection centre showed that foreign cigarettes accounted for 18.6% of our sample in 2005 and 15.5% in 2006. France seems mainly to be a victim of cross-border purchasing of tobacco products, with the contraband market for tobacco remaining modest. in order to avoid cross-border purchases, an increased harmonization of national policies on the taxation of tobacco products needs to be envisaged by the European Union.

  3. Spiritual care competence for contemporary nursing practice: A quantitative exploration of the guidance provided by fundamental nursing textbooks.

    Science.gov (United States)

    Timmins, Fiona; Neill, Freda; Murphy, Maryanne; Begley, Thelma; Sheaf, Greg

    2015-11-01

    Spirituality is receiving unprecedented attention in the nursing literature. Both the volume and scope of literature on the topic is expanding, and it is clear that this topic is of interest to nurses. There is consensus that the spiritual required by clients receiving health ought to be an integrated effort across the health care team. Although undergraduate nurses receive some education on the topic, this is ad hoc and inconsistent across universities. Textbooks are clearly a key resource in this area however the extent to which they form a comprehensive guide for nursing students and nurses is unclear. This study provides a hitherto unperformed analysis of core nursing textbooks to ascertain spirituality related content. 543 books were examined and this provides a range of useful information about inclusions and omissions in this field. Findings revealed that spirituality is not strongly portrayed as a component of holistic care and specific direction for the provision of spiritual care is lacking. Fundamental textbooks used by nurses and nursing students ought to inform and guide integrated spiritual care and reflect a more holistic approach to nursing care. The religious and/or spiritual needs of an increasingly diverse community need to be taken seriously within scholarly texts so that this commitment to individual clients' needs can be mirrored in practice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Quantitative precipitation estimation based on high-resolution numerical weather prediction and data assimilation with WRF – a performance test

    Directory of Open Access Journals (Sweden)

    Hans-Stefan Bauer

    2015-04-01

    Full Text Available Quantitative precipitation estimation and forecasting (QPE and QPF are among the most challenging tasks in atmospheric sciences. In this work, QPE based on numerical modelling and data assimilation is investigated. Key components are the Weather Research and Forecasting (WRF model in combination with its 3D variational assimilation scheme, applied on the convection-permitting scale with sophisticated model physics over central Europe. The system is operated in a 1-hour rapid update cycle and processes a large set of in situ observations, data from French radar systems, the European GPS network and satellite sensors. Additionally, a free forecast driven by the ECMWF operational analysis is included as a reference run representing current operational precipitation forecasting. The verification is done both qualitatively and quantitatively by comparisons of reflectivity, accumulated precipitation fields and derived verification scores for a complex synoptic situation that developed on 26 and 27 September 2012. The investigation shows that even the downscaling from ECMWF represents the synoptic situation reasonably well. However, significant improvements are seen in the results of the WRF QPE setup, especially when the French radar data are assimilated. The frontal structure is more defined and the timing of the frontal movement is improved compared with observations. Even mesoscale band-like precipitation structures on the rear side of the cold front are reproduced, as seen by radar. The improvement in performance is also confirmed by a quantitative comparison of the 24-hourly accumulated precipitation over Germany. The mean correlation of the model simulations with observations improved from 0.2 in the downscaling experiment and 0.29 in the assimilation experiment without radar data to 0.56 in the WRF QPE experiment including the assimilation of French radar data.

  5. Hawaii Clean Energy Initiative (HCEI) Scenario Analysis: Quantitative Estimates Used to Facilitate Working Group Discussions (2008-2010)

    Energy Technology Data Exchange (ETDEWEB)

    Braccio, R.; Finch, P.; Frazier, R.

    2012-03-01

    This report provides details on the Hawaii Clean Energy Initiative (HCEI) Scenario Analysis to identify potential policy options and evaluate their impact on reaching the 70% HECI goal, present possible pathways to attain the goal based on currently available technology, with an eye to initiatives under way in Hawaii, and provide an 'order-of-magnitude' cost estimate and a jump-start to action that would be adjusted with a better understanding of the technologies and market.

  6. HistoFlex-a microfluidic device providing uniform flow conditions enabling highly sensitive, reproducible and quantitative in situ hybridizations

    DEFF Research Database (Denmark)

    Søe, Martin Jensen; Okkels, Fridolin; Sabourin, David

    2011-01-01

    were not visually damaged during assaying, which enabled adapting a complete ISH assay for detection of microRNAs (miRNA). The effects of flow based incubations on hybridization, antibody incubation and Tyramide Signal Amplification (TSA) steps were investigated upon adapting the ISH assay...... for performing in the HistoFlex. The hybridization step was significantly enhanced using flow based incubations due to improved hybridization efficiency. The HistoFlex device enabled a fast miRNA ISH assay (3 hours) which provided higher hybridization signal intensity compared to using conventional techniques (5......A microfluidic device (the HistoFlex) designed to perform and monitor molecular biological assays under dynamic flow conditions on microscope slide-substrates, with special emphasis on analyzing histological tissue sections, is presented. Microscope slides were reversibly sealed onto a cast...

  7. Estimation of types I and III collagens in whole tissue by quantitation of CNBr peptides on SDS-polyacrylamide gels.

    Science.gov (United States)

    Light, N D

    1982-03-18

    The electrophoretic and staining characteristics of CNBr peptides of purified bovine and human types I and III collagens were investigated on SDS-polyacrylamide slab gels. All the major CNBr peptides of both types of collagen showed linear staining characteristics with Coomassie brilliant blue up to a total protein concentration of 150 micrograms per gel track. The amount of each type of collagen present in model mixtures was calculated from quantitations of the alpha 1(I)CB8 (type I) and alpha 1(III)CB8 (type III) peptides after resolution on 10% (w/v) SDS-polyacrylamide slab gels. The accuracy of the method was assessed, shown to give less than 15% error in mixtures containing more than 15% type III, and its applicability to the estimation of ratios of type I and type III collagens in whole tissue was determined.

  8. Quantitative estimation of cellular infiltration of the small intestinal mucosa in children with cow's milk and gluten intolerance.

    Science.gov (United States)

    Kaczmarski, M; Lisiecka, M; Kurpatkowska, B; Jastrzebska, J

    1989-01-01

    Quantitative estimation of the infiltration by intraepithelial lymphocytes and eosinophils of the mucosa was carried out in 21 children with cow's milk and 35 children with gluten intolerance. Before dietary treatment, a statistically significant increase in the infiltration by LIE in children with milk intolerance to the mean value of 34.1 cells and in children with gluten intolerance to 39.0 cells was found, what statistically significantly differed from the mean value of LIE for the control group (19.0 cells/100 epithelial cells). The eosinophilic infiltration in this phase of the disease was noted in 38% of children with cow's milk intolerance (16.9 cells/mm2) and in 27% of children with gluten intolerance (28.6 cells/mm2). After 8-24 months of elimination diets--a decrease in the mean value of the LIE infiltration in the mucosa was revealed in both treated groups.

  9. Patient and healthcare provider barriers to hypertension awareness, treatment and follow up: a systematic review and meta-analysis of qualitative and quantitative studies.

    Directory of Open Access Journals (Sweden)

    Rasha Khatib

    Full Text Available BACKGROUND: Although the importance of detecting, treating, and controlling hypertension has been recognized for decades, the majority of patients with hypertension remain uncontrolled. The path from evidence to practice contains many potential barriers, but their role has not been reviewed systematically. This review aimed to synthesize and identify important barriers to hypertension control as reported by patients and healthcare providers. METHODS: Electronic databases MEDLINE, EMBASE and Global Health were searched systematically up to February 2013. Two reviewers independently selected eligible studies. Two reviewers categorized barriers based on a theoretical framework of behavior change. The theoretical framework suggests that a change in behavior requires a strong commitment to change [intention], the necessary skills and abilities to adopt the behavior [capability], and an absence of health system and support constraints. FINDINGS: Twenty-five qualitative studies and 44 quantitative studies met the inclusion criteria. In qualitative studies, health system barriers were most commonly discussed in studies of patients and health care providers. Quantitative studies identified disagreement with clinical recommendations as the most common barrier among health care providers. Quantitative studies of patients yielded different results: lack of knowledge was the most common barrier to hypertension awareness. Stress, anxiety and depression were most commonly reported as barriers that hindered or delayed adoption of a healthier lifestyle. In terms of hypertension treatment adherence, patients mostly reported forgetting to take their medication. Finally, priority setting barriers were most commonly reported by patients in terms of following up with their health care providers. CONCLUSIONS: This review identified a wide range of barriers facing patients and health care providers pursuing hypertension control, indicating the need for targeted multi

  10. Logistic quantile regression provides improved estimates for bounded avian counts: A case study of California Spotted Owl fledgling production

    Science.gov (United States)

    Cade, Brian S.; Noon, Barry R.; Scherer, Rick D.; Keane, John J.

    2017-01-01

    Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical conditional distribution of a bounded discrete random variable. The logistic quantile regression model requires that counts are randomly jittered to a continuous random variable, logit transformed to bound them between specified lower and upper values, then estimated in conventional linear quantile regression, repeating the 3 steps and averaging estimates. Back-transformation to the original discrete scale relies on the fact that quantiles are equivariant to monotonic transformations. We demonstrate this statistical procedure by modeling 20 years of California Spotted Owl fledgling production (0−3 per territory) on the Lassen National Forest, California, USA, as related to climate, demographic, and landscape habitat characteristics at territories. Spotted Owl fledgling counts increased nonlinearly with decreasing precipitation in the early nesting period, in the winter prior to nesting, and in the prior growing season; with increasing minimum temperatures in the early nesting period; with adult compared to subadult parents; when there was no fledgling production in the prior year; and when percentage of the landscape surrounding nesting sites (202 ha) with trees ≥25 m height increased. Changes in production were primarily driven by changes in the proportion of territories with 2 or 3 fledglings. Average variances of the discrete cumulative distributions of the estimated fledgling counts indicated that temporal changes in climate and parent age class explained 18% of the annual variance in owl fledgling production, which was 34% of the total variance. Prior fledgling production explained as much of

  11. Improved bioautographic assay on TLC layers for qualitative and quantitative estimation of xanthine oxidase inhibitors and superoxide scavengers.

    Science.gov (United States)

    Kong, Yao; Li, Xiangkun; Zhang, Na; Miao, Yu; Feng, Haiyan; Wu, Tao; Cheng, Zhihong

    2018-02-20

    A new agar-free bioautographic assay for xanthine oxidase (XO) inhibitors and superoxide scavengers on TLC layers was developed and validated. Compared to the first version of TLC bioautographic agar overlay method, our bioautographic assay greatly improved the sensitivity and quantification ability. The limit of detection (LOD) of this assay was 0.017ng for allopurinol. Quantitative estimation of XO inhibitors and superoxide scavengers was achieved by densitometry scanning, expressed as allopurinol equivalents in millimoles on a per sample weight basis. This assay has acceptable accuracy (95.37-99.23%), intra-day and inter-day precisions (RSD, 2.56-6.69%), as well as intra-plate and inter-plate precisions (RSD, 2.93-9.62%). Six pure compounds and three herbal extracts were evaluated for their potential XO inhibitory and superoxide scavenging activity by this bioautographic assay on TLC layers. Four active components were separated, located and identified in Astragalus membranaceus var. mongholicus extract by the bioautographic assay after TLC separation. The developed method is rapid, simple, sensitive and stable for screening and estimation of the potential XO inhibitors and superoxide scavengers. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Quantitative PCR: a quality control assay for estimation of viable virus content in live attenuated goat pox vaccine.

    Science.gov (United States)

    Kallesh, D J; Hosamani, M; Balamurugan, V; Bhanuprakash, V; Yadav, V; Singh, R K

    2009-11-01

    Efficacy of live viral vaccine and vaccine-induced sero-conversion depends on the optimum number of live virus particles in a vaccine dose, which is one of the important aspects of quality control. In the present study, TaqMan probe quantitative polymerase chain reaction (QPCR) based on conserved DNA pol gene of capripoxvirus was developed for the quality control of attenuated monovalent goatpox and/or combined attenuated goatpox and peste des petits ruminants (PPR) vaccines. Cells infected with vaccine virus were harvested at critical time point and subjected to QPCR. A critical time point for harvest of Vero cells infected with various log10 dilutions of reference virus was determined to be 36 h (highest slope 3.062), by comparison of slopes of standard curves established with harvests at different time intervals. The assay method was evaluated using different batches of goatpox vaccine, and bivalent goatpox and PPR vaccine. The titers estimated by QPCR and TCID50 method were comparable to each other. The QPCR assay thus, could be used as an alternate method or supplementary tool for estimation of live GTPV particles in monovalent goatpox or bivalent goatpox and PPR vaccines.

  13. PEPIS: A Pipeline for Estimating Epistatic Effects in Quantitative Trait Locus Mapping and Genome-Wide Association Studies.

    Directory of Open Access Journals (Sweden)

    Wenchao Zhang

    2016-05-01

    Full Text Available The term epistasis refers to interactions between multiple genetic loci. Genetic epistasis is important in regulating biological function and is considered to explain part of the 'missing heritability,' which involves marginal genetic effects that cannot be accounted for in genome-wide association studies. Thus, the study of epistasis is of great interest to geneticists. However, estimating epistatic effects for quantitative traits is challenging due to the large number of interaction effects that must be estimated, thus significantly increasing computing demands. Here, we present a new web server-based tool, the Pipeline for estimating EPIStatic genetic effects (PEPIS, for analyzing polygenic epistatic effects. The PEPIS software package is based on a new linear mixed model that has been used to predict the performance of hybrid rice. The PEPIS includes two main sub-pipelines: the first for kinship matrix calculation, and the second for polygenic component analyses and genome scanning for main and epistatic effects. To accommodate the demand for high-performance computation, the PEPIS utilizes C/C++ for mathematical matrix computing. In addition, the modules for kinship matrix calculations and main and epistatic-effect genome scanning employ parallel computing technology that effectively utilizes multiple computer nodes across our networked cluster, thus significantly improving the computational speed. For example, when analyzing the same immortalized F2 rice population genotypic data examined in a previous study, the PEPIS returned identical results at each analysis step with the original prototype R code, but the computational time was reduced from more than one month to about five minutes. These advances will help overcome the bottleneck frequently encountered in genome wide epistatic genetic effect analysis and enable accommodation of the high computational demand. The PEPIS is publically available at http://bioinfo.noble.org/PolyGenic_QTL/.

  14. Size-specific dose estimate (SSDE) provides a simple method to calculate organ dose for pediatric CT examinations

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Bria M.; Brady, Samuel L., E-mail: samuel.brady@stjude.org; Kaufman, Robert A. [Department of Radiological Sciences, St Jude Children' s Research Hospital, Memphis, Tennessee 38105 (United States); Mirro, Amy E. [Department of Biomedical Engineering, Washington University, St Louis, Missouri 63130 (United States)

    2014-07-15

    previously published pediatric patient doses that accounted for patient size in their dose calculation, and was found to agree in the chest to better than an average of 5% (27.6/26.2) and in the abdominopelvic region to better than 2% (73.4/75.0). Conclusions: For organs fully covered within the scan volume, the average correlation of SSDE and organ absolute dose was found to be better than ±10%. In addition, this study provides a complete list of organ dose correlation factors (CF{sub SSDE}{sup organ}) for the chest and abdominopelvic regions, and describes a simple methodology to estimate individual pediatric patient organ dose based on patient SSDE.

  15. Quantitative estimation of climatic parameters from vegetation data in North America by the mutual climatic range technique

    Science.gov (United States)

    Anderson, Katherine H.; Bartlein, Patrick J.; Strickland, Laura E.; Pelltier, Richard T.; Thompson, Robert S.; Shafer, Sarah L.

    2012-01-01

    The mutual climatic range (MCR) technique is perhaps the most widely used method for estimating past climatic parameters from fossil assemblages, largely because it can be conducted on a simple list of the taxa present in an assemblage. When applied to plant macrofossil data, this unweighted approach (MCRun) will frequently identify a large range for a given climatic parameter where the species in an assemblage can theoretically live together. To narrow this range, we devised a new weighted approach (MCRwt) that employs information from the modern relations between climatic parameters and plant distributions to lessen the influence of the "tails" of the distributions of the climatic data associated with the taxa in an assemblage. To assess the performance of the MCR approaches, we applied them to a set of modern climatic data and plant distributions on a 25-km grid for North America, and compared observed and estimated climatic values for each grid point. In general, MCRwt was superior to MCRun in providing smaller anomalies, less bias, and better correlations between observed and estimated values. However, by the same measures, the results of Modern Analog Technique (MAT) approaches were superior to MCRwt. Although this might be reason to favor MAT approaches, they are based on assumptions that may not be valid for paleoclimatic reconstructions, including that: 1) the absence of a taxon from a fossil sample is meaningful, 2) plant associations were largely unaffected by past changes in either levels of atmospheric carbon dioxide or in the seasonal distributions of solar radiation, and 3) plant associations of the past are adequately represented on the modern landscape. To illustrate the application of these MCR and MAT approaches to paleoclimatic reconstructions, we applied them to a Pleistocene paleobotanical assemblage from the western United States. From our examinations of the estimates of modern and past climates from vegetation assemblages, we conclude that

  16. Medical care price indexes for patients with employer-provided insurance: nationally representative estimates from MarketScan Data.

    Science.gov (United States)

    Dunn, Abe; Liebman, Eli; Pack, Sarah; Shapiro, Adam Hale

    2013-06-01

    Commonly observed shifts in the utilization of medical care services to treat diseases may pose problems for official price indexes at the Bureau of Labor Statistics (BLS) that do not account for service shifts. We examine how these shifts may lead to different price estimates than those observed in official price statistics at the BLS. We use a convenience sample of enrollees with employer-provided insurance from the MarketScan database for the years 2003 to 2007. Population weights that consider the age, sex, and geographic distribution of enrollees are assigned to construct representative estimates. We compare two types of price indexes: (1) a Service Price Index (SPI) that is similar to the BLS index, which holds services fixed and measures the prices of the underlying treatments; (2) a Medical Care Expenditure Index (MCE) that measures the cost of treating diseases and allows for utilization shifts. Over the entire period of study the CAGR of the SPI grows 0.7 percentage points faster than the preferred MCE index. Our findings suggest that the health component of inflation may be overstated by 0.7 percentage points per year, and real GDP growth may be understated by a similar amount. However, more work may be necessary to precisely replicate the indexes of the BLS to obtain a more accurate measure of these price differences. © Health Research and Educational Trust.

  17. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    Science.gov (United States)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  18. Improved accuracy of quantitative parameter estimates in dynamic contrast-enhanced CT study with low temporal resolution

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sun Mo, E-mail: Sunmo.Kim@rmp.uhn.on.ca [Radiation Medicine Program, Princess Margaret Hospital/University Health Network, Toronto, Ontario M5G 2M9 (Canada); Haider, Masoom A. [Department of Medical Imaging, Sunnybrook Health Sciences Centre, Toronto, Ontario M4N 3M5, Canada and Department of Medical Imaging, University of Toronto, Toronto, Ontario M5G 2M9 (Canada); Jaffray, David A. [Radiation Medicine Program, Princess Margaret Hospital/University Health Network, Toronto, Ontario M5G 2M9, Canada and Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9 (Canada); Yeung, Ivan W. T. [Radiation Medicine Program, Princess Margaret Hospital/University Health Network, Toronto, Ontario M5G 2M9 (Canada); Department of Medical Physics, Stronach Regional Cancer Centre, Southlake Regional Health Centre, Newmarket, Ontario L3Y 2P9 (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario M5G 2M9 (Canada)

    2016-01-15

    quantitative histogram parameters of volume transfer constant [standard deviation (SD), 98th percentile, and range], rate constant (SD), blood volume fraction (mean, SD, 98th percentile, and range), and blood flow (mean, SD, median, 98th percentile, and range) for sampling intervals between 10 and 15 s. Conclusions: The proposed method of PCA filtering combined with the AIF estimation technique allows low frequency scanning for DCE-CT study to reduce patient radiation dose. The results indicate that the method is useful in pixel-by-pixel kinetic analysis of DCE-CT data for patients with cervical cancer.

  19. Towards a quantitative, measurement-based estimate of the uncertainty in photon mass attenuation coefficients at radiation therapy energies.

    Science.gov (United States)

    Ali, E S M; Spencer, B; McEwen, M R; Rogers, D W O

    2015-02-21

    In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy-i.e. 100 keV (orthovoltage) to 25 MeV-using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990-6003) for 10-30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ∼0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative 'envelope of uncertainty' of the order of 1-2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1-22).

  20. Quantitative assessment of islet cell products: estimating the accuracy of the existing protocol and accounting for islet size distribution.

    Science.gov (United States)

    Buchwald, Peter; Wang, Xiaojing; Khan, Aisha; Bernal, Andres; Fraker, Chris; Inverardi, Luca; Ricordi, Camillo

    2009-01-01

    The ability to consistently and reliably assess the total number and the size distribution of isolated pancreatic islet cells from a small sample is of crucial relevance for the adequate characterization of islet cell preparations used for research or transplantation purposes. Here, data from a large number of isolations were used to establish a continuous probability density function describing the size distribution of human pancreatic islets. This function was then used to generate a polymeric microsphere mixture with a composition resembling those of isolated islets, which, in turn, was used to quantitatively assess the accuracy, reliability, and operator-dependent variability of the currently utilized manual standard procedure of quantification of islet cell preparation. Furthermore, on the basis of the best fit probability density function, which corresponds to a Weibull distribution, a slightly modified scale of islet equivalent number (IEQ) conversion factors is proposed that incorporates the size distribution of islets and accounts for the decreasing probability of finding larger islets within each size group. Compared to the current calculation method, these factors introduce a 4-8% downward correction of the total IEQ estimate, but they reflect a statistically more accurate contribution of differently sized islets.

  1. Health care providers' perceptions of and attitudes towards induced abortions in sub-Saharan Africa and Southeast Asia: a systematic literature review of qualitative and quantitative data.

    Science.gov (United States)

    Rehnström Loi, Ulrika; Gemzell-Danielsson, Kristina; Faxelid, Elisabeth; Klingberg-Allvin, Marie

    2015-02-12

    Unsafe abortions are a serious public health problem and a major human rights issue. In low-income countries, where restrictive abortion laws are common, safe abortion care is not always available to women in need. Health care providers have an important role in the provision of abortion services. However, the shortage of health care providers in low-income countries is critical and exacerbated by the unwillingness of some health care providers to provide abortion services. The aim of this study was to identify, summarise and synthesise available research addressing health care providers' perceptions of and attitudes towards induced abortions in sub-Saharan Africa and Southeast Asia. A systematic literature search of three databases was conducted in November 2014, as well as a manual search of reference lists. The selection criteria included quantitative and qualitative research studies written in English, regardless of the year of publication, exploring health care providers' perceptions of and attitudes towards induced abortions in sub-Saharan Africa and Southeast Asia. The quality of all articles that met the inclusion criteria was assessed. The studies were critically appraised, and thematic analysis was used to synthesise the data. Thirty-six studies, published during 1977 and 2014, including data from 15 different countries, met the inclusion criteria. Nine key themes were identified as influencing the health care providers' attitudes towards induced abortions: 1) human rights, 2) gender, 3) religion, 4) access, 5) unpreparedness, 6) quality of life, 7) ambivalence 8) quality of care and 9) stigma and victimisation. Health care providers in sub-Saharan Africa and Southeast Asia have moral-, social- and gender-based reservations about induced abortion. These reservations influence attitudes towards induced abortions and subsequently affect the relationship between the health care provider and the pregnant woman who wishes to have an abortion. A values

  2. Estimates of genetic and environmental contribution to 43 quantitative traits support sharing of a homogeneous environment in an isolated population from South Tyrol, Italy.

    Science.gov (United States)

    Marroni, Fabio; Grazio, Daniela; Pattaro, Cristian; Devoto, Marcella; Pramstaller, Peter

    2008-01-01

    As part of the genomic health care program 'GenNova', we measured 43 quantitative traits in 1,136 subjects living in three isolated villages in South Tyrol (Italy), for which extended genealogical information was available. Thirty-seven of the studied traits had been previously investigated in other populations, while six of them are, to the best of our knowledge, studied here for the first time. For all 43 traits we estimated narrow-sense heritability, individual-specific environmental effects, and shared environmental effects. Estimates of narrow-sense heritability were in good agreement with previous findings. We found significant heritability for all traits; after correcting for multiple testing, all traits except serum concentration of glutamic oxaloacetic transaminase (GOT) and potassium still showed significant heritability. In contrast, the effect of living in the same sibship or village (the so-called sibship and household effects, respectively) was significant for a few traits only, and after correcting for multiple testing no trait showed significant shared environment effect. We suggest that the sharing of a highly similar environment by the subjects included in this study explains the low contribution of the household effects to the overall trait variation. This peculiarity should provide an advantage in gene-mapping projects by reducing environmental bias. (c) 2007 S. Karger AG, Basel

  3. Quantitative fluorescence kinetic analysis of NADH and FAD in human plasma using three- and four-way calibration methods capable of providing the second-order advantage

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Chao [School of Chemistry and Chemical Engineering, Guizhou University, Guiyang 550025 (China); Wu, Hai-Long, E-mail: hlwu@hnu.edu.cn [State Key Laboratory of Chemo/Biosensing and Chemometrics, College of Chemistry and Chemical Engineering, Hunan University, Changsha 410082 (China); Zhou, Chang; Xiang, Shou-Xia; Zhang, Xiao-Hua; Yu, Yong-Jie; Yu, Ru-Qin [State Key Laboratory of Chemo/Biosensing and Chemometrics, College of Chemistry and Chemical Engineering, Hunan University, Changsha 410082 (China)

    2016-03-03

    The metabolic coenzymes reduced nicotinamide adenine dinucleotide (NADH) and flavin adenine dinucleotide (FAD) are the primary electron donor and acceptor respectively, participate in almost all biological metabolic pathways. This study develops a novel method for the quantitative kinetic analysis of the degradation reaction of NADH and the formation reaction of FAD in human plasma containing an uncalibrated interferent, by using three-way calibration based on multi-way fluorescence technique. In the three-way analysis, by using the calibration set in a static manner, we directly predicted the concentrations of both analytes in the mixture at any time after the start of their reactions, even in the presence of an uncalibrated spectral interferent and a varying background interferent. The satisfactory quantitative results indicate that the proposed method allows one to directly monitor the concentration of each analyte in the mixture as the function of time in real-time and nondestructively, instead of determining the concentration after the analytical separation. Thereafter, we fitted the first-order rate law to their concentration data throughout their reactions. Additionally, a four-way calibration procedure is developed as an alternative for highly collinear systems. The results of the four-way analysis confirmed the results of the three-way analysis and revealed that both the degradation reaction of NADH and the formation reaction of FAD in human plasma fit the first-order rate law. The proposed methods could be expected to provide promising tools for simultaneous kinetic analysis of multiple reactions in complex systems in real-time and nondestructively. - Highlights: • A novel three-way calibration method for the quantitative kinetic analysis of NADH and FAD in human plasma is proposed. • The method can directly monitor the concentration of each analyte in the reaction in real-time and nondestructively. • The method has the second-order advantage. • A

  4. A dynamic neuro-fuzzy model providing bio-state estimation and prognosis prediction for wearable intelligent assistants.

    Science.gov (United States)

    Wang, Yu; Winters, Jack M

    2005-06-28

    Intelligent management of wearable applications in rehabilitation requires an understanding of the current context, which is constantly changing over the rehabilitation process because of changes in the person's status and environment. This paper presents a dynamic recurrent neuro-fuzzy system that implements expert-and evidence-based reasoning. It is intended to provide context-awareness for wearable intelligent agents/assistants (WIAs). The model structure includes the following types of signals: inputs, states, outputs and outcomes. Inputs are facts or events which have effects on patients' physiological and rehabilitative states; different classes of inputs (e.g., facts, context, medication, therapy) have different nonlinear mappings to a fuzzy "effect." States are dimensionless linguistic fuzzy variables that change based on causal rules, as implemented by a fuzzy inference system (FIS). The FIS, with rules based on expertise and evidence, essentially defines the nonlinear state equations that are implemented by nuclei of dynamic neurons. Outputs, a function of weighing of states and effective inputs using conventional or fuzzy mapping, can perform actions, predict performance, or assist with decision-making. Outcomes are scalars to be extremized that are a function of outputs and states. The first example demonstrates setup and use for a large-scale stroke neurorehabilitation application (with 16 inputs, 12 states, 5 outputs and 3 outcomes), showing how this modelling tool can successfully capture causal dynamic change in context-relevant states (e.g., impairments, pain) as a function of input event patterns (e.g., medications). The second example demonstrates use of scientific evidence to develop rule-based dynamic models, here for predicting changes in muscle strength with short-term fatigue and long-term strength-training. A neuro-fuzzy modelling framework is developed for estimating rehabilitative change that can be applied in any field of rehabilitation

  5. A Dynamic Neuro-Fuzzy Model Providing Bio-State Estimation and Prognosis Prediction for Wearable Intelligent Assistants

    Directory of Open Access Journals (Sweden)

    Winters Jack M

    2005-06-01

    Full Text Available Abstract Background Intelligent management of wearable applications in rehabilitation requires an understanding of the current context, which is constantly changing over the rehabilitation process because of changes in the person's status and environment. This paper presents a dynamic recurrent neuro-fuzzy system that implements expert-and evidence-based reasoning. It is intended to provide context-awareness for wearable intelligent agents/assistants (WIAs. Methods The model structure includes the following types of signals: inputs, states, outputs and outcomes. Inputs are facts or events which have effects on patients' physiological and rehabilitative states; different classes of inputs (e.g., facts, context, medication, therapy have different nonlinear mappings to a fuzzy "effect." States are dimensionless linguistic fuzzy variables that change based on causal rules, as implemented by a fuzzy inference system (FIS. The FIS, with rules based on expertise and evidence, essentially defines the nonlinear state equations that are implemented by nuclei of dynamic neurons. Outputs, a function of weighing of states and effective inputs using conventional or fuzzy mapping, can perform actions, predict performance, or assist with decision-making. Outcomes are scalars to be extremized that are a function of outputs and states. Results The first example demonstrates setup and use for a large-scale stroke neurorehabilitation application (with 16 inputs, 12 states, 5 outputs and 3 outcomes, showing how this modelling tool can successfully capture causal dynamic change in context-relevant states (e.g., impairments, pain as a function of input event patterns (e.g., medications. The second example demonstrates use of scientific evidence to develop rule-based dynamic models, here for predicting changes in muscle strength with short-term fatigue and long-term strength-training. Conclusion A neuro-fuzzy modelling framework is developed for estimating

  6. Quantitative investigation of the edge enhancement in in-line phase contrast projections and tomosynthesis provided by distributing microbubbles on the interface between two tissues: a phantom study

    Science.gov (United States)

    Wu, Di; Donovan Wong, Molly; Li, Yuhua; Fajardo, Laurie; Zheng, Bin; Wu, Xizeng; Liu, Hong

    2017-12-01

    The objective of this study was to quantitatively investigate the ability to distribute microbubbles along the interface between two tissues, in an effort to improve the edge and/or boundary features in phase contrast imaging. The experiments were conducted by employing a custom designed tissue simulating phantom, which also simulated a clinical condition where the ligand-targeted microbubbles are self-aggregated on the endothelium of blood vessels surrounding malignant cells. Four different concentrations of microbubble suspensions were injected into the phantom: 0%, 0.1%, 0.2%, and 0.4%. A time delay of 5 min was implemented before image acquisition to allow the microbubbles to become distributed at the interface between the acrylic and the cavity simulating a blood vessel segment. For comparison purposes, images were acquired using three system configurations for both projection and tomosynthesis imaging with a fixed radiation dose delivery: conventional low-energy contact mode, low-energy in-line phase contrast and high-energy in-line phase contrast. The resultant images illustrate the edge feature enhancements in the in-line phase contrast imaging mode when the microbubble concentration is extremely low. The quantitative edge-enhancement-to-noise ratio calculations not only agree with the direct image observations, but also indicate that the edge feature enhancement can be improved by increasing the microbubble concentration. In addition, high-energy in-line phase contrast imaging provided better performance in detecting low-concentration microbubble distributions.

  7. A decision tree model to estimate the value of information provided by a groundwater quality monitoring network

    Science.gov (United States)

    Khader, A. I.; Rosenberg, D. E.; McKee, M.

    2013-05-01

    Groundwater contaminated with nitrate poses a serious health risk to infants when this contaminated water is used for culinary purposes. To avoid this health risk, people need to know whether their culinary water is contaminated or not. Therefore, there is a need to design an effective groundwater monitoring network, acquire information on groundwater conditions, and use acquired information to inform management options. These actions require time, money, and effort. This paper presents a method to estimate the value of information (VOI) provided by a groundwater quality monitoring network located in an aquifer whose water poses a spatially heterogeneous and uncertain health risk. A decision tree model describes the structure of the decision alternatives facing the decision-maker and the expected outcomes from these alternatives. The alternatives include (i) ignore the health risk of nitrate-contaminated water, (ii) switch to alternative water sources such as bottled water, or (iii) implement a previously designed groundwater quality monitoring network that takes into account uncertainties in aquifer properties, contaminant transport processes, and climate (Khader, 2012). The VOI is estimated as the difference between the expected costs of implementing the monitoring network and the lowest-cost uninformed alternative. We illustrate the method for the Eocene Aquifer, West Bank, Palestine, where methemoglobinemia (blue baby syndrome) is the main health problem associated with the principal contaminant nitrate. The expected cost of each alternative is estimated as the weighted sum of the costs and probabilities (likelihoods) associated with the uncertain outcomes resulting from the alternative. Uncertain outcomes include actual nitrate concentrations in the aquifer, concentrations reported by the monitoring system, whether people abide by manager recommendations to use/not use aquifer water, and whether people get sick from drinking contaminated water. Outcome costs

  8. A decision tree model to estimate the value of information provided by a groundwater quality monitoring network

    Directory of Open Access Journals (Sweden)

    A. I. Khader

    2013-05-01

    Full Text Available Groundwater contaminated with nitrate poses a serious health risk to infants when this contaminated water is used for culinary purposes. To avoid this health risk, people need to know whether their culinary water is contaminated or not. Therefore, there is a need to design an effective groundwater monitoring network, acquire information on groundwater conditions, and use acquired information to inform management options. These actions require time, money, and effort. This paper presents a method to estimate the value of information (VOI provided by a groundwater quality monitoring network located in an aquifer whose water poses a spatially heterogeneous and uncertain health risk. A decision tree model describes the structure of the decision alternatives facing the decision-maker and the expected outcomes from these alternatives. The alternatives include (i ignore the health risk of nitrate-contaminated water, (ii switch to alternative water sources such as bottled water, or (iii implement a previously designed groundwater quality monitoring network that takes into account uncertainties in aquifer properties, contaminant transport processes, and climate (Khader, 2012. The VOI is estimated as the difference between the expected costs of implementing the monitoring network and the lowest-cost uninformed alternative. We illustrate the method for the Eocene Aquifer, West Bank, Palestine, where methemoglobinemia (blue baby syndrome is the main health problem associated with the principal contaminant nitrate. The expected cost of each alternative is estimated as the weighted sum of the costs and probabilities (likelihoods associated with the uncertain outcomes resulting from the alternative. Uncertain outcomes include actual nitrate concentrations in the aquifer, concentrations reported by the monitoring system, whether people abide by manager recommendations to use/not use aquifer water, and whether people get sick from drinking contaminated water

  9. The health system burden of chronic disease care: an estimation of provider costs of selected chronic diseases in Uganda.

    Science.gov (United States)

    Settumba, Stella Nalukwago; Sweeney, Sedona; Seeley, Janet; Biraro, Samuel; Mutungi, Gerald; Munderi, Paula; Grosskurth, Heiner; Vassall, Anna

    2015-06-01

    To explore the chronic disease services in Uganda: their level of utilisation, the total service costs and unit costs per visit. Full financial and economic cost data were collected from 12 facilities in two districts, from the provider's perspective. A combination of ingredients-based and step-down allocation costing approaches was used. The diseases under study were diabetes, hypertension, chronic obstructive pulmonary disease (COPD), epilepsy and HIV infection. Data were collected through a review of facility records, direct observation and structured interviews with health workers. Provision of chronic care services was concentrated at higher-level facilities. Excluding drugs, the total costs for NCD care fell below 2% of total facility costs. Unit costs per visit varied widely, both across different levels of the health system, and between facilities of the same level. This variability was driven by differences in clinical and drug prescribing practices. Most patients reported directly to higher-level facilities, bypassing nearby peripheral facilities. NCD services in Uganda are underfunded particularly at peripheral facilities. There is a need to estimate the budget impact of improving NCD care and to standardise treatment guidelines. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  10. Quantitative Estimation of Ising-Type Magnetic Anisotropy in a Family of C3 -Symmetric CoII Complexes.

    Science.gov (United States)

    Mondal, Amit Kumar; Jover, Jesús; Ruiz, Eliseo; Konar, Sanjit

    2017-09-12

    In this paper, the influence of the structural and chemical effects on the Ising-type magnetic anisotropy of pentacoordinate CoII complexes has been investigated by using a combined experimental and theoretical approach. For this, a deliberate design and synthesis of four pentacoordinate CoII complexes [Co(tpa)Cl]⋅ClO4 (1), [Co(tpa)Br]⋅ClO4 (2), [Co(tbta)Cl]⋅(ClO4 )⋅(MeCN)2 ⋅(H2 O) (3) and [Co(tbta)Br]⋅ClO4 (4) by using the tripodal ligands tris(2-methylpyridyl)amine (tpa) and tris[(1-benzyl-1 H-1,2,3-triazole-4-yl)methyl]amine) (tbta) have been carried out. Detailed dc and ac measurements show the existence of field-induced slow magnetic relaxation behavior of CoII centers with Ising-type magnetic anisotropy. A quantitative estimation of the zero-field splitting (ZFS) parameters has been effectively achieved by using detailed ab initio theory calculations. Computational studies reveal that the wavefunction of all the studied complexes has a very strong multiconfigurational character that stabilizes the largest ms =±3/2 components of the quartet state and hence produce a large negative contribution to the ZFS parameters. The difference in the magnitudes of the Ising-type anisotropy can be explained through ligand field theory considerations, that is, D is larger and negative in the case of weak equatorial σ-donating and strong apical π-donating ligands. To elucidate the role of intermolecular interactions in the magnetic relaxation behavior between adjacent CoII centers, a diamagnetic isostructural ZnII analog (5) was synthesized and the magnetic dilution experiment was performed. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Joint Analysis of Near-Isogenic and Recombinant Inbred Line Populations Yields Precise Positional Estimates for Quantitative Trait Loci

    Directory of Open Access Journals (Sweden)

    Kristen L. Kump

    2010-11-01

    Full Text Available Data generated for initial quantitative trait loci (QTL mapping using recombinant inbred line (RIL populations are usually ignored during subsequent fine-mapping using near-isogenic lines (NILs. Combining both datasets would increase the number of recombination events sampled and generate better position and effect estimates. Previously, several QTL for resistance to southern leaf blight of maize were mapped in two RIL populations, each independently derived from a cross between the lines B73 and Mo17. In each case the largest QTL was in bin 3.04. Here, two NIL pairs differing for this QTL were derived and used to create two distinct F family populations that were assessed for southern leaf blight (SLB resistance. By accounting for segregation of the other QTL in the original RIL data, we were able to combine these data with the new genotypic and phenotypic data from the F families. Joint analysis yielded a narrower QTL support interval compared to that derived from analysis of any one of the data sets alone, resulting in the localization of the QTL to a less than 0.5 cM interval. Candidate genes identified within this interval are discussed. This methodology allows combined QTL analysis in which data from RIL populations is combined with data derived from NIL populations segregating for the same pair of alleles. It improves mapping resolution over the conventional approach with virtually no additional resources. Because data sets of this type are commonly produced, this approach is likely to prove widely applicable.

  12. The Impact of Quantitative Data Provided by a Multi-spectral Digital Skin Lesion Analysis Device on Dermatologists'Decisions to Biopsy Pigmented Lesions.

    Science.gov (United States)

    Farberg, Aaron S; Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S

    2017-09-01

    BACKGROUND: Early diagnosis of melanoma is critical to survival. New technologies, such as a multi-spectral digital skin lesion analysis (MSDSLA) device [MelaFind, STRATA Skin Sciences, Horsham, Pennsylvania] may be useful to enhance clinician evaluation of concerning pigmented skin lesions. Previous studies evaluated the effect of only the binary output. OBJECTIVE: The objective of this study was to determine how decisions dermatologists make regarding pigmented lesion biopsies are impacted by providing both the underlying classifier score (CS) and associated probability risk provided by multi-spectral digital skin lesion analysis. This outcome was also compared against the improvement reported with the provision of only the binary output. METHODS: Dermatologists attending an educational conference evaluated 50 pigmented lesions (25 melanomas and 25 benign lesions). Participants were asked if they would biopsy the lesion based on clinical images, and were asked this question again after being shown multi-spectral digital skin lesion analysis data that included the probability graphs and classifier score. RESULTS: Data were analyzed from a total of 160 United States board-certified dermatologists. Biopsy sensitivity for melanoma improved from 76 percent following clinical evaluation to 92 percent after quantitative multi-spectral digital skin lesion analysis information was provided ( p lesion analysis (64% vs. 86%, p lesions led to both increased sensitivity and specificity, thereby resulting in more accurate biopsy decisions.

  13. Quantitative analysis of photoreceptor layer reflectivity on en-face optical coherence tomography as an estimator of cone density.

    Science.gov (United States)

    Saleh, Maher; Flores, Mathieu; Gauthier, Anne Sophie; Elphege, Emeric; Delbosc, Bernard

    2017-11-01

    To investigate whether outer retinal reflectivity on en-face Optical coherence tomography (OCT) can be considered as an estimator of cone density measured in the same area. Forty-one points of comparisons were studied in 9 eyes (n = 6 patients) presenting maculopathies with various degrees of impairment of the photoreceptor layer. The inner segment ellipsoid zone (EZ), interdigitation zone (IZ), and retinal pigment epithelium (RPE) reflectivity were measured on coronal reconstruction of the photoreceptor layer using homemade dedicated software (Matlab, MathWorks Inc., Natick, USA). The cone metrics were measured in the same perifoveal region of interest using a high-resolution flood illumination adaptive optics camera. A semi-automatic cone counting method was adopted and all photoreceptor densities provided by the manufacturer's software were recounted manually by two experienced readers. Mean manual cone count was 21,522 ± 6700 (range, 5908-31,233 cells/mm2). Both EZ and IZ reflectivity values were closely correlated with cone density in the area studied (r2: 0.80 and 0.62, respectively; p optical coherence tomography correlates well with photoreceptor density. This cone density estimation method based on retinal reflectivity could have interesting applications in the exploration and management of maculopathies.

  14. Estimation of genetic parameters and detection of quantitative trait loci for minerals in Danish Holstein and Danish Jersey milk

    DEFF Research Database (Denmark)

    Buitenhuis, Albert Johannes; Poulsen, Nina Aagaard; Sehested, Jakob

    2015-01-01

    Background Bovine milk provides important minerals, essential for human nutrition and dairy product quality. For changing the mineral composition of the milk to improve dietary needs in human nutrition and technological properties of milk, a thorough understanding of the genetics underlying milk...... mineral contents is important. Therefore the aim of this study was to 1) estimate the genetic parameters for individual minerals in Danish Holstein (DH) (n = 371) and Danish Jersey (DJ) (n = 321) milk, and 2) detect genomic regions associated with mineral content in the milk using a genome...... The results show that Ca, Zn, P and Mg show high heritabilities. In combination with the GWAS results this opens up possibilities to select for specific minerals in bovine milk....

  15. Quantitative Estimation of Above Ground Crop Biomass using Ground-based, Airborne and Spaceborne Low Frequency Polarimetric Synthetic Aperture Radar

    Science.gov (United States)

    Koyama, C.; Watanabe, M.; Shimada, M.

    2016-12-01

    Estimation of crop biomass is one of the important challenges in environmental remote sensing related to agricultural as well as hydrological and meteorological applications. Usually passive optical data (photographs, spectral data) operating in the visible and near-infrared bands is used for such purposes. The virtue of optical remote sensing for yield estimation, however, is rather limited as the visible light can only provide information about the chemical characteristics of the canopy surface. Low frequency microwave signals with wavelength longer 20 cm have the potential to penetrate through the canopy and provide information about the whole vertical structure of vegetation from the top of the canopy down to the very soil surface. This phenomenon has been well known and exploited to detect targets under vegetation in the military radar application known as FOPEN (foliage penetration). With the availability of polarimetric interferometric SAR data the use PolInSAR techniques to retrieve vertical vegetation structures has become an attractive tool. However, PolInSAR is still highly experimental and suitable data is not yet widely available. In this study we focus on the use of operational dual-polarization L-band (1.27 GHz) SAR which is since the launch of Japan's Advanced Land Observing Satellite (ALOS, 2006-2011) available worldwide. Since 2014 ALOS-2 continues to deliver such kind of partial polarimetric data for the entire land surface. In addition to these spaceborne data sets we use airborne L-band SAR data acquired by the Japanese Pi-SAR-L2 as well as ultra-wideband (UWB) ground based SAR data operating in the frequency range from 1-4 GHz. By exploiting the complex dual-polarization [C2] Covariance matrix information, the scattering contributions from the canopy can be well separated from the ground reflections allowing for the establishment of semi-empirical relationships between measured radar reflectivity and the amount of fresh-weight above

  16. Quantitative estimation of compliance of human systemic veins by occlusion plethysmography with radionuclide. Methodology and the effect of nitroglycerin

    Energy Technology Data Exchange (ETDEWEB)

    Takatsu, Hisato; Gotoh, Kohshi; Suzuki, Takahiko; Ohsumi, Yukio; Yagi, Yasuo; Tsukamoto, Tatsuo; Terashima, Yasushi; Nagashima, Kenshi; Hirakawa, Senri (Gifu Univ. (Japan). Faculty of Medicine)

    1989-03-01

    Volume-pressure relationship and compliance of human systemic veins were estimated quantitatively and noninvasively using radionuclide. The effect of nitroglycerin (NTG) on these parameters was examined. Plethysmography with radionuclide (RN) was performed using the occlusion method on the forearm in 56 patients with various cardiac diseases after RN angiocardiography with /sup 99m/Tc-RBC. The RN counts-venous pressure curve was constructed from the changes in radioactivity from region of interest on the forearm that were considered to reflect the changes in the blood volume of the forearm, and the changes in the pressure of the forearm vein (fv) due to venous occlusion. The specific compliance of the forearm veins (Csp.fv; (1/V)center dot({Delta}V/{Delta}P)) was obtained graphically from this curve at each patient's venous pressure (Pv). Csp.fv was 0.044 mmHg/sup -1/ in class I, 0.033 mmHg/sup -1/ in class II, and 0.019 mmHg/sup -1/ in class III, of the previous NYHA classification of work tolerance. The systemic venous blood volume (Vsv) was determined by subtracting the central blood volume, measured by RN-angiocardiography, from total blood volume, measured by the indicator dilution method utilizing /sup 131/I-human serum albumin. Systemic venous compliance (Csv) was calculated from Csv=Csp.fvcenter dotVsv. The Csv was 127.2 ml-mmHg/sup -1/ in class I, 101.1ml-mmHg/sup -1/ in class II and 62.2 ml-mmHg/sup -1/ in class III. There were significant differences in Csv among the three classes. The class I Csv value was calculated to be 127.2 ml-mmHg/sup -1/ and the Csv/body weight was calculated to be 2.3 ml-mmHg/sup -1/kg/sup -1/ of body weight. The administration of NTG increased Csv significantly in all cases.

  17. Effects on the estimated cause-specific mortality fraction of providing physician reviewers with different formats of verbal autopsy data

    Directory of Open Access Journals (Sweden)

    Chow Clara

    2011-08-01

    a cause of death did not substantively influence the pattern of mortality estimated. Substantially abbreviated and simplified verbal autopsy questionnaires might provide robust information about high-level mortality patterns.

  18. Quantitative Estimation of Coastal Changes along Selected Locations of Karnataka, India: A GIS and Remote Sensing Approach

    Digital Repository Service at National Institute of Oceanography (India)

    Vinayaraj, P.; Johnson, G.; Dora, G.U.; Philip, C.S.; SanilKumar, V.; Gowthaman, R.

    Qualitative and quantitative studies on changes of coastal geomorphology and shoreline of Karnataka, India have been carried out using toposheets of Survey of India and satellite imageries (IRS-P6 and IRS-1D). Changes during 30 years period...

  19. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    Science.gov (United States)

    Defourny, P.

    2013-12-01

    such the Green Area Index (GAI), fAPAR and fcover usually retrieved from MODIS, MERIS, SPOT-Vegetation described the quality of the green vegetation development. The GLOBAM (Belgium) and EU FP-7 MOCCCASIN projects (Russia) improved the standard products and were demonstrated over large scale. The GAI retrieved from MODIS time series using a purity index criterion depicted successfully the inter-annual variability. Furthermore, the quantitative assimilation of these GAI time series into a crop growth model improved the yield estimate over years. These results showed that the GAI assimilation works best at the district or provincial level. In the context of the GEO Ag., the Joint Experiment of Crop Assessment and Monitoring (JECAM) was designed to enable the global agricultural monitoring community to compare such methods and results over a variety of regional cropping systems. For a network of test sites around the world, satellite and field measurements are currently collected and will be made available for collaborative effort. This experiment should facilitate international standards for data products and reporting, eventually supporting the development of a global system of systems for agricultural crop assessment and monitoring.

  20. Evaluation of Landsat-Based METRIC Modeling to Provide High-Spatial Resolution Evapotranspiration Estimates for Amazonian Forests

    Directory of Open Access Journals (Sweden)

    Izaya Numata

    2017-01-01

    Full Text Available While forest evapotranspiration (ET dynamics in the Amazon have been studied both as point estimates using flux towers, as well as spatially coarse surfaces using satellite data, higher resolution (e.g., 30 m resolution ET estimates are necessary to address finer spatial variability associated with forest biophysical characteristics and their changes by natural and human impacts. The objective of this study is to evaluate the potential of the Landsat-based METRIC (Mapping Evapotranspiration at high Resolution with Internalized Calibration model to estimate high-resolution (30 m forest ET by comparing to flux tower ET (FT ET data collected over seasonally dry tropical forests in Rondônia, the southwestern region of the Amazon. Analyses were conducted at daily, monthly and seasonal scales for the dry seasons (June–September for Rondônia of 2000–2002. Overall daily ET comparison between FT ET and METRIC ET across the study site showed r2 = 0.67 with RMSE = 0.81 mm. For seasonal ET comparison, METRIC-derived ET estimates showed an agreement with FT ET measurements during the dry season of r2 >0.70 and %MAE <15%. We also discuss some challenges and potential applications of METRIC for Amazonian forests.

  1. Scientific Opinion on a quantitative estimation of the public health impact of setting a new target for the reduction of Salmonella in laying hens

    DEFF Research Database (Denmark)

    Hald, Tine; Nørrung, Birgit; Chriél, Mariann

    sampling protocols. Diversion of eggs from flocks that are tested positive in the EU Salmonella control programme to the production of egg products subjected to heat treatment may lead to increased health risks as heat treatment of egg products should not be considered an absolute barrier to Salmonella......-flock dynamics of Salmonella and harvesting data on production of Salmonella contaminated eggs under field conditions would contribute to improving the accuracy of future quantitative estimates....

  2. Simplifying ART cohort monitoring: Can pharmacy stocks provide accurate estimates of patients retained on antiretroviral therapy in Malawi?

    Directory of Open Access Journals (Sweden)

    Tweya Hannock

    2012-07-01

    Full Text Available Abstract Background Routine monitoring of patients on antiretroviral therapy (ART is crucial for measuring program success and accurate drug forecasting. However, compiling data from patient registers to measure retention in ART is labour-intensive. To address this challenge, we conducted a pilot study in Malawi to assess whether patient ART retention could be determined using pharmacy records as compared to estimates of retention based on standardized paper- or electronic based cohort reports. Methods Twelve ART facilities were included in the study: six used paper-based registers and six used electronic data systems. One ART facility implemented an electronic data system in quarter three and was included as a paper-based system facility in quarter two only. Routine patient retention cohort reports, paper or electronic, were collected from facilities for both quarter two [April–June] and quarter three [July–September], 2010. Pharmacy stock data were also collected from the 12 ART facilities over the same period. Numbers of ART continuation bottles recorded on pharmacy stock cards at the beginning and end of each quarter were documented. These pharmacy data were used to calculate the total bottles dispensed to patients in each quarter with intent to estimate the number of patients retained on ART. Information for time required to determine ART retention was gathered through interviews with clinicians tasked with compiling the data. Results Among ART clinics with paper-based systems, three of six facilities in quarter two and four of five facilities in quarter three had similar numbers of patients retained on ART comparing cohort reports to pharmacy stock records. In ART clinics with electronic systems, five of six facilities in quarter two and five of seven facilities in quarter three had similar numbers of patients retained on ART when comparing retention numbers from electronically generated cohort reports to pharmacy stock records. Among

  3. A computerised sampling strategy for therapeutic drug monitoring of lithium provides precise estimates and significantly reduces dose-finding time

    DEFF Research Database (Denmark)

    Høgberg, Lotte Christine Groth; Jürgens, Gesche; Zederkof, Vivian Wederking

    2012-01-01

    citrate. Bayesian approach was performed in the intervention groups, and estimation of lithium steady-state trough concentration was obtained from non-steady-state blood sample, collected about 12 hr after the first lithium study dose. The estimate was compared with the actually measured steady......-state concentration. In the control group, lithium monitoring was traditionally performed as steady-state blood sampling. Predicted and measured lithium concentrations were comparable. The desired lithium dose was reached significantly faster in the intervention group compared to control; 2.47 ± 2.22 days versus 9.......96 ± 11.24 days (mean ± S.D.) (p = 0.0003). Bayesian approach was an advantage for the clinicians as a fast and safe aid to obtain the optimal lithium treatment dose....

  4. Derelict Fishing Line Provides a Useful Proxy for Estimating Levels of Non-Compliance with No-Take Marine Reserves

    OpenAIRE

    Williamson, David H.; Ceccarelli, Daniela M.; Evans, Richard D.; Hill, Jos K.; Russ, Garry R.

    2014-01-01

    No-take marine reserves (NTMRs) are increasingly being established to conserve or restore biodiversity and to enhance the sustainability of fisheries. Although effectively designed and protected NTMR networks can yield conservation and fishery benefits, reserve effects often fail to manifest in systems where there are high levels of non-compliance by fishers (poaching). Obtaining reliable estimates of NTMR non-compliance can be expensive and logistically challenging, particularly in areas wit...

  5. Comparison Of Quantitative Precipitation Estimates Derived From Rain Gauge And Radar Derived Algorithms For Operational Flash Flood Support.

    Science.gov (United States)

    Streubel, D. P.; Kodama, K.

    2014-12-01

    To provide continuous flash flood situational awareness and to better differentiate severity of ongoing individual precipitation events, the National Weather Service Research Distributed Hydrologic Model (RDHM) is being implemented over Hawaii and Alaska. In the implementation process of RDHM, three gridded precipitation analyses are used as forcing. The first analysis is a radar only precipitation estimate derived from WSR-88D digital hybrid reflectivity, a Z-R relationship and aggregated into an hourly ¼ HRAP grid. The second analysis is derived from a rain gauge network and interpolated into an hourly ¼ HRAP grid using PRISM climatology. The third analysis is derived from a rain gauge network where rain gauges are assigned static pre-determined weights to derive a uniform mean areal precipitation that is applied over a catchment on a ¼ HRAP grid. To assess the effect of different QPE analyses on the accuracy of RDHM simulations and to potentially identify a preferred analysis for operational use, each QPE was used to force RDHM to simulate stream flow for 20 USGS peak flow events. An evaluation of the RDHM simulations was focused on peak flow magnitude, peak flow timing, and event volume accuracy to be most relevant for operational use. Results showed RDHM simulations based on the observed rain gauge amounts were more accurate in simulating peak flow magnitude and event volume relative to the radar derived analysis. However this result was not consistent for all 20 events nor was it consistent for a few of the rainfall events where an annual peak flow was recorded at more than one USGS gage. Implications of this indicate that a more robust QPE forcing with the inclusion of uncertainty derived from the three analyses may provide a better input for simulating extreme peak flow events.

  6. Effects of calibration methods on quantitative material decomposition in photon-counting spectral computed tomography using a maximum a posteriori estimator.

    Science.gov (United States)

    Curtis, Tyler E; Roeder, Ryan K

    2017-10-01

    Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in

  7. Quantitative estimation of the influence of external vibrations on the measurement error of a coriolis mass-flow meter

    NARCIS (Netherlands)

    van de Ridder, Bert; Hakvoort, Wouter; van Dijk, Johannes; Lötters, Joost Conrad; de Boer, Andries; Dimitrovova, Z.; de Almeida, J.R.

    2013-01-01

    In this paper the quantitative influence of external vibrations on the measurement value of a Coriolis Mass-Flow Meter for low flows is investigated, with the eventual goal to reduce the influence of vibrations. Model results are compared with experimental results to improve the knowledge on how

  8. Quantitative integration of seismic and GPR reflections to derive unique estimates for water saturation and porosity in subsoil

    NARCIS (Netherlands)

    Ghose, R.; Slob, E.C.

    2006-01-01

    For shallow subsoil, the estimates of in-situ porosity and water saturation are important, but until now it has been difficult to estimate them reliably. We relate seismic and GPR reflection coefficients to porosity and water saturation using a shared earth model. Using this model, we propose a

  9. Social Media and Language Processing: How Facebook and Twitter Provide the Best Frequency Estimates for Studying Word Recognition.

    Science.gov (United States)

    Herdağdelen, Amaç; Marelli, Marco

    2017-05-01

    Corpus-based word frequencies are one of the most important predictors in language processing tasks. Frequencies based on conversational corpora (such as movie subtitles) are shown to better capture the variance in lexical decision tasks compared to traditional corpora. In this study, we show that frequencies computed from social media are currently the best frequency-based estimators of lexical decision reaction times (up to 3.6% increase in explained variance). The results are robust (observed for Twitter- and Facebook-based frequencies on American English and British English datasets) and are still substantial when we control for corpus size. © 2016 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  10. Derelict fishing line provides a useful proxy for estimating levels of non-compliance with no-take marine reserves.

    Science.gov (United States)

    Williamson, David H; Ceccarelli, Daniela M; Evans, Richard D; Hill, Jos K; Russ, Garry R

    2014-01-01

    No-take marine reserves (NTMRs) are increasingly being established to conserve or restore biodiversity and to enhance the sustainability of fisheries. Although effectively designed and protected NTMR networks can yield conservation and fishery benefits, reserve effects often fail to manifest in systems where there are high levels of non-compliance by fishers (poaching). Obtaining reliable estimates of NTMR non-compliance can be expensive and logistically challenging, particularly in areas with limited or non-existent resources for conducting surveillance and enforcement. Here we assess the utility of density estimates and re-accumulation rates of derelict (lost and abandoned) fishing line as a proxy for fishing effort and NTMR non-compliance on fringing coral reefs in three island groups of the Great Barrier Reef Marine Park (GBRMP), Australia. Densities of derelict fishing line were consistently lower on reefs within old (>20 year) NTMRs than on non-NTMR reefs (significantly in the Palm and Whitsunday Islands), whereas line densities did not differ significantly between reefs in new NTMRs (5 years of protection) and non-NTMR reefs. A manipulative experiment in which derelict fishing lines were removed from a subset of the monitoring sites demonstrated that lines re-accumulated on NTMR reefs at approximately one third (32.4%) of the rate observed on non-NTMR reefs over a thirty-two month period. Although these inshore NTMRs have long been considered some of the best protected within the GBRMP, evidence presented here suggests that the level of non-compliance with NTMR regulations is higher than previously assumed.

  11. Derelict fishing line provides a useful proxy for estimating levels of non-compliance with no-take marine reserves.

    Directory of Open Access Journals (Sweden)

    David H Williamson

    Full Text Available No-take marine reserves (NTMRs are increasingly being established to conserve or restore biodiversity and to enhance the sustainability of fisheries. Although effectively designed and protected NTMR networks can yield conservation and fishery benefits, reserve effects often fail to manifest in systems where there are high levels of non-compliance by fishers (poaching. Obtaining reliable estimates of NTMR non-compliance can be expensive and logistically challenging, particularly in areas with limited or non-existent resources for conducting surveillance and enforcement. Here we assess the utility of density estimates and re-accumulation rates of derelict (lost and abandoned fishing line as a proxy for fishing effort and NTMR non-compliance on fringing coral reefs in three island groups of the Great Barrier Reef Marine Park (GBRMP, Australia. Densities of derelict fishing line were consistently lower on reefs within old (>20 year NTMRs than on non-NTMR reefs (significantly in the Palm and Whitsunday Islands, whereas line densities did not differ significantly between reefs in new NTMRs (5 years of protection and non-NTMR reefs. A manipulative experiment in which derelict fishing lines were removed from a subset of the monitoring sites demonstrated that lines re-accumulated on NTMR reefs at approximately one third (32.4% of the rate observed on non-NTMR reefs over a thirty-two month period. Although these inshore NTMRs have long been considered some of the best protected within the GBRMP, evidence presented here suggests that the level of non-compliance with NTMR regulations is higher than previously assumed.

  12. Estimating the Potential Toxicity of Chemicals Associated with Hydraulic Fracturing Operations Using Quantitative Structure-Activity Relationship Modeling

    Science.gov (United States)

    Researchers facilitated evaluation of chemicals that lack chronic oral toxicity values using a QSAR model to develop estimates of potential toxicity for chemicals used in HF fluids or found in flowback or produced water

  13. Estimation of the economic value of the ecosystem services provided by the Blue Nile Basin in Ethiopia

    NARCIS (Netherlands)

    Tesfaye, A.; Wolanios, N.; Brouwer, R.

    2016-01-01

    This paper aims to quantify and economically value the main ecosystem services provided by the Blue Nile basin in Ethiopia. It is the first study in its kind to do so in a consistent and comprehensive manner using the same valuation approach. Water flows are linked to corresponding economic market

  14. Quantitative Electroencephalographic Analysis Provides an Early-Stage Indicator of Disease Onset and Progression in the zQ175 Knock-In Mouse Model of Huntington's Disease

    Science.gov (United States)

    Fisher, Simon P.; Schwartz, Michael D.; Wurts-Black, Sarah; Thomas, Alexia M.; Chen, Tsui-Ming; Miller, Michael A.; Palmerston, Jeremiah B.; Kilduff, Thomas S.; Morairty, Stephen R.

    2016-01-01

    intervention and improve outcomes for patients with HD. Citation: Fisher SP, Schwartz MD, Wurts-Black S, Thomas AM, Chen TM, Miller MA, Palmerston JB, Kilduff TS, Morairty SR. Quantitative electroencephalographic analysis provides an early-stage indicator of disease onset and progression in the zQ175 knock-in mouse model of Huntington's disease. SLEEP 2016;39(2):379–391. PMID:26446107

  15. Similar gene estimates from circular and linear standards in quantitative PCR analyses using the prokaryotic 16S rRNA gene as a model.

    Directory of Open Access Journals (Sweden)

    Athenia L Oldham

    Full Text Available Quantitative PCR (qPCR is one of the most widely used tools for quantifying absolute numbers of microbial gene copies in test samples. A recent publication showed that circular plasmid DNA standards grossly overestimated numbers of a target gene by as much as 8-fold in a eukaryotic system using quantitative PCR (qPCR analysis. Overestimation of microbial numbers is a serious concern in industrial settings where qPCR estimates form the basis for quality control or mitigation decisions. Unlike eukaryotes, bacteria and archaea most commonly have circular genomes and plasmids and therefore may not be subject to the same levels of overestimation. Therefore, the feasibility of using circular DNA plasmids as standards for 16S rRNA gene estimates was assayed using these two prokaryotic systems, with the practical advantage being rapid standard preparation for ongoing qPCR analyses. Full-length 16S rRNA gene sequences from Thermovirga lienii and Archaeoglobus fulgidus were cloned and used to generate standards for bacterial and archaeal qPCR reactions, respectively. Estimates of 16S rRNA gene copies were made based on circular and linearized DNA conformations using two genomes from each domain: Desulfovibrio vulgaris, Pseudomonas aeruginosa, Archaeoglobus fulgidus, and Methanocaldocococcus jannaschii. The ratio of estimated to predicted 16S rRNA gene copies ranged from 0.5 to 2.2-fold in bacterial systems and 0.5 to 1.0-fold in archaeal systems, demonstrating that circular plasmid standards did not lead to the gross over-estimates previously reported for eukaryotic systems.

  16. Estimating the effect of plant-provided food supplements on pest consumption by omnivorous predators: lessons from two coccinellid beetles.

    Science.gov (United States)

    Schuldiner-Harpaz, Tarryn; Coll, Moshe

    2017-05-01

    Plant-provided food supplements can influence biological pest control by omnivorous predators in two counteracting ways: they can (i) enhance predator populations, but (ii) reduce pest consumption by individual predators. Yet the majority of studies address only one of these aspects. Here, we first tested the influence of canola (Brassica napus L.) pollen supplements on the life history of two ladybeetle species: Hoppodamia variegata (Goeze) and Coccinella septempunctata (L.). We then developed a theoretical model to simulate total pest consumption in the presence and absence of pollen supplements. Supplementing a prey diet with canola pollen increased H. variegata larval survival from 50 to 82%, and C. septempunctata female oviposition by 1.6-fold. Model simulations revealed a greater benefit of pollen supplements when relying on C. septempunctata for pest suppression than on H. variegata. For these two predators, the tested pollen serves as an essential supplement to a diet of prey. However, the benefit of a mixed prey-pollen diet was not always sufficient to overcome individual decrease in pest consumption. Taken together, our study highlights the importance of addressing both positive and negative roles of plant-provided food supplements in considering the outcome for biological control efforts that rely on omnivorous predators. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  17. Stereological estimates of nuclear volume and other quantitative variables in supratentorial brain tumors. Practical technique and use in prognostic evaluation

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Braendgaard, H; Chistiansen, A O

    1991-01-01

    the practical technique. The continuous variables were correlated with the subjective, qualitative WHO classification of brain tumors, and the prognostic value of the parameters was assessed. Well differentiated astrocytomas (n = 14) had smaller estimates of the volume-weighted mean nuclear volume and mean...... was significantly increased in glioblastomas (2p = 0.01). Three-dimensional, shape-independent estimates of macroscopical tumor volume were not different in anaplastic astrocytomas and glioblastomas (2p = 0.39). Histological type of tumor and mitotic index were of significant prognostic value (2p = 8.2.10(-6) and 2...... techniques in the prognostic evaluation of primary brain tumors....

  18. Development and Validation of RP-HPLC Method for Quantitative Estimation of Vinpocetine in Pure and Pharmaceutical Dosage Forms

    Directory of Open Access Journals (Sweden)

    Subrata Bhadra

    2011-01-01

    Full Text Available A simple, precise, specific, and accurate reversed phase high performance liquid chromatographic (RP-HPLC method was developed and validated for determination of vinpocetine in pure and pharmaceutical dosage forms. The different analytical performance parameters such as linearity, accuracy, specificity, precision, and sensitivity (limit of detection and limit of quantitation were determined according to International Conference on Harmonization ICH Q2 (R1 guidelines. RP-HPLC was conducted on Zorbax C18 (150 mm length × 4.6 mm ID, 5 μm column. The mobile phase was consisting of buffer (containing 1.54% w/v ammonium acetate solution and acetonitrile in the ratio (40 : 60, v/v, and the flow rate was maintained at 1.0 mLmin−1. Vinpocetine was monitored using Agilent 1200 series equipped with photo diode array detector (λ = 280 nm. Linearity was observed in concentration range of 160–240 μgmL−1, and correlation coefficient was found excellent (R2 = 0.999. All the system suitability parameters were found within the range. The proposed method is rapid, cost-effective and can be used as a quality-control tool for routine quantitative analysis of vinpocetine in pure and pharmaceutical dosage forms.

  19. A quantitative real time polymerase chain reaction approach for estimating processed animal proteins in feed: preliminary data

    Directory of Open Access Journals (Sweden)

    Maria Cesarina Abete

    2013-04-01

    Full Text Available Lifting of the ban on the use of processed animal proteins (PAPs from non-ruminants in non-ruminant feed is in the wind, avoiding intraspecies recycling. Discrimination of species will be performed through polymerase chain reaction (PCR, which is at a moment a merely qualitative method. Nevertheless, quantification of PAPs in feed is needed. The aim of this study was to approach the quantitative determination of PAPs in feed through Real Time (RT-PCR technique; three different protocols picked up from the literature were tested. Three different kind of matrices were examined: pure animal meals (bovine, chicken and pork; one feed sample certified by the European reference laboratory on animal proteins (EURL AP in feed spiked with 0.1% bovine meal; and genomic DNAs from bovine, chicken and pork muscles. The limit of detection (LOD of the three protocols was set up. All the results obtained from the three protocols considered failed in the quantification process, most likely due to the uncertain copy numbers of the analytical targets chosen. This preliminary study will allow us to address further investigations, with the purpose of developing a RT-PCR quantitative method.

  20. Stereological estimation of nuclear volume and other quantitative histopathological parameters in the prognostic evaluation of supraglottic laryngeal squamous cell carcinoma

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bennedbaek, O; Pilgaard, J

    1989-01-01

    The aim of this study was to investigate various approaches to the grading of malignancy in pre-treatment biopsies from patients with supraglottic laryngeal squamous cell carcinoma. The prospects of objective malignancy grading based on stereological estimation of the volume-weighted mean nuclear...

  1. Bayesian estimation and use of high-throughput remote sensing indices for quantitative genetic analyses of leaf growth.

    Science.gov (United States)

    Baker, Robert L; Leong, Wen Fung; An, Nan; Brock, Marcus T; Rubin, Matthew J; Welch, Stephen; Weinig, Cynthia

    2017-10-20

    We develop Bayesian function-valued trait models that mathematically isolate genetic mechanisms underlying leaf growth trajectories by factoring out genotype-specific differences in photosynthesis. Remote sensing data can be used instead of leaf-level physiological measurements. Characterizing the genetic basis of traits that vary during ontogeny and affect plant performance is a major goal in evolutionary biology and agronomy. Describing genetic programs that specifically regulate morphological traits can be complicated by genotypic differences in physiological traits. We describe the growth trajectories of leaves using novel Bayesian function-valued trait (FVT) modeling approaches in Brassica rapa recombinant inbred lines raised in heterogeneous field settings. While frequentist approaches estimate parameter values by treating each experimental replicate discretely, Bayesian models can utilize information in the global dataset, potentially leading to more robust trait estimation. We illustrate this principle by estimating growth asymptotes in the face of missing data and comparing heritabilities of growth trajectory parameters estimated by Bayesian and frequentist approaches. Using pseudo-Bayes factors, we compare the performance of an initial Bayesian logistic growth model and a model that incorporates carbon assimilation (A max) as a cofactor, thus statistically accounting for genotypic differences in carbon resources. We further evaluate two remotely sensed spectroradiometric indices, photochemical reflectance (pri2) and MERIS Terrestrial Chlorophyll Index (mtci) as covariates in lieu of A max, because these two indices were genetically correlated with A max across years and treatments yet allow much higher throughput compared to direct leaf-level gas-exchange measurements. For leaf lengths in uncrowded settings, including A max improves model fit over the initial model. The mtci and pri2 indices also outperform direct A max measurements. Of particular

  2. Logistic quantile regression provides improved estimates for bounded avian counts: a case study of California Spotted Owl fledgling production

    Science.gov (United States)

    Brian S. Cade; Barry R. Noon; Rick D. Scherer; John J. Keane

    2017-01-01

    Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical...

  3. Bone Structure and Estimated Bone Strength in Obese Patients Evaluated by High-Resolution Peripheral Quantitative Computed Tomography

    DEFF Research Database (Denmark)

    Andersen, Stine; Frederiksen, Katrine Diemer; Hansen, Stinus

    2014-01-01

    females, age 25-56 years and BMI 33.2-57.6 kg/m(2)) matched with healthy controls (age 25-54 years and BMI 19.5-24.8 kg/m(2)) in regard to gender, menopausal status, age (±6 years) and height (±6 cm) using high resolution peripheral quantitative computed tomography and dual energy X-ray absorptiometry....... In radius, total bone area and trabecular area were significantly higher in obese patients (both p radius. Trabecular integrity was strengthened in obese...... patients compared with controls in radius and tibia with higher trabecular number (p = 0.002 and p radius in obese patients. FL was significantly...

  4. Quantitative estimation of granitoid composition from thermal infrared multispectral scanner (TIMS) data, Desolation Wilderness, northern Sierra Nevada, California

    Science.gov (United States)

    Sabine, Charles; Realmuto, Vincent J.; Taranik, James V.

    1994-01-01

    We have produced images that quantitatively depict modal and chemical parameters of granitoids using an image processing algorithm called MINMAP that fits Gaussian curves to normalized emittance spectra recovered from thermal infrared multispectral scanner (TIMS) radiance data. We applied the algorithm to TIMS data from the Desolation Wilderness, an extensively glaciated area near the northern end of the Sierra Nevada batholith that is underlain by Jurassic and Cretaceous plutons that range from diorite and anorthosite to leucogranite. The wavelength corresponding to the calculated emittance minimum lambda(sub min) varies linearly with quartz content, SiO2, and other modal and chemical parameters. Thematic maps of quartz and silica content derived from lambda(sub min) values distinguish bodies of diorite from surrounding granite, identify outcrops of anorthosite, and separate felsic, intermediate, and mafic rocks.

  5. Quantitative estimation of the spin-wave features supported by a spin-torque-driven magnetic waveguide

    Energy Technology Data Exchange (ETDEWEB)

    Consolo, Giancarlo, E-mail: gconsolo@unime.it; Currò, Carmela; Valenti, Giovanna [Department of Mathematics and Computer Science, University of Messina V.le F. Stagno D' Alcontres 31, I-98166 Messina (Italy)

    2014-12-07

    The main features of the spin-waves excited at the threshold via spin-polarized currents in a one-dimensional normally-to-plane magnetized waveguide are quantitatively determined both analytically and numerically. In particular, the dependence of the threshold current, frequency, wavenumber, and decay length is investigated as a function of the size of the nanocontact area through which the electric current is injected. From the analytical viewpoint, such a goal has required to solve the linearized Landau-Lifshitz-Gilbert-Slonczewski equation together with boundary and matching conditions associated with the waveguide geometry. Owing to the complexity of the resulting transcendent system, particular solutions have been obtained in the cases of elongated and contracted nanocontacts. These results have been successfully compared with those arising from numerical integration of the abovementioned transcendent system and with micromagnetic simulations. This quantitative agreement has been achieved thanks to the model here considered which takes explicitly into account the diagonal demagnetizing factors of a rectangular prism as well as the dependence of the relaxation rate on the wavenumber. Our analysis confirmed that the spin-wave features supported by such a waveguide geometry are significantly different from the ones observed in classical two-dimensional nanocontact devices. Moreover, it has been proved that the characteristic parameters depend strongly on the material properties and on the modulus of external field, but they could be independent of the nanocontact length. Finally, it is shown that spin-transfer oscillators based on contracted nanocontacts have a better capability to transmit spin-waves over large distances.

  6. Methodology for estimating dietary data from the semi-quantitative food frequency questionnaire of the Mexican National Health and Nutrition Survey 2012

    Directory of Open Access Journals (Sweden)

    Ivonne Ramírez-Silva

    2016-12-01

    Full Text Available Objective. To describe the methodology used to clean up and estimate dietary intake (DI data from the Semi-Quantitative Food Frequency Questionnaire (SFFQ of the Mexican National Health and Nutrition Survey 2012. Materials and methods. DI was collected through a shortterm SFFQ regarding 140 foods (from October 2011 to May 2012. Energy and nutrient intake was calculated accordingto a nutrient database constructed specifically for the SFFQ. Results. A total of 133 nutrients including energy and fiber were generated from SFFQ data. Between 4.8 and 9.6% of the survey sample was excluded as a result of the cleaning process. Valid DI data were obtained regarding energy and nutrients consumed by 1 212 pre-school children, 1 323 school children, 1 961 adolescents, 2 027 adults and 526 older adults. Conclusions. We documented the methodology used to clean up and estimate DI from the SFFQ used in national dietary assessments in Mexico.

  7. Skinfold Prediction Equations Fail to Provide an Accurate Estimate of Body Composition in Elite Rugby Union Athletes of Caucasian and Polynesian Ethnicity.

    Science.gov (United States)

    Zemski, Adam J; Broad, Elizabeth M; Slater, Gary J

    2018-01-01

    Body composition in elite rugby union athletes is routinely assessed using surface anthropometry, which can be utilized to provide estimates of absolute body composition using regression equations. This study aims to assess the ability of available skinfold equations to estimate body composition in elite rugby union athletes who have unique physique traits and divergent ethnicity. The development of sport-specific and ethnicity-sensitive equations was also pursued. Forty-three male international Australian rugby union athletes of Caucasian and Polynesian descent underwent surface anthropometry and dual-energy X-ray absorptiometry (DXA) assessment. Body fat percent (BF%) was estimated using five previously developed equations and compared to DXA measures. Novel sport and ethnicity-sensitive prediction equations were developed using forward selection multiple regression analysis. Existing skinfold equations provided unsatisfactory estimates of BF% in elite rugby union athletes, with all equations demonstrating a 95% prediction interval in excess of 5%. The equations tended to underestimate BF% at low levels of adiposity, whilst overestimating BF% at higher levels of adiposity, regardless of ethnicity. The novel equations created explained a similar amount of variance to those previously developed (Caucasians 75%, Polynesians 90%). The use of skinfold equations, including the created equations, cannot be supported to estimate absolute body composition. Until a population-specific equation is established that can be validated to precisely estimate body composition, it is advocated to use a proven method, such as DXA, when absolute measures of lean and fat mass are desired, and raw anthropometry data routinely to derive an estimate of body composition change.

  8. Probabilistic quantitative microbial risk assessment model of norovirus from wastewater irrigated vegetables in Ghana using genome copies and fecal indicator ratio conversion for estimating exposure dose.

    Science.gov (United States)

    Owusu-Ansah, Emmanuel de-Graft Johnson; Sampson, Angelina; Amponsah, Samuel K; Abaidoo, Robert C; Dalsgaard, Anders; Hald, Tine

    2017-12-01

    The need to replace the commonly applied fecal indicator conversions ratio (an assumption of 1:10 -5 virus to fecal indicator organism) in Quantitative Microbial Risk Assessment (QMRA) with models based on quantitative data on the virus of interest has gained prominence due to the different physical and environmental factors that might influence the reliability of using indicator organisms in microbial risk assessment. The challenges facing analytical studies on virus enumeration (genome copies or particles) have contributed to the already existing lack of data in QMRA modelling. This study attempts to fit a QMRA model to genome copies of norovirus data. The model estimates the risk of norovirus infection from the intake of vegetables irrigated with wastewater from different sources. The results were compared to the results of a corresponding model using the fecal indicator conversion ratio to estimate the norovirus count. In all scenarios of using different water sources, the application of the fecal indicator conversion ratio underestimated the norovirus disease burden, measured by the Disability Adjusted Life Years (DALYs), when compared to results using the genome copies norovirus data. In some cases the difference was >2 orders of magnitude. All scenarios using genome copies met the 10 -4 DALY per person per year for consumption of vegetables irrigated with wastewater, although these results are considered to be highly conservative risk estimates. The fecal indicator conversion ratio model of stream-water and drain-water sources of wastewater achieved the 10 -6 DALY per person per year threshold, which tends to indicate an underestimation of health risk when compared to using genome copies for estimating the dose. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Quantitative estimation of renal function with dynamic contrast-enhanced MRI using a modified two-compartment model.

    Directory of Open Access Journals (Sweden)

    Bin Chen

    Full Text Available To establish a simple two-compartment model for glomerular filtration rate (GFR and renal plasma flow (RPF estimations by dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI.A total of eight New Zealand white rabbits were included in DCE-MRI. The two-compartment model was modified with the impulse residue function in this study. First, the reliability of GFR measurement of the proposed model was compared with other published models in Monte Carlo simulation at different noise levels. Then, functional parameters were estimated in six healthy rabbits to test the feasibility of the new model. Moreover, in order to investigate its validity of GFR estimation, two rabbits underwent acute ischemia surgical procedure in unilateral kidney before DCE-MRI, and pixel-wise measurements were implemented to detect the cortical GFR alterations between normal and abnormal kidneys.The lowest variability of GFR and RPF measurements were found in the proposed model in the comparison. Mean GFR was 3.03±1.1 ml/min and mean RPF was 2.64±0.5 ml/g/min in normal animals, which were in good agreement with the published values. Moreover, large GFR decline was found in dysfunction kidneys comparing to the contralateral control group.Results in our study demonstrate that measurement of renal kinetic parameters based on the proposed model is feasible and it has the ability to discriminate GFR changes in healthy and diseased kidneys.

  10. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  11. Development of response surface methodology for optimization of extraction parameters and quantitative estimation of embelin from Embelia ribes Burm by high performance liquid chromatography

    Science.gov (United States)

    Alam, Md. Shamsir; Damanhouri, Zoheir A.; Ahmad, Aftab; Abidin, Lubna; Amir, Mohd; Aqil, Mohd; Khan, Shah Alam; Mujeeb, Mohd

    2015-01-01

    Background: Embelia ribes Burm is widely used medicinal plant for the treatment of different types of disorders in the Indian traditional systems of medicine. Objective: The present work was aimed to optimize the extraction parameters of embelin from E. ribes fruits and also to quantify embelin content in different extracts of the plant. Materials and Methods: Optimization of extraction parameters such as solvent: drug ratio, temperature and time were carried out by response surface methodology (RSM). Quantitative estimation of embelin in different extracts of E. ribes fruits was done through high performance liquid chromatography. Results: The optimal conditions determined for extraction of embelin through RSM were; extraction time (27.50 min), extraction temperature 45°C and solvent: drug ratio (8:1). Under the optimized conditions, the embelin yield (32.71%) was equitable to the expected yield (31.07%, P > 0.05). These results showed that the developed model is satisfactory and suitable for the extraction process of embelin. The analysis of variance showed a high goodness of model fit and the accomplishment of the RSM method for improving embelin extraction from the fruits of E. ribes. Conclusion: It is concluded that this may be a useful method for the extraction and quantitative estimation of embelin from the fruits of E. ribes. PMID:26109763

  12. Surface scattering dominated magnetotransport for improved quantitative estimation of particle size in Ag{sub 100−x}Co{sub x} nanogranular films

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Dinesh; Chaudhary, Sujeet, E-mail: sujeetc@physics.iitd.ac.in; Pandya, Dinesh K.

    2014-12-15

    The size and distribution of cobalt particles in 100 nm thin films of Ag{sub 100−x}Co{sub x} (x=11.8–21.1 at%) co-sputtered at room temperature are determined from the fitting of their room temperature magnetoresistance data by Langevin function using a log-normal particle moment distribution. The systematically combined magnetoresistance and magnetization data indicates the narrow distribution and the progressively interacting nature of the magnetic particles with increase in ‘x’. Instead of the conventional magnetization data, the magnetotransport data is proposed for improved quantitative estimation of the particle size owing to their capability to track the surface effects associated with the ultrafine nanoparticles. The particle sizes obtained in the range of 3.9–6.4 nm from this alternate approach are in excellent agreement with those determined from transmission electron microscopy (TEM). On the other hand, the particle sizes determined from the magnetization measurements are systematically larger than determined from TEM. The particle size probing ability of magnetotransport is interpreted by explicitly taking account of the spin-dependent electron scattering within the magnetic particles as well as scattering from the surface of magnetic particle. - Highlights: • MR is proposed for improved quantitative estimation of the particle size. • Particle size obtained from MR is in excellent agreement with TEM. • Particle size determined from the magnetization data is overestimated from TEM. • Role of surface scattering in obtaining the particle size is explored.

  13. Development of response surface methodology for optimization of extraction parameters and quantitative estimation of embelin from Embelia ribes Burm by high performance liquid chromatography.

    Science.gov (United States)

    Alam, Md Shamsir; Damanhouri, Zoheir A; Ahmad, Aftab; Abidin, Lubna; Amir, Mohd; Aqil, Mohd; Khan, Shah Alam; Mujeeb, Mohd

    2015-05-01

    Embelia ribes Burm is widely used medicinal plant for the treatment of different types of disorders in the Indian traditional systems of medicine. The present work was aimed to optimize the extraction parameters of embelin from E. ribes fruits and also to quantify embelin content in different extracts of the plant. Optimization of extraction parameters such as solvent: drug ratio, temperature and time were carried out by response surface methodology (RSM). Quantitative estimation of embelin in different extracts of E. ribes fruits was done through high performance liquid chromatography. The optimal conditions determined for extraction of embelin through RSM were; extraction time (27.50 min), extraction temperature 45°C and solvent: drug ratio (8:1). Under the optimized conditions, the embelin yield (32.71%) was equitable to the expected yield (31.07%, P > 0.05). These results showed that the developed model is satisfactory and suitable for the extraction process of embelin. The analysis of variance showed a high goodness of model fit and the accomplishment of the RSM method for improving embelin extraction from the fruits of E. ribes. It is concluded that this may be a useful method for the extraction and quantitative estimation of embelin from the fruits of E. ribes.

  14. Estimates of the genetic parameters, optimum sample size and conversion of quantitative data in multiple categories for soybean genotypes/Estimativas de parametros geneticos, do tamanho otimo da amostra e conversao de dados quantitativos em multicategoricos para genotipos de soja

    National Research Council Canada - National Science Library

    Matsuo, Eder; Sediyama, Tuneo; Cruz, Cosme Damiao; Oliveira, Rita de Cassia Teixeira; Cadore, Luiz Renato

    2012-01-01

    The objective of this study was to estimate the genetic parameters and optimal sample size for the lengths of the hypocotyl and epicotyls and to analyze the conversion of quantitative data in multiple...

  15. Quantitative estimates of coral reef substrate and species type derived objectively from photographic images taken at twenty-eight sites in the Hawaiian islands, 2002-2004 (NODC Accession 0002313)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of CRAMP surveys taken in 2002-2004 and includes quantitative estimates of substrate and species type. From the data percent coverage of a...

  16. Super-resolution T1 estimation: Quantitative high resolution T1 mapping from a set of low resolution T1 -weighted images with different slice orientations.

    Science.gov (United States)

    Van Steenkiste, Gwendolyn; Poot, Dirk H J; Jeurissen, Ben; den Dekker, Arnold J; Vanhevel, Floris; Parizel, Paul M; Sijbers, Jan

    2017-05-01

    Quantitative T1 mapping is a magnetic resonance imaging technique that estimates the spin-lattice relaxation time of tissues. Even though T1 mapping has a broad range of potential applications, it is not routinely used in clinical practice as accurate and precise high resolution T1 mapping requires infeasibly long acquisition times. To improve the trade-off between the acquisition time, signal-to-noise ratio and spatial resolution, we acquire a set of low resolution T1 -weighted images and directly estimate a high resolution T1 map by means of super-resolution reconstruction. Simulation and in vivo experiments show an increased spatial resolution of the T1 map, while preserving a high signal-to-noise ratio and short scan time. Moreover, the proposed method outperforms conventional estimation in terms of root-mean-square error. Super resolution T1 estimation enables resolution enhancement in T1 mapping with the use of standard (inversion recovery) T1 acquisition sequences. Magn Reson Med 77:1818-1830, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  17. SU-G-IeP1-06: Estimating Relative Tissue Density From Quantitative MR Images: A Novel Perspective for MRI-Only Heterogeneity Corrected Dose Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Soliman, A; Hashemi, M; Safigholi, H [Sunnybrook Research Institute, Toronto, ON (Canada); Sunnybrook Health Sciences Centre, Toronto, ON (Canada); Tchistiakova, E [Sunnybrook Health Sciences Centre, Toronto, ON (Canada); University of Toronto, Toronto, ON (Canada); Song, W [Sunnybrook Research Institute, Toronto, ON (Canada); Sunnybrook Health Sciences Centre, Toronto, ON (Canada); University of Toronto, Toronto, ON (Canada)

    2016-06-15

    Purpose: To explore the feasibility of extracting the relative density from quantitative MRI measurements as well as estimate a correlation between the extracted measures and CT Hounsfield units. Methods: MRI has the ability to separate water and fat signals, producing two separate images for each component. By performing appropriate corrections on the separated images, quantitative measurement of water and fat mass density can be estimated. This work aims to test this hypothesis on 1.5T.Peanut oil was used as fat-representative, while agar as water-representative. Gadolinium Chloride III and Sodium Chloride were added to the agar solution to adjust the relaxation times and the medium conductivity, respectively. Peanut oil was added to the agar solution with different percentages: 0%, 3%, 5%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90% and 100%. The phantom was scanned on 1.5T GE Optima 450W with the body coil using a multigradient echo sequences. Water/fat separation were performed while correcting for main field (B0) inhomogeneity and T{sub 2}* relaxation time. B1+ inhomogeneities were ignored. The phantom was subsequently scanned on a Philips Brilliance CT Big Bore. MR-corrected fat signal from all vials were normalized to 100% fat signal. CT Hounsfield values were then compared to those obtained from the normalized MR-corrected fat values as well as to the phantom for validation. Results: Good agreement were found between CT HU and the MR-extracted fat values (R{sup 2} = 0.98). CT HU also showed excellent agreement with the prepared fat fractions (R{sup 2}=0.99). Vials with 70%, 80%, and 90% fat percentages showed inhomogeneous distributions, however their results were included for completion. Conclusion: Quantitative MRI water/fat imaging can be potentially used to extract the relative tissue density. Further in-vivo validation are required.

  18. Integration of quantitated expression estimates from polyA-selected and rRNA-depleted RNA-seq libraries.

    Science.gov (United States)

    Bush, Stephen J; McCulloch, Mary E B; Summers, Kim M; Hume, David A; Clark, Emily L

    2017-06-13

    The availability of fast alignment-free algorithms has greatly reduced the computational burden of RNA-seq processing, especially for relatively poorly assembled genomes. Using these approaches, previous RNA-seq datasets could potentially be processed and integrated with newly sequenced libraries. Confounding factors in such integration include sequencing depth and methods of RNA extraction and selection. Different selection methods (typically, either polyA-selection or rRNA-depletion) omit different RNAs, resulting in different fractions of the transcriptome being sequenced. In particular, rRNA-depleted libraries sample a broader fraction of the transcriptome than polyA-selected libraries. This study aimed to develop a systematic means of accounting for library type that allows data from these two methods to be compared. The method was developed by comparing two RNA-seq datasets from ovine macrophages, identical except for RNA selection method. Gene-level expression estimates were obtained using a two-part process centred on the high-speed transcript quantification tool Kallisto. Firstly, a set of reference transcripts was defined that constitute a standardised RNA space, with expression from both datasets quantified against it. Secondly, a simple ratio-based correction was applied to the rRNA-depleted estimates. The outcome is an almost perfect correlation between gene expression estimates, independent of library type and across the full range of levels of expression. A combination of reference transcriptome filtering and a ratio-based correction can create equivalent expression profiles from both polyA-selected and rRNA-depleted libraries. This approach will allow meta-analysis and integration of existing RNA-seq data into transcriptional atlas projects.

  19. Quantitative evaluation of leg lymphedema by MR imaging. Estimation of therapeutic effect by intra-arterial injection of lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Makimoto, Yumi; Harada, Masafumi; Matsuzaki, Kenji; Hayashi, Yoshinori; Nishitani, Hiromu; Yoshizumi, Masanori; Yoshida, Osamu; Katoh, Itsuo [Tokushima Univ. (Japan). School of Medicine

    1994-12-01

    The purpose of this study was to obtain characteristic findings of lymphedema and reactions after intra-arterial lymphocytes injection therapy by MRI and to evaluate quantitatively effect of intra-arterial lymphocytes injection. Five patients were treated by several intra-arterial lymphocytes injections. We measured T{sub 2} value on edematous tissue using triple echo sequence and shot TI IR (STIR) images to assess the extent of lymphedema. MRI was measured before and after each intra-arterial lymphocytes injection. Mean T{sub 2} and standard deviation (SD) of T{sub 2} distribution were obtained from T{sub 2}-calculated images. Characteristic findings of lymphedema were the thickening of subcutaneous, meshed pattern of fluid, thickening of skin, and fluid on the fascia. After therapy, thickness of subcutaneous tissue and meshed pattern were greatly decreased, but thickening of skin and fluid on the fascia still remained. Mean T{sub 2} and SD of T{sub 2} in edematous tissue were much higher than those of normal tissue. Both of them decreased extremely after therapy on improved cases but did not change on less effective cases. STIR could differentiate water from adipose tissue and clearly indicated the distribution of water. Mean T{sub 2} and SD of T{sub 2} distribution were useful to indicate evaluation of lymphedema and assessment of therapy. (author).

  20. A Comparison of Three Quantitative Methods to Estimate G6PD Activity in the Chittagong Hill Tracts, Bangladesh

    Science.gov (United States)

    Ley, Benedikt; Alam, Mohammad Shafiul; O’Donnell, James J.; Hossain, Mohammad Sharif; Kibria, Mohammad Golam; Jahan, Nusrat; Khan, Wasif A.; Thriemer, Kamala; Chatfield, Mark D.; Price, Ric N.; Richards, Jack S.

    2017-01-01

    Background Glucose-6-phosphate-dehydrogenase-deficiency (G6PDd) is a major risk factor for primaquine-induced haemolysis. There is a need for improved point-of-care and laboratory-based G6PD diagnostics to unsure safe use of primaquine. Methods G6PD activities of participants in a cross-sectional survey in Bangladesh were assessed using two novel quantitative assays, the modified WST-8 test and the CareStart™ G6PD Biosensor (Access Bio), The results were compared with a gold standard UV spectrophotometry assay (Randox). The handheld CareStart™ Hb instrument (Access Bio) is designed to be a companion instrument to the CareStart™ G6PD biosensor, and its performance was compared to the well-validated HemoCue™ method. All quantitative G6PD results were normalized with the HemoCue™ result. Results A total of 1002 individuals were enrolled. The adjusted male median (AMM) derived by spectrophotometry was 7.03 U/g Hb (interquartile range (IQR): 5.38–8.69), by WST-8 was 7.03 U/g Hb (IQR: 5.22–8.16) and by Biosensor was 8.61 U/g Hb (IQR: 6.71–10.08). The AMM between spectrophotometry and WST-8 did not differ (p = 1.0) but differed significantly between spectrophotometry and Biosensor (pspectrophotometry (rs = 0.5 and rs = 0.4, both pspectrophotometry and WST-8 and -1.74U/g Hb (95% LoA: -7.63 to 4.23) between spectrophotometry and Biosensor. The WST-8 identified 55.1% (49/89) and the Biosensor 19.1% (17/89) of individuals with G6PD activity spectrophotometry. Areas under the ROC curve did not differ significantly for the WST-8 and Biosensor irrespective of the cut-off activity applied (all p>0.05). Sensitivity and specificity for detecting G6PD activity <30% was 0.55 (95% confidence interval (95%CI): 0.44–0.66) and 0.98 (95%CI: 0.97–0.99) respectively for the WST-8 and 0.19 (95%CI: 0.12–0.29) and 0.99 (95%CI: 0.98–0.99) respectively for the Biosensor. Hb concentrations measured by HemoCue™ and CareStart™ Hb were strongly correlated (rs = 0.8, p<0

  1. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    Science.gov (United States)

    Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and

  2. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    Directory of Open Access Journals (Sweden)

    Philip J Kellman

    Full Text Available Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert

  3. Secondary dentine as a sole parameter for age estimation: Comparison and reliability of qualitative and quantitative methods among North Western adult Indians

    Directory of Open Access Journals (Sweden)

    Jasbir Arora

    2016-06-01

    Full Text Available The indestructible nature of teeth against most of the environmental abuses makes its use in disaster victim identification (DVI. The present study has been undertaken to examine the reliability of Gustafson’s qualitative method and Kedici’s quantitative method of measuring secondary dentine for age estimation among North Western adult Indians. 196 (M = 85; F = 111 single rooted teeth were collected from the Department of Oral Health Sciences, PGIMER, Chandigarh. Ground sections were prepared and the amount of secondary dentine formed was scored qualitatively according to Gustafson’s (0–3 scoring system (method 1 and quantitatively following Kedici’s micrometric measurement method (method 2. Out of 196 teeth 180 samples (M = 80; F = 100 were found to be suitable for measuring secondary dentine following Kedici’s method. Absolute mean error of age was calculated by both methodologies. Results clearly showed that in pooled data, method 1 gave an error of ±10.4 years whereas method 2 exhibited an error of approximately ±13 years. A statistically significant difference was noted in absolute mean error of age between two methods of measuring secondary dentine for age estimation. Further, it was also revealed that teeth extracted for periodontal reasons severely decreased the accuracy of Kedici’s method however, the disease had no effect while estimating age by Gustafson’s method. No significant gender differences were noted in the absolute mean error of age by both methods which suggest that there is no need to separate data on the basis of gender.

  4. Definition of polytrauma: Discussion on the objective definition based on quantitative estimation of multiply injured patients during wartime.

    Science.gov (United States)

    Lovrić, Zvonimir

    2015-11-01

    There is a clear lack of consensus on a validated definition of the term "polytrauma". This study presents and classifies the extent of injuries during wartime in Croatia using the Revised Trauma Score and Injury Severity Score (TRISS) and compares the scores with a clinical estimation based on subjective assessments of polytraumatised and non-polytraumatised patients. We analysed the data from 426 war victims who sustained multiple injuries and were managed at Osijek University Hospital from September 1st 1991 to December 31st 1991. The victims were divided into polytraumatised (n=149) and multitraumatised (n=277) patients according to the initial clinical estimation of the extent of injury. Patients classified as monotraumatised were excluded from this study. The assessment was based on the following definition of polytrauma: simultaneous injury of two or more body regions or anatomical systems with at least one injury being life-threatening. All data were scored retrospectively using TRISS methodology. Two patients classified as polytraumatised had an ISS of less than 16, and one patient classified as multitraumatised had an ISS of more than 16. The difference between the actual (29.5%) and expected (40.44%) postoperative mortality in the polytraumatised group was statistically significant (p=0.0016), whereas in the multitraumatised group, the difference between the actual (3.2%) and expected (3.04%) postoperative mortality was not significant (p=0.6103). The data show that clinical and subjective assessment of polytraumatised patients can be useful in the management of such cases and can be tested retrospectively using TRISS methodology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Health care providers' perceptions of and attitudes towards induced abortions in sub-Saharan Africa and Southeast Asia : a systematic literature review of qualitative and quantitative data.

    OpenAIRE

    Rehnstr?m Loi, Ulrika; Gemzell-Danielsson, Kristina; Faxelid, Elisabeth; Klingberg-Allvin, Marie

    2015-01-01

    Background Unsafe abortions are a serious public health problem and a major human rights issue. In low-income countries, where restrictive abortion laws are common, safe abortion care is not always available to women in need. Health care providers have an important role in the provision of abortion services. However, the shortage of health care providers in low-income countries is critical and exacerbated by the unwillingness of some health care providers to provide abortion services. The aim...

  6. Quantitative testing of the methodology for genome size estimation in plants using flow cytometry: a case study of the Primulina genus

    Directory of Open Access Journals (Sweden)

    Jing eWang

    2015-05-01

    Full Text Available Flow cytometry (FCM is a commonly used method for estimating genome size in many organisms. The use of flow cytometry in plants is influenced by endogenous fluorescence inhibitors and may cause an inaccurate estimation of genome size; thus, falsifying the relationship between genome size and phenotypic traits/ecological performance. Quantitative optimization of FCM methodology minimizes such errors, yet there are few studies detailing this methodology. We selected the genus Primulina, one of the most representative and diverse genera of the Old World Gesneriaceae, to evaluate the methodology effect on determining genome size. Our results showed that buffer choice significantly affected genome size estimation in six out of the eight species examined and altered the 2C-value (DNA content by as much as 21.4%. The staining duration and propidium iodide (PI concentration slightly affected the 2C-value. Our experiments showed better histogram quality when the samples were stained for 40 minutes at a PI concentration of 100 µg ml-1. The quality of the estimates was not improved by one-day incubation in the dark at 4 °C or by centrifugation. Thus, our study determined an optimum protocol for genome size measurement in Primulina: LB01 buffer supplemented with 100 µg ml-1 PI and stained for 40 minutes. This protocol also demonstrated a high universality in other Gesneriaceae genera. We report the genome size of nine Gesneriaceae species for the first time. The results showed substantial genome size variation both within and among the species, with the 2C-value ranging between 1.62 and 2.71 pg. Our study highlights the necessity of optimizing the FCM methodology prior to obtaining reliable genome size estimates in a given taxon.

  7. Development and Validation of an RP-HPLC Method for Quantitative Estimation of Eslicarbazepine Acetate in Bulk Drug and Tablets.

    Science.gov (United States)

    Singh, M; Kumar, L; Arora, P; Mathur, S C; Saini, P K; Singh, R M; Singh, G N

    2013-11-01

    A convenient, simple, accurate, precise and reproducible RP-HPLC method was developed and validated for the estimation of eslicarbazepine acetate in bulk drug and tablet dosage form. Objective was achieved under optimised chromatographic conditions on Dionex RP-HPLC system with Dionex C18 column (250×4.6 mm, 5 μm particle size) using mobile phase composed of methanol and ammonium acetate (0.005 M) in the ratio of 70:30 v/v. The separation was achieved using an isocratic elution method with a flow rate of 1.0 ml/ min at room temperature. The effluent was monitored at 230 nm using diode array detector. The retention time of eslicarbazepine acetate is found to be 4.9 min and the standard calibration plot was linear over a concentration range of 10-90 μg/ml with r(2)=0.9995. The limit of detection and quantification were found to be 3.144 and 9.52 μg/ml, respectively. The amount of eslicarbazepine acetate in bulk and tablet dosage form was found to be 99.19 and 97.88%, respectively. The method was validated statistically using the percent relative standard deviation and the values are found to be within the limits. The recovery studies were performed and the percentage recoveries were found to be 98.33± 0.5%.

  8. Isolation, Characterization of a Potential Degradation Product of Aspirin and an HPLC Method for Quantitative Estimation of Its Impurities.

    Science.gov (United States)

    Acharya, Subasranjan; Daniel, Alex; Gyadangi, Bharath; Ramsamy, Sriramulu

    2015-10-01

    In this work, a new degradation product of Aspirin was isolated, characterized and analyzed along with other impurities. New unknown degradation product referred as UP was observed exceeding the limit of ICH Q3B identification thresholds in the stability study of Aspirin and Dipyridamole capsule. The UP isolated from the thermal degradation sample was further studied by IR, Mass and (1)H NMR spectrometry, revealing structural similarities with the parent molecule. Finally, UP was identified as a new compound generated from the interaction of Aspirin and Salicylic acid to form a dehydrated product. A specific HPLC method was developed and validated for the analysis of UP and other Aspirin impurities (A, B, C, E and other unknown degradation products). The proposed method was successfully employed for estimation of Aspirin impurities in a pharmaceutical preparation of Aspirin (Immediate Release) and Dipyridamole (Extended Release) Capsules. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Precipitation evidences on X-Band Synthetic Aperture Radar imagery: an approach for quantitative detection and estimation

    Science.gov (United States)

    Mori, Saverio; Marzano, Frank S.; Montopoli, Mario; Pulvirenti, Luca; Pierdicca, Nazzareno

    2017-04-01

    al. 2014 and Mori et al. 2012); ancillary data, such as local incident angle and land cover, are used. This stage is necessary to tune the precipitation map stage and to avoid severe misinterpretations on the precipitation map routines. The second stage consist of estimating the local cloud attenuation. Finally the precipitation map is estimated, using the the retrieval algorithm developed by Marzano et al. (2011), applied only to pixels where rain is known to be present. Within the FP7 project EartH2Observe we have applied this methodology to 14 study cases, acquired within TSX and CSK missions over Italy and United States. This choice allows analysing both hurricane-like intense events and continental mid-latitude precipitations, with the possibility to verify and validate the proposed methodology through the available weather radar networks. Moreover it allows in same extent analysing the contribution of orography and quality of ancillary data (i.e. landcover). In this work we will discuss the results obtained until now in terms of improved rain cell localization and precipitation quantification.

  10. Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos

    Directory of Open Access Journals (Sweden)

    Odette Laneuville

    2015-05-01

    Full Text Available To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim of this study was to determine if the pedagogical tool was used by students as instructed, and to explore students’ perception of its usefulness. A combination of quantitative survey data and measures of online viewing was used to evaluate the usage of the pedagogical practice. A total of 77 short videos linked to 11 lectures were made available to 71 students, and 64 completed the survey. Using online tracking tools, a total of 7046 views were recorded. Survey data indicated that most students (73.4% accessed all videos, and the majority (98.4% found the videos to be useful in assisting their learning. Interestingly, approximately half of the students (53.1% always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos. While the proposed pedagogical tool was used by the majority of students outside the classroom, only half used it as recommended limiting the impact on students’ involvement in the learning of the material presented in class.

  11. Quantitative relationships between huntingtin levels, polyglutamine length, inclusion body formation, and neuronal death provide novel insight into Huntington’s disease molecular pathogenesis

    Science.gov (United States)

    Miller, Jason; Arrasate, Montserrat; Shaby, Benjamin A.; Mitra, Siddhartha; Masliah, Eliezer; Finkbeiner, Steven

    2010-01-01

    An expanded polyglutamine (polyQ) stretch in the protein huntingtin (htt) induces self-aggregation into inclusion bodies (IBs) and causes Huntington’s disease (HD). Defining precise relationships between early observable variables and neuronal death at the molecular and cellular levels should improve our understanding of HD pathogenesis. Here, we utilized an automated microscope that can track thousands of neurons individually over their entire lifetime to quantify interconnected relationships between early variables, such as htt levels, polyQ length, and IB formation, and neuronal death in a primary striatal model of HD. The resulting model revealed that: mutant htt increases the risk of death by tonically interfering with homeostatic coping mechanisms rather than producing accumulated damage to the neuron; htt toxicity is saturable; the rate limiting steps for inclusion body formation and death can be traced to different conformational changes in monomeric htt; and IB formation reduces the impact of a neuron’s starting levels of htt on its risk of death. Finally, the model that emerges from our quantitative measurements places critical limits on the potential mechanisms by which mutant htt might induce neurodegeneration, which should help direct future research. PMID:20685997

  12. QUANTITATIVE ESTIMATION OF SOIL EROSION IN THE DRĂGAN RIVER WATERSHED WITH THE U.S.L.E. TYPE ROMSEM MODEL

    Directory of Open Access Journals (Sweden)

    Csaba HORVÁTH

    2008-05-01

    Full Text Available Quantitative estimation of soil erosion in the Drăgan river watershed with the U.S.L.E. type Romsem modelSediment delivered from water erosion causes substantial waterway damages and water quality degradation. A number of factors such as drainage area size, basin slope, climate, land use/land cover may affect sediment delivery processes. The goal of this study is to define a computationally effective suitable soil erosion model in the Drăgan river watershed, for future sedimentation studies. Geographic Information System (GIS is used to determine the Universal Soil Loss Equation Model (U.S.L.E. values of the studied water basin. The methods and approaches used in this study are expected to be applicable in future research and to watersheds in other regions.

  13. Statistical alloy design for superalloy. ; Especially quantitative estimation of physical properties. chotainetsu gokin no tokeiteki shuho ni yoru gokin sekkei. ; Tokuni butsuriteki seishitsu no teiryoteki yosoku ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Tsuji, I. (Mitsubishi Heavy Industries, Ltd., Tokyo (Japan))

    1991-05-27

    The statistical design method of heat resistant superalloys was studied. The data on both mechanical and physical properties of 50 kinds of Ni-base heat resistant superalloys were compiled from publications of alloy manufacturers and technical handbooks. Quantitatively predictive equations of superalloy properties were derived by regression analysis with independent variables of alloy compositions as Co and Ti and dependent variables of mechanical and physical properties. The statistical alloy design program was then derived to select the most suitable alloy compositions for target heat resistant superalloys. The composition of new Ni-base heat resistant superalloy was selected for gas turbine combustor material through the program, and the thermal conductivity, coefficient of thermal expansion and dynamic modulus of elasticity were measured of the 0.5 mm plate made of such alloy. As a result, measured values well agreed with values estimated by the program resulting in the high applicability of this method. 6 refs., 4 figs., 1 tab.

  14. Processing of next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data for the DuPage County streamflow simulation system

    Science.gov (United States)

    Bera, Maitreyee; Ortel, Terry W.

    2018-01-12

    The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.

  15. Major bioactive phenolics in Bergenia species from the Indian Himalayan region: Method development, validation and quantitative estimation using UHPLC-QqQLIT-MS/MS

    Science.gov (United States)

    Pandey, Renu; Kumar, Brijesh; Meena, Baleshwar; Srivastava, Mukesh; Mishra, Tripti; Tiwari, Vandana; Pal, Mahesh; Nair, Narayanan K.; Upreti, Dalip K.

    2017-01-01

    Bergenia species are important medicinal plants used in indigenous systems of medicine for their antilithiatic and diuretic properties. An ultra-high performance liquid chromatography coupled to hybrid linear ion trap triple quadrupole mass spectrometry (UHPLC-QqQLIT-MS/MS) method has been developed and validated for the estimation of quantitative variation of eight major bioactive phenolics in the rhizomes (150 samples) of four species of this herb, Bergenia (B. ciliata, B. ligulata, B. purpurascens and B. stracheyi). Chromatographic separation was obtained on a Waters ACQUITY UPLCTM BEH (ethylene bridged hybrid) C18 column with a mobile phase consisting of 0.1% (v/v) formic acid aqueous solution and acetonitrile under a gradient elution manner. A hybrid linear ion trap triple quadrupole mass spectrometer was operated in negative electrospray ionization mode with multiple reactions monitoring for detection and quantification of the eight compounds. The validated method demonstrated good linearity (r2 ≥ 0.9991), precision (RSD ≤ 1.87%) and accuracy (95.16–102.11%, RSD ≤ 1.83%) for all reference analytes. The quantitative results revealed that B. ligulata contains the highest amount of the major active marker-bergenin. The results also suggest that sensitive UHPLC-QqQLIT-MS/MS method, a sensitive, accurate and convenient one, could be helpful in identification of potential accession(s), rapid quality control and establishing authenticity of Bergenia species as raw material for pharmaceutical industries. PMID:28749965

  16. Contrast-enhanced 3T MR Perfusion of Musculoskeletal Tumours: T1 Value Heterogeneity Assessment and Evaluation of the Influence of T1 Estimation Methods on Quantitative Parameters.

    Science.gov (United States)

    Gondim Teixeira, Pedro Augusto; Leplat, Christophe; Chen, Bailiang; De Verbizier, Jacques; Beaumont, Marine; Badr, Sammy; Cotten, Anne; Blum, Alain

    2017-12-01

    To evaluate intra-tumour and striated muscle T1 value heterogeneity and the influence of different methods of T1 estimation on the variability of quantitative perfusion parameters. Eighty-two patients with a histologically confirmed musculoskeletal tumour were prospectively included in this study and, with ethics committee approval, underwent contrast-enhanced MR perfusion and T1 mapping. T1 value variations in viable tumour areas and in normal-appearing striated muscle were assessed. In 20 cases, normal muscle perfusion parameters were calculated using three different methods: signal based and gadolinium concentration based on fixed and variable T1 values. Tumour and normal muscle T1 values were significantly different (p = 0.0008). T1 value heterogeneity was higher in tumours than in normal muscle (variation of 19.8% versus 13%). The T1 estimation method had a considerable influence on the variability of perfusion parameters. Fixed T1 values yielded higher coefficients of variation than variable T1 values (mean 109.6 ± 41.8% and 58.3 ± 14.1% respectively). Area under the curve was the least variable parameter (36%). T1 values in musculoskeletal tumours are significantly different and more heterogeneous than normal muscle. Patient-specific T1 estimation is needed for direct inter-patient comparison of perfusion parameters. • T1 value variation in musculoskeletal tumours is considerable. • T1 values in muscle and tumours are significantly different. • Patient-specific T1 estimation is needed for comparison of inter-patient perfusion parameters. • Technical variation is higher in permeability than semiquantitative perfusion parameters.

  17. Pain in patients with multiple sclerosis: a complex assessment including quantitative and qualitative measurements provides for a disease-related biopsychosocial pain model

    Directory of Open Access Journals (Sweden)

    Michalski D

    2011-08-01

    Full Text Available Dominik Michalski1,*, Stefanie Liebig1,*, Eva Thomae1,2, Andreas Hinz3, Florian Then Bergh1,21Department of Neurology, 2Translational Centre for Regenerative Medicine (TRM, 3Department of Medical Psychology and Medical Sociology, University of Leipzig, Leipzig, Germany *These authors contributed equallyBackground: Pain of various causes is a common phenomenon in patients with Multiple Sclerosis (MS. A biopsychosocial perspective has proven a useful theoretical construct in other chronic pain conditions and was also started in MS. To support such an approach, we aimed to investigate pain in MS with special emphasis on separating quantitative and qualitative aspects, and its interrelation to behavioral and physical aspects.Materials and methods: Pain intensity (NRS and quality (SES were measured in 38 consecutive outpatients with MS (mean age, 42.0 ± 11.5 years, 82% women. Pain-related behavior (FSR, health care utilization, bodily complaints (GBB-24 and fatigue (WEIMuS were assessed by questionnaires, and MS-related neurological impairment by a standardized neurological examination (EDSS.Results: Mean pain intensity was 4.0 (range, 0–10 and mean EDSS 3.7 (range, 0–8 in the overall sample. Currently present pain was reported by 81.6% of all patients. Disease duration and EDSS did not differ between patients with and without pain and were not correlated to quality or intensity of pain. Patients with pain had significantly higher scores of musculoskeletal complaints, but equal scores of exhaustion, gastrointestinal and cardiovascular complaints. Pain intensity correlated only with physical aspects, whereas quality of pain was additionally associated with increased avoidance, resignation and cognitive fatigue.Conclusion: As in other conditions, pain in MS must be assessed in a multidimensional way. Further research should be devoted to adapt existing models to a MS-specific model of pain.Keywords: pain intensity, quality of pain, pain

  18. Novel approach for quantitatively estimating element retention and material balances in soil profiles of recharge basins used for wastewater reclamation.

    Science.gov (United States)

    Eshel, Gil; Lin, Chunye; Banin, Amos

    2015-01-01

    We investigated changes in element content and distribution in soil profiles in a study designed to monitor the geochemical changes accruing in soil due to long-term secondary effluent recharge, and its impact on the sustainability of the Soil Aquifer Treatment (SAT) system. Since the initial elemental contents of the soils at the studied site were not available, we reconstructed them using scandium (Sc) as a conservative tracer. By using this approach, we were able to produce a mass-balance for 18 elements and evaluate the geochemical changes resulting from 19 years of effluent recharge. This approach also provides a better understanding of the role of soils as an adsorption filter for the heavy metals contained in the effluent. The soil mass balance suggests 19 years of effluent recharge cause for a significant enrichment in Cu, Cr, Ni, Zn, Mg, K, Na, S and P contents in the upper 4m of the soil profile. Combining the elements lode record during the 19 years suggest that Cr, Ni, and P inputs may not reach the groundwater (20 m deep), whereas the other elements may. Conversely, we found that 58, 60, and 30% of the initial content of Mn, Ca and Co respectively leached from the upper 2-m of the soil profile. These high percentages of Mn and Ca depletion from the basin soils may reduce the soil's ability to buffer decreases in redox potential pe and pH, respectively, which could initiate a reduction in the soil's holding capacity for heavy metals. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment

    Energy Technology Data Exchange (ETDEWEB)

    Wengert, G.J.; Helbich, T.H.; Woitek, R.; Kapetas, P.; Clauser, P.; Baltzer, P.A. [Medical University of Vienna/ Vienna General Hospital, Department of Biomedical Imaging and Image-guided Therapy, Division of Molecular and Gender Imaging, Vienna (Austria); Vogl, W.D. [Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Computational Imaging Research Lab, Wien (Austria); Weber, M. [Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Division of General and Pediatric Radiology, Wien (Austria); Meyer-Baese, A. [State University of Florida, Department of Scientific Computing in Medicine, Tallahassee, FL (United States); Pinker, Katja [Medical University of Vienna/ Vienna General Hospital, Department of Biomedical Imaging and Image-guided Therapy, Division of Molecular and Gender Imaging, Vienna (Austria); State University of Florida, Department of Scientific Computing in Medicine, Tallahassee, FL (United States); Memorial Sloan-Kettering Cancer Center, Department of Radiology, Molecular Imaging and Therapy Services, New York City, NY (United States)

    2016-11-15

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. (orig.)

  20. Validation of Nutrient Intake Estimates Derived Using a Semi-Quantitative FFQ against 3 Day Diet Records in the Baltimore Longitudinal Study of Aging.

    Science.gov (United States)

    Talegawkar, S A; Tanaka, T; Maras, J E; Ferrucci, L; Tucker, K L

    2015-12-01

    To examine the relative validity of a multicultural FFQ used to derive nutrient intake estimates in a community dwelling cohort of younger and older men and women compared with those derived from 3 day (3d) diet records during the same time-frame. Cross-sectional analyses. The Baltimore Longitudinal Study of Aging (BLSA) conducted in the Baltimore, MD and District of Columbia areas. A subset (n=468, aged 26 to 95 years (y), 47% female, 65% non-Hispanic white) from the BLSA, with complete data for nutrient estimates from a FFQ and 3d diet records. Pearson's correlation coefficients (energy adjusted and de-attenuated) for intakes of energy and 26 nutrients estimated from the FFQ and the mean of 3d diet records were calculated in a cross-sectional analysis. Rankings of individuals based on FFQ for various nutrient intakes were compared to corresponding rankings based on the average of the 3d diet records. Bland Altman plots were examined for a visual representation of agreement between both assessment methods. All analyses were stratified by sex and age (above and below 65 y). Median nutrient intake estimates tended to be higher from the FFQ compared to average 3d diet records. Energy adjusted and de-attenuated correlations between FFQ intake estimates and records ranged from 0.23 (sodium intake in men) to 0.81 (alcohol intake in women). The FFQ classified more than 70 percent of participants in either the same or adjacent quartile categories for all nutrients examined. Bland Altman plots demonstrated good agreement between the assessment methods for most nutrients. This FFQ provides reasonably valid estimates of dietary intakes of younger and older participants of the BLSA.

  1. Development of quantitative structure-activity relationship (QSAR) models to predict the carcinogenic potency of chemicals I. Alternative toxicity measures as an estimator of carcinogenic potency.

    Science.gov (United States)

    Venkatapathy, Raghuraman; Wang, Ching Yi; Bruce, Robert Mark; Moudgal, Chandrika

    2009-01-15

    Determining the carcinogenicity and carcinogenic potency of new chemicals is both a labor-intensive and time-consuming process. In order to expedite the screening process, there is a need to identify alternative toxicity measures that may be used as surrogates for carcinogenic potency. Alternative toxicity measures for carcinogenic potency currently being used in the literature include lethal dose (dose that kills 50% of a study population [LD(50)]), lowest-observed-adverse-effect-level (LOAEL) and maximum tolerated dose (MTD). The purpose of this study was to investigate the correlation between tumor dose (TD(50)) and three alternative toxicity measures as an estimator of carcinogenic potency. A second aim of this study was to develop a Classification and Regression Tree (CART) between TD(50) and estimated/experimental predictor variables to predict the carcinogenic potency of new chemicals. Rat TD(50)s of 590 structurally diverse chemicals were obtained from the Cancer Potency Database, and the three alternative toxicity measures considered in this study were estimated using TOPKAT, a toxicity estimation software. Though poor correlations were obtained between carcinogenic potency and the three alternative toxicity (both experimental and TOPKAT) measures for the CPDB chemicals, a CART developed using experimental data with no missing values as predictor variables provided reasonable estimates of TD(50) for nine chemicals that were part of an external validation set. However, if experimental values for the three alternative measures, mutagenicity and logP are not available in the literature, then either the CART developed using missing experimental values or estimated values may be used for making a prediction.

  2. Genomic quantitative genetics to study evolution in the wild

    NARCIS (Netherlands)

    Gienapp, P.; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R.; Sork, Victoria L.; Csilléry, Katalin

    2017-01-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic

  3. MO-E-17A-04: Size-Specific Dose Estimate (SSDE) Provides a Simple Method to Calculate Organ Dose for Pediatric CT Examinations

    Energy Technology Data Exchange (ETDEWEB)

    Moore, B; Brady, S; Kaufman, R [St Jude Children' s Research Hospital, Memphis, TN (United States); Mirro, A [Washington University, St. Louis, MO (United States)

    2014-06-15

    Purpose: Investigate the correlation of SSDE with organ dose in a pediatric population. Methods: Four anthropomorphic phantoms, representing a range of pediatric body habitus, were scanned with MOSFET dosimeters placed at 23 organ locations to determine absolute organ dosimetry. Phantom organ dosimetry was divided by phantom SSDE to determine correlation between organ dose and SSDE. Correlation factors were then multiplied by patient SSDE to estimate patient organ dose. Patient demographics consisted of 352 chest and 241 abdominopelvic CT examinations, 22 ± 15 kg (range 5−55 kg) mean weight, and 6 ± 5 years (range 4 mon to 23 years) mean age. Patient organ dose estimates were compared to published pediatric Monte Carlo study results. Results: Phantom effective diameters were matched with patient population effective diameters to within 4 cm. 23 organ correlation factors were determined in the chest and abdominopelvic region across nine pediatric weight subcategories. For organs fully covered by the scan volume, correlation in the chest (average 1.1; range 0.7−1.4) and abdominopelvic (average 0.9; range 0.7−1.3) was near unity. For organs that extended beyond the scan volume (i.e., skin, bone marrow, and bone surface), correlation was determined to be poor (average 0.3; range: 0.1−0.4) for both the chest and abdominopelvic regions, respectively. Pediatric organ dosimetry was compared to published values and was found to agree in the chest to better than an average of 5% (27.6/26.2) and in the abdominopelvic region to better than 2% (73.4/75.0). Conclusion: Average correlation of SSDE and organ dosimetry was found to be better than ± 10% for fully covered organs within the scan volume. This study provides a list of organ dose correlation factors for the chest and abdominopelvic regions, and describes a simple methodology to estimate individual pediatric patient organ dose based on patient SSDE.

  4. Skin temperature over the carotid artery provides an accurate noninvasive estimation of core temperature in infants and young children during general anesthesia.

    Science.gov (United States)

    Jay, Ollie; Molgat-Seon, Yannick; Chou, Shirley; Murto, Kimmo

    2013-12-01

    The accurate measurement of core temperature is an essential aspect of intraoperative management in children. Invasive measurement sites are accurate but carry some health risks and cannot be used in certain patients. An accurate form of noninvasive thermometry is therefore needed. Our aim was to develop, and subsequently validate, separate models for estimating core temperature using different skin temperatures with an individualized correction factor. Forty-eight pediatric patients (0-36 months) undergoing elective surgery were separated into a modeling group (MG, n = 28) and validation group (VG, n = 20). Skin temperature was measured over the carotid artery (Tsk_carotid ), upper abdomen (Tsk_abd ), and axilla (Tsk_axilla ), while nasopharyngeal temperature (Tnaso ) was measured as a reference. In the MG, derived models for estimating Tnaso were: Tsk_carotid  + 0.52; Tsk_abd  + (0.076[body mass] + 0.02); and Tsk_axilla  + (0.081[body mass]-0.66). After adjusting raw Tsk_carotid, Tsk_abd , and Tsk_axilla values in the independent VG using these models, the mean bias (Predicted Tnaso - Actual Tnaso [with 95% confidence intervals]) was +0.03[+0.53, -0.50]°C, -0.05[+1.02, -1.07]°C, and -0.06[+1.21, -1.28°C], respectively. The percentage of values within ±0.5°C of Tnaso was 93.2%, 75.4%, and 66.1% for Tsk_carotid, Tsk_abd , and Tsk_axilla , respectively. Sensitivity and specificity for detecting hypothermia (Tnaso  Skin temperature over the carotid artery, with a simple correction factor of +0.52°C, provides a viable noninvasive estimate of Tnaso in young children during elective surgery with a general anesthetic. © 2013 John Wiley & Sons Ltd.

  5. Will circumcision provide even more protection from HIV to women and men? New estimates of the population impact of circumcision interventions.

    Science.gov (United States)

    Hallett, Timothy B; Alsallaq, Ramzi A; Baeten, Jared M; Weiss, Helen; Celum, Connie; Gray, Ron; Abu-Raddad, Laith

    2011-03-01

    Mathematical modelling has indicated that expansion of male circumcision services in high HIV prevalence settings can substantially reduce population-level HIV transmission. However, these projections need revision to incorporate new data on the effect of male circumcision on the risk of acquiring and transmitting HIV. Recent data on the effect of male circumcision during wound healing and the risk of HIV transmission to women were synthesised based on four trials of circumcision among adults and new observational data of HIV transmission rates in stable partnerships from men circumcised at younger ages. New estimates were generated for the impact of circumcision interventions in two mathematical models, representing the HIV epidemics in Zimbabwe and Kisumu, Kenya. The models did not capture the interaction between circumcision, HIV and other sexually transmitted infections. An increase in the risk of HIV acquisition and transmission during wound healing is unlikely to have a major impact of circumcision interventions. However, it was estimated that circumcision confers a 46% reduction in the rate of male-to-female HIV transmission. If this reduction begins 2 years after the procedure, the impact of circumcision is substantially enhanced and accelerated compared with previous projections with no such effect-increasing by 40% the infections averted by the intervention overall and doubling the number of infections averted among women. Communities, and especially women, may benefit much more from circumcision interventions than had previously been predicted, and these results provide an even greater imperative to increase scale-up of safe male circumcision services.

  6. Real-time quantitative PCR with SYBR Green I detection for estimating copy numbers of nine drug resistance candidate genes in Plasmodium falciparum

    Directory of Open Access Journals (Sweden)

    Cravo Pedro VL

    2006-01-01

    Full Text Available Abstract Background Evaluating copy numbers of given genes in Plasmodium falciparum parasites is of major importance for laboratory-based studies or epidemiological surveys. For instance, pfmdr1 gene amplification has been associated with resistance to quinine derivatives and several genes involved in anti-oxidant defence may play an important role in resistance to antimalarial drugs, although their potential involvement has been overlooked. Methods The ΔΔCt method of relative quantification using real-time quantitative PCR with SYBR Green I detection was adapted and optimized to estimate copy numbers of three genes previously indicated as putative candidates of resistance to quinolines and artemisinin derivatives: pfmdr1, pfatp6 (SERCA and pftctp, and in six further genes involved in oxidative stress responses. Results Using carefully designed specific RT-qPCR oligonucleotides, the methods were optimized for each gene and validated by the accurate measure of previously known number of copies of the pfmdr1 gene in the laboratory reference strains P. falciparum 3D7 and Dd2. Subsequently, Standard Operating Procedures (SOPs were developed to the remaining genes under study and successfully applied to DNA obtained from dried filter blood spots of field isolates of P. falciparum collected in São Tomé & Principe, West Africa. Conclusion The SOPs reported here may be used as a high throughput tool to investigate the role of these drug resistance gene candidates in laboratory studies or large scale epidemiological surveys.

  7. Real-time quantitative PCR with SYBR Green I detection for estimating copy numbers of nine drug resistance candidate genes in Plasmodium falciparum.

    Science.gov (United States)

    Ferreira, Isabel D; Rosário, Virgílio E do; Cravo, Pedro V L

    2006-01-18

    Evaluating copy numbers of given genes in Plasmodium falciparum parasites is of major importance for laboratory-based studies or epidemiological surveys. For instance, pfmdr1 gene amplification has been associated with resistance to quinine derivatives and several genes involved in anti-oxidant defence may play an important role in resistance to antimalarial drugs, although their potential involvement has been overlooked. The DeltaDeltaCt method of relative quantification using real-time quantitative PCR with SYBR Green I detection was adapted and optimized to estimate copy numbers of three genes previously indicated as putative candidates of resistance to quinolines and artemisinin derivatives: pfmdr1, pfatp6 (SERCA) and pftctp, and in six further genes involved in oxidative stress responses. Using carefully designed specific RT-qPCR oligonucleotides, the methods were optimized for each gene and validated by the accurate measure of previously known number of copies of the pfmdr1 gene in the laboratory reference strains P. falciparum 3D7 and Dd2. Subsequently, Standard Operating Procedures (SOPs) were developed to the remaining genes under study and successfully applied to DNA obtained from dried filter blood spots of field isolates of P. falciparum collected in São Tomé & Principe, West Africa. The SOPs reported here may be used as a high throughput tool to investigate the role of these drug resistance gene candidates in laboratory studies or large scale epidemiological surveys.

  8. Towards cheminformatics-based estimation of drug therapeutic index: Predicting the protective index of anticonvulsants using a new quantitative structure-index relationship approach.

    Science.gov (United States)

    Chen, Shangying; Zhang, Peng; Liu, Xin; Qin, Chu; Tao, Lin; Zhang, Cheng; Yang, Sheng Yong; Chen, Yu Zong; Chui, Wai Keung

    2016-06-01

    The overall efficacy and safety profile of a new drug is partially evaluated by the therapeutic index in clinical studies and by the protective index (PI) in preclinical studies. In-silico predictive methods may facilitate the assessment of these indicators. Although QSAR and QSTR models can be used for predicting PI, their predictive capability has not been evaluated. To test this capability, we developed QSAR and QSTR models for predicting the activity and toxicity of anticonvulsants at accuracy levels above the literature-reported threshold (LT) of good QSAR models as tested by both the internal 5-fold cross validation and external validation method. These models showed significantly compromised PI predictive capability due to the cumulative errors of the QSAR and QSTR models. Therefore, in this investigation a new quantitative structure-index relationship (QSIR) model was devised and it showed improved PI predictive capability that superseded the LT of good QSAR models. The QSAR, QSTR and QSIR models were developed using support vector regression (SVR) method with the parameters optimized by using the greedy search method. The molecular descriptors relevant to the prediction of anticonvulsant activities, toxicities and PIs were analyzed by a recursive feature elimination method. The selected molecular descriptors are primarily associated with the drug-like, pharmacological and toxicological features and those used in the published anticonvulsant QSAR and QSTR models. This study suggested that QSIR is useful for estimating the therapeutic index of drug candidates. Copyright © 2016. Published by Elsevier Inc.

  9. Is life cycle assessment (LCA) a suitable method for quantitative CO{sub 2} saving estimations? the impact of field input on the LCA results for a pure vegetable oil chain

    Energy Technology Data Exchange (ETDEWEB)

    Chiaramonti, David [University of Florence, Mech. Eng. Faculty, CREAR and Department of Energetics ' ' Sergio Stecco' ' , Via S. Marta, 3 - 50139 Florence (Italy); Recchia, Lucia [University of Florence, Agriculture Faculty, CREAR and Department of Agricultural and Forestry Engineering, Piazzale delle Cascine, 15 - 50144 Florence (Italy)

    2010-05-15

    The environmental and social sustainability of biofuel production and use is today the most critical issue for the development of support policies in this sector.The Life Cycle Assessment (LCA) methodology is commonly agreed as the main tool for the estimation of the impact of biofuel chains, even in quantitative terms. This is also reflected in the recently issued EU Directive (Renewable Energy Directive, RED) on the promotion of the use of energy from renewable sources. However, the results of Life Cycle Assessment works largely depend on the quality of the information given as input to the study, as also very recent research works started to investigate: in addition, the comparison of a large number of very different (technically, geographically, agronomically) biofuel chains, as some Life Cycle Assessments and reviews tried to do, is a very difficult task due to the extremely large number of variable conditions and parameters. This paper, by considering a very specific biofuel chain (production and use of Pure/Straight Sunflower Oil in North-Central Italy), discuss some limits and constraints of the application of the LCA method. The work investigated within which boundaries Life Cycle Assessment could be implemented to perform quantitative assessments, as requested by the current supporting policies in the biofuel area. Results showed very large variations in the calculation of the CO{sub 2} equivalent emissions, thus illustrating how achievable results depends on the local agricultural practices and performances, even for such a small and well defined biofuel chain. The adoption of the present standardized Life Cycle Assessment approach for generalized evaluations in the bioenergy sector and, in particular, for quantitative assessments should therefore be reconsidered. Concluding, LCA studies, even while addressing very specific and well defined chains, should always provide the bias of the calculations, as this range of variation of Life Cycle Assessment

  10. Source apportionment of the carbonaceous aerosol in Norway – quantitative estimates based on 14C, thermal-optical and organic tracer analysis

    Directory of Open Access Journals (Sweden)

    K. Stenström

    2011-09-01

    Full Text Available In the present study, source apportionment of the ambient summer and winter time particulate carbonaceous matter (PCM in aerosol particles (PM1 and PM10 has been conducted for the Norwegian urban and rural background environment. Statistical treatment of data from thermal-optical, 14C and organic tracer analysis using Latin Hypercube Sampling has allowed for quantitative estimates of seven different sources contributing to the ambient carbonaceous aerosol. These are: elemental carbon from combustion of biomass (ECbb and fossil fuel (ECff, primary and secondary organic carbon arising from combustion of biomass (OCbb and fossil fuel (OCff, primary biological aerosol particles (OCPBAP, which includes plant debris, OCpbc, and fungal spores, OCpbs, and secondary organic aerosol from biogenic precursors (OCBSOA. Our results show that emissions from natural sources were particularly abundant in summer, and with a more pronounced influence at the rural compared to the urban background site. 80% of total carbon (TCp, corrected for the positive artefact in PM10 and ca. 70% of TCpin PM1 could be attributed to natural sources at the rural background site in summer. Natural sources account for about 50% of TCp in PM10 at the urban background site as well. The natural source contribution was always dominated by OCBSOA, regardless of season, site and size fraction. During winter anthropogenic sources totally dominated the carbonaceous aerosol (80–90%. Combustion of biomass contributed slightly more than fossil-fuel sources in winter, whereas emissions from fossil-fuel sources were more abundant in summer. Mass closure calculations show that PCM made significant contributions to the mass concentration of the ambient PM regardless of size fraction, season, and site. A larger fraction of PM1 (ca. 40–60% was accounted for by carbonaceous matter compared to PM10 (ca. 40–50%, but only by a small margin. In general, there were no pronounced differences in the

  11. Quantitative estimation of AgNORs in inflammatory gingival overgrowth in pediatric patients and its correlation with the dental plaque status

    Directory of Open Access Journals (Sweden)

    Mukhopadhyay S

    2009-01-01

    Full Text Available Background and Objectives: Nucleolar organizer Regions (NORs are situated within the nucleolus of a cell. The proteins are selectively stained by the silver colloid technique that is known as the AgNOR technique. AgNOR stain can be visualized as a black dot under the optical microscope. The present study aimed to evaluate the cases for quantitative estimation of AgNORs in the epithelial cells in various grades of gingival overgrowth to that of normal gingival tissues. Materials and Methods: Only preadolescent and adolescent groups aged up to 14 years were selected. Twenty normal and 31 disease cases of gingival overgrowth were selected. The tissue sections were stained by the hematoxylin and eosin (HandE technique for the routine histological evaluation, while the AgNOR counts were performed through the improved one-step method of Ploton et al. Results: HandE staining revealed five different types of gingival overgrowth. The plaque index (PI, gingival index (GI, and AgNOR count were not significantly (P> 0.05 higher than that of control cases in pyogenic granuloma, puberty gingivitis, and in drug-induced gingival overgrowth cases. In gingival fibromatosis cases, for comparison of different indices t-tests were done. The PI when compared with AgNOR count was found significant at 5% level and 0.1% level for mixed and permanent dentition, respectively. The GI when compared with AgNOR count was found significant at 1% level and 0.1% level in mixed and permanent dentitions, respectively.

  12. High-Performance Liquid Chromatographic and High-Performance Thin-Layer Chromatographic Method for the Quantitative Estimation of Dolutegravir Sodium in Bulk Drug and Pharmaceutical Dosage Form.

    Science.gov (United States)

    Bhavar, Girija B; Pekamwar, Sanjay S; Aher, Kiran B; Thorat, Ravindra S; Chaudhari, Sanjay R

    2016-01-01

    Simple, sensitive, precise, and specific high-performance liquid chromategraphic (HPLC) and high-performance thin-layer chromatographic (HPTLC) methods for the determination of dolutegravir sodium in bulk drug and pharmaceutical dosage form were developed and validated. In the HPLC method, analysis of the drug was carried out on the ODS C18 column (150 × 4.6 mm, 5 μm particle size) using a mixture of acetonitrile: water (pH 7.5) in the ratio of 80:20 v/v as the mobile phase at the flow rate 1 mL/min at 260 nm. This method was found to be linear in the concentration range of 5-35 μg/mL. The peak for dolutegravir sodium was observed at 3.0 ± 0.1 minutes. In the HPTLC method, analysis was performed on aluminum-backed plates pre-coated with silica gel G60 F254 using methanol: chloroform: formic acid in the proportion of 8:2:0.5 v/v/v as the mobile phase. This solvent system was found to give compact spots for dolutegravir sodium with the Rf value 0.77 ± 0.01. Densitometric analysis of dolutegravir sodium was carried out in the absorbance mode at 265 nm. Linear regression analysis showed good linearity with respect to peak area in the concentration range of 200-900 ng/spot. The methods were validated for precision, limit of detection (LOD), limit of quantitation (LOQ), accuracy, and specificity. Statistical analysis showed that both of the methods are repeatable and specific for the estimation of the said drug. The methods can be used for routine quality control analysis of dolutegravir sodium.

  13. Quantitative analysis of oyster larval proteome provides new insights into the effects of multiple climate change stressors, supplement to: Dineshram, R; Chandramouli, K; Ko, W K Ginger; Zhang, Huoming; Qian, Pei Yuan; Ravasi, Timothy; Thiyagarajan, Vengatesen (2016): Quantitative analysis of oyster larval proteome provides new insights into the effects of multiple climate change stressors. Global Change Biology, 22(6), 2054-2068

    KAUST Repository

    Dineshram, R

    2016-01-01

    The metamorphosis of planktonic larvae of the Pacific oyster (Crassostrea gigas) underpins their complex life-history strategy by switching on the molecular machinery required for sessile life and building calcite shells. Metamorphosis becomes a survival bottleneck, which will be pressured by different anthropogenically induced climate change-related variables. Therefore, it is important to understand how metamorphosing larvae interact with emerging climate change stressors. To predict how larvae might be affected in a future ocean, we examined changes in the proteome of metamorphosing larvae under multiple stressors: decreased pH (pH 7.4), increased temperature (30 °C), and reduced salinity (15 psu). Quantitative protein expression profiling using iTRAQ-LC-MS/MS identified more than 1300 proteins. Decreased pH had a negative effect on metamorphosis by down-regulating several proteins involved in energy production, metabolism, and protein synthesis. However, warming switched on these down-regulated pathways at pH 7.4. Under multiple stressors, cell signaling, energy production, growth, and developmental pathways were up-regulated, although metamorphosis was still reduced. Despite the lack of lethal effects, significant physiological responses to both individual and interacting climate change related stressors were observed at proteome level. The metamorphosing larvae of the C. gigas population in the Yellow Sea appear to have adequate phenotypic plasticity at the proteome level to survive in future coastal oceans, but with developmental and physiological costs.

  14. Ionization Energies, Electron Affinities, and Polarization Energies of Organic Molecular Crystals: Quantitative Estimations from a Polarizable Continuum Model (PCM)–Tuned Range-Separated Density Functional Approach

    KAUST Repository

    Sun, Haitao

    2016-05-16

    We propose a new methodology for the first-principles description of the electronic properties relevant for charge transport in organic molecular crystals. This methodology, which is based on the combination of a non-empirical, optimally tuned range-separated hybrid functional with the polarizable continuum model, is applied to a series of eight representative molecular semiconductor crystals. We show that it provides ionization energies, electron affinities, and transport gaps in very good agreement with experimental values as well as with the results of many-body perturbation theory within the GW approximation at a fraction of the computational costs. Hence, this approach represents an easily applicable and computationally efficient tool to estimate the gas-to-crystal-phase shifts of the frontier-orbital quasiparticle energies in organic electronic materials.

  15. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  16. LiST modelling with monitoring data to estimate impact on child mortality of an ORS and zinc programme with public sector providers in Bihar, India.

    Science.gov (United States)

    Ayyanat, Jayachandran A; Harbour, Catherine; Kumar, Sanjeev; Singh, Manjula

    2018-01-05

    Many interventions have attempted to increase vulnerable and remote populations' access to ORS and zinc to reduce child mortality from diarrhoea. However, the impact of these interventions is difficult to measure. From 2010 to 15, Micronutrient Initiative (MI), worked with the public sector in Bihar, India to enable community health workers to treat and report uncomplicated child diarrhoea with ORS and zinc. We describe how we estimated programme's impact on child mortality with Lives Saved Tool (LiST) modelling and data from MI's management information system (MIS). This study demonstrates that using LiST modelling and MIS data are viable options for evaluating programmes to reduce child mortality. We used MI's programme monitoring data to estimate coverage rates and LiST modelling software to estimate programme impact on child mortality. Four scenarios estimated the effects of different rates of programme scale-up and programme coverage on estimated child mortality by measuring children's lives saved. The programme saved an estimated 806-975 children under-5 who had diarrhoea during five-year project phase. Increasing ORS and zinc coverage rates to 19.8% & 18.3% respectively under public sector coverage with effective treatment would have increased the programme's impact on child mortality and could have achieved the project goal of saving 4200 children's lives during the five-year programme. Programme monitoring data can be used with LiST modelling software to estimate coverage rates and programme impact on child mortality. This modelling approach may cost less and yield estimates sooner than directly measuring programme impact with population-based surveys. However, users must be cautious about relying on modelled estimates of impact and ensure that the programme monitoring data used is complete and precise about the programme aspects that are modelled. Otherwise, LiST may mis-estimate impact on child mortality. Further, LiST software may require modifications

  17. Estimativa de estro em vacas leiteiras utilizando métodos quantitativos preditivos Dairy cows estrus estimation using predictive and quantitative methods

    Directory of Open Access Journals (Sweden)

    Irenilza de Alencar Nääs

    2008-11-01

    in milk production was due to the use of several technologies that have being developed for the sector, mainly those related to genetics and herd management. Accurate estrus detection in dairy cows is a limiting factor in the reproduction efficiency of dairy cattle, and it has been considered the most important deficiency in the field of reproduction. Failing to detect estrus efficiently may cause losses for the producer. Quantitative predictive methods based on historical data and specialist knowledge may allow, from an organized data base, the prediction of estrus pattern with lower error. This research compared the precision of the estrus prediction techniques for freestall confined Holstein dairy cows using quantitative predictive methods, through the interpolation of intermediate points of historical herd data set. A base of rules was formulated and the values of weight for each statement is within the interval of 0 to 1; and these limits were used to generate a function of pertinence fuzzy that had as output the estrus prediction. In the following stage Data mining technique was applied using the parameters of movement rate, milk production, days of lactation and mounting behavior, and a decision tree was built for analyzing the most significant parameters for predicting estrus in dairy cows. The results indicate that the prediction of estrus incidence may be achieved either using the association of cow’s movement (87%, with estimated error of 4% or the observation of mounting behavior (78%, with estimated error of 11%.

  18. Pyrolysis and co-composting of municipal organic waste in Bangladesh: A quantitative estimate of recyclable nutrients, greenhouse gas emissions, and economic benefits.

    Science.gov (United States)

    Mia, Shamim; Uddin, Md Ektear; Kader, Md Abdul; Ahsan, Amimul; Mannan, M A; Hossain, Mohammad Monjur; Solaiman, Zakaria M

    2018-02-10

    Waste causes environmental pollution and greenhouse gas (GHG) emissions when it is not managed sustainably. In Bangladesh, municipal organic waste (MOW) is partially collected and landfilled. Thus, it causes deterioration of the environment urging a recycle-oriented waste management system. In this study, we propose a waste management system through pyrolysis of selective MOW for biochar production and composting of the remainder with biochar as an additive. We estimated the carbon (C), nitrogen (N), phosphorus (P) and potassium (K) recycling potentials in the new techniques of waste management. Waste generation of a city was calculated using population density and per capita waste generation rate (PWGR). Two indicators of economic development, i.e., gross domestic product (GDP) and per capita gross national income (GNI) were used to adopt PWGR with a projected contribution of 5-20% to waste generation. The projected PWGR was then validated with a survey. The waste generation from urban areas of Bangladesh in 2016 was estimated between 15,507 and 15,888 t day -1 with a large share (∼75%) of organic waste. Adoption of the proposed system could produce 3936 t day -1 biochar blended compost with an annual return of US $210 million in 2016 while it could reduce GHG emission substantially (-503 CO 2 e t -1 municipal waste). Moreover, the proposed system would able to recover ∼46%, 54%, 54% and 61% of total C, N, P and K content in the initial waste, respectively. We also provide a projection of waste generation and nutrient recycling potentials for the year 2035. The proposed method could be a self-sustaining policy option for waste management as it would generate ∼US$51 from each tonne of waste. Moreover, a significant amount of nutrients can be recycled to agriculture while contributing to the reduction in environmental pollution and GHG emission. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Rapid exposure and loss estimates for the May 12, 2008 Mw 7.9 Wenchuan earthquake provided by the U.S. Geological Survey's PAGER system

    Science.gov (United States)

    Earle, P.S.; Wald, D.J.; Allen, T.I.; Jaiswal, K.S.; Porter, K.A.; Hearne, M.G.

    2008-01-01

    One half-hour after the May 12th Mw 7.9 Wenchuan, China earthquake, the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) system distributed an automatically generated alert stating that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater). It was immediately clear that a large-scale disaster had occurred. These alerts were widely distributed and referenced by the major media outlets and used by governments, scientific, and relief agencies to guide their responses. The PAGER alerts and Web pages included predictive ShakeMaps showing estimates of ground shaking, maps of population density, and a list of estimated intensities at impacted cities. Manual, revised alerts were issued in the following hours that included the dimensions of the fault rupture. Within a half-day, PAGER’s estimates of the population exposed to strong shaking levels stabilized at 5.2 million people. A coordinated research effort is underway to extend PAGER’s capability to include estimates of the number of casualties. We are pursuing loss models that will allow PAGER the flexibility to use detailed inventory and engineering results in regions where these data are available while also calculating loss estimates in regions where little is known about the type and strength of the built infrastructure. Prototype PAGER fatality estimates are currently implemented and can be manually triggered. In the hours following the Wenchuan earthquake, these models predicted fatalities in the tens of thousands.

  20. The added value that increasing levels of diagnostic information provide in prognostic models to estimate hospital mortality for adult intensive care patients

    NARCIS (Netherlands)

    de Keizer, N. F.; Bonsel, G. J.; Goldfad, C.; Rowan, K. M.

    2000-01-01

    To investigate in a systematic, reproducible way the potential of adding increasing levels of diagnostic information to prognostic models for estimating hospital mortality. Prospective cohort study. Thirty UK intensive care units (ICUs) participating in the ICNARC Case Mix Programme. Eight thousand

  1. So You Think You Look Young? Matching Older Adults' Subjective Ages with Age Estimations Provided by Younger, Middle-Aged, and Older Adults

    Science.gov (United States)

    Kotter-Gruhn, Dana; Hess, Thomas M.

    2012-01-01

    Perceived age plays an important role in the context of age identity and social interactions. To examine how accurate individuals are in estimating how old they look and how old others are, younger, middle-aged, and older adults rated photographs of older target persons (for whom we had information about objective and subjective age) in terms of…

  2. Evaluation of postprocessing dual-energy quantitative computed tomography

    NARCIS (Netherlands)

    C. van Kuijk (Cornelis)

    1991-01-01

    textabstractCT scanners can be used to provide quantitative information on body composition. Its main application is for bone mineral content estimation within the lumbar vertebral body. This is usually done with a single-energy technique. The estimates obtained with this technique are influenced

  3. Bias in the Cq value observed with hydrolysis probe based quantitative PCR can be corrected with the estimated PCR efficiency value

    NARCIS (Netherlands)

    Tuomi, Jari Michael; Voorbraak, Frans; Jones, Douglas L.; Ruijter, Jan M.

    2010-01-01

    For real-time monitoring of PCR amplification of DNA, quantitative PCR (qPCR) assays use various fluorescent reporters. DNA binding molecules and hybridization reporters (primers and probes) only fluoresce when bound to DNA and result in the non-cumulative increase in observed fluorescence.

  4. Comparison of visual scoring and quantitative planimetry methods for estimation of global infarct size on delayed enhanced cardiac MRI and validation with myocardial enzymes

    Energy Technology Data Exchange (ETDEWEB)

    Mewton, Nathan, E-mail: nmewton@gmail.com [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France); Revel, Didier [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France); Bonnefoy, Eric [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); Ovize, Michel [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); INSERM Unite 886 (France); Croisille, Pierre [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France)

    2011-04-15

    Purpose: Although delayed enhanced CMR has become a reference method for infarct size quantification, there is no ideal method to quantify total infarct size in a routine clinical practice. In a prospective study we compared the performance and post-processing time of a global visual scoring method to standard quantitative planimetry and we compared both methods to the peak values of myocardial biomarkers. Materials and methods: This study had local ethics committee approval; all patients gave written informed consent. One hundred and three patients admitted with reperfused AMI to our intensive care unit had a complete CMR study with gadolinium-contrast injection 4 {+-} 2 days after admission. A global visual score was defined on a 17-segment model and compared with the quantitative planimetric evaluation of hyperenhancement. The peak values of serum Troponin I (TnI) and creatine kinase (CK) release were measured in each patient. Results: The mean percentage of total left ventricular myocardium with hyperenhancement determined by the quantitative planimetry method was (20.1 {+-} 14.6) with a range of 1-68%. There was an excellent correlation between quantitative planimetry and visual global scoring for the hyperenhancement extent's measurement (r = 0.94; y = 1.093x + 0.87; SEE = 1.2; P < 0.001) The Bland-Altman plot showed a good concordance between the two approaches (mean of the differences = 1.9% with a standard deviation of 4.7). Mean post-processing time for quantitative planimetry was significantly longer than visual scoring post-processing time (23.7 {+-} 5.7 min vs 5.0 {+-} 1.1 min respectively, P < 0.001). Correlation between peak CK and quantitative planimetry was r = 0.82 (P < 0.001) and r = 0.83 (P < 0.001) with visual global scoring. Correlation between peak Troponin I and quantitative planimetry was r = 0.86 (P < 0.001) and r = 0.85 (P < 0.001) with visual global scoring. Conclusion: A visual approach based on a 17-segment model allows a rapid

  5. Fusion of multiple radar-based quantitative precipitation estimates (QPE) for high-resolution flash flood forecasting in large urban areas

    Science.gov (United States)

    Rafieei Nasab, A.; Norouzi, A.; Kim, B.; Seo, D. J.

    2014-12-01

    With increasingly widespread use of weather radars, multiple radar-based QPEs are now routinely available in many places. In the Dallas-Fort Worth Metroplex (DFW), for example, the Multisensor Precipitation Estimator (MPE), Q2 (Next Generation QPE) and CASA (Collaborative Adaptive Sensing of Atmosphere) QPEs are available. Because these products are based on different radar systems, different sources of additional information, and/or processing algorithms, they have different error characteristics and spatiotemporal resolutions. In this work, we explore improving the accuracy of the highest-resolution radar QPE by fusing it with lower-resolution QPE(s). Two approaches are examined. The first is to pose fusion as a Fisher estimation problem in which the state vector is the true unknown precipitation at the highest resolution and the observation vector is made of all radar QPEs at their native resolutions. The second is to upscale the higher resolution QPE(s) to the lowest resolution, merge them via optimal estimation, and disaggregate the merged estimates based on the spatiotemporal patterns of precipitation in the high resolution QPE. In both approaches, we compare Fisher estimation with conditional bias-penalized Fisher-like estimation which improves estimation of heavy-to-extreme precipitation. For evaluation, we compare the precipitation estimates from the two approaches with rain gauge observations in the DFW area.

  6. Hypoxia off the Changjiang (Yangtze River) estuary and in the adjacent East China Sea: Quantitative approaches to estimating the tidal impact and nutrient regeneration.

    Science.gov (United States)

    Zhu, Zhuo-Yi; Wu, Hui; Liu, Su-Mei; Wu, Ying; Huang, Da-Ji; Zhang, Jing; Zhang, Guo-Sen

    2017-12-15

    Large areas of hypoxia have been reported off The Changjiang Estuary and in the East China Sea. Five cruises, covering winter, spring, and summer, were carried out from 2007 to 2013 in this region, and in August 2013 (summer), an extensive hypoxic event (11,150km2) was observed, which was characterized by an estimated bulk oxygen depletion of 5.1 million tons. A strong tidal impact was observed associated with the bottom oxygen depletion, with the periodicity of diel variations in dissolved oxygen being 12h (i.e., similar to the tidal cycle). A conservative estimate of nutrient regeneration suggested that during the hypoxic event of August 2013, the amount of regenerated nitrogen (as nitrate) and phosphorus (as dissolved inorganic phosphorus) was 27,000-30,000 tons and 1300-41,000tons, respectively. Estimates of the absolute (bulk) regenerated nutrient fluxes were much greater than the conservative estimates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  8. Estimation of Land Surface Temperature for the Quantitative Analysis of Land Cover of Lower Areas of Sindh to Assess the Impacts of Climate Variability

    Science.gov (United States)

    Qaisar, Maha

    2016-07-01

    Due to the present land use practices and climate variability, drastic shifts in regional climate and land covers are easily seen and their future reduction and gain are too well predicted. Therefore, there is an increasing need for data on land-cover changes at narrow and broad spatial scales. In this study, a remote sensing-based technique for land-cover-change analysis is applied to the lower Sindh areas for the last decade. Landsat satellite products were analyzed on an alternate yearly basis, from 1990 to 2016. Then Land-cover-change magnitudes were measured and mapped for alternate years. Land Surface Temperature (LST) is one of the critical elements in the natural phenomena of surface energy and water balance at local and global extent. However, LST was computed by using Landsat thermal bands via brightness temperature and a vegetation index. Normalized difference vegetation index (NDVI) was interpreted and maps were achieved. LST reflected NDVI patterns with complexity of vegetation patterns. Along with this, Object Based Image Analysis (OBIA) was done for classifying 5 major classes of water, vegetation, urban, marshy lands and barren lands with significant map layouts. Pakistan Meteorological Department provided the climate data in which rainfall, temperature and air temperature are included. Once the LST and OBIA are performed, overlay analysis was done to correlate the results of LST with OBIA and LST with meteorological data to ascertain the changes in land covers due to increasing centigrade of LST. However, satellite derived LST was also correlated with climate data for environmental analysis and to estimate Land Surface Temperature for assessing the inverse impacts of climate variability. This study's results demonstrate the land-cover changes in Lower Areas of Sindh including the Indus Delta mostly involve variations in land-cover conditions due to inter-annual climatic variability and temporary shifts in seasonality. However it is too concluded

  9. The Need to Provide for Security in Old Age in Hierarchy of Needs-An Estimation of Its Ranking within the Polish Population

    Science.gov (United States)

    Roszkiewicz, Malgorzata

    2004-01-01

    The results of studies conducted in the last 5 years in Poland formed the basis for the assumption that amongst many needs an individual or a Polish household seeks to satisfy, the need to provide for security in old age takes a prominent position. Determining the position of this need among other needs as defined in Schrab's classification…

  10. Evaluation of sanitary impact of environmental pollution and quantitative evaluation of sanitary risks; Estimation de l'impact sanitaire d'une pollution environnementale et evaluation quantitative des risques sanitaires

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-09-15

    The calculation of a sanitary impact present a great interest at the decision level for the decision-makers and the whole of concerned actors. It constitutes a first step to organize a social debate around the risk acceptance, to analyze the feasibility of an inquiry or an epidemiological surveillance or to proportion an activity leading to pollutants emission in natural medium. Several conclusions are brought out: it is justified to estimate a sanitary impact from a sanitary risk excess, especially coming from animal tissue. It is conceivable to go beyond an estimation of the only individual risk and to calculate a number of cases in excess in the concerned population. The working group underlines that the characteristics of the situation are the determining factor to give the type of response to bring. The effective of the population is an important element and a situation has not to be underestimated because of the size at the pretext that the excess calculation leads to a number of cases inferior to one leading to believe that the impact is minor or negligible while the individual probability is high. The sanitary impact, expressed by the number of cancer cases in excess in an exposed population is quantified from the average value of excess of sanitary risk multiplied by the population effective, and expressed with a confidence interval. The sanitary impact can be expressed under the form of a percentage of the population present in the exposure area and goes past the comparison marks usually pointed up. This practice must be cheered. An analysis of uncertainties must be made as often as possible. (N.C.)

  11. Isolation and Quantitative Estimation of Diesel Exhaust and Carbon Black Particles Ingested by Lung Epithelial Cells and Alveolar Macrophages In Vitro

    Science.gov (United States)

    A new procedure for isolating and estimating ingested carbonaceous diesel exhaust particles (DEP) or carbon black (CB) particles by lung epithelial cells and macrophages is described. Cells were incubated with DEP or CB to examine cell-particle interaction and ingestion. After va...

  12. Probabilistic quantitative microbial risk assessment model of norovirus from wastewater irrigated vegetables in Ghana using genome copies and fecal indicator ratio conversion for estimating exposure dose

    DEFF Research Database (Denmark)

    Owusu-Ansah, Emmanuel de-Graft Johnson; Sampson, Angelina; Amponsah, Samuel K.

    2017-01-01

    . In some cases the difference was > 2 orders of magnitude. All scenarios using genome copies met the 10− 4 DALY per person per year for consumption of vegetables irrigated with wastewater, although these results are considered to be highly conservative risk estimates. The fecal indicator conversion ratio...

  13. Estimating systemic fibrosis by combining galectin-3 and ST2 provides powerful risk stratification value for patients after acute decompensated heart failure.

    Science.gov (United States)

    Wang, Chao-Hung; Yang, Ning-I; Liu, Min-Hui; Hsu, Kuang-Hung; Kuo, Li-Tang

    2016-01-01

    Two fibrosis biomarkers, galectin-3 (Gal-3) and suppression of tumorigenicity 2 (ST2), provide prognostic value additive to natriuretic peptides and traditional risk factors in patients with heart failure (HF). However, it is to be investigated whether their combined measurement before discharge provides incremental risk stratification for patients after acute HF. A total of 344 patients with acute HF were analyzed with Gal-3, and ST2 measured. Patients were prospectively followed for 3.7 ± 1.3 years for deaths, and composite events (death/HF-related re-hospitalizations). The levels of Gal-3 and ST2 were only slightly related (r = 0.20, p risk factors. According to the cutoff at median values, patients were separated into four subgroups based on high and low Gal-3 (HG and LG, respectively) and ST2 levels (HS and LS, respectively). Kaplan-Meier survival curves showed that HGHS powerfully identified patients at risk of mortality (Log rank = 21.27, p risk stratification value.

  14. Using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) to characterize copper, zinc and mercury along grizzly bear hair providing estimate of diet

    Energy Technology Data Exchange (ETDEWEB)

    Noël, Marie, E-mail: marie.noel@stantec.com [Stantec Consulting Ltd. 2042 Mills Road, Unit 11, Sidney BC V8L 4X2 (Canada); Christensen, Jennie R., E-mail: jennie.christensen@stantec.com [Stantec Consulting Ltd. 2042 Mills Road, Unit 11, Sidney BC V8L 4X2 (Canada); Spence, Jody, E-mail: jodys@uvic.ca [School of Earth and Ocean Sciences, Bob Wright Centre A405, University of Victoria, PO BOX 3065 STN CSC, Victoria, BC V8W 3V6 (Canada); Robbins, Charles T., E-mail: ctrobbins@wsu.edu [School of the Environment and School of Biological Sciences, Washington State University, Pullman, WA 99164-4236 (United States)

    2015-10-01

    We enhanced an existing technique, laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), to function as a non-lethal tool in the temporal characterization of trace element exposure in wild mammals. Mercury (Hg), copper (Cu), cadmium (Cd), lead (Pb), iron (Fe) and zinc (Zn) were analyzed along the hair of captive and wild grizzly bears (Ursus arctos horribilis). Laser parameters were optimized (consecutive 2000 μm line scans along the middle line of the hair at a speed of 50 μm/s; spot size = 30 μm) for consistent ablation of the hair. A pressed pellet of reference material DOLT-2 and sulfur were used as external and internal standards, respectively. Our newly adapted method passed the quality control tests with strong correlations between trace element concentrations obtained using LA-ICP-MS and those obtained with regular solution-ICP-MS (r{sup 2} = 0.92, 0.98, 0.63, 0.57, 0.99 and 0.90 for Hg, Fe, Cu, Zn, Cd and Pb, respectively). Cross-correlation analyses revealed good reproducibility between trace element patterns obtained from hair collected from the same bear. One exception was Cd for which external contamination was observed resulting in poor reproducibility. In order to validate the method, we used LA-ICP-MS on the hair of five captive grizzly bears fed known and varying amounts of cutthroat trout over a period of 33 days. Trace element patterns along the hair revealed strong Hg, Cu and Zn signals coinciding with fish consumption. Accordingly, significant correlations between Hg, Cu, and Zn in the hair and Hg, Cu, and Zn intake were evident and we were able to develop accumulation models for each of these elements. While the use of LA-ICP-MS for the monitoring of trace elements in wildlife is in its infancy, this study highlights the robustness and applicability of this newly adapted method. - Highlights: • LA-ICP-MS provides temporal trace metal exposure information for wild grizzly bears. • Cu and Zn temporal exposures provide

  15. Comparing Top-down and Bottom-up Estimates of Methane Emissions across Multiple U.S. Basins Provides Insights into National Oil and Gas Emissions and Mitigation Strategies

    Science.gov (United States)

    Hamburg, S.; Alvarez, R.; Lyon, D. R.; Zavala-Araiza, D.

    2016-12-01

    Several recent studies quantified regional methane emissions in U.S. oil and gas (O&G) basins using top-down approaches such as airborne mass balance measurements. These studies apportioned total methane emissions to O&G based on hydrocarbon ratios or subtracting bottom-up estimates of other sources. In most studies, top-down estimates of O&G methane emissions exceeded bottom-up emission inventories. An exception is the Barnett Shale Coordinated Campaign, which found agreement between aircraft mass balance estimates and a custom emission inventory. Reconciliation of Barnett Shale O&G emissions depended on two key features: 1) matching the spatial domains of top-down and bottom-up estimates, and 2) accounting for fat-tail sources in site-level emission factors. We construct spatially explicit custom emission inventories for domains with top-down O&G emission estimates in eight major U.S. oil and gas production basins using a variety of data sources including a spatially-allocated U.S. EPA Greenhouse Gas Inventory, the EPA Greenhouse Gas Reporting Program, state emission inventories, and recently published measurement studies. A comparison of top-down and our bottom-up estimates of O&G emissions constrains the gap between these approaches and elucidates regional variability in production-normalized loss rates. A comparison of component-level and site-level emission estimates of production sites in the Barnett Shale region - where comprehensive activity data and emissions estimates are available - indicates that abnormal process conditions contribute about 20% of regional O&G emissions. Combining these two analyses provides insights into the relative importance of different equipment, processes, and malfunctions to emissions in each basin. These data allow us to estimate the U.S. O&G supply chain loss rate, recommend mitigation strategies to reduce emissions from existing infrastructure, and discuss how a similar approach can be applied internationally.

  16. Improving satellite quantitative precipitation estimates through the use of high-resolution numerical weather predictions: Similarities and contrasts between the Alps and Blue Nile region

    Science.gov (United States)

    Bartsotas, Nikolaos; Nikolopoulos, Efthymios; Anagnostou, Emmanouil; Kallos, George

    2017-04-01

    Estimation of heavy precipitation events (HPEs) over high mountainous terrain is a particularly challenging task due to the limited availability of in-situ observations. Proper analysis and thorough understanding of the charac-teristics of HPE over complex terrain is thus hampered by insufficient precipitation information. Rain gauge networks usually present insufficient density and quality control issues in such areas. Radar rainfall estimates, wherever available, are heavily affected from terrain blockage. In this context, remote sensing has been attributed with a major role. However, this does not come without blemishes, as strong underestimation of precipitation associated with low-level orographic enhancement, introduces significant error in satellite estimates. In this study, we evaluate a satellite precipitation error-correction approach that can be implemented in the ab-sence of ground observations and it is based on utilization of precipitation information from high-resolution (1-2km) NWP simulations. Two quasi-global satellite precipitation products (CMORPH-8km and PERSIANN-4km) are used in more than 20 identified HPEs over two mountainous areas, the Alps and Ethiopia's Blue Nile. High-resolution atmospheric simulations from RAMS/ICLAMS are evaluated against rain gauge networks and radar estimates, then utilized to derive error correction functions for corresponding satellite precipitation data. Consequently, a PDF matching is applied and conclusions on the dependence of the method from synoptic at-mospheric conditions, which reveal to a certain degree the predictability of error properties, as well as the possi-bility of a global approach, are thoroughly discussed.

  17. Statistical significance of quantitative PCR

    Directory of Open Access Journals (Sweden)

    Mazza Christian

    2007-04-01

    Full Text Available Abstract Background PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. Results Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. Conclusion Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.

  18. Estimating the flash flood quantitative parameters affecting the oil-fields infrastructures in Ras Sudr, Sinai, Egypt, during the January 2010 event

    Directory of Open Access Journals (Sweden)

    Safwat Gabr

    2015-12-01

    Full Text Available This paper aims to quantify the hydrological parameters for the flash flood event of 17th January 2010 in Sinai using multiple sets of remote sensing data and field work for the nongaged catchments (approximately 2100 sq km of the wadis affecting Ras Sudr area, which is heavily occupied by numerous oil fields and related activities. The affected areas were visited, and several cross sections of the main active channels were surveyed to estimate the peak discharge rates. The Tropical Rainfall Monitoring Mission (TRMM data have been used to estimate rainfall parameters for the catchments due to the absence of in situ data. The digital elevation model (DEM of the Shuttle Radar Topography Mission (SRTM was used to extract the hydrographic data following standard procedures and techniques of the Geographic Information Systems (GIS. Both of the surveyed and extracted parameters for the active channels were integrated into GIS to estimate the runoff parameters using the open-channel flow equation of Manning’s. The simulated hydrographs show that the total discharge exceeded 5.7 million cubic meters and the peak discharge rate was 70 cubic meters per second. The mitigation of extreme flash flood is possible by altering the natural flow dispersion over the alluvial fan, and conveying the resulting flows into one adjusted channel.

  19. Estimating the Costs and Benefits of Providing Free Public Transit Passes to Students in Los Angeles County: Lessons Learned in Applying a Health Lens to Decision-Making

    Directory of Open Access Journals (Sweden)

    Lauren N. Gase

    2014-10-01

    Full Text Available In spite of increased focus by public health to engage and work with non-health sector partners to improve the health of the general as well as special populations, only a paucity of studies have described and disseminated emerging lessons and promising practices that can be used to undertake this work. This article describes the process used to conduct a Health Impact Assessment of a proposal to provide free public transportation passes to students in Los Angeles County. This illustrative case example describes opportunities and challenges encountered in working with an array of cross-sector partners and highlights four important lessons learned: (1 the benefits and challenges associated with broad conceptualization of public issues; (2 the need for more comprehensive, longitudinal data systems and dynamic simulation models to inform decision-making; (3 the importance of having a comprehensive policy assessment strategy that considers health impacts as well as costs and feasibility; and (4 the need for additional efforts to delineate the interconnectivity between health and other agency priorities. As public health advances cross-sector work in the community, further development of these priorities will help advance meaningful collaboration among all partners.

  20. Is the ECB so special? A qualitative and quantitative analysis

    OpenAIRE

    Fourçans, André; Vranceanu, Radu

    2006-01-01

    This paper analyses the European Central Bank (ECB) monetary policy over the period 1999-2005, both from a qualitative and a quantitative perspective, and compares it with the Federal Reserve Bank. The qualitative approach builds on information conveyed by various speeches of the central bank officers, mainly the President of the ECB, Jean-Claude Trichet. The quantitative analysis provides several estimates of what could have been the ECB and Fed interest rate rules. It also develops a VAR mo...

  1. Estimates of the genetic parameters, optimum sample size and conversion of quantitative data in multiple categories for soybean genotypes=Estimativas de parâmetros genéticos, do tamanho ótimo da amostra e conversão de dados quantitativos em multicategóricos para genótipos de soja

    National Research Council Canada - National Science Library

    Rita de Cássia Teixeira Oliveira; Cosme Damião Cruz; Tuneo Sediyama; Éder Matsuo; Luiz Renato Cadore

    2012-01-01

    The objective of this study was to estimate the genetic parameters and optimal sample size for the lengths of the hypocotyl and epicotyls and to analyze the conversion of quantitative data in multiple...

  2. Medicare Provider Data - Hospice Providers

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Hospice Utilization and Payment Public Use File provides information on services provided to Medicare beneficiaries by hospice providers. The Hospice PUF...

  3. Qualitative and quantitative estimation of comprehensive synaptic connectivity in short- and long-term cultured rat hippocampal neurons with new analytical methods inspired by Scatchard and Hill plots

    Energy Technology Data Exchange (ETDEWEB)

    Tanamoto, Ryo; Shindo, Yutaka; Niwano, Mariko [Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University (Japan); Matsumoto, Yoshinori [Department of Applied Physics and Physico-Informatics, Faculty of Science and Technology, Keio University (Japan); Miki, Norihisa [Department of Mechanical Engineering, Faculty of Science and Technology, Keio University, 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Kanagawa, 223-8522 (Japan); Hotta, Kohji [Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University (Japan); Oka, Kotaro, E-mail: oka@bio.keio.ac.jp [Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University (Japan)

    2016-03-18

    To investigate comprehensive synaptic connectivity, we examined Ca{sup 2+} responses with quantitative electric current stimulation by indium-tin-oxide (ITO) glass electrode with transparent and high electro-conductivity. The number of neurons with Ca{sup 2+} responses was low during the application of stepwise increase of electric current in short-term cultured neurons (less than 17 days in-vitro (DIV)). The neurons cultured over 17 DIV showed two-type responses: S-shaped (sigmoid) and monotonous saturated responses, and Scatchard plots well illustrated the difference of these two responses. Furthermore, sigmoid like neural network responses over 17 DIV were altered to the monotonous saturated ones by the application of the mixture of AP5 and CNQX, specific blockers of NMDA and AMPA receptors, respectively. This alternation was also characterized by the change of Hill coefficients. These findings indicate that the neural network with sigmoid-like responses has strong synergetic or cooperative synaptic connectivity via excitatory glutamate synapses. - Highlights: • We succeed to evaluate the maturation of neural network by Scathard and Hill Plots. • Long-term cultured neurons showed two-type responses: sigmoid and monotonous. • The sigmoid-like increase indicates the cooperatevity of neural networks. • Excitatory glutamate synapses cause the cooperatevity of neural networks.

  4. Evaluation of antioxidant activity, and quantitative estimation of flavonoids, saponins and phenols in crude extract and dry fractions of Medicago lupulina aerial parts.

    Science.gov (United States)

    Kicel, Agnieszka; Olszewska, Monika Anna

    2015-03-01

    This study was designed to evaluate the flavonoid, saponin (TSC) and phenolic (TPC) contents and in vitro antioxidant activity of the crude (CME) and dry extracts and fractions of Medicago lupulina L. aerial parts. A validated RP-HPLC method led to quantitation of flavonols (kaempferol, quercetin and myricetin) and flavones (apigenin and luteolin) in the hydrolyzed extract. TSC and TPC were assayed spectrophotometrically at 560 and 760 nm, respectively. The antioxidant activity of the CME and the dry fractions were followed in vitro by DPPH free radical and ferric reducing antioxidant power (FRAP) methods. The flavonoid content of CME was 1.27 mg/g dw. The prevailing flavonoids were luteolin and myricetin, at concentrations of 0.37 and 0.36 mg/g dw, respectively. TSC and TPC were detected in CME at the level of 90.4 mg ESE/g dw, and 12.9 mg GAE/g dw, respectively. In the DPPH and FRAP tests, the CME exhibited antioxidant capacity with TEAA and FRAP values of 45.4 μmol Trolox®/g dw and 0.2 mmol Fe2+/g dw, respectively. The diethyl ether dry fraction was the most valuable one, showing the highest antioxidant activity (TEAA = 726.1 μmol Trolox®/g dw, FRAP = 2349.4 μmol Fe2+/g dw) that was in accordance with its high TPC (162.4 mg/g dw).

  5. A quantitative trait locus for the number of days from sowing to ...

    African Journals Online (AJOL)

    Quantitative trait locus (QTL) mapping provides useful information for breeding programs since it allows the estimation of genomic locations and genetic effects of chromosomal regions related to the expression of quantitative traits. The number of days from sowing to seedling emergence (NDSSE) is an important agronomic ...

  6. Estimating travel reduction associated with the use of telemedicine by patients and healthcare professionals: proposal for quantitative synthesis in a systematic review

    Directory of Open Access Journals (Sweden)

    Bahaadinbeigy Kambiz

    2011-08-01

    Full Text Available Abstract Background A major benefit offered by telemedicine is the avoidance of travel, by patients, their carers and health care professionals. Unfortunately, there is very little published information about the extent of avoided travel. We propose to undertake a systematic review of literature which reports credible data on the reductions in travel associated with the use of telemedicine. Method The conventional approach to quantitative synthesis of the results from multiple studies is to conduct a meta analysis. However, too much heterogeneity exists between available studies to allow a meaningful meta analysis of the avoided travel when telemedicine is used across all possible settings. We propose instead to consider all credible evidence on avoided travel through telemedicine by fitting a linear model which takes into account the relevant factors in the circumstances of the studies performed. We propose the use of stepwise multiple regression to identify which factors are significant. Discussion Our proposed approach is illustrated by the example of teledermatology. In a preliminary review of the literature we found 20 studies in which the percentage of avoided travel through telemedicine could be inferred (a total of 5199 patients. The mean percentage avoided travel reported in the 12 store-and-forward studies was 43%. In the 7 real-time studies and in a single study with a hybrid technique, 70% of the patients avoided travel. A simplified model based on the modality of telemedicine employed (i.e. real-time or store and forward explained 29% of the variance. The use of store and forward teledermatology alone was associated with 43% of avoided travel. The increase in the proportion of patients who avoided travel (25% when real-time telemedicine was employed was significant (P = 0.014. Service planners can use this information to weigh up the costs and benefits of the two approaches.

  7. Quantitative estimates of Asian dust input to the western Philippine Sea in the mid-late Quaternary and its potential significance for paleoenvironment

    Science.gov (United States)

    Xu, Zhaokai; Li, Tiegang; Clift, Peter D.; Lim, Dhongil; Wan, Shiming; Chen, Hongjin; Tang, Zheng; Jiang, Fuqing; Xiong, Zhifang

    2015-09-01

    We present a new high-resolution multiproxy data set of Sr-Nd isotopes, rare earth element, soluble iron, and total organic carbon data from International Marine Global Change Study Core MD06-3047 located in the western Philippine Sea. We integrate our new data with published clay mineralogy, rare earth element chemistry, thermocline depth, and δ13C differences between benthic and planktonic foraminifera, in order to quantitatively constrain Asian dust input to the basin. We explore the relationship between Philippine Sea and high-latitude Pacific eolian fluxes, as well as its significance for marine productivity and atmospheric CO2 during the mid-late Quaternary. Three different indices indicate that Asian dust contributes between ˜15% and ˜50% to the detrital fraction of the sediments. Eolian dust flux in Core MD06-3047 is similar to that in the polar southern Pacific sediment. Coherent changes for most dust flux maximum/minimum indicate that dust generation in interhemispheric source areas might have a common response to climatic variation over the mid-late Quaternary. Furthermore, we note relatively good coherence between Asian dust input, soluble iron concentration, local marine productivity, and even global atmospheric CO2 concentration over the entire study interval. This suggests that dust-borne iron fertilization of marine phytoplankton might have been a periodic process operating at glacial/interglacial time scales over the past 700 ka. We suggest that strengthening of the biological pump in the Philippine Sea, and elsewhere in the tropical western Pacific during the mid-late Quaternary glacial periods may contribute to the lowering of atmospheric CO2 concentrations during ice ages.

  8. Estimating travel reduction associated with the use of telemedicine by patients and healthcare professionals: proposal for quantitative synthesis in a systematic review.

    Science.gov (United States)

    Wootton, Richard; Bahaadinbeigy, Kambiz; Hailey, David

    2011-08-08

    A major benefit offered by telemedicine is the avoidance of travel, by patients, their carers and health care professionals. Unfortunately, there is very little published information about the extent of avoided travel. We propose to undertake a systematic review of literature which reports credible data on the reductions in travel associated with the use of telemedicine. The conventional approach to quantitative synthesis of the results from multiple studies is to conduct a meta analysis. However, too much heterogeneity exists between available studies to allow a meaningful meta analysis of the avoided travel when telemedicine is used across all possible settings. We propose instead to consider all credible evidence on avoided travel through telemedicine by fitting a linear model which takes into account the relevant factors in the circumstances of the studies performed. We propose the use of stepwise multiple regression to identify which factors are significant. Our proposed approach is illustrated by the example of teledermatology. In a preliminary review of the literature we found 20 studies in which the percentage of avoided travel through telemedicine could be inferred (a total of 5199 patients). The mean percentage avoided travel reported in the 12 store-and-forward studies was 43%. In the 7 real-time studies and in a single study with a hybrid technique, 70% of the patients avoided travel. A simplified model based on the modality of telemedicine employed (i.e. real-time or store and forward) explained 29% of the variance. The use of store and forward teledermatology alone was associated with 43% of avoided travel. The increase in the proportion of patients who avoided travel (25%) when real-time telemedicine was employed was significant (P = 0.014). Service planners can use this information to weigh up the costs and benefits of the two approaches.

  9. [Study on the quantitative estimation method for VOCs emission from petrochemical storage tanks based on tanks 4.0.9d model].

    Science.gov (United States)

    Li, Jing; Wang, Min-Yan; Zhang, Jian; He, Wan-Qing; Nie, Lei; Shao, Xia

    2013-12-01

    VOCs emission from petrochemical storage tanks is one of the important emission sources in the petrochemical industry. In order to find out the VOCs emission amount of petrochemical storage tanks, Tanks 4.0.9d model is utilized to calculate the VOCs emission from different kinds of storage tanks. VOCs emissions from a horizontal tank, a vertical fixed roof tank, an internal floating roof tank and an external floating roof tank were calculated as an example. The consideration of the site meteorological information, the sealing information, the tank content information and unit conversion by using Tanks 4.0.9d model in China was also discussed. Tanks 4.0.9d model can be used to estimate VOCs emissions from petrochemical storage tanks in China as a simple and highly accurate method.

  10. Quantitative easing

    OpenAIRE

    Faustino, Rui Alexandre Rodrigues Veloso

    2012-01-01

    A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics Since November 2008, the Federal Reserve of the United States pursued a series of large-scale asset purchases, known as Quantitative Easing. In this Work Project, I describe the context, the objectives and the implementation of the Quantitative Easing. Additionally, I discuss its expected effects. Finally, I present empirical evidence of the ...

  11. Environmental contamination with Toxocara eggs: a quantitative approach to estimate the relative contributions of dogs, cats and foxes, and to assess the efficacy of advised interventions in dogs.

    Science.gov (United States)

    Nijsse, Rolf; Mughini-Gras, Lapo; Wagenaar, Jaap A; Franssen, Frits; Ploeger, Harm W

    2015-07-28

    Environmental contamination with Toxocara eggs is considered the main source of human toxocariasis. The contribution of different groups of hosts to this contamination is largely unknown. Current deworming advices focus mainly on dogs. However, controversy exists about blind deworming regimens for >6-month-old dogs, as most of them do not actually shed Toxocara eggs. We aim to estimate the contribution of different non-juvenile hosts to the environmental Toxocara egg contamination and to assess the effects of different Toxocara-reducing interventions for dogs. A stochastic model was developed to quantify the relative contribution to the environmental contamination with Toxocara eggs of household dogs, household cats, stray cats, and foxes, all older than 6 months in areas with varying urbanization degrees. The model was built upon an existing model developed by Morgan et al. (2013). We used both original and published data on host density, prevalence and intensity of infection, coprophagic behaviour, faeces disposal by owners, and cats' outdoor access. Scenario analyses were performed to assess the expected reduction in dogs' egg output according to different deworming regimens and faeces clean-up compliances. Estimates referred to the Netherlands, a country free of stray dogs. Household dogs accounted for 39% of the overall egg output of >6-month-old hosts in the Netherlands, followed by stray cats (27%), household cats (19%), and foxes (15%). In urban areas, egg output was dominated by stray cats (81%). Intervention scenarios revealed that only with a high compliance (90%) to the four times a year deworming advice, dogs' contribution would drop from 39 to 28%. Alternatively, when 50% of owners would always remove their dogs' faeces, dogs' contribution would drop to 20%. Among final hosts of Toxocara older than 6 months, dogs are the main contributors to the environmental egg contamination, though cats in total (i.e. both owned and stray) transcend this

  12. Quantitative estimates of vascularity in a collagen-based cell scaffold containing basic fibroblast growth factor by non-invasive near-infrared spectroscopy for regenerative medicine

    Science.gov (United States)

    Kushibiki, Toshihiro; Awazu, Kunio

    2008-04-01

    Successful tissue regeneration required both cells with high proliferative and differentiation potential and an environment permissive for regeneration. These conditions can be achieved by providing cell scaffolds and growth factors that induce angiogenesis and cell proliferation. Angiogenenis within cell scaffolds is typically determined by histological examination with immunohistochemical markers for endothelium. Unfortunately, this approach requires removal of tissue and the scaffold. In this study, we examined the hemoglobin content of implanted collagen-based cell scaffolds containing basic fibroblast growth factor (bFGF) in vivo by non-invasive near infrared spectroscopy (NIRS). We also compared the hemoglobin levels measured by NIRS to the hemoglobin content measured with a conventional biological assay. Non-invasive NIRS recordings were performed with a custom-built near-infrared spectrometer using light guide-coupled reflectance measurements. NIRS recordings revealed that absorbance increased after implantation of collagen scaffolds containing bFGF. This result correlated (R2=0.93) with our subsequent conventional hemoglobin assay. The NIRS technique provides a non-invasive method for measuring the degree of vascularization in cell scaffolds. This technique may be advantageous for monitoring angiogenesis within different cell scaffolds, a prerequisite for effective tissue regeneration.

  13. Development and validation of a sensitive liquid chromatography/mass spectrometry method for quantitation of flavopiridol in plasma enables accurate estimation of pharmacokinetic parameters with a clinically active dosing schedule.

    Science.gov (United States)

    Phelps, Mitch A; Rozewski, Darlene M; Johnston, Jeffrey S; Farley, Katherine L; Albanese, Katie A; Byrd, John C; Lin, Thomas S; Grever, Michael R; Dalton, James T

    2008-06-01

    A high-performance liquid chromatographic assay with tandem mass spectrometric detection was developed and validated for quantitation of the broad spectrum kinase inhibitor, flavopiridol, in human plasma. Sample preparation conditions included liquid-liquid extraction in acetonitrile (ACN), drying, and reconstitution in 20/80 water/ACN. Flavopiridol and the internal standard (IS), genistein, were separated by reversed phase chromatography using a C-18 column and a gradient of water with 25 mM ammonium formate and ACN. Electrospray ionization and detection of flavopiridol and genistein were accomplished with single reaction monitoring of m/z 402.09>341.02 and 271.09>152.90, respectively in positive-ion mode [M+H](+) on a triple quadrupole mass spectrometer. Recovery was greater than 90% throughout the linear range of 3-1000 nM. Replicate sample analysis indicated within- and between-run accuracy and precision to be less than 13% throughout the linear range. This method has the lowest lower limit of quantitation (LLOQ) reported to date for flavopiridol, and it allows for more accurate determination of terminal phase concentrations and improved pharmacokinetic parameter estimation in patients receiving an active dosing schedule of flavopiridol.

  14. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  15. Quantitative Estimation of Soil Carbon Sequestration in Three Land Use Types (Orchard, Paddy Rice and Forest in a Part of Ramsar Lands, Northern Iran

    Directory of Open Access Journals (Sweden)

    zakieh pahlavan yali

    2017-02-01

    Full Text Available Introduction: The increasing Greenhouse Gases in atmosphere is the main cause of climate and ecosystems changes. The most important greenhouse gas is CO2 that causes global warming or the greenhouse effect. One of the known solutions that reduces atmospheric carbon and helps to improve the situation, is carbon sequestration in vegetation cover and soil. Carbon sequestration refers to the change in atmospheric CO2 into organic carbon compounds by plants and capture it for a certain time . However, the ecosystems with different vegetation have Impressive Influence on soil carbon sequestration (SCS. Soil as the main component of these ecosystems is a world-wide indicator which has been known to play an important role in global balance of carbon sequestration. Furthermore, carbon sequestration can be a standard world trade and becomes guaranteed. Costs of transfer of CO2 (carbon transfer From the atmosphere into the soil based on the negative effects of increased CO2 on Weather is always increasing, This issue can be faced by developing countries to create a new industry, especially when conservation and restoration of rangeland to follow. This research was regarded due to estimation of SCS in three land use types (orchard, paddy rice and forest in a Part of Ramsar Lands, Northern Iran. Materials and Methods: Ramsar city with an area of about 729/7 km2 is located in the western part of Mazandaran province. Its height above sea level is 20 meters. Ramsar city is situated in a temperate and humid climate. Land area covered by forest, orchard and paddy rice. After field inspection of the area, detailed topographic maps of the specified zone on the study were also tested. In each of the three land types, 500 hectares in the every growing and totally 1,500 hectares as study area were selected .For evaluation the sequestration of carbon in different vegetation systems,15 soil profile selected and sampling from depth of 0 to 100 centimetres of each profile

  16. Quantitative Literacy.

    Science.gov (United States)

    Daniele, Vincent A.

    1993-01-01

    Quantitative literacy for students with deafness is addressed, noting work by the National Council of Teachers of Mathematics to establish curriculum standards for grades K-12. The standards stress problem solving, communication, reasoning, making mathematical connections, and the need for educators of the deaf to pursue mathematics literacy with…

  17. UPLC-ESI-MS/MS and HPTLC Method for Quantitative Estimation of Cytotoxic Glycosides and Aglycone in Bioactivity Guided Fractions of Solanum nigrum L.

    Directory of Open Access Journals (Sweden)

    Karishma Chester

    2017-07-01

    Full Text Available Solanum nigrum L., is traditionally used for the management of the various liver disorders. Investigating the effect of polarity based fractionation of S. nigrum for its hepatoprotective effect on Hep G2 cells in vitro to provide base of its activity by quantifying in steroidal glycosides responsible for hepatoprotective potential. A new UPLC-ESI-MS/MS method following a high performance thin layer chromatography (HPTLC has been developed and validated for quantification of steroidal glycosides and aglycone (solasonine, solamargine, and solasodine, respectively. The in vitro antioxidant potential, total phenolics, and flavonoid content were also determined in different fractions. The newly developed UPLC-ESI-MS/MS and HPTLC methods were linear (r2 ≥ 0.99, precise, accurate, and showing recovery more than 97%. The n-butanol enriched fraction of S. nigrum berries was found to be the most potent hepatoprotective fraction against all other fractions as it showed significantly (p < 0.01 better in vitro anti-oxidant potential than other fractions. Quantification by both methods revealed that, content of steroidal glycosides and aglycones are more than 20% in n-butanol fraction as compared to other fractions. The screened steroidal glycoside n-butanol enriched fraction underwent bioefficacy studies against D-galactosamine and H2O2 induced toxicity in HepG2 cell line showing significant (p < 0.05 liver protection. However, developed method can be used for the quality control analysis with respect to targeted metabolites and it can be explored for the pharmacokinetic and pharmacodynamic analysis in future.

  18. A statistical estimation approach for quantitative concentrations of compounds lacking authentic standards/surrogates based on linear correlations between directly measured detector responses and carbon number of different functional groups.

    Science.gov (United States)

    Kim, Yong-Hyun; Kim, Ki-Hyun

    2013-01-01

    A statistical approach was investigated to estimate the concentration of compounds lacking authentic standards/surrogates (CLASS). As a means to assess the reliability of this approach, the response factor (RF) of CLASS is derived by predictive equations based on a linear regression (LR) analysis between the actual RF (by external calibration) of 18 reference volatile organic compounds (VOCs) consisting of six original functional groups and their physicochemical parameters ((1) carbon number (CN), (2) molecular weight (MW), and (3) boiling point (BP)). If the experimental bias is estimated in terms of percent difference (PD) between the actual and projected RF, the least bias for 18 VOCs is found from CN (17.9 ± 19.0%). In contrast, the PD values against MW and BP are 40.6% and 81.5%, respectively. Predictive equations were hence derived via an LR analysis between the actual RF and CN for 29 groups: (1) one group consisting of all 18 reference VOCs, (2) three out of six original functional groups, and (3) 25 groups formed randomly from the six functional groups. The applicability of this method was tested by fitting these 29 equations into each of the six original functional groups. According to this approach, the mean PD for 18 compounds dropped as low as 5.60 ± 5.63%. This approach can thus be used as a practical tool to assess the quantitative data for CLASS.

  19. Estimating rice yield related traits and quantitative trait loci analysis under different nitrogen treatments using a simple tower-based field phenotyping system with modified single-lens reflex cameras

    Science.gov (United States)

    Naito, Hiroki; Ogawa, Satoshi; Valencia, Milton Orlando; Mohri, Hiroki; Urano, Yutaka; Hosoi, Fumiki; Shimizu, Yo; Chavez, Alba Lucia; Ishitani, Manabu; Selvaraj, Michael Gomez; Omasa, Kenji

    2017-03-01

    Application of field based high-throughput phenotyping (FB-HTP) methods for monitoring plant performance in real field conditions has a high potential to accelerate the breeding process. In this paper, we discuss the use of a simple tower based remote sensing platform using modified single-lens reflex cameras for phenotyping yield traits in rice under different nitrogen (N) treatments over three years. This tower based phenotyping platform has the advantages of simplicity, ease and stability in terms of introduction, maintenance and continual operation under field conditions. Out of six phenological stages of rice analyzed, the flowering stage was the most useful in the estimation of yield performance under field conditions. We found a high correlation between several vegetation indices (simple ratio (SR), normalized difference vegetation index (NDVI), transformed vegetation index (TVI), corrected transformed vegetation index (CTVI), soil-adjusted vegetation index (SAVI) and modified soil-adjusted vegetation index (MSAVI)) and multiple yield traits (panicle number, grain weight and shoot biomass) across a three trials. Among all of the indices studied, SR exhibited the best performance in regards to the estimation of grain weight (R2 = 0.80). Under our tower-based field phenotyping system (TBFPS), we identified quantitative trait loci (QTL) for yield related traits using a mapping population of chromosome segment substitution lines (CSSLs) and a single nucleotide polymorphism data set. Our findings suggest the TBFPS can be useful for the estimation of yield performance during early crop development. This can be a major opportunity for rice breeders whom desire high throughput phenotypic selection for yield performance traits.

  20. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  1. Quantitative Decision Making.

    Science.gov (United States)

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  2. Quantitative Risks

    Science.gov (United States)

    2015-02-24

    the review” to “what fraction of the critical requirements have been shown to have been met.” The MCE contains estimates of the recurring and non... recurring costs, for each variant in the Family of Vehicles, against a detailed standardized Ground System Architecture. The 19 Government...and maintainability requirements, (b) reliability (mean miles between system abort ) and maintainability (maintenance ratio, mean time to repair, max

  3. Quantitative Communication Research: Review, Trends, and Critique

    Directory of Open Access Journals (Sweden)

    Timothy R. Levine

    2013-01-01

    Full Text Available Trends in quantitative communication research are reviewed. A content analysis of 48 articles reporting original communication research published in 1988-1991 and 2008-2011 is reported. Survey research and self-report measurement remain common approaches to research. Null hypothesis significance testing remains the dominant approach to statistical analysis. Reporting the shapes of distributions, estimates of statistical power, and confidence intervals remain uncommon. Trends over time include the increased popularity of health communication and computer mediated communication as topics of research, and increased attention to mediator and moderator variables. The implications of these practices for scientific progress are critically discussed, and suggestions for the future are provided.

  4. Quantitative Computertomographie

    Directory of Open Access Journals (Sweden)

    Engelke K

    2002-01-01

    Full Text Available Die quantitative Computertomographie (QCT ist neben der Dual X-ray-Absorptiometry (DXA eine Standardmethode in der Osteodensitometrie. Wichtigste Meßorte, für die auch kommerzielle Lösungen existieren, sind die Lendenwirbelsäule (LWS und der distale Unterarm. Untersuchungen des Tibia- oder auch des Femurschaftes haben dagegen untergeordnete Bedeutung. Untersuchungen der LWS werden mit klinischen Ganzkörpertomographen durchgeführt. Dafür existieren spezielle Aufnahme- und Auswerteprotokolle. Für QCT-Messungen an peripheren Meßorten (pQCT, insbesondere am distalen Unterarm, wurden kompakte CT-Scanner entwickelt, die heute als Tischgeräte angeboten werden. Entscheidende Vorteile der QCT im Vergleich mit der DXA sind die exakte dreidimensionale Lokalisation des Meßvolumens, die isolierte Erfassung dieses Volumens ohne Überlagerung des umgebenden Gewebes und die Separation trabekulären und kortikalen Knochens. Mit QCT wird die Konzentration des Knochenmineralgehaltes innerhalb einer definierten Auswerteregion (ROI, region of interest bestimmt. Die Konzentration wird typischerweise als Knochenmineraldichte (BMD, bone mineral density bezeichnet und in g/cm3 angegeben. Dagegen wird mit dem projektiven Verfahren der DXA lediglich eine Flächenkonzentration in g/cm2 bestimmt, die in Analogie zur QCT als Flächendichte bezeichnet wird. Der Unterschied zwischen Dichte (QCT und Flächendichte (DXA wird aber in der Literatur meistens vernachlässigt.

  5. Quantitative Analysen

    Science.gov (United States)

    Hübner, Philipp

    Der heilige Gral jeglicher Analytik ist, den wahren Wert bestimmen zu können. Dies bedingt quantitative Messmethoden, welche in der molekularen Analytik nun seit einiger Zeit zur Verfügung stehen. Das generelle Problem bei der Quantifizierung ist, dass wir meistens den wahren Wert weder kennen noch bestimmen können! Aus diesem Grund behelfen wir uns mit Annäherungen an den wahren Wert, indem wir aus Laborvergleichsuntersuchungen den Median oder den (robusten) Mittelwert berechnen oder indem wir einen Erwartungswert (expected value) aufgrund der Herstellung des Probenmaterials berechnen. Bei diesen Versuchen der Annäherung an den wahren Wert findet beabsichtigterweise eine Normierung der Analytik statt, entweder nach dem demokratischen Prinzip, dass die Mehrheit bestimmt oder durch zur Verfügungsstellung von geeignetem zertifiziertem Referenzmaterial. Wir müssen uns folglich bewusst sein, dass durch dieses Vorgehen zwar garantiert wird, dass die Mehrheit der Analysenlaboratorien gleich misst, wir jedoch dabei nicht wissen, ob alle gleich gut oder allenfalls gleich schlecht messen.

  6. Magnetic Resonance Imaging Provides Added Value to the Prostate Cancer Prevention Trial Risk Calculator for Patients With Estimated Risk of High-grade Prostate Cancer Less Than or Equal to 10.

    Science.gov (United States)

    Kim, Eric H; Weaver, John K; Shetty, Anup S; Vetter, Joel M; Andriole, Gerald L; Strope, Seth A

    2017-04-01

    To determine the added value of prostate magnetic resonance imaging (MRI) to the Prostate Cancer Prevention Trial risk calculator. Between January 2012 and December 2015, 339 patients underwent prostate MRI prior to biopsy at our institution. MRI was considered positive if there was at least 1 Prostate Imaging Reporting and Data System 4 or 5 MRI suspicious region. Logistic regression was used to develop 2 models: biopsy outcome as a function of the (1) Prostate Cancer Prevention Trial risk calculator alone and (2) combined with MRI findings. When including all patients, the Prostate Cancer Prevention Trial with and without MRI models performed similarly (area under the curve [AUC] = 0.74 and 0.78, P = .06). When restricting the cohort to patients with estimated risk of high-grade (Gleason ≥7) prostate cancer ≤10%, the model with MRI outperformed the Prostate Cancer Prevention Trial alone model (AUC = 0.69 and 0.60, P = .01). Within this cohort of patients, there was no significant difference in discrimination between models for those with previous negative biopsy (AUC = 0.61 vs 0.63, P = .76), whereas there was a significant improvement in discrimination with the MRI model for biopsy-naïve patients (AUC = 0.72 vs 0.60, P = .01). The use of prostate MRI in addition to the Prostate Cancer Prevention Trial risk calculator provides a significant improvement in clinical risk discrimination for patients with estimated risk of high-grade (Gleason ≥7) prostate cancer ≤10%. Prebiopsy prostate MRI should be strongly considered for these patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Comparación de dos índices cuantitativos de estimación del estado de desarrollo de la alfalfa Comparison of two quantitative indexes for the estimation of alfalfa development stages

    Directory of Open Access Journals (Sweden)

    M.L. Bernáldez

    2006-12-01

    Full Text Available El estado de desarrollo de la alfalfa (Medicago sativa L. es una variable de estudio común en evaluaciones de cultivares, dada su relación con la composición química y la tasa de crecimiento de la pastura. La determinación de los índices cuantitativos "estado medio por conteo" y "estado medio por peso" (EMC y EMP respectivamente permite la descripción del estado de desarrollo en pasturas de alfalfa de una manera objetiva y reproducible. Los índices EMC y EMP describen igualmente el estado de desarrollo de la alfalfa cuando la pastura se encuentra próxima al momento de utilización recomendado en la práctica. La ventaja de estimar EMC en relación a EMP, se basa en la rapidez operativa que ofrece la generación de datos para el cálculo del primero.The developmental stage of alfalfa (Medicago sativa L. is an usual variable of study when evaluating cultivars because of its relationship with chemical composition and pasture growth rate. Determination of quantitative indexes such as "mean stage by count" and "mean stage by weight" (MSC and MSW respectively makes it possible to describe the developmental phenological stages of alfalfa pastures in a more objective and reproducible way. Likewise, both the MSC and MSW indexes, describe the developmental stages of alfalfa when the pasture is close to the recommended utilisation time in practice. The advantage of estimating MSC in relation to MSW is based on the higher operative efficiency offered by the former in data generation for its calculation.

  8. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  9. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  10. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  11. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  12. qfasar: quantitative fatty acid signature analysis with R

    Science.gov (United States)

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  13. Regularized Tyler's Scatter Estimator: Existence, Uniqueness, and Algorithms

    Science.gov (United States)

    Sun, Ying; Babu, Prabhu; Palomar, Daniel P.

    2014-10-01

    This paper considers the regularized Tyler's scatter estimator for elliptical distributions, which has received considerable attention recently. Various types of shrinkage Tyler's estimators have been proposed in the literature and proved work effectively in the "small n large p" scenario. Nevertheless, the existence and uniqueness properties of the estimators are not thoroughly studied, and in certain cases the algorithms may fail to converge. In this work, we provide a general result that analyzes the sufficient condition for the existence of a family of shrinkage Tyler's estimators, which quantitatively shows that regularization indeed reduces the number of required samples for estimation and the convergence of the algorithms for the estimators. For two specific shrinkage Tyler's estimators, we also proved that the condition is necessary and the estimator is unique. Finally, we show that the two estimators are actually equivalent. Numerical algorithms are also derived based on the majorization-minimization framework, under which the convergence is analyzed systematically.

  14. Use of quantitative-structure property relationship (QSPR) and artificial neural network (ANN) based approaches for estimating the octanol-water partition coefficients of the 209 chlorinated trans-azobenzene congeners.

    Science.gov (United States)

    Wilczyńska-Piliszek, Agata J; Piliszek, Sławomir; Falandysz, Jerzy

    2012-01-01

    Polychlorinated azobenzenes (PCABs) can be found as contaminant by products in 3,4-dichloroaniline and its derivatives and in the herbicides Diuron, Linuron, Methazole, Neburon, Propanil and SWEP. Trans congeners of PCABs are physically and chemically more stable and so are environmentally relevant, when compared to unstable cis congeners. In this study, to fulfill gaps on environmentally relevant partitioning properties of PCABs, the values of n-octanol/water partition coefficients (log K(OW)) have been determined for 209 congeners of chloro-trans-azobenzene (Ct-AB) by means of quantitative structure-property relationship (QSPR) approach and artificial neural networks (ANN) predictive ability. The QSPR methods used based on geometry optimalization and quantum-chemical structural descriptors, which were computed on the level of density functional theory (DFT) using B3LYP functional and 6-311++G basis set in Gaussian 03 and of the semi-empirical quantum chemistry method (PM6) of the molecular orbital package (MOPAC). Polychlorinated dibenzo-p-dioxins (PCDDs), -furans (PCDFs) and -biphenyls (PCBs), to which PCABs are related, were reference compounds in this study. An experimentally obtained data on physical and chemical properties of PCDD/Fs and PCBs were reference data for ANN predictions of log K(OW) values of Ct-ABs in this study. Both calculation methods gave similar results in term of absolute log K(OW) values, while the models generated by PM6 are considered highly efficient in time spent, when compared to these by DFT. The estimated log K(OW) values of 209 Ct-ABs varied between 5.22-5.57 and 5.45-5.60 for Mono-, 5.56-6.00 and 5.59-6.07 for Di-, 5.89-6.56 and 5.91-6.46 for Tri-, 6.10-7.05 and 6.13-6.80 for Tetra-, 6.43-7.39 and 6.48-7.14 for Penta-, 6.61-7.78 and 6.98-7.42 for Hexa-, 7.41-7.94 and 7.34-7.86 for Hepta-, 7.99-8.17 and 7.72-8.20 for Octa-, 8.35-8.42 and 8.10-8.62 for NonaCt-ABs, and 8.52-8.60 and 8.81-8.83 for DecaCt-AB. These log K(OW) values

  15. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  16. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  17. Compact, common path quantitative phase microscopic techniques ...

    Indian Academy of Sciences (India)

    2014-01-05

    Jan 5, 2014 ... Quantitative phase contrast techniques, which directly provide informa- tion about the phase of the object wavefront, can be used to quantitatively image the object under investigation. Typically, interferometric techniques are used for quantitative phase imaging. 2. Digital holographic microscopy. Holograms ...

  18. HCG blood test - quantitative

    Science.gov (United States)

    ... blood test - quantitative; Beta-HCG blood test - quantitative; Pregnancy test - blood - quantitative ... of a screening test for Down syndrome. This test is also done to diagnose abnormal conditions not related to pregnancy that can raise HCG level.

  19. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Science.gov (United States)

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  20. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  1. Sagebrush Biomass Estimation Using Terrestrial Laser Scanning

    Science.gov (United States)

    Olsoy, P.; Glenn, N. F.; Clark, P. E.; Spaete, L.; Mitchell, J.; Shrestha, R.

    2012-12-01

    LiDAR (Light Detection and Ranging) is a proven tool for inventory of many vegetation types. Airborne laser scanning (ALS) has been demonstrated for estimation of biomass of trees, but the relatively low number of laser points (1-10 m-2) typical of ALS datasets makes estimating biomass of shrubs and small stature vegetation challenging. This study uses terrestrial laser scanning (TLS) to estimate sagebrush biomass (Artemisia tridentata subsp. wyomingensis) by relating destructively sampled estimates to TLS-derived volumetric estimates. At close range, TLS can commonly provide in excess of 100,000 3-D points for a single sagebrush of approximately 1 m3 in volume. In this study, thirty sagebrush were scanned and destructively sampled at 6 sites within Reynolds Creek Experimental Watershed in southwestern Idaho, USA. The 3-D point cloud data are converted into 1-cm voxels to give quantitative estimates of shrub volume. The accuracy of the TLS-based metrics for estimating biomass are then compared to several traditional plot sampling methods including point-intercept and simple crown dimension measurements. The findings of this study are expected to provide guidance on methods for data collection and analysis such that biomass can be accurately estimated across plot-scales (e.g., 100 m x 100 m).

  2. Method of calibration of a fluorescence microscope for quantitative studies.

    Science.gov (United States)

    Kedziora, Katarzyna M; Prehn, Johen H M; Dobrucki, Jurek; Bernas, Tytus

    2011-10-01

    Confocal microscopy is based on measurement of intensity of fluorescence originating from a limited volume in the imaged specimen. The intensity is quantized in absolute (albeit arbitrary) units, producing a digital 3D micrograph. Thus, one may obtain quantitative information on local concentration of biomolecules in cells and tissues. This approach requires estimation of precision of light measurement (limited by noise) and conversion of the digital intensity units to absolute values of concentration (or number) of molecules of interest. To meet the first prerequisite we propose a technique for measurement of signal and noise. This method involves registration of a time series of images of any stationary microscope specimen. The analysis is a multistep process, which separates monotonic, periodic and random components of pixel intensity change. This approach permits simultaneous determination of dark and photonic components of noise. Consequently, confidence interval (total noise estimation) is obtained for every level of signal. The algorithm can also be applied to detect mechanical instability of a microscope and instability of illumination source. The presented technique is combined with a simple intensity standard to provide conversion of relative intensity units into their absolute counterparts (the second prerequisite of quantitative imaging). Moreover, photobleaching kinetics of the standard is used to estimate the power of light delivered to a microscope specimen. Thus, the proposed method provides in one step an absolute intensity calibration, estimate of precision and sensitivity of a microscope system. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  3. Evolution of quantitative traits in the wild: mind the ecology.

    Science.gov (United States)

    Pemberton, Josephine M

    2010-08-27

    Recent advances in the quantitative genetics of traits in wild animal populations have created new interest in whether natural selection, and genetic response to it, can be detected within long-term ecological studies. However, such studies have re-emphasized the fact that ecological heterogeneity can confound our ability to infer selection on genetic variation and detect a population's response to selection by conventional quantitative genetics approaches. Here, I highlight three manifestations of this issue: counter gradient variation, environmentally induced covariance between traits and the correlated effects of a fluctuating environment. These effects are symptomatic of the oversimplifications and strong assumptions of the breeder's equation when it is applied to natural populations. In addition, methods to assay genetic change in quantitative traits have overestimated the precision with which change can be measured. In the future, a more conservative approach to inferring quantitative genetic response to selection, or genomic approaches allowing the estimation of selection intensity and responses to selection at known quantitative trait loci, will provide a more precise view of evolution in ecological time.

  4. On nonparametric hazard estimation.

    Science.gov (United States)

    Hobbs, Brian P

    The Nelson-Aalen estimator provides the basis for the ubiquitous Kaplan-Meier estimator, and therefore is an essential tool for nonparametric survival analysis. This article reviews martingale theory and its role in demonstrating that the Nelson-Aalen estimator is uniformly consistent for estimating the cumulative hazard function for right-censored continuous time-to-failure data.

  5. On nonparametric hazard estimation

    OpenAIRE

    Hobbs, Brian P.

    2015-01-01

    The Nelson-Aalen estimator provides the basis for the ubiquitous Kaplan-Meier estimator, and therefore is an essential tool for nonparametric survival analysis. This article reviews martingale theory and its role in demonstrating that the Nelson-Aalen estimator is uniformly consistent for estimating the cumulative hazard function for right-censored continuous time-to-failure data.

  6. Generalized PSF modeling for optimized quantitation in PET imaging

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-ud-Din, Hassan; Karakatsanis, Nicolas A.; Jha, Abhinav K.; Casey, Michael E.; Kadrmas, Dan J.; Rahmim, Arman

    2017-06-01

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  7. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  8. Estimating constituent loads

    Science.gov (United States)

    Cohn, T.A.; DeLong, L.L.; Gilroy, E.J.; Hirsch, R.M.; Wells, D.K.

    1989-01-01

    This paper compares the bias and variance of three procedures that can be used with log linear regression models: the traditional rating curve estimator, a modified rating curve method, and a minimum variance unbiased estimator (MVUE). Analytical derivations of the bias and efficiency of all three estimators are presented. It is shown that for many conditions the traditional and the modified estimator can provide satisfactory estimates. However, other conditions exist where they have substantial bias and a large mean square error. These conditions commonly occur when sample sizes are small, or when loads are estimated during high-flow conditions. The MVUE, however, is unbiased and always performs nearly as well or better than the rating curve estimator or the modified estimator provided that the hypothesis of the log linear model is correct. Since an efficient unbiased estimator is available, there seems to be no reason to employ biased estimators. -from Authors

  9. PCA-based groupwise image registration for quantitative MRI

    NARCIS (Netherlands)

    Huizinga, W.; Poot, D. H J; Guyader, J. M.; Klaassen, R.; Coolen, B. F.; Van Kranenburg, M.; Van Geuns, R. J M; Uitterdijk, A.; Polfliet, M.; Vandemeulebroucke, J.; Leemans, A.; Niessen, W. J.; Klein, S.

    2016-01-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T1 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different

  10. EFSA Panel on Biological Hazards (BIOHAZ); Scientific Opinion on a quantitative estimation of the public health impact of setting a new target for the reduction of Salmonella in broilers

    DEFF Research Database (Denmark)

    Hald, Tine

    This assessment relates the percentage of broiler-associated human salmonellosis cases to different Salmonella prevalences in broiler flocks in the European Union. It considers the contribution and relevance of different Salmonella serovars found in broilers to human salmonellosis. The model......-SAM model) employes data from the EU Baseline Surveys and EU statutory monitoring on Salmonella in animal-food sources, data on incidence of human salmonellosis and food availability data. It is estimated that around 2.4%, 65%, 28% and 4.5% of the human salmonellosis cases are attributable to broilers......, laying hens (eggs), pigs and turkeys respectively. Of the broiler-associated human salmonellosis cases, around 42% and 23% are estimated to be due to the serovars Salmonella Enteritidis and Salmonella Infantis respectively, while other serovars individually contributed less than 5%. Different scenarios...

  11. Validity and reproducibility of a self-administered semi-quantitative food-frequency questionnaire for estimating usual daily fat, fibre, alcohol, caffeine and theobromine intakes among Belgian post-menopausal women.

    Science.gov (United States)

    Bolca, Selin; Huybrechts, Inge; Verschraegen, Mia; De Henauw, Stefaan; Van de Wiele, Tom

    2009-01-01

    A novel food-frequency questionnaire (FFQ) was developed and validated to assess the usual daily fat, saturated, mono-unsaturated and poly-unsaturated fatty acid, fibre, alcohol, caffeine, and theobromine intakes among Belgian post-menopausal women participating in dietary intervention trials with phyto-oestrogens. The relative validity of the FFQ was estimated by comparison with 7 day (d) estimated diet records (EDR, n 64) and its reproducibility was evaluated by repeated administrations 6 weeks apart (n 79). Although the questionnaire underestimated significantly all intakes compared to the 7 d EDR, it had a good ranking ability (r 0.47-0.94; weighted kappa 0.25-0.66) and it could reliably distinguish extreme intakes for all the estimated nutrients, except for saturated fatty acids. Furthermore, the correlation between repeated administrations was high (r 0.71-0.87) with a maximal misclassification of 7% (weighted kappa 0.33-0.80). In conclusion, these results compare favourably with those reported by others and indicate that the FFQ is a satisfactorily reliable and valid instrument for ranking individuals within this study population.

  12. Validity and Reproducibility of a Self-Administered Semi-Quantitative Food-Frequency Questionnaire for Estimating Usual Daily Fat, Fibre, Alcohol, Caffeine and Theobromine Intakes among Belgian Post-Menopausal Women

    Directory of Open Access Journals (Sweden)

    Selin Bolca

    2009-01-01

    Full Text Available A novel food-frequency questionnaire (FFQ was developed and validated to assess the usual daily fat, saturated, mono-unsaturated and poly-unsaturated fatty acid, fibre, alcohol, caffeine, and theobromine intakes among Belgian post-menopausal women participating in dietary intervention trials with phyto-oestrogens. The relative validity of the FFQ was estimated by comparison with 7 day (d estimated diet records (EDR, n 64 and its reproducibility was evaluated by repeated administrations 6 weeks apart (n 79. Although the questionnaire underestimated significantly all intakes compared to the 7 d EDR, it had a good ranking ability (r 0.47-0.94; weighted κ 0.25-0.66 and it could reliably distinguish extreme intakes for all the estimated nutrients, except for saturated fatty acids. Furthermore, the correlation between repeated administrations was high (r 0.71-0.87 with a maximal misclassification of 7% (weighted κ 0.33-0.80. In conclusion, these results compare favourably with those reported by others and indicate that the FFQ is a satisfactorily reliable and valid instrument for ranking individuals within this study population.

  13. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  14. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  15. [Structure and function of the cardiotocographic score (CTG-score) calculated by the "quantitative cardiotocography" computer method. Determining the significance of its components for the accuracy of the estimates for the ph of the fetus].

    Science.gov (United States)

    Ignatov, P; Atanasov, B

    2011-01-01

    In the last three years "quantitative cardiotocography" has become the main method for fetal monitoring during late pregnancy and birth in Sheynovo hospital - Sofia, Bulgaria. Our previous studies presented opportunities for increasing the diagnostic potential of the methodology. In this paper we offer a new approach to further improve the accuracy of prognostic values for fetal pH during labor. This is achieved by analyzing the individual components of the CTG-score (microfluctuation - OSZ, basic fetal heart rate - FRQ and decelerations - DEC). Several groups of CTG-scores have been formed, according to the composition of the score and the correlation between forecast and actual results for the pH of the fetus. For each of the stored 171 recordings we compared the CTG-score, produced prior to the delivery, with the pH measured in the umbilical artery (UA) before cutting the umbilical cord. As fetal pH forecast is based strictly on the CTG-score value, the difference between actual and prognostic results for the pH actually shows how accurate is the CTG score itself. We used standard deviation (Std. deviation) to assess this variability. We defined several groups of CTG-score based on its composition and the respective standard deviations. Each group includes CTG-scores with no significant statistical difference between the calculated standard deviations: CTG-score with low (composed of OSZ; Std. Dev. 0.065), satisfactory (composed of OSZ + FRQ and FRQ; Std. dev 0048 and 0044), high (composed of OSZ + DEC and DEC; Std. dev 0032 and 0027) and very high (composed of FRQ + DEC and OSZ + FRQ + DEC; Std. dev. 0019 and 0012) predictive value. We observed a substantial variety in the prognostic results, depending on which components of the CTG-score are involved in the evaluation of pH. The composition of the CTG-score seems to be crucial for the accuracy of the prognostic fetal pH values. In order to organize the gathered information it is necessary to develop clinical

  16. Mapping Mendelian Factors Underlying Quantitative Traits Using RFLP Linkage Maps

    Science.gov (United States)

    Lander, E. S.; Botstein, D.

    1989-01-01

    The advent of complete genetic linkage maps consisting of codominant DNA markers [typically restriction fragment length polymorphisms (RFLPs)] has stimulated interest in the systematic genetic dissection of discrete Mendelian factors underlying quantitative traits in experimental organisms. We describe here a set of analytical methods that modify and extend the classical theory for mapping such quantitative trait loci (QTLs). These include: (i) a method of identifying promising crosses for QTL mapping by exploiting a classical formula of SEWALL WRIGHT; (ii) a method (interval mapping) for exploiting the full power of RFLP linkage maps by adapting the approach of LOD score analysis used in human genetics, to obtain accurate estimates of the genetic location and phenotypic effect of QTLs; and (iii) a method (selective genotyping) that allows a substantial reduction in the number of progeny that need to be scored with the DNA markers. In addition to the exposition of the methods, explicit graphs are provided that allow experimental geneticists to estimate, in any particular case, the number of progeny required to map QTLs underlying a quantitative trait. PMID:2563713

  17. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  18. A quantitative approach to weighted Carleson condition

    Directory of Open Access Journals (Sweden)

    Rivera-Ríos Israel P.

    2017-01-01

    Full Text Available Quantitative versions of weighted estimates obtained by F. Ruiz and J.L. Torrea [30, 31] for the operator are obtained. As a consequence, some sufficient conditions for the boundedness of Min the two weight setting in the spirit of the results obtained by C. Pérez and E. Rela [26] and very recently by M. Lacey and S. Spencer [17] for the Hardy-Littlewood maximal operator are derived. As a byproduct some new quantitative estimates for the Poisson integral are obtained.

  19. Quantitative dispersion microscopy

    OpenAIRE

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Ramachandra R Dasari; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live...

  20. Quantitative estimation of diacerein in bulk and in capsule formulation using hydrotropic solubilizing agents by UV-spectrophotometry and the first order derivative using the area under curve method.

    Science.gov (United States)

    Pandey, Ramchandra; Patil, Pravin O; Patil, Manohar U; Deshmukh, Prashant K; Bari, Sanjay B

    2012-01-01

    This study was designed to develop and validate two simple, rapid, and economical UV-spectrophotometric and the first-order derivative methods using the area under curve method for estimation of diacerein in bulk and in capsule formulation. In this study, hydrotrophic solution of 8 M urea and 0.5 M potassium citrate were employed as the solubilizing agent to solubilize a poorly water-soluble drug, diacerein. In the UV-spectrophotometry method, two wavelengths 252.0 nm and 266.2 nm and in the first-order derivative spectrophotometric methods two wavelengths 259.4 nm and 274.2 nm in 8 M urea and two wavelengths 247.8 nm and 267.4 nm in the UV-spectrophotometry method and in the first-order derivative spectrophotometric methods two wavelengths 259.2 nm and 274.2 nm in 0.5 M potassium citrate were selected for determination of areas. Hydrotrophic agents used did not interfere in spectrophotometric analysis of diacerein. Diacerein followed linearity in the concentration range of 2-12 μg/mL with a coefficient correlation of 0.999 for both methods. The amount of drugs estimated by both proposed methods are in good accord with label claim. The % RSD value in recovery, precision, and ruggedness studies are found to be less than 2 indicate that the method is accurate, precise, and rugged.

  1. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  2. Quantitative analysis of 'calanchi

    Science.gov (United States)

    Agnesi, Valerio; Cappadonia, Chiara; Conoscenti, Christian; Costanzo, Dario; Rotigliano, Edoardo

    2010-05-01

    Three years (2006 - 2009) of monitoring data from two calanchi sites located in the western Sicilian Appennines are analyzed and discussed: the data comes from two networks of erosion pins and a rainfall gauge station. The aim of the present research is to quantitatively analyze the effects of erosion by water and to investigate their relationships with rainfall trends and specific properties of the two calanchi fronts. Each of the sites was equipped with a grid of randomly distributed erosion pins, made of 41 nodes for the "Catalfimo" site, and 13 nodes for the "Ottosalme" site (in light of the general homogeneity of its geomorphologic conditions); the erosion pins consist in 2 cm graded iron stakes, 100 cm long, with a section having a diameter of 1.6 cm. Repeated readings at the erosion pins allowed to estimate point topographic height variations; a total number of 21 surveys have been made remotely by acquiring high resolution photographs from a fixed view point. Since the two calanchi sites are very close each other (some hundred meters), a single rainfall gauge station was installed, assuming a strict climatic homogeneity of the investigated area. Rainfall data have been processed to derive the rain erosivity index signal, detecting a total number of 27 erosive events. Despite the close distance between the two sites, because of a different geologic setting, the calanchi fronts are characterized by the outcropping of different levels of the same formation (Terravecchia fm., Middle-Late Miocene); as a consequence, both mineralogical, textural and geotechnical (index) properties, as well as the topographic and geomorphologic characteristics, change. Therefore, in order to define the "framework" in which the two erosion pin grids have been installed, 40 samples of rock have been analyzed, and a geomorphologic detailed survey has been carried out; in particular, plasticity index, liquid limit, carbonate, pH, granulometric fractions and their mineralogic

  3. Towards alignment independent quantitative assessment of homology detection.

    Directory of Open Access Journals (Sweden)

    Avihay Apatoff

    Full Text Available Identification of homologous proteins provides a basis for protein annotation. Sequence alignment tools reliably identify homologs sharing high sequence similarity. However, identification of homologs that share low sequence similarity remains a challenge. Lowering the cutoff value could enable the identification of diverged homologs, but also introduces numerous false hits. Methods are being continuously developed to minimize this problem. Estimation of the fraction of homologs in a set of protein alignments can help in the assessment and development of such methods, and provides the users with intuitive quantitative assessment of protein alignment results. Herein, we present a computational approach that estimates the amount of homologs in a set of protein pairs. The method requires a prevalent and detectable protein feature that is conserved between homologs. By analyzing the feature prevalence in a set of pairwise protein alignments, the method can estimate the number of homolog pairs in the set independently of the alignments' quality. Using the HomoloGene database as a standard of truth, we implemented this approach in a proteome-wide analysis. The results revealed that this approach, which is independent of the alignments themselves, works well for estimating the number of homologous proteins in a wide range of homology values. In summary, the presented method can accompany homology searches and method development, provides validation to search results, and allows tuning of tools and methods.

  4. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  5. Electrical estimating methods

    CERN Document Server

    Del Pico, Wayne J

    2014-01-01

    Simplify the estimating process with the latest data, materials, and practices Electrical Estimating Methods, Fourth Edition is a comprehensive guide to estimating electrical costs, with data provided by leading construction database RS Means. The book covers the materials and processes encountered by the modern contractor, and provides all the information professionals need to make the most precise estimate. The fourth edition has been updated to reflect the changing materials, techniques, and practices in the field, and provides the most recent Means cost data available. The complexity of el

  6. Sensitivity analysis in quantitative microbial risk assessment.

    Science.gov (United States)

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  7. Quantitative evaluation of Alzheimer's disease

    Science.gov (United States)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  8. Use of demographic and quantitative admissions data to predict academic difficulty among professional physical therapist students.

    Science.gov (United States)

    Utzman, Ralph R; Riddle, Daniel L; Jewell, Dianne V

    2007-09-01

    The purpose of this study was to determine whether admissions data could be used to estimate physical therapist students' risk for academic difficulty. A nationally representative sample of 20 physical therapist education programs provided data on 3,582 students. Programs provided data regarding student demographic characteristics, undergraduate grade point average (uGPA), quantitative and verbal Graduate Record Examination scores (qGRE, vGRE), and academic difficulty. Data were analyzed using logistic regression. Rules for predicting risk of academic difficulty were developed. A prediction rule that included uGPA, vGRE, qGRE, age, and race or ethnicity was developed from the entire sample. Prediction rules for individual programs showed large variation. Undergraduate grade point average, GRE scores, age, and race or ethnicity can be useful for estimating student academic risk. Programs should calculate their own estimates of student risk. Academic programs should use risk estimates in combination with other data to recruit, admit, and retain students.

  9. Therapy Provider Phase Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Therapy Provider Phase Information dataset is a tool for providers to search by their National Provider Identifier (NPI) number to determine their phase for...

  10. Assessing the Reliability of Quantitative Imaging of Sm-153

    Science.gov (United States)

    Poh, Zijie; Dagan, Maáyan; Veldman, Jeanette; Trees, Brad

    2013-03-01

    Samarium-153 is used for palliation of and recently has been investigated for therapy for bone metastases. Patient specific dosing of Sm-153 is based on quantitative single-photon emission computed tomography (SPECT) and knowing the accuracy and precision of image-based estimates of the in vivo activity distribution. Physical phantom studies are useful for estimating these in simple objects, but do not model realistic activity distributions. We are using realistic Monte Carlo simulations combined with a realistic digital phantom modeling human anatomy to assess the accuracy and precision of Sm-153 SPECT. Preliminary data indicates that we can simulate projection images and reconstruct them with compensation for various physical image degrading factors, such as attenuation and scatter in the body as well as non-idealities in the imaging system, to provide realistic SPECT images.

  11. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    OpenAIRE

    Houde Dai; Pengyue Zhang; Lueth, Tim C

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor as...

  12. Estimation des quantités d'émissions azotées et des courbes de coût marginal d'épuration associées dans les secteurs et les régions du bassin d'un cours d'eau : une application pour le bassin rhénan

    OpenAIRE

    Saulnier, J.

    2008-01-01

    Dans cet article, nous nous intéressons aux questions liées à l'estimation des quantités d'émissions azotées et des courbes de coût marginal d'épuration dans le bassin d'un cours d'eau. L'application et les calculs empiriques sont réalisés pour les secteurs d'activité et les régions du bassin rhénan. Dans un premier temps, nous revenons sur les objectifs environnementaux formulés par la Commission Internationale pour la Protection du Rhin. L'intégration des contraintes de régulation existante...

  13. Quantitative diagnostics of stratospheric mixing

    Science.gov (United States)

    Sobel, Adam Harrison

    1998-12-01

    This thesis addresses the planetary-scale mixing of tracers along isentropic surfaces in the extratropical winter stratosphere. The primary goal is a more fully quantitative understanding of the mixing than is available at present. The general problem of representing eddy mixing in a one- dimensional mean representation of a two-dimensional flow is discussed. The limitations of the eddy diffusion model are reviewed, and alternatives explored. The stratosphere may, for some purposes, be viewed as consisting of relatively well-mixed regions separated by moving, internal transport barriers. Methods for diagnosing transport across moving surfaces, such as tracer isosurfaces, from given flow and tracer fields are reviewed. The central results of the thesis involve diagnostic studies of output from a shallow water model of the stratosphere. It is first proved that in an inviscid shallow water atmosphere subject to mass sources and sinks, if the mass enclosed by a potential vorticity (PV) contour is steady in time, then the integral of the mass source over the area enclosed by the contour must be zero. Next, two different approaches are used to diagnose the time-averaged transport across PV contours in the model simulations. The first is the modified Lagrangian mean (MLM) approach, which relates the transport across PV contours to PV sources and sinks. The second is called 'local gradient reversal' (LGR), and is similar to contour advection with surgery. The model includes a sixth-order hyperdiffusion on the vorticity field. Except in a thin outer 'entrainment zone', the hyperdiffusion term has only a very weak effect on the MLM mass budget of the polar vortex edge. In the entrainment zone, the hyperdiffusion term has a significant effect. The LGR results capture this behavior, providing good quantitative estimates of the hyperdiffusion term, which is equivalent to the degree of radiative disequilibrium at a PV contour. This agreement shows that the main role of the

  14. Parameter Estimation

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Heitzig, Martina; Cameron, Ian

    2011-01-01

    In this chapter the importance of parameter estimation in model development is illustrated through various applications related to reaction systems. In particular, rate constants in a reaction system are obtained through parameter estimation methods. These approaches often require the application...... of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set...... of algebraic equations as the basis for parameter estimation.These approaches are illustrated using estimations of kinetic constants from reaction system models....

  15. How to make 137Cs erosion estimation more useful: An uncertainty perspective

    Science.gov (United States)

    The cesium-137 technique has been widely used in the past 50 years to provide quantitative soil redistribution estimates at a point scale. Recently its usefulness has been challenged by a few researchers questioning the validity of the key assumption that the spatial distribution of fallout cesium-...

  16. Quantitative cardiac ultrasound

    NARCIS (Netherlands)

    H. Rijsterborgh (Hans)

    1990-01-01

    textabstractThis thesis is about the various aspects of quantitative cardiac ultrasound. The first four chapters are mainly devoted to the reproducibility of echocardiographic measurements. These . are focussed on the variation of echocardiographic measurements within patients. An important

  17. On Quantitative Rorschach Scales.

    Science.gov (United States)

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  18. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    Science.gov (United States)

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  19. Stereological estimation of the mean and variance of nuclear volume from vertical sections

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt

    1991-01-01

    The application of assumption-free, unbiased stereological techniques for estimation of the volume-weighted mean nuclear volume, nuclear vv, from vertical sections of benign and malignant nuclear aggregates in melanocytic skin tumours is described. Combining sampling of nuclei with uniform...... size variability within benign and malignant nuclear populations can for all practical purposes be reduced to 2-D measurement of nuclear profile areas. These new powerful stereological estimators of nuclear volume and nuclear size variability provide an attractive approach to quantitative...

  20. Quantitative physics tasks

    OpenAIRE

    Snětinová, Marie

    2015-01-01

    Title: Quantitative Physics Tasks Author: Mgr. Marie Snětinová Department: Department of Physics Education Supervisor of the doctoral thesis: doc. RNDr. Leoš Dvořák, CSc., Department of Physics Education Abstract: The doctoral thesis concerns with problem solving in physics, especially on students' attitudes to solving of quantitative physics tasks, and various methods how to develop students' problem solving skills in physics. It contains brief overview of the theoretical framework of proble...

  1. Evolutionary Quantitative Genomics of Populus trichocarpa.

    Directory of Open Access Journals (Sweden)

    Ilga Porth

    Full Text Available Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance was investigated for signatures of selection (comparing QST-FST using clustering of individuals by climate of origin (temperature and precipitation. 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation; 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes were associated with adaptive traits (based on significant QST. Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show

  2. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2017-10-04

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 The Authors. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  3. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Abstract. Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  4. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    L/stroke (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently......Duplex Vector Flow Imaging (VFI) imaging is introduced as a replacement for spectral Doppler, as it automatically can yield fully quantitative flow estimates without angle correction. Continuous VFI data over 9 s for 10 pulse cycles were acquired by a 3 MHz convex probe connected to the SARUS...... scanner for pulsating flow mimicking the femoral artery from a CompuFlow 1000 pump (Shelley Medical). Data were used in four estimators based on directional transverse oscillation for velocity, flow angle, volume flow, and turbulence estimation and their respective precisions. An adaptive lag scheme gave...

  5. Quantitative Luminescence Imaging System

    Energy Technology Data Exchange (ETDEWEB)

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  6. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety.

    Science.gov (United States)

    van Gerwen, S J; te Giffel, M C; van't Riet, K; Beumer, R R; Zwietering, M H

    2000-06-01

    This paper describes a system for the microbiological quantitative risk assessment for food products and their production processes. The system applies a stepwise risk assessment, allowing the main problems to be addressed before focusing on less important problems. First, risks are assessed broadly, using order of magnitude estimates. Characteristic numbers are used to quantitatively characterize microbial behaviour during the production process. These numbers help to highlight the major risk-determining phenomena, and to find negligible aspects. Second, the risk-determining phenomena are studied in more detail. Both general and/or specific models can be used for this and varying situations can be simulated to quantitatively describe the risk-determining phenomena. Third, even more detailed studies can be performed where necessary, for instance by using stochastic variables. The system for quantitative risk assessment has been implemented as a decision supporting expert system called SIEFE: Stepwise and Interactive Evaluation of Food safety by an Expert System. SIEFE performs bacterial risk assessments in a structured manner, using various information sources. Because all steps are transparent, every step can easily be scrutinized. In the current study the effectiveness of SIEFE is shown for a cheese spread. With this product, quantitative data concerning the major risk-determining factors were not completely available to carry out a full detailed assessment. However, this did not necessarily hamper adequate risk estimation. Using ranges of values instead helped identifying the quantitatively most important parameters and the magnitude of their impact. This example shows that SIEFE provides quantitative insights into production processes and their risk-determining factors to both risk assessors and decision makers, and highlights critical gaps in knowledge.

  7. Characterization of persistent postoperative pain by quantitative sensory testing

    DEFF Research Database (Denmark)

    Werner, Mads U.; Kehlet, Henrik

    2010-01-01

    Postoperative pain remains inadequately treated, and it has been estimated that 5-10% undergoing surgery will develop moderate to severe persistent pain leading to chronic physical disability and psychosocial distress. Quantitative sensory testing (QST) is a graded, standardized activation...

  8. Short Course Introduction to Quantitative Mineral Resource Assessments

    Science.gov (United States)

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  9. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    Science.gov (United States)

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  10. Quantitative susceptibility mapping of small objects using volume constraints.

    Science.gov (United States)

    Liu, Saifeng; Neelavalli, Jaladhar; Cheng, Yu-Chung N; Tang, Jin; Mark Haacke, E

    2013-03-01

    Microbleeds have been implicated to play a role in many neurovascular and neurodegenerative diseases. The diameter of each microbleed has been used previously as a possible quantitative measure for grading microbleeds. We propose that magnetic susceptibility provides a new quantitative measure of extravasated blood. Recently, a Fourier-based method has been used that allows susceptibility quantification from phase images for any arbitrarily shaped structures. However, when very small objects, such as microbleeds, are considered, the accuracy of this susceptibility mapping method still remains to be evaluated. In this article, air bubbles and glass beads are taken as microbleed surrogates to evaluate the quantitative accuracy of the susceptibility mapping method. We show that when an object occupies only a few voxels, an estimate of the true volume of the object is necessary for accurate susceptibility quantification. Remnant errors in the quantified susceptibilities and their sources are evaluated. We show that quantifying magnetic moment, rather than the susceptibility of these small structures, may be a better and more robust alternative. Copyright © 2012 Wiley Periodicals, Inc.

  11. [An allelism test for quantitative trait genes].

    Science.gov (United States)

    Smiriaev, A V

    2011-04-01

    Analytical modeling has been used to test assumptions on the mode of inheritance of a quantitative trait in the course of diallel crossing between pure strains that are sufficient for adequacy of a simple regression model. This model frequently proved to be adequate in analysis of numerous data on diallel crossings of wheat and maize. An allelism test for quantitative trait genes has been suggested. Computer simulation has been used to estimate the effect of random experimental errors and deviations from the suggested model.

  12. Preferred provider organizations.

    Science.gov (United States)

    Davy, J D

    1984-05-01

    The 1980s has marked the beginning of a new alternative health care delivery system: the preferred provider organization ( PPO ). This system has developed from the health maintenance organization model and is predominant in California and Colorado. A PPO is a group of providers, usually hospitals and doctors, who agree to provide health care to subscribers for a negotiated fee that is usually discounted. Preferred provider organizations are subject to peer review and strict use controls in exchange for a consistent volume of patients and speedy turnaround on claims payments. This article describes the factors leading to the development of PPOs and the implications for occupational therapy.

  13. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between...... climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...

  14. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  15. Estimation of hydrologic properties of an unsaturated, fractured rock mass

    Energy Technology Data Exchange (ETDEWEB)

    Klavetter, E.A.; Peters, R.R.

    1986-07-01

    In this document, two distinctly different approaches are used to develop continuum models to evaluate water movement in a fractured rock mass. Both models provide methods for estimating rock-mass hydrologic properties. Comparisons made over a range of different tuff properties show good qualitative and quantitative agreement between estimates of rock-mass hydrologic properties made by the two models. This document presents a general discussion of: (1) the hydrology of Yucca Mountain, and the conceptual hydrological model currently being used for the Yucca Mountain site, (2) the development of two models that may be used to estimate the hydrologic properties of a fractured, porous rock mass, and (3) a comparison of the hydrologic properties estimated by these two models. Although the models were developed in response to hydrologic characterization requirements at Yucca Mountain, they can be applied to water movement in any fractured rock mass that satisfies the given assumptions.

  16. Adaptive Spectral Doppler Estimation

    DEFF Research Database (Denmark)

    Gran, Fredrik; Jakobsson, Andreas; Jensen, Jørgen Arendt

    2009-01-01

    In this paper, 2 adaptive spectral estimation techniques are analyzed for spectral Doppler ultrasound. The purpose is to minimize the observation window needed to estimate the spectrogram to provide a better temporal resolution and gain more flexibility when designing the data acquisition sequence...

  17. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  18. Building Service Provider Capabilities

    DEFF Research Database (Denmark)

    Brandl, Kristin; Jaura, Manya; Ørberg Jensen, Peter D.

    In this paper we study whether and how the interaction between clients and the service providers contributes to the development of capabilities in service provider firms. In situations where such a contribution occurs, we analyze how different types of activities in the production process...

  19. Providing free autopoweroff plugs

    DEFF Research Database (Denmark)

    Jensen, Carsten Lynge; Hansen, Lars Gårn; Fjordbak, Troels

    2012-01-01

    Experimental evidence of the effect of providing households with cheap energy saving technology is sparse. We present results from a field experiment in which autopoweroff plugs were provided free of charge to randomly selected households. We use propensity score matching to find treatment effects...

  20. A simple method for unbiased quantitation of adoptively transferred cells in solid tissues

    DEFF Research Database (Denmark)

    Petersen, Mikkel; Petersen, Charlotte Christie; Agger, Ralf

    2006-01-01

    In a mouse model, we demonstrate how to obtain a direct, unbiased estimate of the total number of adoptively transferred cells in a variety of organs at different time points. The estimate is obtained by a straightforward method based on the optical fractionator principle. Specifically, non......-stimulated C57BL/6J mouse splenocytes were labelled with carboxyfluorescein diacetate succinimidyl ester (CFSE) and adoptively transferred to normal C57BL/6J mice by intravenous injection. The total number of CFSE-positive cells was subsequently determined in lung, spleen, liver, kidney, and inguinal lymph...... node at six different time points following adoptive transfer (from 60 s to 1 week), providing a quantitative estimate of the organ distribution of the transferred cells over time. These estimates were obtained by microscopy of uniform samples of thick sections from the respective organs. Importantly...

  1. High-Frequency Quantitative Ultrasound Imaging of Cancerous Lymph Nodes

    Science.gov (United States)

    Mamou, Jonathan; Coron, Alain; Hata, Masaki; Machi, Junji; Yanagihara, Eugene; Laugier, Pascal; Feleppa, Ernest J.

    2009-07-01

    High-frequency ultrasound (HFU) offers a means of investigating biological tissue at the microscopic level. High-frequency, quantitative-ultrasound (QUS) methods were developed to characterize freshly-dissected lymph nodes of cancer patients. Three-dimensional (3D) ultrasound data were acquired from lymph nodes using a 25.6-MHz center-frequency transducer. Each node was inked prior to 3D histological fixation to recover orientation after sectioning. Backscattered echo signals were processed to yield two QUS estimates associated with tissue microstructure: scatterer size and acoustic concentration. The QUS estimates were computed following established methods using a Gaussian scattering model. Four lymph nodes from a patient with stage-3 colon cancer were evaluated as an illustrative case. QUS images were generated for this patient by expressing QUS estimates as color-encoded pixels and overlaying them on conventional gray-scale B-mode images. The single metastatic node had an average scatterer size that was significantly larger than the average scatterer size of the other nodes, and the statistics of both QUS estimates in the metastatic node showed greater variance than the statistics of the other nodes. Results indicate that the methods may provide a useful means of identifying small metastatic foci in dissected lymph nodes that might not be detectable using current standard pathology procedures.

  2. Quantitation and localization of pospiviroids in aphids.

    Science.gov (United States)

    Van Bogaert, N; De Jonghe, K; Van Damme, E J M; Maes, M; Smagghe, G

    2015-01-01

    In this paper, the potential role of aphids in viroid transmission was explored. Apterous aphids were fed on pospiviroid-infected plants and viroid targets in the aphids were consequently quantified through RT-qPCR and localized within the aphid body using fluorescence in situ hybridization (FISH). Based on the analytical sensitivity test, the limit of detection (LOD) was estimated at 1.69×10(6) viroid copies per individual aphid body. To localize the viroids in the aphids, a pospiviroid-generic Cy5-labelled probe was used and the fluorescent signal was determined by confocal microscopy. Viroids were clearly observed in the aphid's stylet and stomach, but not in the embryos. Viroids were detected in 29% of the aphids after a 24h feeding period, which suggests only a partial and low concentration viroid uptake by the aphid population including viroid concentrations under the LOD. However, these results show that viroids can be ingested by aphids while feeding on infected plants, thus potentially increasing the transmission risk. The combination of FISH and RT-qPCR provides reliable and fast localization and quantitation of viroid targets in individual aphids and thus constitutes a valuable tool in future epidemiological research. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Quantitative estimation of biogas produced from the leaves and stem ...

    African Journals Online (AJOL)

    The concept of using aquatic plants for the production of energy (methane) is gaining attention in tropical and sub-tropical regions of the world, where warm climate is connected to the plant growth throughout the year. This research ... Keywords: Water hyacinth, biomass, biogas, methane, anaerobic digestion. International ...

  4. A quantitative framework for estimating water resources in India

    Digital Repository Service at National Institute of Oceanography (India)

    Shankar, D.; Kotamraju, V.; Shetye, S.R.

    it to the hydrology of the Mandovi river on the western slopes of the Sahyadris; it is typical of the rivers along the Indian west coast. Most of the catchment area of the river is in Goa, but parts of the river also flow through Karnataka and Maharashtra. We use a...

  5. Estimating the quantitative relation between incongruent information and response time.

    Science.gov (United States)

    Kerzel, Dirk; Weigelt, Matthias; Bosbach, Simone

    2006-07-01

    In Eriksen's flanker paradigm, participants' responses are slower and more error-prone when task-relevant and simultaneously available task-irrelevant cues are incongruent. The influence of task-irrelevant information decreases as its distance from the task-relevant information increases. Here, we manipulated the quantity of task-irrelevant information while keeping the distance constant. We asked whether when the impact on response selection processes was stronger the more incongruent information was available, or whether the impact on response selection depended only on its presence or absence. We conducted an experiment, in which subjects had to discriminate the direction of motion of a central point-light-walker that was flanked by two, four, or eight point-light-walkers at an equal distance from the center. The experiment showed that reaction times increased with the number of incongruent walkers. This effect was modulated by the total number of walkers, showing that the effect of incongruent information saturates when the display is cluttered.

  6. Quantitative estimation of biogas produced from the leaves and stem ...

    African Journals Online (AJOL)

    The concept of using aquatic plants for the production of energy (methane) is gaining attention in tropical and sub-tropical regions of the world, where warm climate is connected to the plant growth throughout the year. This research work investigated the overall quantity of biogas produced by the leaves, stem and the ...

  7. 611 A QUANTITATIVE ESTIMATE OF WEEDS OF SUGARCANE ...

    African Journals Online (AJOL)

    Osondu

    References. Akobundu I.O. and Agyakwa C.W. (1998), A. Hand book of West African Weeds. IITA,. Ibadan- Nigeria. 521pp. Anonymous (2008), Ecology and Simpson's. Diversity Index : advanced applied science: GCE. A2 UNITS: 1- 9. Blackshaw, R.E. (1994), Rotation affects downy brome (Bromus tectorum) in winter wheat.

  8. Antioxidant activity and quantitative estimation of azadirachtin and ...

    African Journals Online (AJOL)

    The leaf and bark fraction extracts of Azadirachta indica A. Juss. (neem) grown in the foothills (subtropical region) of Nepal were evaluated for their antioxidant activity, total phenolic (TP) and total flavonid (TF) contents. HPLC method was employed to quantify the amount of azadirachtin and nimbin present in the seed, leaf ...

  9. Quantitative estimates of the surface habitability of Kepler-452b

    Science.gov (United States)

    Silva, Laura; Vladilo, Giovanni; Murante, Giuseppe; Provenzale, Antonello

    2017-09-01

    Kepler-452b is currently the best example of an Earth-size planet in the habitable zone of a sun-like star, a type of planet whose number of detections is expected to increase in the future. Searching for biosignatures in the supposedly thin atmospheres of these planets is a challenging goal that requires a careful selection of the targets. Under the assumption of a rocky-dominated nature for Kepler-452b, we considered it as a test case to calculate a temperature-dependent habitability index, h050, designed to maximize the potential presence of biosignature-producing activity. The surface temperature has been computed for a broad range of climate factors using a climate model designed for terrestrial-type exoplanets. After fixing the planetary data according to the experimental results, we changed the surface gravity, CO2 abundance, surface pressure, orbital eccentricity, rotation period, axis obliquity and ocean fraction within the range of validity of our model. For most choices of parameters, we find habitable solutions with h050 > 0.2 only for CO2 partial pressure p_CO_2 ≲ 0.04 bar. At this limiting value of CO2 abundance, the planet is still habitable if the total pressure is p ≲ 2 bar. In all cases, the habitability drops for eccentricity e ≳ 0.3. Changes of rotation period and obliquity affect the habitability through their impact on the equator-pole temperature difference rather than on the mean global temperature. We calculated the variation of h050 resulting from the luminosity evolution of the host star for a wide range of input parameters. Only a small combination of parameters yields habitability-weighted lifetimes ≳2 Gyr, sufficiently long to develop atmospheric biosignatures still detectable at the present time.

  10. Antioxidant activity and quantitative estimation of azadirachtin and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-07-06

    Jul 6, 2009 ... HPLC method was employed to quantify the amount of azadirachtin and nimbin present in the seed, leaf and the bark extracts .... rose gels and visualized using ethidium bromide staining. Fe2+ chelating activity ..... weaker polyphenol content (66.63 µg/mg) among the sol- vents used. Likewise, the butanol ...

  11. Imperfect Channel State Estimation

    Directory of Open Access Journals (Sweden)

    Tao Qin

    2010-01-01

    in a multiuser OFDM CR system. A simple back-off scheme is proposed, and simulation results are provided which show that the proposed scheme is very effective in mitigating the negative impact of channel estimation errors.

  12. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  13. Quantitative Management in Libraries

    Science.gov (United States)

    Heinritz, Fred J.

    1970-01-01

    Based on a position paper orginally presented at the Institute on Quantitative Methods in Librarianship at Ohio State University Libraries in August, 1969, this discusses some of the elements of management: motion, time and cost studies, operations research and other mathematical techniques, and data processing equipment. (Author)

  14. Estimating directional epistasis

    Science.gov (United States)

    Le Rouzic, Arnaud

    2014-01-01

    Epistasis, i.e., the fact that gene effects depend on the genetic background, is a direct consequence of the complexity of genetic architectures. Despite this, most of the models used in evolutionary and quantitative genetics pay scant attention to genetic interactions. For instance, the traditional decomposition of genetic effects models epistasis as noise around the evolutionarily-relevant additive effects. Such an approach is only valid if it is assumed that there is no general pattern among interactions—a highly speculative scenario. Systematic interactions generate directional epistasis, which has major evolutionary consequences. In spite of its importance, directional epistasis is rarely measured or reported by quantitative geneticists, not only because its relevance is generally ignored, but also due to the lack of simple, operational, and accessible methods for its estimation. This paper describes conceptual and statistical tools that can be used to estimate directional epistasis from various kinds of data, including QTL mapping results, phenotype measurements in mutants, and artificial selection responses. As an illustration, I measured directional epistasis from a real-life example. I then discuss the interpretation of the estimates, showing how they can be used to draw meaningful biological inferences. PMID:25071828

  15. Health service providers in Somalia: their readiness to provide malaria case-management

    Directory of Open Access Journals (Sweden)

    Moonen Bruno

    2009-05-01

    Full Text Available Abstract Background Studies have highlighted the inadequacies of the public health sector in sub-Saharan African countries in providing appropriate malaria case management. The readiness of the public health sector to provide malaria case-management in Somalia, a country where there has been no functioning central government for almost two decades, was investigated. Methods Three districts were purposively sampled in each of the two self-declared states of Puntland and Somaliland and the south-central region of Somalia, in April-November 2007. A survey and mapping of all public and private health service providers was undertaken. Information was recorded on services provided, types of anti-malarial drugs used and stock, numbers and qualifications of staff, sources of financial support and presence of malaria diagnostic services, new treatment guidelines and job aides for malaria case-management. All settlements were mapped and a semi-quantitative approach was used to estimate their population size. Distances from settlements to public health services were computed. Results There were 45 public health facilities, 227 public health professionals, and 194 private pharmacies for approximately 0.6 million people in the three districts. The median distance to public health facilities was 6 km. 62.3% of public health facilities prescribed the nationally recommended anti-malarial drug and 37.7% prescribed chloroquine as first-line therapy. 66.7% of public facilities did not have in stock the recommended first-line malaria therapy. Diagnosis of malaria using rapid diagnostic tests (RDT or microscopy was performed routinely in over 90% of the recommended public facilities but only 50% of these had RDT in stock at the time of survey. National treatment guidelines were available in 31.3% of public health facilities recommended by the national strategy. Only 8.8% of the private pharmacies prescribed artesunate plus sulphadoxine/pyrimethamine, while 53

  16. Peptide Selection for Targeted Protein Quantitation.

    Science.gov (United States)

    Chiva, Cristina; Sabidó, Eduard

    2017-03-03

    Targeted proteomics methods in their different flavors rely on the use of a few peptides as proxies for protein quantitation, which need to be specified either prior to or after data acquisition. However, in contrast with discovery methods that use all identified peptides for a given protein to estimate its abundance, targeted proteomics methods are limited in the number of peptides that are used for protein quantitation. Because only a few peptides per protein are acquired or extracted in targeted experiments, the selection of peptides that are used for targeted protein quantitation becomes crucial. Several rules have been proposed to guide peptide selection for targeted proteomics studies, which have generally been based on the amino acidic composition of the peptide sequences. However, the compliance of these rules does not imply that not-conformed peptides are not reproducibly generated nor do they guarantee that the selected peptides correctly represent the behavior of the protein abundance under different conditions.

  17. Providing driving rain data for hygrothermal calculations

    DEFF Research Database (Denmark)

    Kragh, Mikkel Kristian

    1996-01-01

    Due to a wish for driving rain data as input for hygrothermal calculations, this report deals with utilizing commonly applied empirical relations and standard meteorological data, in an attempt to provide realistic estimates rather than exact correlations.......Due to a wish for driving rain data as input for hygrothermal calculations, this report deals with utilizing commonly applied empirical relations and standard meteorological data, in an attempt to provide realistic estimates rather than exact correlations....

  18. Power system state estimation

    CERN Document Server

    Ahmad, Mukhtar

    2012-01-01

    State estimation is one of the most important functions in power system operation and control. This area is concerned with the overall monitoring, control, and contingency evaluation of power systems. It is mainly aimed at providing a reliable estimate of system voltages. State estimator information flows to control centers, where critical decisions are made concerning power system design and operations. This valuable resource provides thorough coverage of this area, helping professionals overcome challenges involving system quality, reliability, security, stability, and economy.Engineers are

  19. Targeted quantitation of proteins by mass spectrometry.

    Science.gov (United States)

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  20. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  1. Transverse spectral velocity estimation.

    Science.gov (United States)

    Jensen, Jørgen

    2014-11-01

    A transverse oscillation (TO)-based method for calculating the velocity spectrum for fully transverse flow is described. Current methods yield the mean velocity at one position, whereas the new method reveals the transverse velocity spectrum as a function of time at one spatial location. A convex array probe is used along with two different estimators based on the correlation of the received signal. They can estimate the velocity spectrum as a function of time as for ordinary spectrograms, but they also work at a beam-to-flow angle of 90°. The approach is validated using simulations of pulsatile flow using the Womersly-Evans flow model. The relative bias of the mean estimated frequency is 13.6% and the mean relative standard deviation is 14.3% at 90°, where a traditional estimator yields zero velocity. Measurements have been conducted with an experimental scanner and a convex array transducer. A pump generated artificial femoral and carotid artery flow in the phantom. The estimated spectra degrade when the angle is different from 90°, but are usable down to 60° to 70°. Below this angle the traditional spectrum is best and should be used. The conventional approach can automatically be corrected for angles from 0° to 70° to give fully quantitative velocity spectra without operator intervention.

  2. ESTIMATING WELFARE IN INSURANCE MARKETS USING VARIATION IN PRICES*

    Science.gov (United States)

    Einav, Liran; Finkelstein, Amy; Cullen, Mark R.

    2009-01-01

    We provide a graphical illustration of how standard consumer and producer theory can be used to quantify the welfare loss associated with inefficient pricing in insurance markets with selection. We then show how this welfare loss can be estimated empirically using identifying variation in the price of insurance. Such variation, together with quantity data, allows us to estimate the demand for insurance. The same variation, together with cost data, allows us to estimate how insurer’s costs vary as market participants endogenously respond to price. The slope of this estimated cost curve provides a direct test for both the existence and nature of selection, and the combination of demand and cost curves can be used to estimate welfare. We illustrate our approach by applying it to data on employer-provided health insurance from one specific company. We detect adverse selection but estimate that the quantitative welfare implications associated with inefficient pricing in our particular application are small, in both absolute and relative terms. PMID:21218182

  3. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  4. The Provident Principal.

    Science.gov (United States)

    McCall, John R.

    This monograph offers leadership approaches for school principals. Discussion applies the business leadership theory of Warren Bennis and Burt Nanus to the role of the principal. Each of the booklet's three parts concludes with discussion questions. Part 1, "Visions and Values for the Provident Principal," demonstrates the importance of…

  5. What HERA may provide?

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hannes [DESY, Hamburg (Germany); De Roeck, Albert [CERN, Genf (Switzerland); Bartles, Jochen [Univ. Hamburg (DE). Institut fuer Theoretische Physik II] (and others)

    2008-09-15

    More than 100 people participated in a discussion session at the DIS08 workshop on the topic What HERA may provide. A summary of the discussion with a structured outlook and list of desirable measurements and theory calculations is given. (orig.)

  6. care Providers in Ibadan

    African Journals Online (AJOL)

    Three hundred and eighty six respondents (77.7%) were aware of intermittent preventive treatment (IPT). Awareness ... Key Words: malaria in pregnancy, intermittent preventive treatment, malaria control, health care providers. Department of Obstetrics .... Auxiliary nurses do not have formal training prior to employment.

  7. Estimating abundance: Chapter 27

    Science.gov (United States)

    Royle, J. Andrew

    2016-01-01

    This chapter provides a non-technical overview of ‘closed population capture–recapture’ models, a class of well-established models that are widely applied in ecology, such as removal sampling, covariate models, and distance sampling. These methods are regularly adopted for studies of reptiles, in order to estimate abundance from counts of marked individuals while accounting for imperfect detection. Thus, the chapter describes some classic closed population models for estimating abundance, with considerations for some recent extensions that provide a spatial context for the estimation of abundance, and therefore density. Finally, the chapter suggests some software for use in data analysis, such as the Windows-based program MARK, and provides an example of estimating abundance and density of reptiles using an artificial cover object survey of Slow Worms (Anguis fragilis).

  8. Biomarkers and Surrogate Endpoints in Uveitis: The Impact of Quantitative Imaging.

    Science.gov (United States)

    Denniston, Alastair K; Keane, Pearse A; Srivastava, Sunil K

    2017-05-01

    Uveitis is a major cause of sight loss across the world. The reliable assessment of intraocular inflammation in uveitis ('disease activity') is essential in order to score disease severity and response to treatment. In this review, we describe how 'quantitative imaging', the approach of using automated analysis and measurement algorithms across both standard and emerging imaging modalities, can develop objective instrument-based measures of disease activity. This is a narrative review based on searches of the current world literature using terms related to quantitative imaging techniques in uveitis, supplemented by clinical trial registry data, and expert knowledge of surrogate endpoints and outcome measures in ophthalmology. Current measures of disease activity are largely based on subjective clinical estimation, and are relatively insensitive, with poor discrimination and reliability. The development of quantitative imaging in uveitis is most established in the use of optical coherence tomographic (OCT) measurement of central macular thickness (CMT) to measure severity of macular edema (ME). The transformative effect of CMT in clinical assessment of patients with ME provides a paradigm for the development and impact of other forms of quantitative imaging. Quantitative imaging approaches are now being developed and validated for other key inflammatory parameters such as anterior chamber cells, vitreous haze, retinovascular leakage, and chorioretinal infiltrates. As new forms of quantitative imaging in uveitis are proposed, the uveitis community will need to evaluate these tools against the current subjective clinical estimates and reach a new consensus for how disease activity in uveitis should be measured. The development, validation, and adoption of sensitive and discriminatory measures of disease activity is an unmet need that has the potential to transform both drug development and routine clinical care for the patient with uveitis.

  9. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  10. Energy & Climate: Getting Quantitative

    Science.gov (United States)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  11. Internet Medline providers.

    Science.gov (United States)

    Vine, D L; Coady, T R

    1998-01-01

    Each database in this review has features that will appeal to some users. Each provides a credible interface to information available within the Medline database. The major differences are pricing and interface design. In this context, features that cost more and might seem trivial to the occasional searcher may actually save time and money when used by the professional. Internet Grateful Med is free, but Ms. Coady and I agree the availability of only three ANDable search fields is a major functional limitation. PubMed is also free but much more powerful. The command line interface that permits very sophisticated searches requires a commitment that casual users will find intimidating. Ms. Coady did not believe the feedback currently provided during a search was sufficient for sustained professional use. Paper Chase and Knowledge Finder are mature, modestly priced Medline search services. Paper Chase provides a menu-driven interface that is very easy to use, yet permits the user to search virtually all of Medline's data fields. Knowledge Finder emphasizes the use of natural language queries but fully supports more traditional search strategies. The impact of the tradeoff between fuzzy and Boolean strategies offered by Knowledge Finder is unclear and beyond the scope of this review. Additional software must be downloaded to use all of Knowledge Finders' features. Other providers required no software beyond the basic Internet browser, and this requirement prevented Ms. Coady from evaluating Knowledge Finder. Ovid and Silver Platter offer well-designed interfaces that simplify the construction of complex queries. These are clearly services designed for professional users. While pricing eliminates these for casual use, it should be emphasized that Medline citation access is only a portion of the service provided by these high-end vendors. Finally, we should comment that each of the vendors and government-sponsored services provided prompt and useful feedback to e

  12. [Reflection of estimating postmortem interval in forensic entomology and the Daubert standard].

    Science.gov (United States)

    Xie, Dan; Peng, Yu-Long; Guo, Ya-Dong; Cai, Ji-Feng

    2013-08-01

    Estimating postmortem interval (PMI) is always the emphasis and difficulty in forensic practice. Forensic entomology plays a significant indispensable role. Recently, the theories and technologies of forensic entomology are increasingly rich. But many problems remain in the research and practice. With proposing the Daubert standard, the reliability and accuracy of estimation PMI by forensic entomology need more demands. This review summarizes the application of the Daubert standard in several aspects of ecology, quantitative genetics, population genetics, molecular biology, and microbiology in the practice of forensic entomology. It builds a bridge for basic research and forensic practice to provide higher accuracy for estimating postmortem interval by forensic entomology.

  13. A quantitative reconstruction software suite for SPECT imaging

    Science.gov (United States)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  14. Development of an SRM method for absolute quantitation of MYDGF/C19orf10 protein.

    Science.gov (United States)

    Dwivedi, Ravi C; Krokhin, Oleg V; El-Gabalawy, Hani S; Wilkins, John A

    2016-06-01

    To develop a MS-based selected reaction monitoring (SRM) assay for quantitation of myeloid-derived growth factor (MYDGF) formerly chromosome 19 open reading frame (C19orf10). Candidate reporter peptides were identified in digests of recombinant MYDGF. Isotopically labeled forms of these reporter peptides were employed as internal standards for assay development. Two reference peptides were selected SYLYFQTFFK and GAEIEYAMAYSK with respective LOQ of 42 and 380 attomole per injection. Application of the assay to human serum and synovial fluid determined that the assay sensitivity was reduced and quantitation was not achievable. However, the partial depletion of albumin and immunoglobulin from synovial fluids provided estimates of 300-650 femtomoles per injection (0.7-1.6 nanomolar (nM) fluid concentrations) in three of the six samples analyzed. A validated sensitive assay for the quantitation of MYDGF in biological fluids was developed. However, the endogenous levels of MYDGF in such fluids are at or below the current levels of quantitation. The levels of MYDGF are lower than those previously reported using an ELISA. The current results suggest that additional steps may be required to remove high abundance proteins or to enrich MYDGF for SRM-based quantitation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Quantitative assessment of structural image quality.

    Science.gov (United States)

    Rosen, Adon F G; Roalf, David R; Ruparel, Kosha; Blake, Jason; Seelaus, Kevin; Villa, Lakshmi P; Ciric, Rastko; Cook, Philip A; Davatzikos, Christos; Elliott, Mark A; Garcia de La Garza, Angel; Gennatas, Efstathios D; Quarmley, Megan; Schmitt, J Eric; Shinohara, Russell T; Tisdall, M Dylan; Craddock, R Cameron; Gur, Raquel E; Gur, Ruben C; Satterthwaite, Theodore D

    2017-12-24

    Data quality is increasingly recognized as one of the most important confounding factors in brain imaging research. It is particularly important for studies of brain development, where age is systematically related to in-scanner motion and data quality. Prior work has demonstrated that in-scanner head motion biases estimates of structural neuroimaging measures. However, objective measures of data quality are not available for most structural brain images. Here we sought to identify quantitative measures of data quality for T1-weighted volumes, describe how these measures relate to cortical thickness, and delineate how this in turn may bias inference regarding associations with age in youth. Three highly-trained raters provided manual ratings of 1840 raw T1-weighted volumes. These images included a training set of 1065 images from Philadelphia Neurodevelopmental Cohort (PNC), a test set of 533 images from the PNC, as well as an external test set of 242 adults acquired on a different scanner. Manual ratings were compared to automated quality measures provided by the Preprocessed Connectomes Project's Quality Assurance Protocol (QAP), as well as FreeSurfer's Euler number, which summarizes the topological complexity of the reconstructed cortical surface. Results revealed that the Euler number was consistently correlated with manual ratings across samples. Furthermore, the Euler number could be used to identify images scored "unusable" by human raters with a high degree of accuracy (AUC: 0.98-0.99), and out-performed proxy measures from functional timeseries acquired in the same scanning session. The Euler number also was significantly related to cortical thickness in a regionally heterogeneous pattern that was consistent across datasets and replicated prior results. Finally, data quality both inflated and obscured associations with age during adolescence. Taken together, these results indicate that reliable measures of data quality can be automatically derived from T1

  16. Providing plastic zone extrusion

    Science.gov (United States)

    Manchiraju, Venkata Kiran; Feng, Zhili; David, Stan A.; Yu, Zhenzhen

    2017-04-11

    Plastic zone extrusion may be provided. First, a compressor may generate frictional heat in stock to place the stock in a plastic zone of the stock. Then, a conveyer may receive the stock in its plastic zone from the compressor and transport the stock in its plastic zone from the compressor. Next, a die may receive the stock in its plastic zone from the conveyer and extrude the stock to form a wire.

  17. Attitude Estimation or Quaternion Estimation?

    Science.gov (United States)

    Markley, F. Landis

    2003-01-01

    The attitude of spacecraft is represented by a 3x3 orthogonal matrix with unity determinant, which belongs to the three-dimensional special orthogonal group SO(3). The fact that all three-parameter representations of SO(3) are singular or discontinuous for certain attitudes has led to the use of higher-dimensional nonsingular parameterizations, especially the four-component quaternion. In attitude estimation, we are faced with the alternatives of using an attitude representation that is either singular or redundant. Estimation procedures fall into three broad classes. The first estimates a three-dimensional representation of attitude deviations from a reference attitude parameterized by a higher-dimensional nonsingular parameterization. The deviations from the reference are assumed to be small enough to avoid any singularity or discontinuity of the three-dimensional parameterization. The second class, which estimates a higher-dimensional representation subject to enough constraints to leave only three degrees of freedom, is difficult to formulate and apply consistently. The third class estimates a representation of SO(3) with more than three dimensions, treating the parameters as independent. We refer to the most common member of this class as quaternion estimation, to contrast it with attitude estimation. We analyze the first and third of these approaches in the context of an extended Kalman filter with simplified kinematics and measurement models.

  18. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    Science.gov (United States)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver L.

    2016-01-01

    Satellite constellations and Distributed Spacecraft Mission (DSM) architectures offer unique benefits to Earth observation scientists and unique challenges to cost estimators. The Cost and Risk (CR) module of the Tradespace Analysis Tool for Constellations (TAT-C) being developed by NASA Goddard seeks to address some of these challenges by providing a new approach to cost modeling, which aggregates existing Cost Estimating Relationships (CER) from respected sources, cost estimating best practices, and data from existing and proposed satellite designs. Cost estimation through this tool is approached from two perspectives: parametric cost estimating relationships and analogous cost estimation techniques. The dual approach utilized within the TAT-C CR module is intended to address prevailing concerns regarding early design stage cost estimates, and offer increased transparency and fidelity by offering two preliminary perspectives on mission cost. This work outlines the existing cost model, details assumptions built into the model, and explains what measures have been taken to address the particular challenges of constellation cost estimating. The risk estimation portion of the TAT-C CR module is still in development and will be presented in future work. The cost estimate produced by the CR module is not intended to be an exact mission valuation, but rather a comparative tool to assist in the exploration of the constellation design tradespace. Previous work has noted that estimating the cost of satellite constellations is difficult given that no comprehensive model for constellation cost estimation has yet been developed, and as such, quantitative assessment of multiple spacecraft missions has many remaining areas of uncertainty. By incorporating well-established CERs with preliminary approaches to approaching these uncertainties, the CR module offers more complete approach to constellation costing than has previously been available to mission architects or Earth

  19. Uncertainty Evaluation of Weibull Estimators through Monte Carlo Simulation: Applications for Crack Initiation Testing

    Directory of Open Access Journals (Sweden)

    Jae Phil Park

    2016-06-01

    Full Text Available The typical experimental procedure for testing stress corrosion cracking initiation involves an interval-censored reliability test. Based on these test results, the parameters of a Weibull distribution, which is a widely accepted crack initiation model, can be estimated using maximum likelihood estimation or median rank regression. However, it is difficult to determine the appropriate number of test specimens and censoring intervals required to obtain sufficiently accurate Weibull estimators. In this study, we compare maximum likelihood estimation and median rank regression using a Monte Carlo simulation to examine the effects of the total number of specimens, test duration, censoring interval, and shape parameters of the true Weibull distribution on the estimator uncertainty. Finally, we provide the quantitative uncertainties of both Weibull estimators, compare them with the true Weibull parameters, and suggest proper experimental conditions for developing a probabilistic crack initiation model through crack initiation tests.

  20. Uncertainty Evaluation of Weibull Estimators through Monte Carlo Simulation: Applications for Crack Initiation Testing.

    Science.gov (United States)

    Park, Jae Phil; Bahn, Chi Bum

    2016-06-27

    The typical experimental procedure for testing stress corrosion cracking initiation involves an interval-censored reliability test. Based on these test results, the parameters of a Weibull distribution, which is a widely accepted crack initiation model, can be estimated using maximum likelihood estimation or median rank regression. However, it is difficult to determine the appropriate number of test specimens and censoring intervals required to obtain sufficiently accurate Weibull estimators. In this study, we compare maximum likelihood estimation and median rank regression using a Monte Carlo simulation to examine the effects of the total number of specimens, test duration, censoring interval, and shape parameters of the true Weibull distribution on the estimator uncertainty. Finally, we provide the quantitative uncertainties of both Weibull estimators, compare them with the true Weibull parameters, and suggest proper experimental conditions for developing a probabilistic crack initiation model through crack initiation tests.

  1. Whole cell, label free protein quantitation with data independent acquisition: quantitation at the MS2 level.

    Science.gov (United States)

    McQueen, Peter; Spicer, Vic; Schellenberg, John; Krokhin, Oleg; Sparling, Richard; Levin, David; Wilkins, John A

    2015-01-01

    Label free quantitation by measurement of peptide fragment signal intensity (MS2 quantitation) is a technique that has seen limited use due to the stochastic nature of data dependent acquisition (DDA). However, data independent acquisition has the potential to make large scale MS2 quantitation a more viable technique. In this study we used an implementation of data independent acquisition--SWATH--to perform label free protein quantitation in a model bacterium Clostridium stercorarium. Four tryptic digests analyzed by SWATH were probed by an ion library containing information on peptide mass and retention time obtained from DDA experiments. Application of this ion library to SWATH data quantified 1030 proteins with at least two peptides quantified (∼ 40% of predicted proteins in the C. stercorarium genome) in each replicate. Quantitative results obtained were very consistent between biological replicates (R(2) ∼ 0.960). Protein quantitation by summation of peptide fragment signal intensities was also highly consistent between biological replicates (R(2) ∼ 0.930), indicating that this approach may have increased viability compared to recent applications in label free protein quantitation. SWATH based quantitation was able to consistently detect differences in relative protein quantity and it provided coverage for a number of proteins that were missed in some samples by DDA analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Providing Compassion through Flow

    Directory of Open Access Journals (Sweden)

    Lydia Royeen

    2015-07-01

    Full Text Available Meg Kral, MS, OTR/L, CLT, is the cover artist for the Summer 2015 issue of The Open Journal of Occupational Therapy. Her untitled piece of art is an oil painting and is a re-creation of a photograph taken while on vacation. Meg is currently supervisor of outpatient services at Rush University Medical Center. She is lymphedema certified and has a specific interest in breast cancer lymphedema. Art and occupational therapy serve similar purposes for Meg: both provide a sense of flow. She values the outcomes, whether it is a piece of art or improved functional status

  3. Providing Contraception to Adolescents.

    Science.gov (United States)

    Raidoo, Shandhini; Kaneshiro, Bliss

    2015-12-01

    Adolescents have high rates of unintended pregnancy and face unique reproductive health challenges. Providing confidential contraceptive services to adolescents is important in reducing the rate of unintended pregnancy. Long-acting contraception such as the intrauterine device and contraceptive implant are recommended as first-line contraceptives for adolescents because they are highly effective with few side effects. The use of barrier methods to prevent sexually transmitted infections should be encouraged. Adolescents have limited knowledge of reproductive health and contraceptive options, and their sources of information are often unreliable. Access to contraception is available through a variety of resources that continue to expand. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Ysla S. Catalina & Providence

    OpenAIRE

    Diazgranados, Carlos Nicolás; Torres Carreño, Guillermo Andrés; Castell, Edmon; Moreno, Santiago; Ramirez, Natalia

    2010-01-01

    Esta Hoja de Mano pertenece a la exposición temporal "Ysla S. Catalina & Providence". Contiene un resumen histórico de las Islas de Santa Catalina y Providencia en los idiomas inglés y español y un mapa del siglo VI que lo hace más didáctico apoyado por figuras recortables. Esta muestra hace parte del proyecto IDA y VUELTA del Sistema de Patrimonio Cultural y Museos SPM que gestiona la descentralización del patrimonio cultural de la Universidad Nacional de Colombia a otras ciudades del pa...

  5. Quantitative feedback versus standard training for cervical and thoracic manipulation.

    Science.gov (United States)

    Triano, John J; Rogers, Carolyn M; Combs, Sarah; Potts, David; Sorrels, Kenneth

    2003-01-01

    To quantify elements of spinal manipulation therapy performance and to test the strategy of combined rehearsal and quantitative feedback as a means of enhancing student skill development for cervical and thoracic manipulative procedures. Randomized, controlled study. Chiropractic college. Thirty-nine chiropractic student volunteers entering the manipulation technique training course. Student performance of cervical and thoracic spinal manipulation therapies were quantified at the beginning, middle, and end of a trimester using a Leader 900 Z series manipulation table (Leader International, Port Orchard, Wash) embedded with an AMTI force plate. Passive loads acting through the targeted (C2 or T7) functional spinal units were estimated using inverse dynamics. Participating students rehearsed the index transverse (C2) and single pisiform-transverse (T7) procedures following either the standard curriculum alone or a modified curriculum adding the Dynadjust Instrument training aid (Labarge, Inc.), as assigned on a randomized basis. Student t and chi-square tests were used to explore and describe biomechanical parameter changes over time as the semester progressed. Significant changes in performance between the standard curriculum and modified curriculum (with the Dynadjust) were observed for several, but different, biomechanical parameters of cervical and thoracic procedures. This project used a rehearsal program that provided quantitative feedback on an empirically defined schedule that was self-administered by the student. Results demonstrated significant changes in performance of spinal manipulation by students using the Dynadjust Instrument versus those who did not. Using quantitative feedback provided from training aids and biomechanical measurement systems, future training programs may be optimized and tested.

  6. Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.

    Science.gov (United States)

    Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K

    2016-11-01

    Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.

  7. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    Over the past quarter-century, microbiologists have used DNA sequence information to aid in the characterization of microbial communities. During the last decade, this has expanded from single genes to microbial community genomics, or metagenomics, in which the gene content of an environment can...... provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... by estimating average genome sizes. This normalization can relieve comparative biases introduced by differences in community structure, number of sequencing reads, and sequencing read lengths between different metagenomes. We demonstrate the utility of this approach by comparing metagenomes from two different...

  8. A General Model For Estimating Macroevolutionary Landscapes.

    Science.gov (United States)

    Boucher, Florian C; Démery, Vincent; Conti, Elena; Harmon, Luke J; Uyeda, Josef

    2017-09-22

    The evolution of quantitative characters over long timescales is often studied using stochastic diffusion models. The current toolbox available to students of macroevolution is however limited to two main models: Brownian motion and the Ornstein-Uhlenbeck process, plus some of their extensions. Here we present a very general model for inferring the dynamics of quantitative characters evolving under both random diffusion and deterministic forces of any possible shape and strength, which can accommodate interesting evolutionary scenarios like directional trends, disruptive selection, or macroevolutionary landscapes with multiple peaks. This model is based on a general partial differential equation widely used in statistical mechanics: the Fokker-Planck equation, also known in population genetics as the Kolmogorov forward equation. We thus call the model FPK, for Fokker-Planck-Kolmogorov. We first explain how this model can be used to describe macroevolutionary landscapes over which quantitative traits evolve and, more importantly, we detail how it can be fitted to empirical data. Using simulations, we show that the model has good behavior both in terms of discrimination from alternative models and in terms of parame