WorldWideScience

Sample records for proteins valid estimates

  1. Development, standardization and validation of nuclear based technologies for estimating microbial protein supply in ruminant livestock for improving productivity

    International Nuclear Information System (INIS)

    Makkar, H.P.S.

    2004-01-01

    The primary constraint to livestock production in developing countries is the scarcity and fluctuating quantity and quality of the year-round feed supply. These countries experience serious shortages of animal feeds and fodders of the conventional type. Natural forages are very variable both in quality and quantity, conventional agro-industrial by-products are scarce and vary seasonal, and grains are required almost exclusively for human consumption. The small farmers in developing countries have limited resources available to them for feeding their ruminant livestock. Poor nutrition results in low rates of reproduction and production as well as increased susceptibility to disease and mortality. Providing adequate good-quality feed to livestock to raise and maintain their productivity is a major challenge to agricultural scientists and policy makers all over the world. Recent advances in ration balancing include manipulation of feed to increase the quantity and quality of protein and energy delivered to the small intestine. Selection of feeds based on high efficiency of microbial protein synthesis in the rumen along with the high dry matter digestibility, and development of feeding strategies based on high efficiency as well as high microbial protein synthesis in the rumen will lead to higher supply of protein post-ruminally. The strategy for improving production has therefore been to maximize the efficiency of utilization of available feed resources in the rumen by providing optimum conditions for microbial growth and thereby supplementing dietary nutrients to complement and balance the products of rumen digestion to the animal's requirement

  2. Content validity and its estimation

    Directory of Open Access Journals (Sweden)

    Yaghmale F

    2003-04-01

    Full Text Available Background: Measuring content validity of instruments are important. This type of validity can help to ensure construct validity and give confidence to the readers and researchers about instruments. content validity refers to the degree that the instrument covers the content that it is supposed to measure. For content validity two judgments are necessary: the measurable extent of each item for defining the traits and the set of items that represents all aspects of the traits. Purpose: To develop a content valid scale for assessing experience with computer usage. Methods: First a review of 2 volumes of International Journal of Nursing Studies, was conducted with onlyI article out of 13 which documented content validity did so by a 4-point content validity index (CV! and the judgment of 3 experts. Then a scale with 38 items was developed. The experts were asked to rate each item based on relevance, clarity, simplicity and ambiguity on the four-point scale. Content Validity Index (CVI for each item was determined. Result: Of 38 items, those with CVIover 0.75 remained and the rest were discarded reSulting to 25-item scale. Conclusion: Although documenting content validity of an instrument may seem expensive in terms of time and human resources, its importance warrants greater attention when a valid assessment instrument is to be developed. Keywords: Content Validity, Measuring Content Validity

  3. Summary of the co-ordinated research project on development, standardization and validation of nuclear based technologies for estimating microbial protein supply in ruminant livestock for improving productivity

    International Nuclear Information System (INIS)

    Jayasuriya, M.C.N.

    1999-01-01

    A major constraint to animal production in developing countries is poor nutrition due to inadequate or fluctuating nutrient supply. This results in low rates of reproduction and production as well as increased susceptibility to disease and mortality. Microbial cells formed as a result of rumen degradation of carbohydrates under anaerobic conditions are a major source of protein for ruminants. They provide the majority of the amino acids that the host animal requires for tissue maintenance, growth and production. In roughage-fed ruminants, micro-organisms are virtually the only source of protein. Therefore, a knowledge of the microbial contribution to the nutrition of the host animal is essential to developing feed supplementation strategies for improving ruminant production. While this factor has been recognized for many years, it has been extremely difficult to determine the microbial protein contribution to ruminant nutrition. The methods generally used for determining microbial protein production depend on the use of natural microbial markers such as RNA (ribonucleic acid) and DAPA (diamino-pimelic acid) or of isotopes 35 S, 15 N or 32 P. However, these methods involve surgical intervention such as post-rumen cannulation and complex procedures that require accurate and quantitative information on both digesta and microbial marker flow. A calorimetric technique using enzymatic procedures was developed for measuring purine derivatives (PD) in urine under a Technical Contract. With knowledge of the amount of PD excreted in the urine, the microbial protein supply to the host animal can be estimated. The principle of the method is that nucleic acids leaving the rumen are essentially of microbial origin. The nucleic acids are extensively digested in the small intestine and the resulting purines are absorbed

  4. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  5. Validation of protein carbonyl measurement

    DEFF Research Database (Denmark)

    Augustyniak, Edyta; Adam, Aisha; Wojdyla, Katarzyna

    2015-01-01

    Protein carbonyls are widely analysed as a measure of protein oxidation. Several different methods exist for their determination. A previous study had described orders of magnitude variance that existed when protein carbonyls were analysed in a single laboratory by ELISA using different commercial...... protein carbonyl analysis across Europe. ELISA and Western blotting techniques detected an increase in protein carbonyl formation between 0 and 5min of UV irradiation irrespective of method used. After irradiation for 15min, less oxidation was detected by half of the laboratories than after 5min...... irradiation. Three of the four ELISA carbonyl results fell within 95% confidence intervals. Likely errors in calculating absolute carbonyl values may be attributed to differences in standardisation. Out of up to 88 proteins identified as containing carbonyl groups after tryptic cleavage of irradiated...

  6. Validation-driven protein-structure improvement

    NARCIS (Netherlands)

    Touw, W.G.

    2016-01-01

    High-quality protein structure models are essential for many Life Science applications, such as protein engineering, molecular dynamics, drug design, and homology modelling. The WHAT_CHECK model validation project and the PDB_REDO model optimisation project have shown that many structure models in

  7. Clinical validity of the estimated energy requirement and the average protein requirement for nutritional status change and wound healing in older patients with pressure ulcers: A multicenter prospective cohort study.

    Science.gov (United States)

    Iizaka, Shinji; Kaitani, Toshiko; Nakagami, Gojiro; Sugama, Junko; Sanada, Hiromi

    2015-11-01

    Adequate nutritional intake is essential for pressure ulcer healing. Recently, the estimated energy requirement (30 kcal/kg) and the average protein requirement (0.95 g/kg) necessary to maintain metabolic balance have been reported. The purpose was to evaluate the clinical validity of these requirements in older hospitalized patients with pressure ulcers by assessing nutritional status and wound healing. This multicenter prospective study carried out as a secondary analysis of a clinical trial included 194 patients with pressure ulcers aged ≥65 years from 29 institutions. Nutritional status including anthropometry and biochemical tests, and wound status by a structured severity tool, were evaluated over 3 weeks. Energy and protein intake were determined from medical records on a typical day and dichotomized by meeting the estimated average requirement. Longitudinal data were analyzed with a multivariate mixed-effects model. Meeting the energy requirement was associated with changes in weight (P clinically validated for prevention of nutritional decline and of impaired healing of deep pressure ulcers. © 2014 Japan Geriatrics Society.

  8. Estimation of rumen microbial protein production from purine derivatives in urine. A laboratory manual for the FAO/IAEA co-ordinated research programme on development, standardization and validation of nuclear based technologies for measuring microbial protein supply in ruminant livestock for improving productivity

    International Nuclear Information System (INIS)

    1997-05-01

    This laboratory manual contains the methodologies used in the standardization and validation of the urine purine derivative technique for estimating microbial protein supply to the rumen. It includes descriptions of methods that involve both radioactive and stable isotopes as well as non isotopic techniques such as chemical assays, since it has been recognised that while isotopic trace techniques provide a powerful tool for nutrition research they can not and should not be used in isolation. Refs, figs, tabs

  9. How Valid are Estimates of Occupational Illness?

    Science.gov (United States)

    Hilaski, Harvey J.; Wang, Chao Ling

    1982-01-01

    Examines some of the methods of estimating occupational diseases and suggests that a consensus on the adequacy and reliability of estimates by the Bureau of Labor Statistics and others is not likely. (SK)

  10. Experimental validation of pulsed column inventory estimators

    International Nuclear Information System (INIS)

    Beyerlein, A.L.; Geldard, J.F.; Weh, R.; Eiben, K.; Dander, T.; Hakkila, E.A.

    1991-01-01

    Near-real-time accounting (NRTA) for reprocessing plants relies on the timely measurement of all transfers through the process area and all inventory in the process. It is difficult to measure the inventory of the solvent contractors; therefore, estimation techniques are considered. We have used experimental data obtained at the TEKO facility in Karlsruhe and have applied computer codes developed at Clemson University to analyze this data. For uranium extraction, the computer predictions agree to within 15% of the measured inventories. We believe this study is significant in demonstrating that using theoretical models with a minimum amount of process data may be an acceptable approach to column inventory estimation for NRTA. 15 refs., 7 figs

  11. Validation of Core Temperature Estimation Algorithm

    Science.gov (United States)

    2016-01-20

    based on an extended Kalman filter , which was developed using field data from 17 young male U.S. Army soldiers with core temperatures ranging from...CTstart, v) %KFMODEL estimate core temperature from heart rate with Kalman filter % This version supports both batch mode (operate on entire HR time...CTstart = 37.1; % degrees Celsius end if nargin < 3 v = 0; end %Extended Kalman Filter Parameters a = 1; gamma = 0.022^2; b_0 = -7887.1; b_1

  12. Validating estimates of problematic drug use in England

    Directory of Open Access Journals (Sweden)

    Heatlie Heath

    2007-10-01

    Full Text Available Abstract Background UK Government expenditure on combatting drug abuse is based on estimates of illicit drug users, yet the validity of these estimates is unknown. This study aims to assess the face validity of problematic drug use (PDU and injecting drug use (IDU estimates for all English Drug Action Teams (DATs in 2001. The estimates were derived from a statistical model using the Multiple Indicator Method (MIM. Methods Questionnaire study, in which the 149 English Drug Action Teams were asked to evaluate the MIM estimates for their DAT. Results The response rate was 60% and there were no indications of selection bias. Of responding DATs, 64% thought the PDU estimates were about right or did not dispute them, while 27% had estimates that were too low and 9% were too high. The figures for the IDU estimates were 52% (about right, 44% (too low and 3% (too high. Conclusion This is the first UK study to determine the validity estimates of problematic and injecting drug misuse. The results of this paper highlight the need to consider criterion and face validity when evaluating estimates of the number of drug users.

  13. A probabilistic approach for validating protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  14. Benchmarking protein classification algorithms via supervised cross-validation

    NARCIS (Netherlands)

    Kertész-Farkas, A.; Dhir, S.; Sonego, P.; Pacurar, M.; Netoteia, S.; Nijveen, H.; Kuzniar, A.; Leunissen, J.A.M.; Kocsor, A.; Pongor, S.

    2008-01-01

    Development and testing of protein classification algorithms are hampered by the fact that the protein universe is characterized by groups vastly different in the number of members, in average protein size, similarity within group, etc. Datasets based on traditional cross-validation (k-fold,

  15. Estimating activity energy expenditure: how valid are physical activity questionnaires?

    Science.gov (United States)

    Neilson, Heather K; Robson, Paula J; Friedenreich, Christine M; Csizmadi, Ilona

    2008-02-01

    Activity energy expenditure (AEE) is the modifiable component of total energy expenditure (TEE) derived from all activities, both volitional and nonvolitional. Because AEE may affect health, there is interest in its estimation in free-living people. Physical activity questionnaires (PAQs) could be a feasible approach to AEE estimation in large populations, but it is unclear whether or not any PAQ is valid for this purpose. Our aim was to explore the validity of existing PAQs for estimating usual AEE in adults, using doubly labeled water (DLW) as a criterion measure. We reviewed 20 publications that described PAQ-to-DLW comparisons, summarized study design factors, and appraised criterion validity using mean differences (AEE(PAQ) - AEE(DLW), or TEE(PAQ) - TEE(DLW)), 95% limits of agreement, and correlation coefficients (AEE(PAQ) versus AEE(DLW) or TEE(PAQ) versus TEE(DLW)). Only 2 of 23 PAQs assessed most types of activity over the past year and indicated acceptable criterion validity, with mean differences (TEE(PAQ) - TEE(DLW)) of 10% and 2% and correlation coefficients of 0.62 and 0.63, respectively. At the group level, neither overreporting nor underreporting was more prevalent across studies. We speculate that, aside from reporting error, discrepancies between PAQ and DLW estimates may be partly attributable to 1) PAQs not including key activities related to AEE, 2) PAQs and DLW ascertaining different time periods, or 3) inaccurate assignment of metabolic equivalents to self-reported activities. Small sample sizes, use of correlation coefficients, and limited information on individual validity were problematic. Future research should address these issues to clarify the true validity of PAQs for estimating AEE.

  16. Temporal validation for landsat-based volume estimation model

    Science.gov (United States)

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  17. Validity of Edgeworth expansions for realized volatility estimators

    DEFF Research Database (Denmark)

    Hounyo, Ulrich; Veliyev, Bezirgen

    (2009). Second, we show that the validity of the Edgeworth expansions for realized volatility may not cover the optimal two-point distribution wild bootstrap proposed by Gonçalves and Meddahi (2009). Then, we propose a new optimal nonlattice distribution which ensures the second-order correctness...... of the bootstrap. Third, in the presence of microstructure noise, based on our Edgeworth expansions, we show that the new optimal choice proposed in the absence of noise is still valid in noisy data for the pre-averaged realized volatility estimator proposed by Podolskij and Vetter (2009). Finally, we show how...

  18. Worldwide Protein Data Bank validation information: usage and trends.

    Science.gov (United States)

    Smart, Oliver S; Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika; Kleywegt, Gerard J; Velankar, Sameer

    2018-03-01

    Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrends DB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics.

  19. Predicting and validating protein interactions using network structure.

    Directory of Open Access Journals (Sweden)

    Pao-Yang Chen

    2008-07-01

    Full Text Available Protein interactions play a vital part in the function of a cell. As experimental techniques for detection and validation of protein interactions are time consuming, there is a need for computational methods for this task. Protein interactions appear to form a network with a relatively high degree of local clustering. In this paper we exploit this clustering by suggesting a score based on triplets of observed protein interactions. The score utilises both protein characteristics and network properties. Our score based on triplets is shown to complement existing techniques for predicting protein interactions, outperforming them on data sets which display a high degree of clustering. The predicted interactions score highly against test measures for accuracy. Compared to a similar score derived from pairwise interactions only, the triplet score displays higher sensitivity and specificity. By looking at specific examples, we show how an experimental set of interactions can be enriched and validated. As part of this work we also examine the effect of different prior databases upon the accuracy of prediction and find that the interactions from the same kingdom give better results than from across kingdoms, suggesting that there may be fundamental differences between the networks. These results all emphasize that network structure is important and helps in the accurate prediction of protein interactions. The protein interaction data set and the program used in our analysis, and a list of predictions and validations, are available at http://www.stats.ox.ac.uk/bioinfo/resources/PredictingInteractions.

  20. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation

  1. Validation of Structures in the Protein Data Bank.

    Science.gov (United States)

    Gore, Swanand; Sanz García, Eduardo; Hendrickx, Pieter M S; Gutmanas, Aleksandras; Westbrook, John D; Yang, Huanwang; Feng, Zukang; Baskaran, Kumaran; Berrisford, John M; Hudson, Brian P; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Mading, Steve; Mak, Lora; Mukhopadhyay, Abhik; Oldfield, Thomas J; Patwardhan, Ardan; Peisach, Ezra; Sahni, Gaurav; Sekharan, Monica R; Sen, Sanchayita; Shao, Chenghua; Smart, Oliver S; Ulrich, Eldon L; Yamashita, Reiko; Quesada, Martha; Young, Jasmine Y; Nakamura, Haruki; Markley, John L; Berman, Helen M; Burley, Stephen K; Velankar, Sameer; Kleywegt, Gerard J

    2017-12-05

    The Worldwide PDB recently launched a deposition, biocuration, and validation tool: OneDep. At various stages of OneDep data processing, validation reports for three-dimensional structures of biological macromolecules are produced. These reports are based on recommendations of expert task forces representing crystallography, nuclear magnetic resonance, and cryoelectron microscopy communities. The reports provide useful metrics with which depositors can evaluate the quality of the experimental data, the structural model, and the fit between them. The validation module is also available as a stand-alone web server and as a programmatically accessible web service. A growing number of journals require the official wwPDB validation reports (produced at biocuration) to accompany manuscripts describing macromolecular structures. Upon public release of the structure, the validation report becomes part of the public PDB archive. Geometric quality scores for proteins in the PDB archive have improved over the past decade. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Estimates of selection parameters in protein mutants of spring barley

    International Nuclear Information System (INIS)

    Gaul, H.; Walther, H.; Seibold, K.H.; Brunner, H.; Mikaelsen, K.

    1976-01-01

    Detailed studies have been made with induced protein mutants regarding a possible genetic advance in selection including the estimation of the genetic variation and heritability coefficients. Estimates were obtained for protein content and protein yield. The variation of mutant lines in different environments was found to be many times as large as the variation of the line means. The detection of improved protein mutants seems therefore possible only in trials with more than one environment. The heritability of protein content and protein yield was estimated in different sets of environments and was found to be low. However, higher values were found with an increasing number of environments. At least four environments seem to be necessary to obtain reliable heritability estimates. The geneticall component of the variation between lines was significant for protein content in all environmental combinations. For protein yield some environmental combinations only showed significant differences. The expected genetic advance with one selection step was small for both protein traits. Genetically significant differences between protein micromutants give, however, a first indication that selection among protein mutants with small differences seems also possible. (author)

  3. Urine protein concentration estimation for biomarker discovery

    OpenAIRE

    Mistry, Hiten D.; Bramham, Kate; Weston, Andrew; Ward, Malcolm; Thompson, Andrew; Chappell, Lucy C.

    2013-01-01

    Recent advances have been made in the study of urinary proteomics as a diagnostic tool for renal disease and pre-eclampsia which requires accurate measurement of urinary protein. We compared different protein assays (Bicinchoninic acid (BCA), Lowry and Bradford) against the ‘gold standard’ amino-acid assay in urine from 43 women (8 non-pregnant, 34 pregnant, including 8 with pre-eclampsia. BCA assay was superior to both Lowry and Bradford assays (Bland Altman bias: 0.08) compared to amino-aci...

  4. Validation of equations for pleural effusion volume estimation by ultrasonography.

    Science.gov (United States)

    Hassan, Maged; Rizk, Rana; Essam, Hatem; Abouelnour, Ahmed

    2017-12-01

    To validate the accuracy of previously published equations that estimate pleural effusion volume using ultrasonography. Only equations using simple measurements were tested. Three measurements were taken at the posterior axillary line for each case with effusion: lateral height of effusion ( H ), distance between collapsed lung and chest wall ( C ) and distance between lung and diaphragm ( D ). Cases whose effusion was aspirated to dryness were included and drained volume was recorded. Intra-class correlation coefficient (ICC) was used to determine the predictive accuracy of five equations against the actual volume of aspirated effusion. 46 cases with effusion were included. The most accurate equation in predicting effusion volume was ( H  +  D ) × 70 (ICC 0.83). The simplest and yet accurate equation was H  × 100 (ICC 0.79). Pleural effusion height measured by ultrasonography gives a reasonable estimate of effusion volume. Incorporating distance between lung base and diaphragm into estimation improves accuracy from 79% with the first method to 83% with the latter.

  5. Well-founded cost estimation validated by experience

    International Nuclear Information System (INIS)

    LaGuardia, T.S.

    2005-01-01

    to build consistency into its cost estimates. A standardized list of decommissioning activities needs to be adopted internationally so estimates can be prepared on a consistent basis, and to facilitate tracking of actual costs against the estimate. The OECD/NEA Standardized List incorporates the consensus of international experts as to the elements of cost and activities that should be included in the estimate. A significant effort was made several years ago to promote universal adoption of this standard. Using the standardized list of activities as a template, a questionnaire was distributed to gather actual decommissioning costs (and other parameters) from international projects. The results of cost estimate contributions from many countries were analyzed and evaluated as to reactor types, decommissioning strategies, cost drivers, and waste disposal quantities. The results were reported in the literature A standardized list of activities will only be valuable if the underlying cost elements and methodology is clearly identified in the estimate. While no one would expect perfect correlation of every element of cost in a large project estimate versus actual cost comparison, the variants should be visible so the basis for the difference can be examined and evaluated. For the nuclear power industry to grow to meet the increasing demand for electricity, the investors, regulators and the public must understand the total cost of the nuclear fuel cycle. The costs for decommissioning and the funding requirements to provide for safe closure and dismantling of these units are well recognized to represent a significant liability to the owner utilities and governmental agencies. Owners and government regulatory agencies need benchmarked decommissioning costs to test the validity of each proposed cost and funding request. The benchmarking process requires the oversight of decommissioning experts to evaluate contributed cost data in a meaningful manner. An international

  6. Probabilistic validation of protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Dashti, Hesam; Tonelli, Marco; Lee, Woonghee; Westler, William M.; Cornilescu, Gabriel; Ulrich, Eldon L.; Markley, John L.

    2016-01-01

    Data validation plays an important role in ensuring the reliability and reproducibility of studies. NMR investigations of the functional properties, dynamics, chemical kinetics, and structures of proteins depend critically on the correctness of chemical shift assignments. We present a novel probabilistic method named ARECA for validating chemical shift assignments that relies on the nuclear Overhauser effect data. ARECA has been evaluated through its application to 26 case studies and has been shown to be complementary to, and usually more reliable than, approaches based on chemical shift databases. ARECA is available online at http://areca.nmrfam.wisc.edu/ http://areca.nmrfam.wisc.edu/

  7. Probabilistic validation of protein NMR chemical shift assignments

    Energy Technology Data Exchange (ETDEWEB)

    Dashti, Hesam [University of Wisconsin-Madison, Graduate Program in Biophysics, Biochemistry Department (United States); Tonelli, Marco; Lee, Woonghee; Westler, William M.; Cornilescu, Gabriel [University of Wisconsin-Madison, Biochemistry Department, National Magnetic Resonance Facility at Madison (United States); Ulrich, Eldon L. [University of Wisconsin-Madison, BioMagResBank, Biochemistry Department (United States); Markley, John L., E-mail: markley@nmrfam.wisc.edu, E-mail: jmarkley@wisc.edu [University of Wisconsin-Madison, Biochemistry Department, National Magnetic Resonance Facility at Madison (United States)

    2016-01-15

    Data validation plays an important role in ensuring the reliability and reproducibility of studies. NMR investigations of the functional properties, dynamics, chemical kinetics, and structures of proteins depend critically on the correctness of chemical shift assignments. We present a novel probabilistic method named ARECA for validating chemical shift assignments that relies on the nuclear Overhauser effect data. ARECA has been evaluated through its application to 26 case studies and has been shown to be complementary to, and usually more reliable than, approaches based on chemical shift databases. ARECA is available online at http://areca.nmrfam.wisc.edu/ http://areca.nmrfam.wisc.edu/.

  8. Identification and validation of novel small proteins in Pseudomonas putida

    DEFF Research Database (Denmark)

    Yang, Xiaochen; Ingemann Jensen, Sheila; Wulff, Tune

    2016-01-01

    Small proteins of fifty amino acids or less have been understudied due to difficulties that impede their annotation and detection. In order to obtain information on small open reading frames (sORFs) in P. putida, bioinformatic and proteomic approaches were used to identify putative small open...... reading frames (sORFs) in the well-characterized strain KT2440. A plasmid-based system was established for sORF validation, enabling expression of C-terminal sequential peptide affinity (SPA) tagged variants and their detection via protein immunoblotting. Out of 22 tested putative sORFs, the expression...... of fourteen sORFs was confirmed, where all except one are novel. All of the validated sORFs except one are located adjacent to annotated genes on the same strand and three are in close proximity to genes with known functions. These include an ABC transporter operon and the two transcriptional regulators Fis...

  9. Validation of Persian rapid estimate of adult literacy in dentistry.

    Science.gov (United States)

    Pakpour, Amir H; Lawson, Douglas M; Tadakamadla, Santosh K; Fridlund, Bengt

    2016-05-01

    The aim of the present study was to establish the psychometric properties of the Rapid Estimate of adult Literacy in Dentistry-99 (REALD-99) in the Persian language for use in an Iranian population (IREALD-99). A total of 421 participants with a mean age of 28 years (59% male) were included in the study. Participants included those who were 18 years or older and those residing in Quazvin (a city close to Tehran), Iran. A forward-backward translation process was used for the IREALD-99. The Test of Functional Health Literacy in Dentistry (TOFHLiD) was also administrated. The validity of the IREALD-99 was investigated by comparing the IREALD-99 across the categories of education and income levels. To further investigate, the correlation of IREALD-99 with TOFHLiD was computed. A principal component analysis (PCA) was performed on the data to assess unidimensionality and strong first factor. The Rasch mathematical model was used to evaluate the contribution of each item to the overall measure, and whether the data were invariant to differences in sex. Reliability was estimated with Cronbach's α and test-retest correlation. Cronbach's alpha for the IREALD-99 was 0.98, indicating strong internal consistency. The test-retest correlation was 0.97. IREALD-99 scores differed by education levels. IREALD-99 scores were positively related to TOFHLiD scores (rh = 0.72, P < 0.01). In addition, IREALD-99 showed positive correlation with self-rated oral health status (rh = 0.31, P < 0.01) as evidence of convergent validity. The PCA indicated a strong first component, five times the strength of the second component and nine times the third. The empirical data were a close fit with the Rasch mathematical model. There was not a significant difference in scores with respect to income level (P = 0.09), and only the very lowest income level was significantly different (P < 0.01). The IREALD-99 exhibited excellent reliability on repeated administrations, as well as internal

  10. Validation of estimated glomerular filtration rate equations for Japanese children.

    Science.gov (United States)

    Gotoh, Yoshimitsu; Uemura, Osamu; Ishikura, Kenji; Sakai, Tomoyuki; Hamasaki, Yuko; Araki, Yoshinori; Hamda, Riku; Honda, Masataka

    2018-01-25

    The gold standard for evaluation of kidney function is renal inulin clearance (Cin). However, the methodology for Cin is complicated and difficult, especially for younger children and/or patients with bladder dysfunction. Therefore, we developed a simple and easier method for obtaining the estimated glomerular filtration rate (eGFR) using equations and values for several biomarkers, i.e., serum creatinine (Cr), serum cystatin C (cystC), serum beta-2 microglobulin (β 2 MG), and creatinine clearance (Ccr). The purpose of the present study was to validate these equations with a new data set. To validate each equation, we used data of 140 patients with CKD with clinical need for Cin, using the measured GFR (mGFR). We compared the results for each eGFR equation with the mGFR using mean error (ME), root mean square error (RMSE), P 30 , and Bland-Altman analysis. The ME of Cr, cystC, β 2 MG, and Ccr based on eGFR was 15.8 ± 13.0, 17.2 ± 16.5, 15.4 ± 14.3, and 10.6 ± 13.0 ml/min/1.73 m 2 , respectively. The RMSE was 29.5, 23.8, 20.9, and 16.7, respectively. The P 30 was 79.4, 71.1, 69.5, and 92.9%, respectively. The Bland-Altman bias analysis showed values of 4.0 ± 18.6, 5.3 ± 16.8, 12.7 ± 17.0, and 2.5 ± 17.2 ml/min/1.73 m 2 , respectively, for these parameters. The bias of each eGFR equation was not large. Therefore, each eGFR equation could be used.

  11. Targeted estimation of nuisance parameters to obtain valid statistical inference.

    Science.gov (United States)

    van der Laan, Mark J

    2014-01-01

    In order to obtain concrete results, we focus on estimation of the treatment specific mean, controlling for all measured baseline covariates, based on observing independent and identically distributed copies of a random variable consisting of baseline covariates, a subsequently assigned binary treatment, and a final outcome. The statistical model only assumes possible restrictions on the conditional distribution of treatment, given the covariates, the so-called propensity score. Estimators of the treatment specific mean involve estimation of the propensity score and/or estimation of the conditional mean of the outcome, given the treatment and covariates. In order to make these estimators asymptotically unbiased at any data distribution in the statistical model, it is essential to use data-adaptive estimators of these nuisance parameters such as ensemble learning, and specifically super-learning. Because such estimators involve optimal trade-off of bias and variance w.r.t. the infinite dimensional nuisance parameter itself, they result in a sub-optimal bias/variance trade-off for the resulting real-valued estimator of the estimand. We demonstrate that additional targeting of the estimators of these nuisance parameters guarantees that this bias for the estimand is second order and thereby allows us to prove theorems that establish asymptotic linearity of the estimator of the treatment specific mean under regularity conditions. These insights result in novel targeted minimum loss-based estimators (TMLEs) that use ensemble learning with additional targeted bias reduction to construct estimators of the nuisance parameters. In particular, we construct collaborative TMLEs (C-TMLEs) with known influence curve allowing for statistical inference, even though these C-TMLEs involve variable selection for the propensity score based on a criterion that measures how effective the resulting fit of the propensity score is in removing bias for the estimand. As a particular special

  12. Observers for vehicle tyre/road forces estimation: experimental validation

    Science.gov (United States)

    Doumiati, M.; Victorino, A.; Lechner, D.; Baffet, G.; Charara, A.

    2010-11-01

    The motion of a vehicle is governed by the forces generated between the tyres and the road. Knowledge of these vehicle dynamic variables is important for vehicle control systems that aim to enhance vehicle stability and passenger safety. This study introduces a new estimation process for tyre/road forces. It presents many benefits over the existing state-of-art works, within the dynamic estimation framework. One of these major contributions consists of discussing in detail the vertical and lateral tyre forces at each tyre. The proposed method is based on the dynamic response of a vehicle instrumented with potentially integrated sensors. The estimation process is separated into two principal blocks. The role of the first block is to estimate vertical tyre forces, whereas in the second block two observers are proposed and compared for the estimation of lateral tyre/road forces. The different observers are based on a prediction/estimation Kalman filter. The performance of this concept is tested and compared with real experimental data using a laboratory car. Experimental results show that the proposed approach is a promising technique to provide accurate estimation. Thus, it can be considered as a practical low-cost solution for calculating vertical and lateral tyre/road forces.

  13. Validity of Submaximal Cycle Ergometry for Estimating Aerobic Capacity

    National Research Council Canada - National Science Library

    Myhre, Loren

    1998-01-01

    ... that allows early selection of the most appropriate test work load. A computerized version makes it possible for non-trained personnel to safely administer this test for estimating aerobic capacity...

  14. Validation of Transverse Oscillation Vector Velocity Estimation In-Vivo

    DEFF Research Database (Denmark)

    Hansen, Kristoffer Lindskov; Udesen, Jesper; Thomsen, Carsten

    2007-01-01

    Conventional Doppler methods for blood velocity estimation only estimate the velocity component along the ultrasound (US) beam direction. This implies that a Doppler angle under examination close to 90deg results in unreliable information about the true blood direction and blood velocity. The novel...... the presented angle independent 2-D vector velocity method. The results give reason to believe that the TO method can be a useful alternative to conventional Doppler systems bringing forth new information to the US examination of blood flow....

  15. On the validity of time-dependent AUC estimators.

    Science.gov (United States)

    Schmid, Matthias; Kestler, Hans A; Potapov, Sergej

    2015-01-01

    Recent developments in molecular biology have led to the massive discovery of new marker candidates for the prediction of patient survival. To evaluate the predictive value of these markers, statistical tools for measuring the performance of survival models are needed. We consider estimators of discrimination measures, which are a popular approach to evaluate survival predictions in biomarker studies. Estimators of discrimination measures are usually based on regularity assumptions such as the proportional hazards assumption. Based on two sets of molecular data and a simulation study, we show that violations of the regularity assumptions may lead to over-optimistic estimates of prediction accuracy and may therefore result in biased conclusions regarding the clinical utility of new biomarkers. In particular, we demonstrate that biased medical decision making is possible even if statistical checks indicate that all regularity assumptions are satisfied. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  16. Estimate of body composition by Hume's equation: validation with DXA.

    Science.gov (United States)

    Carnevale, Vincenzo; Piscitelli, Pamela Angela; Minonne, Rita; Castriotta, Valeria; Cipriani, Cristiana; Guglielmi, Giuseppe; Scillitani, Alfredo; Romagnoli, Elisabetta

    2015-05-01

    We investigated how the Hume's equation, using the antipyrine space, could perform in estimating fat mass (FM) and lean body mass (LBM). In 100 (40 male ad 60 female) subjects, we estimated FM and LBM by the equation and compared these values with those measured by a last generation DXA device. The correlation coefficients between measured and estimated FM were r = 0.940 (p LBM were r = 0.913 (p LBM, though the equation underestimated FM and overestimated LBM in respect to DXA. The mean difference for FM was 1.40 kg (limits of agreement of -6.54 and 8.37 kg). For LBM, the mean difference in respect to DXA was 1.36 kg (limits of agreement -8.26 and 6.52 kg). The root mean square error was 3.61 kg for FM and 3.56 kg for LBM. Our results show that in clinically stable subjects the Hume's equation could reliably assess body composition, and the estimated FM and LBM approached those measured by a modern DXA device.

  17. Estimation of the 24-h urinary protein excretion based on the estimated urinary creatinine output.

    Science.gov (United States)

    Ubukata, Masamitsu; Takei, Takashi; Nitta, Kosaku

    2016-06-01

    The urinary protein/creatinine ratio [Up/Ucr (g/gCr)] has been used in the clinical management of patients with chronic kidney disease (CKD). However, a discrepancy is often noted between the Up/Ucr and 24-h urinary protein excretion [24hUp (g/day)] in patients with extremes of muscle mass. We examined devised a method for precise estimation of the 24-h urinary protein excretion (E-24hUp) based on estimation of 24-h urinary creatinine output (E-24hCr). Three parameters, spot Up/Ucr, 24hUP and E-24hUp (=Up/Ucr × E-24hCr), were determined in 116 adult patients with CKD. The correlations among the groups were analyzed. There was a significant correlation between the Up/Ucr and 24hUp (p high urinary protein group (>3.5 g/day). There was a significant correlation between the Up/Ucr and 24hUp in the low (p = 0.04) and high urinary protein (p = 0.01) groups, whereas the correlation coefficient was lower in the intermediate urinary protein (p = 0.07) group. Thus, we found a significant correlation between 24hUp and E-24hUp in the study population overall (p high urinary protein group (p < 0.001). We conclude that a poor correlation exists between the Up/Ucr and 24hUp in patients with intermediate urinary protein excretion levels. The recommended parameter for monitoring proteinuria in such patients may be the E-24hUp, which is calculated using the E-24hCr.

  18. Importance of Statistical Evidence in Estimating Valid DEA Scores.

    Science.gov (United States)

    Barnum, Darold T; Johnson, Matthew; Gleason, John M

    2016-03-01

    Data Envelopment Analysis (DEA) allows healthcare scholars to measure productivity in a holistic manner. It combines a production unit's multiple outputs and multiple inputs into a single measure of its overall performance relative to other units in the sample being analyzed. It accomplishes this task by aggregating a unit's weighted outputs and dividing the output sum by the unit's aggregated weighted inputs, choosing output and input weights that maximize its output/input ratio when the same weights are applied to other units in the sample. Conventional DEA assumes that inputs and outputs are used in different proportions by the units in the sample. So, for the sample as a whole, inputs have been substituted for each other and outputs have been transformed into each other. Variables are assigned different weights based on their marginal rates of substitution and marginal rates of transformation. If in truth inputs have not been substituted nor outputs transformed, then there will be no marginal rates and therefore no valid basis for differential weights. This paper explains how to statistically test for the presence of substitutions among inputs and transformations among outputs. Then, it applies these tests to the input and output data from three healthcare DEA articles, in order to identify the effects on DEA scores when input substitutions and output transformations are absent in the sample data. It finds that DEA scores are badly biased when substitution and transformation are absent and conventional DEA models are used.

  19. Catalytic hydrolysis of ammonia borane: Intrinsic parameter estimation and validation

    Energy Technology Data Exchange (ETDEWEB)

    Basu, S.; Gore, J.P. [School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907-2088 (United States); School of Chemical Engineering, Purdue University, West Lafayette, IN 47907-2100 (United States); Energy Center in Discovery Park, Purdue University, West Lafayette, IN 47907-2022 (United States); Zheng, Y. [School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907-2088 (United States); Energy Center in Discovery Park, Purdue University, West Lafayette, IN 47907-2022 (United States); Varma, A.; Delgass, W.N. [School of Chemical Engineering, Purdue University, West Lafayette, IN 47907-2100 (United States); Energy Center in Discovery Park, Purdue University, West Lafayette, IN 47907-2022 (United States)

    2010-04-02

    Ammonia borane (AB) hydrolysis is a potential process for on-board hydrogen generation. This paper presents isothermal hydrogen release rate measurements of dilute AB (1 wt%) hydrolysis in the presence of carbon supported ruthenium catalyst (Ru/C). The ranges of investigated catalyst particle sizes and temperature were 20-181 {mu}m and 26-56 C, respectively. The obtained rate data included both kinetic and diffusion-controlled regimes, where the latter was evaluated using the catalyst effectiveness approach. A Langmuir-Hinshelwood kinetic model was adopted to interpret the data, with intrinsic kinetic and diffusion parameters determined by a nonlinear fitting algorithm. The AB hydrolysis was found to have an activation energy 60.4 kJ mol{sup -1}, pre-exponential factor 1.36 x 10{sup 10} mol (kg-cat){sup -1} s{sup -1}, adsorption energy -32.5 kJ mol{sup -1}, and effective mass diffusion coefficient 2 x 10{sup -10} m{sup 2} s{sup -1}. These parameters, obtained under dilute AB conditions, were validated by comparing measurements with simulations of AB consumption rates during the hydrolysis of concentrated AB solutions (5-20 wt%), and also with the axial temperature distribution in a 0.5 kW continuous-flow packed-bed reactor. (author)

  20. Development and validation of satellite based estimates of surface visibility

    Science.gov (United States)

    Brunner, J.; Pierce, R. B.; Lenzen, A.

    2015-10-01

    A satellite based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weather prediction forecasts to National Weather Service Automated Surface Observing System (ASOS) surface visibility measurements. Validation using independent ASOS measurements shows that the GOES-R ABI surface visibility retrieval (V) has an overall success rate of 64.5% for classifying Clear (V ≥ 30 km), Moderate (10 km ≤ V United States Environmental Protection Agency (EPA) and National Park Service (NPS) Interagency Monitoring of Protected Visual Environments (IMPROVE) network, and provide useful information to the regional planning offices responsible for developing mitigation strategies required under the EPA's Regional Haze Rule, particularly during regional haze events associated with smoke from wildfires.

  1. Validity and practicability of smartphone-based photographic food records for estimating energy and nutrient intake.

    Science.gov (United States)

    Kong, Kaimeng; Zhang, Lulu; Huang, Lisu; Tao, Yexuan

    2017-05-01

    Image-assisted dietary assessment methods are frequently used to record individual eating habits. This study tested the validity of a smartphone-based photographic food recording approach by comparing the results obtained with those of a weighed food record. We also assessed the practicality of the method by using it to measure the energy and nutrient intake of college students. The experiment was implemented in two phases, each lasting 2 weeks. In the first phase, a labelled menu and a photograph database were constructed. The energy and nutrient content of 31 randomly selected dishes in three different portion sizes were then estimated by the photograph-based method and compared with a weighed food record. In the second phase, we combined the smartphone-based photographic method with the WeChat smartphone application and applied this to 120 randomly selected participants to record their energy and nutrient intake. The Pearson correlation coefficients for energy, protein, fat, and carbohydrate content between the weighed and the photographic food record were 0.997, 0.936, 0.996, and 0.999, respectively. Bland-Altman plots showed good agreement between the two methods. The estimated protein, fat, and carbohydrate intake by participants was in accordance with values in the Chinese Residents' Nutrition and Chronic Disease report (2015). Participants expressed satisfaction with the new approach and the compliance rate was 97.5%. The smartphone-based photographic dietary assessment method combined with the WeChat instant messaging application was effective and practical for use by young people.

  2. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  3. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  4. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    deterministic case, and the uncertainty bands did not always overlap. This suggest that there are considerable model uncertainties present, which were not considered in this study. Concerning possible constraints in the application domain of different models, the results of this exercise suggest that if only the evolution of the root zone concentration is to be predicted, all of the studied models give comparable results. However, if also the flux to the groundwater is to be predicted, then a considerably increased amount of detail is needed concerning the model and the parameterization. This applies to the hydrological as well as the transport modelling. The difference in model predictions and the magnitude of uncertainty was quite small for some of the end-points predicted, while for others it could span many orders of magnitude. Of special importance were end-points where delay in the soil was involved, e.g. release to the groundwater. In such cases the influence of radioactive decay gave rise to strongly non-linear effects. The work in the subgroup has provided many valuable insights on the effects of model simplifications, e.g. discretization in the model, averaging of the time varying input parameters and the assignment of uncertainties to parameters. The conclusions that have been drawn concerning these are primarily valid for the studied scenario. However, we believe that they to a large extent also are generally applicable. The subgroup have had many opportunities to study the pitfalls involved in model comparison. The intention was to provide a well defined scenario for the subgroup, but despite several iterations misunderstandings and ambiguities remained. The participants have been forced to scrutinize their models to try to explain differences in the predictions and most, if not all, of the participants have improved their models as a result of this

  5. Valid and efficient manual estimates of intracranial volume from magnetic resonance images

    International Nuclear Information System (INIS)

    Klasson, Niklas; Olsson, Erik; Rudemo, Mats; Eckerström, Carl; Malmgren, Helge; Wallin, Anders

    2015-01-01

    Manual segmentations of the whole intracranial vault in high-resolution magnetic resonance images are often regarded as very time-consuming. Therefore it is common to only segment a few linearly spaced intracranial areas to estimate the whole volume. The purpose of the present study was to evaluate how the validity of intracranial volume estimates is affected by the chosen interpolation method, orientation of the intracranial areas and the linear spacing between them. Intracranial volumes were manually segmented on 62 participants from the Gothenburg MCI study using 1.5 T, T 1 -weighted magnetic resonance images. Estimates of the intracranial volumes were then derived using subsamples of linearly spaced coronal, sagittal or transversal intracranial areas from the same volumes. The subsamples of intracranial areas were interpolated into volume estimates by three different interpolation methods. The linear spacing between the intracranial areas ranged from 2 to 50 mm and the validity of the estimates was determined by comparison with the entire intracranial volumes. A progressive decrease in intra-class correlation and an increase in percentage error could be seen with increased linear spacing between intracranial areas. With small linear spacing (≤15 mm), orientation of the intracranial areas and interpolation method had negligible effects on the validity. With larger linear spacing, the best validity was achieved using cubic spline interpolation with either coronal or sagittal intracranial areas. Even at a linear spacing of 50 mm, cubic spline interpolation on either coronal or sagittal intracranial areas had a mean absolute agreement intra-class correlation with the entire intracranial volumes above 0.97. Cubic spline interpolation in combination with linearly spaced sagittal or coronal intracranial areas overall resulted in the most valid and robust estimates of intracranial volume. Using this method, valid ICV estimates could be obtained in less than five

  6. Validation of protein carbonyl measurement: A multi-centre study

    Directory of Open Access Journals (Sweden)

    Edyta Augustyniak

    2015-04-01

    Full Text Available Protein carbonyls are widely analysed as a measure of protein oxidation. Several different methods exist for their determination. A previous study had described orders of magnitude variance that existed when protein carbonyls were analysed in a single laboratory by ELISA using different commercial kits. We have further explored the potential causes of variance in carbonyl analysis in a ring study. A soluble protein fraction was prepared from rat liver and exposed to 0, 5 and 15 min of UV irradiation. Lyophilised preparations were distributed to six different laboratories that routinely undertook protein carbonyl analysis across Europe. ELISA and Western blotting techniques detected an increase in protein carbonyl formation between 0 and 5 min of UV irradiation irrespective of method used. After irradiation for 15 min, less oxidation was detected by half of the laboratories than after 5 min irradiation. Three of the four ELISA carbonyl results fell within 95% confidence intervals. Likely errors in calculating absolute carbonyl values may be attributed to differences in standardisation. Out of up to 88 proteins identified as containing carbonyl groups after tryptic cleavage of irradiated and control liver proteins, only seven were common in all three liver preparations. Lysine and arginine residues modified by carbonyls are likely to be resistant to tryptic proteolysis. Use of a cocktail of proteases may increase the recovery of oxidised peptides. In conclusion, standardisation is critical for carbonyl analysis and heavily oxidised proteins may not be effectively analysed by any existing technique.

  7. Classification in hyperspectral images by independent component analysis, segmented cross-validation and uncertainty estimates

    Directory of Open Access Journals (Sweden)

    Beatriz Galindo-Prieto

    2018-02-01

    Full Text Available Independent component analysis combined with various strategies for cross-validation, uncertainty estimates by jack-knifing and critical Hotelling’s T2 limits estimation, proposed in this paper, is used for classification purposes in hyperspectral images. To the best of our knowledge, the combined approach of methods used in this paper has not been previously applied to hyperspectral imaging analysis for interpretation and classification in the literature. The data analysis performed here aims to distinguish between four different types of plastics, some of them containing brominated flame retardants, from their near infrared hyperspectral images. The results showed that the method approach used here can be successfully used for unsupervised classification. A comparison of validation approaches, especially leave-one-out cross-validation and regions of interest scheme validation is also evaluated.

  8. Estimation of daily protein intake based on spot urine urea nitrogen concentration in chronic kidney disease patients.

    Science.gov (United States)

    Kanno, Hiroko; Kanda, Eiichiro; Sato, Asako; Sakamoto, Kaori; Kanno, Yoshihiko

    2016-04-01

    Determination of daily protein intake in the management of chronic kidney disease (CKD) requires precision. Inaccuracies in recording dietary intake occur, and estimation from total urea excretion presents hurdles owing to the difficulty of collecting whole urine for 24 h. Spot urine has been used for measuring daily sodium intake and urinary protein excretion. In this cross-sectional study, we investigated whether urea nitrogen (UN) concentration in spot urine can be used to predict daily protein intake instead of the 24-h urine collection in 193 Japanese CKD patients (Stages G1-G5). After patient randomization into 2 datasets for the development and validation of models, bootstrapping was used to develop protein intake estimation models. The parameters for the candidate multivariate regression models were male gender, age, body mass index (BMI), diabetes mellitus, dyslipidemia, proteinuria, estimated glomerular filtration rate, serum albumin level, spot urinary UN and creatinine level, and spot urinary UN/creatinine levels. The final model contained BMI and spot urinary UN level. The final model was selected because of the higher correlation between the predicted and measured protein intakes r = 0.558 (95 % confidence interval 0.400, 0.683), and the smaller distribution of the difference between the measured and predicted protein intakes than those of the other models. The results suggest that UN concentration in spot urine may be used to estimate daily protein intake and that a prediction formula would be useful for nutritional control in CKD patients.

  9. Validation of Temperature Histories for Structural Steel Welds Using Estimated Heat-Affected-Zone Edges

    Science.gov (United States)

    2016-10-12

    Metallurgy , 2nd Ed., John Wiley & Sons, Inc., 2003. DOI: 10.1002/0471434027. 2. O. Grong, Metallurgical Modelling of Welding , 2ed., Materials Modelling...Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6394--16-9690 Validation of Temperature Histories for Structural Steel Welds Using...PAGES 17. LIMITATION OF ABSTRACT Validation of Temperature Histories for Structural Steel Welds Using Estimated Heat-Affected-Zone Edges S.G. Lambrakos

  10. Validity and reliability of Nike + Fuelband for estimating physical activity energy expenditure.

    Science.gov (United States)

    Tucker, Wesley J; Bhammar, Dharini M; Sawyer, Brandon J; Buman, Matthew P; Gaesser, Glenn A

    2015-01-01

    The Nike + Fuelband is a commercially available, wrist-worn accelerometer used to track physical activity energy expenditure (PAEE) during exercise. However, validation studies assessing the accuracy of this device for estimating PAEE are lacking. Therefore, this study examined the validity and reliability of the Nike + Fuelband for estimating PAEE during physical activity in young adults. Secondarily, we compared PAEE estimation of the Nike + Fuelband with the previously validated SenseWear Armband (SWA). Twenty-four participants (n = 24) completed two, 60-min semi-structured routines consisting of sedentary/light-intensity, moderate-intensity, and vigorous-intensity physical activity. Participants wore a Nike + Fuelband and SWA, while oxygen uptake was measured continuously with an Oxycon Mobile (OM) metabolic measurement system (criterion). The Nike + Fuelband (ICC = 0.77) and SWA (ICC = 0.61) both demonstrated moderate to good validity. PAEE estimates provided by the Nike + Fuelband (246 ± 67 kcal) and SWA (238 ± 57 kcal) were not statistically different than OM (243 ± 67 kcal). Both devices also displayed similar mean absolute percent errors for PAEE estimates (Nike + Fuelband = 16 ± 13 %; SWA = 18 ± 18 %). Test-retest reliability for PAEE indicated good stability for Nike + Fuelband (ICC = 0.96) and SWA (ICC = 0.90). The Nike + Fuelband provided valid and reliable estimates of PAEE, that are similar to the previously validated SWA, during a routine that included approximately equal amounts of sedentary/light-, moderate- and vigorous-intensity physical activity.

  11. Validation of protein models by a neural network approach

    Directory of Open Access Journals (Sweden)

    Fantucci Piercarlo

    2008-01-01

    Full Text Available Abstract Background The development and improvement of reliable computational methods designed to evaluate the quality of protein models is relevant in the context of protein structure refinement, which has been recently identified as one of the bottlenecks limiting the quality and usefulness of protein structure prediction. Results In this contribution, we present a computational method (Artificial Intelligence Decoys Evaluator: AIDE which is able to consistently discriminate between correct and incorrect protein models. In particular, the method is based on neural networks that use as input 15 structural parameters, which include energy, solvent accessible surface, hydrophobic contacts and secondary structure content. The results obtained with AIDE on a set of decoy structures were evaluated using statistical indicators such as Pearson correlation coefficients, Znat, fraction enrichment, as well as ROC plots. It turned out that AIDE performances are comparable and often complementary to available state-of-the-art learning-based methods. Conclusion In light of the results obtained with AIDE, as well as its comparison with available learning-based methods, it can be concluded that AIDE can be successfully used to evaluate the quality of protein structures. The use of AIDE in combination with other evaluation tools is expected to further enhance protein refinement efforts.

  12. Robust Backlash Estimation for Industrial Drive-Train Systems—Theory and Validation

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios; Blanke, Mogens; Niemann, Hans Henrik

    2018-01-01

    Backlash compensation is used in modern machinetool controls to ensure high-accuracy positioning. When wear of a machine causes deadzone width to increase, high-accuracy control may be maintained if the deadzone is accurately estimated. Deadzone estimation is also an important parameter to indica......-of-the-art Siemens equipment. The experiments validate the theory and show that expected performance and robustness to parameter uncertainties are both achieved....

  13. Validation of radiation dose estimations in VRdose: comparing estimated radiation doses with observed radiation doses

    International Nuclear Information System (INIS)

    Nystad, Espen; Sebok, Angelia; Meyer, Geir

    2004-04-01

    The Halden Virtual Reality Centre has developed work-planning software that predicts the radiation exposure of workers in contaminated areas. To validate the accuracy of the predicted radiation dosages, it is necessary to compare predicted doses to actual dosages. During an experimental study conducted at the Halden Boiling Water Reactor (HBWR) hall, the radiation exposure was measured for all participants throughout the test session, ref. HWR-681 [3]. Data from this experimental study have also been used to model tasks in the work-planning software and gather data for predicted radiation exposure. Two different methods were used to predict radiation dosages; one method used all radiation data from all the floor levels in the HBWR (all-data method). The other used only data from the floor level where the task was conducted (isolated data method). The study showed that the all-data method gave predictions that were on average 2.3 times higher than the actual radiation dosages. The isolated-data method gave predictions on average 0.9 times the actual dosages. (Author)

  14. Implementing an X-ray validation pipeline for the Protein Data Bank

    International Nuclear Information System (INIS)

    Gore, Swanand; Velankar, Sameer; Kleywegt, Gerard J.

    2012-01-01

    The implementation of a validation pipeline, based on community recommendations, for future depositions of X-ray crystal structures in the Protein Data Bank is described. There is an increasing realisation that the quality of the biomacromolecular structures deposited in the Protein Data Bank (PDB) archive needs to be assessed critically using established and powerful validation methods. The Worldwide Protein Data Bank (wwPDB) organization has convened several Validation Task Forces (VTFs) to advise on the methods and standards that should be used to validate all of the entries already in the PDB as well as all structures that will be deposited in the future. The recommendations of the X-ray VTF are currently being implemented in a software pipeline. Here, ongoing work on this pipeline is briefly described as well as ways in which validation-related information could be presented to users of structural data

  15. Implementing an X-ray validation pipeline for the Protein Data Bank

    Energy Technology Data Exchange (ETDEWEB)

    Gore, Swanand; Velankar, Sameer; Kleywegt, Gerard J., E-mail: gerard@ebi.ac.uk [EMBL–EBI, Wellcome Trust Genome Campus, Hinxton, Cambridge CB10 1SD (United Kingdom)

    2012-04-01

    The implementation of a validation pipeline, based on community recommendations, for future depositions of X-ray crystal structures in the Protein Data Bank is described. There is an increasing realisation that the quality of the biomacromolecular structures deposited in the Protein Data Bank (PDB) archive needs to be assessed critically using established and powerful validation methods. The Worldwide Protein Data Bank (wwPDB) organization has convened several Validation Task Forces (VTFs) to advise on the methods and standards that should be used to validate all of the entries already in the PDB as well as all structures that will be deposited in the future. The recommendations of the X-ray VTF are currently being implemented in a software pipeline. Here, ongoing work on this pipeline is briefly described as well as ways in which validation-related information could be presented to users of structural data.

  16. An Improved Fuzzy Based Missing Value Estimation in DNA Microarray Validated by Gene Ranking

    Directory of Open Access Journals (Sweden)

    Sujay Saha

    2016-01-01

    Full Text Available Most of the gene expression data analysis algorithms require the entire gene expression matrix without any missing values. Hence, it is necessary to devise methods which would impute missing data values accurately. There exist a number of imputation algorithms to estimate those missing values. This work starts with a microarray dataset containing multiple missing values. We first apply the modified version of the fuzzy theory based existing method LRFDVImpute to impute multiple missing values of time series gene expression data and then validate the result of imputation by genetic algorithm (GA based gene ranking methodology along with some regular statistical validation techniques, like RMSE method. Gene ranking, as far as our knowledge, has not been used yet to validate the result of missing value estimation. Firstly, the proposed method has been tested on the very popular Spellman dataset and results show that error margins have been drastically reduced compared to some previous works, which indirectly validates the statistical significance of the proposed method. Then it has been applied on four other 2-class benchmark datasets, like Colorectal Cancer tumours dataset (GDS4382, Breast Cancer dataset (GSE349-350, Prostate Cancer dataset, and DLBCL-FL (Leukaemia for both missing value estimation and ranking the genes, and the results show that the proposed method can reach 100% classification accuracy with very few dominant genes, which indirectly validates the biological significance of the proposed method.

  17. Validity of 20-metre multi stage shuttle run test for estimation of ...

    African Journals Online (AJOL)

    Validity of 20-metre multi stage shuttle run test for estimation of maximum oxygen uptake in indian male university students. P Chatterjee, AK Banerjee, P Debnath, P Bas, B Chatterjee. Abstract. No Abstract. South African Journal for Physical, Health Education, Recreation and DanceVol. 12(4) 2006: pp. 461-467. Full Text:.

  18. Relative validation of a food frequency questionnaire to estimate food intake in an adult population.

    Science.gov (United States)

    Steinemann, Nina; Grize, Leticia; Ziesemer, Katrin; Kauf, Peter; Probst-Hensch, Nicole; Brombach, Christine

    2017-01-01

    Background : Scientifically valid descriptions of dietary intake at population level are crucial for investigating diet effects on health and disease. Food frequency questionnaires (FFQs) are the most common dietary tools used in large epidemiological studies. Objective : To examine the relative validity of a newly developed FFQ to be used as dietary assessment tool in epidemiological studies. Design : Validity was evaluated by comparing the FFQ and a 4-day weighed food record (4-d FR) at nutrient and food group levels, Spearman's correlations, Bland-Altman analysis and Wilcoxon rank sum tests were used. Fifty-six participants completed a paper format FFQ and a 4-d FR within 4 weeks. Results : Corrected correlations between the two instruments ranged from 0.27 (carbohydrates) to 0.55 (protein), and at food group level from 0.09 (soup) to 0.92 (alcohol). Nine out of 25 food groups showed correlations > 0.5, indicating moderate validity. More than half the food groups were overestimated in the FFQ, especially vegetables (82.8%) and fruits (56.3%). Water, tea and coffee were underestimated (-14.0%). Conclusions : The FFQ showed moderate relative validity for protein and the food groups fruits, egg, meat, sausage, nuts, salty snacks and beverages. This study supports the use of the FFQ as an acceptable tool for assessing nutrition as a health determinant in large epidemiological studies.

  19. A short 18 items food frequency questionnaire biochemically validated to estimate zinc status in humans.

    Science.gov (United States)

    Trame, Sarah; Wessels, Inga; Haase, Hajo; Rink, Lothar

    2018-02-21

    Inadequate dietary zinc intake is wide-spread in the world's population. Despite the clinical significance of zinc deficiency there is no established method or biomarker to reliably evaluate the zinc status. The aim of our study was to develop a biochemically validated questionnaire as a clinically useful tool that can predict the risk of an individual being zinc deficient. From 71 subjects aged 18-55 years blood and urine samples were collected. Zinc concentrations in serum and urine were determined by atomic absorption spectrometry. A food frequency questionnaire (FFQ) including 38 items was filled out representing the consumption during the last 6 months obtaining nutrient diet scores. Latter were calculated by multiplication of the particular frequency of consumption, the nutrient intake of the respective portion size and the extent of the consumed quantity. Results from the FFQ were compared with nutrient intake information gathered in 24-h dietary recalls. A hemogram was performed and cytokine concentrations were obtained using Enzyme-linked Immunosorbent Assay. Reducing the items of the primary FFQ from 38 to 18 did not result in a significant variance between both calculated scores. Zinc diet scores showed highly significant correlation with serum zinc (r = 0.37; p < 0.01) and urine zinc concentrations (r = 0.34; p < 0.01). Serum zinc concentrations and zinc diet scores showed a significant positive correlation with animal protein intake (r = 0.37; p < 0.01/r = 0.54; p < 0.0001). Higher zinc diet scores were found in omnivores compared to vegetarians (213.5 vs. 111.9; p < 0.0001). The 18 items FFQ seems to be a sufficient tool to provide a good estimation of the zinc status. Moreover, shortening of the questionnaire to 18 items without a loss of predictive efficiency enables a facilitated and resource-saving routine use. A validation of the questionnaire in other cohorts could enable the progression towards clinical

  20. Validation of differential gene expression algorithms: Application comparing fold-change estimation to hypothesis testing

    Directory of Open Access Journals (Sweden)

    Bickel David R

    2010-01-01

    Full Text Available Abstract Background Sustained research on the problem of determining which genes are differentially expressed on the basis of microarray data has yielded a plethora of statistical algorithms, each justified by theory, simulation, or ad hoc validation and yet differing in practical results from equally justified algorithms. Recently, a concordance method that measures agreement among gene lists have been introduced to assess various aspects of differential gene expression detection. This method has the advantage of basing its assessment solely on the results of real data analyses, but as it requires examining gene lists of given sizes, it may be unstable. Results Two methodologies for assessing predictive error are described: a cross-validation method and a posterior predictive method. As a nonparametric method of estimating prediction error from observed expression levels, cross validation provides an empirical approach to assessing algorithms for detecting differential gene expression that is fully justified for large numbers of biological replicates. Because it leverages the knowledge that only a small portion of genes are differentially expressed, the posterior predictive method is expected to provide more reliable estimates of algorithm performance, allaying concerns about limited biological replication. In practice, the posterior predictive method can assess when its approximations are valid and when they are inaccurate. Under conditions in which its approximations are valid, it corroborates the results of cross validation. Both comparison methodologies are applicable to both single-channel and dual-channel microarrays. For the data sets considered, estimating prediction error by cross validation demonstrates that empirical Bayes methods based on hierarchical models tend to outperform algorithms based on selecting genes by their fold changes or by non-hierarchical model-selection criteria. (The latter two approaches have comparable

  1. Validity of Bioelectrical Impedance Analysis to Estimation Fat-Free Mass in the Army Cadets.

    Science.gov (United States)

    Langer, Raquel D; Borges, Juliano H; Pascoa, Mauro A; Cirolini, Vagner X; Guerra-Júnior, Gil; Gonçalves, Ezequiel M

    2016-03-11

    Bioelectrical Impedance Analysis (BIA) is a fast, practical, non-invasive, and frequently used method for fat-free mass (FFM) estimation. The aims of this study were to validate predictive equations of BIA to FFM estimation in Army cadets and to develop and validate a specific BIA equation for this population. A total of 396 males, Brazilian Army cadets, aged 17-24 years were included. The study used eight published predictive BIA equations, a specific equation in FFM estimation, and dual-energy X-ray absorptiometry (DXA) as a reference method. Student's t-test (for paired sample), linear regression analysis, and Bland-Altman method were used to test the validity of the BIA equations. Predictive BIA equations showed significant differences in FFM compared to DXA (p FFM variance. Specific BIA equations showed no significant differences in FFM, compared to DXA values. Published BIA predictive equations showed poor accuracy in this sample. The specific BIA equations, developed in this study, demonstrated validity for this sample, although should be used with caution in samples with a large range of FFM.

  2. Validity of Bioelectrical Impedance Analysis to Estimation Fat-Free Mass in the Army Cadets

    Directory of Open Access Journals (Sweden)

    Raquel D. Langer

    2016-03-01

    Full Text Available Background: Bioelectrical Impedance Analysis (BIA is a fast, practical, non-invasive, and frequently used method for fat-free mass (FFM estimation. The aims of this study were to validate predictive equations of BIA to FFM estimation in Army cadets and to develop and validate a specific BIA equation for this population. Methods: A total of 396 males, Brazilian Army cadets, aged 17–24 years were included. The study used eight published predictive BIA equations, a specific equation in FFM estimation, and dual-energy X-ray absorptiometry (DXA as a reference method. Student’s t-test (for paired sample, linear regression analysis, and Bland–Altman method were used to test the validity of the BIA equations. Results: Predictive BIA equations showed significant differences in FFM compared to DXA (p < 0.05 and large limits of agreement by Bland–Altman. Predictive BIA equations explained 68% to 88% of FFM variance. Specific BIA equations showed no significant differences in FFM, compared to DXA values. Conclusion: Published BIA predictive equations showed poor accuracy in this sample. The specific BIA equations, developed in this study, demonstrated validity for this sample, although should be used with caution in samples with a large range of FFM.

  3. Validation of generic cost estimates for construction-related activities at nuclear power plants: Final report

    International Nuclear Information System (INIS)

    Simion, G.; Sciacca, F.; Claiborne, E.; Watlington, B.; Riordan, B.; McLaughlin, M.

    1988-05-01

    This report represents a validation study of the cost methodologies and quantitative factors derived in Labor Productivity Adjustment Factors and Generic Methodology for Estimating the Labor Cost Associated with the Removal of Hardware, Materials, and Structures From Nuclear Power Plants. This cost methodology was developed to support NRC analysts in determining generic estimates of removal, installation, and total labor costs for construction-related activities at nuclear generating stations. In addition to the validation discussion, this report reviews the generic cost analysis methodology employed. It also discusses each of the individual cost factors used in estimating the costs of physical modifications at nuclear power plants. The generic estimating approach presented uses the /open quotes/greenfield/close quotes/ or new plant construction installation costs compiled in the Energy Economic Data Base (EEDB) as a baseline. These baseline costs are then adjusted to account for labor productivity, radiation fields, learning curve effects, and impacts on ancillary systems or components. For comparisons of estimated vs actual labor costs, approximately four dozen actual cost data points (as reported by 14 nuclear utilities) were obtained. Detailed background information was collected on each individual data point to give the best understanding possible so that the labor productivity factors, removal factors, etc., could judiciously be chosen. This study concludes that cost estimates that are typically within 40% of the actual values can be generated by prudently using the methodologies and cost factors investigated herein

  4. Constructing valid density matrices on an NMR quantum information processor via maximum likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Harpreet; Arvind; Dorai, Kavita, E-mail: kavita@iisermohali.ac.in

    2016-09-07

    Estimation of quantum states is an important step in any quantum information processing experiment. A naive reconstruction of the density matrix from experimental measurements can often give density matrices which are not positive, and hence not physically acceptable. How do we ensure that at all stages of reconstruction, we keep the density matrix positive? Recently a method has been suggested based on maximum likelihood estimation, wherein the density matrix is guaranteed to be positive definite. We experimentally implement this protocol on an NMR quantum information processor. We discuss several examples and compare with the standard method of state estimation. - Highlights: • State estimation using maximum likelihood method was performed on an NMR quantum information processor. • Physically valid density matrices were obtained every time in contrast to standard quantum state tomography. • Density matrices of several different entangled and separable states were reconstructed for two and three qubits.

  5. Infant bone age estimation based on fibular shaft length: model development and clinical validation

    International Nuclear Information System (INIS)

    Tsai, Andy; Stamoulis, Catherine; Bixby, Sarah D.; Breen, Micheal A.; Connolly, Susan A.; Kleinman, Paul K.

    2016-01-01

    Bone age in infants (<1 year old) is generally estimated using hand/wrist or knee radiographs, or by counting ossification centers. The accuracy and reproducibility of these techniques are largely unknown. To develop and validate an infant bone age estimation technique using fibular shaft length and compare it to conventional methods. We retrospectively reviewed negative skeletal surveys of 247 term-born low-risk-of-abuse infants (no persistent child protection team concerns) from July 2005 to February 2013, and randomized them into two datasets: (1) model development (n = 123) and (2) model testing (n = 124). Three pediatric radiologists measured all fibular shaft lengths. An ordinary linear regression model was fitted to dataset 1, and the model was evaluated using dataset 2. Readers also estimated infant bone ages in dataset 2 using (1) the hemiskeleton method of Sontag, (2) the hemiskeleton method of Elgenmark, (3) the hand/wrist atlas of Greulich and Pyle, and (4) the knee atlas of Pyle and Hoerr. For validation, we selected lower-extremity radiographs of 114 normal infants with no suspicion of abuse. Readers measured the fibulas and also estimated bone ages using the knee atlas. Bone age estimates from the proposed method were compared to the other methods. The proposed method outperformed all other methods in accuracy and reproducibility. Its accuracy was similar for the testing and validating datasets, with root-mean-square error of 36 days and 37 days; mean absolute error of 28 days and 31 days; and error variability of 22 days and 20 days, respectively. This study provides strong support for an infant bone age estimation technique based on fibular shaft length as a more accurate alternative to conventional methods. (orig.)

  6. Infant bone age estimation based on fibular shaft length: model development and clinical validation

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Andy; Stamoulis, Catherine; Bixby, Sarah D.; Breen, Micheal A.; Connolly, Susan A.; Kleinman, Paul K. [Boston Children' s Hospital, Harvard Medical School, Department of Radiology, Boston, MA (United States)

    2016-03-15

    Bone age in infants (<1 year old) is generally estimated using hand/wrist or knee radiographs, or by counting ossification centers. The accuracy and reproducibility of these techniques are largely unknown. To develop and validate an infant bone age estimation technique using fibular shaft length and compare it to conventional methods. We retrospectively reviewed negative skeletal surveys of 247 term-born low-risk-of-abuse infants (no persistent child protection team concerns) from July 2005 to February 2013, and randomized them into two datasets: (1) model development (n = 123) and (2) model testing (n = 124). Three pediatric radiologists measured all fibular shaft lengths. An ordinary linear regression model was fitted to dataset 1, and the model was evaluated using dataset 2. Readers also estimated infant bone ages in dataset 2 using (1) the hemiskeleton method of Sontag, (2) the hemiskeleton method of Elgenmark, (3) the hand/wrist atlas of Greulich and Pyle, and (4) the knee atlas of Pyle and Hoerr. For validation, we selected lower-extremity radiographs of 114 normal infants with no suspicion of abuse. Readers measured the fibulas and also estimated bone ages using the knee atlas. Bone age estimates from the proposed method were compared to the other methods. The proposed method outperformed all other methods in accuracy and reproducibility. Its accuracy was similar for the testing and validating datasets, with root-mean-square error of 36 days and 37 days; mean absolute error of 28 days and 31 days; and error variability of 22 days and 20 days, respectively. This study provides strong support for an infant bone age estimation technique based on fibular shaft length as a more accurate alternative to conventional methods. (orig.)

  7. A validated HPTLC method for estimation of moxifloxacin hydrochloride in tablets.

    Science.gov (United States)

    Dhillon, Vandana; Chaudhary, Alok Kumar

    2010-10-01

    A simple HPTLC method having high accuracy, precision and reproducibility was developed for the routine estimation of moxifloxacin hydrochloride in the tablets available in market and was validated for various parameters according to ICH guidelines. moxifloxacin hydrochloride was estimated at 292 nm by densitometry using Silica gel 60 F254 as stationary phase and a premix of methylene chloride: methanol: strong ammonia solution and acetonitrile (10:10:5:10) as mobile phase. Method was found linear in a range of 9-54 nanograms with a correlation coefficient >0.99. The regression equation was: AUC = 65.57 × (Amount in nanograms) + 163 (r(2) = 0.9908).

  8. Development and validation of GFR-estimating equations using diabetes, transplant and weight

    DEFF Research Database (Denmark)

    Stevens, L.A.; Schmid, C.H.; Zhang, Y.L.

    2009-01-01

    interactions. Equations were developed in a pooled database of 10 studies [2/3 (N = 5504) for development and 1/3 (N = 2750) for internal validation], and final model selection occurred in 16 additional studies [external validation (N = 3896)]. RESULTS: The mean mGFR was 68, 67 and 68 ml/min/ 1.73 m(2......BACKGROUND: We have reported a new equation (CKD-EPI equation) that reduces bias and improves accuracy for GFR estimation compared to the MDRD study equation while using the same four basic predictor variables: creatinine, age, sex and race. Here, we describe the development and validation...... of this equation as well as other equations that incorporate diabetes, transplant and weight as additional predictor variables. METHODS: Linear regression was used to relate log-measured GFR (mGFR) to sex, race, diabetes, transplant, weight, various transformations of creatinine and age with and without...

  9. Development, Validation, and Verification of a Self-Assessment Tool to Estimate Agnibala (Digestive Strength).

    Science.gov (United States)

    Singh, Aparna; Singh, Girish; Patwardhan, Kishor; Gehlot, Sangeeta

    2017-01-01

    According to Ayurveda, the traditional system of healthcare of Indian origin, Agni is the factor responsible for digestion and metabolism. Four functional states (Agnibala) of Agni have been recognized: regular, irregular, intense, and weak. The objective of the present study was to develop and validate a self-assessment tool to estimate Agnibala The developed tool was evaluated for its reliability and validity by administering it to 300 healthy volunteers of either gender belonging to 18 to 40-year age group. Besides confirming the statistical validity and reliability, the practical utility of the newly developed tool was also evaluated by recording serum lipid parameters of all the volunteers. The results show that the lipid parameters vary significantly according to the status of Agni The tool, therefore, may be used to screen normal population to look for possible susceptibility to certain health conditions. © The Author(s) 2016.

  10. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  11. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  12. Endogenous protein "barcode" for data validation and normalization in quantitative MS analysis.

    Science.gov (United States)

    Lee, Wooram; Lazar, Iulia M

    2014-07-01

    Quantitative proteomic experiments with mass spectrometry detection are typically conducted by using stable isotope labeling and label-free quantitation approaches. Proteins with housekeeping functions and stable expression level such actin, tubulin, and glyceraldehyde-3-phosphate dehydrogenase are frequently used as endogenous controls. Recent studies have shown that the expression level of such common housekeeping proteins is, in fact, dependent on various factors such as cell type, cell cycle, or disease status and can change in response to a biochemical stimulation. The interference of such phenomena can, therefore, substantially compromise their use for data validation, alter the interpretation of results, and lead to erroneous conclusions. In this work, we advance the concept of a protein "barcode" for data normalization and validation in quantitative proteomic experiments. The barcode comprises a novel set of proteins that was generated from cell cycle experiments performed with MCF7, an estrogen receptor positive breast cancer cell line, and MCF10A, a nontumorigenic immortalized breast cell line. The protein set was selected from a list of ~3700 proteins identified in different cellular subfractions and cell cycle stages of MCF7/MCF10A cells, based on the stability of spectral count data generated with an LTQ ion trap mass spectrometer. A total of 11 proteins qualified as endogenous standards for the nuclear and 62 for the cytoplasmic barcode, respectively. The validation of the protein sets was performed with a complementary SKBR3/Her2+ cell line.

  13. Validation and Intercomparison of Ocean Color Algorithms for Estimating Particulate Organic Carbon in the Oceans

    Directory of Open Access Journals (Sweden)

    Hayley Evers-King

    2017-08-01

    Full Text Available Particulate Organic Carbon (POC plays a vital role in the ocean carbon cycle. Though relatively small compared with other carbon pools, the POC pool is responsible for large fluxes and is linked to many important ocean biogeochemical processes. The satellite ocean-color signal is influenced by particle composition, size, and concentration and provides a way to observe variability in the POC pool at a range of temporal and spatial scales. To provide accurate estimates of POC concentration from satellite ocean color data requires algorithms that are well validated, with uncertainties characterized. Here, a number of algorithms to derive POC using different optical variables are applied to merged satellite ocean color data provided by the Ocean Color Climate Change Initiative (OC-CCI and validated against the largest database of in situ POC measurements currently available. The results of this validation exercise indicate satisfactory levels of performance from several algorithms (highest performance was observed from the algorithms of Loisel et al., 2002; Stramski et al., 2008 and uncertainties that are within the requirements of the user community. Estimates of the standing stock of the POC can be made by applying these algorithms, and yield an estimated mixed-layer integrated global stock of POC between 0.77 and 1.3 Pg C of carbon. Performance of the algorithms vary regionally, suggesting that blending of region-specific algorithms may provide the best way forward for generating global POC products.

  14. Validation of statistical models for estimating hospitalization associated with influenza and other respiratory viruses.

    Directory of Open Access Journals (Sweden)

    Lin Yang

    Full Text Available BACKGROUND: Reliable estimates of disease burden associated with respiratory viruses are keys to deployment of preventive strategies such as vaccination and resource allocation. Such estimates are particularly needed in tropical and subtropical regions where some methods commonly used in temperate regions are not applicable. While a number of alternative approaches to assess the influenza associated disease burden have been recently reported, none of these models have been validated with virologically confirmed data. Even fewer methods have been developed for other common respiratory viruses such as respiratory syncytial virus (RSV, parainfluenza and adenovirus. METHODS AND FINDINGS: We had recently conducted a prospective population-based study of virologically confirmed hospitalization for acute respiratory illnesses in persons <18 years residing in Hong Kong Island. Here we used this dataset to validate two commonly used models for estimation of influenza disease burden, namely the rate difference model and Poisson regression model, and also explored the applicability of these models to estimate the disease burden of other respiratory viruses. The Poisson regression models with different link functions all yielded estimates well correlated with the virologically confirmed influenza associated hospitalization, especially in children older than two years. The disease burden estimates for RSV, parainfluenza and adenovirus were less reliable with wide confidence intervals. The rate difference model was not applicable to RSV, parainfluenza and adenovirus and grossly underestimated the true burden of influenza associated hospitalization. CONCLUSION: The Poisson regression model generally produced satisfactory estimates in calculating the disease burden of respiratory viruses in a subtropical region such as Hong Kong.

  15. Biased binomial assessment of cross-validated estimation of classification accuracies illustrated in diagnosis predictions

    Directory of Open Access Journals (Sweden)

    Quentin Noirhomme

    2014-01-01

    Full Text Available Multivariate classification is used in neuroimaging studies to infer brain activation or in medical applications to infer diagnosis. Their results are often assessed through either a binomial or a permutation test. Here, we simulated classification results of generated random data to assess the influence of the cross-validation scheme on the significance of results. Distributions built from classification of random data with cross-validation did not follow the binomial distribution. The binomial test is therefore not adapted. On the contrary, the permutation test was unaffected by the cross-validation scheme. The influence of the cross-validation was further illustrated on real-data from a brain–computer interface experiment in patients with disorders of consciousness and from an fMRI study on patients with Parkinson disease. Three out of 16 patients with disorders of consciousness had significant accuracy on binomial testing, but only one showed significant accuracy using permutation testing. In the fMRI experiment, the mental imagery of gait could discriminate significantly between idiopathic Parkinson's disease patients and healthy subjects according to the permutation test but not according to the binomial test. Hence, binomial testing could lead to biased estimation of significance and false positive or negative results. In our view, permutation testing is thus recommended for clinical application of classification with cross-validation.

  16. Biased binomial assessment of cross-validated estimation of classification accuracies illustrated in diagnosis predictions.

    Science.gov (United States)

    Noirhomme, Quentin; Lesenfants, Damien; Gomez, Francisco; Soddu, Andrea; Schrouff, Jessica; Garraux, Gaëtan; Luxen, André; Phillips, Christophe; Laureys, Steven

    2014-01-01

    Multivariate classification is used in neuroimaging studies to infer brain activation or in medical applications to infer diagnosis. Their results are often assessed through either a binomial or a permutation test. Here, we simulated classification results of generated random data to assess the influence of the cross-validation scheme on the significance of results. Distributions built from classification of random data with cross-validation did not follow the binomial distribution. The binomial test is therefore not adapted. On the contrary, the permutation test was unaffected by the cross-validation scheme. The influence of the cross-validation was further illustrated on real-data from a brain-computer interface experiment in patients with disorders of consciousness and from an fMRI study on patients with Parkinson disease. Three out of 16 patients with disorders of consciousness had significant accuracy on binomial testing, but only one showed significant accuracy using permutation testing. In the fMRI experiment, the mental imagery of gait could discriminate significantly between idiopathic Parkinson's disease patients and healthy subjects according to the permutation test but not according to the binomial test. Hence, binomial testing could lead to biased estimation of significance and false positive or negative results. In our view, permutation testing is thus recommended for clinical application of classification with cross-validation.

  17. Protein array staining methods for undefined protein content, manufacturing quality control, and performance validation.

    Science.gov (United States)

    Schabacker, Daniel S; Stefanovska, Ivana; Gavin, Igor; Pedrak, Casandra; Chandler, Darrell P

    2006-12-01

    Methods to assess the quality and performance of protein microarrays fabricated from undefined protein content are required to elucidate slide-to-slide variability and interpolate resulting signal intensity values after an interaction assay. We therefore developed several simple total- and posttranslational modification-specific, on-chip staining methods to quantitatively assess the quality of gel element protein arrays manufactured with whole-cell lysate in vitro protein fractions derived from two-dimensional liquid-phase fractionation (PF2D) technology. A linear dynamic range of at least 3 logs was observed for protein stains and immobilized protein content, with a lower limit of detection at 8 pg of protein per gel element with Deep Purple protein stain and a field-portable microarray imager. Data demonstrate the successful isolation, separation, transfer, and immobilization of putative transmembrane proteins from Yersinia pestis KIM D27 with the combined PF2D and gel element array method. Internal bovine serum albumin standard curves provided a method to assess on-chip PF2D transfer and quantify total protein immobilized per gel element. The basic PF2D array fabrication and quality assurance/quality control methods described here therefore provide a standard operating procedure and basis for developing whole-proteome arrays for interrogating host-pathogen interactions, independent of sequenced genomes, affinity tags, or a priori knowledge of target cell composition.

  18. The Identification and Validation of Novel Small Proteins in Pseudomonas Putida KT-2440

    DEFF Research Database (Denmark)

    Yang, Xiaochen; Long, Katherine

    2014-01-01

    and activities and may lead to the discovery of novel antimicrobial agents. Our project focuses on the identification, validation and characterization of novel s-­‐proteins in the bacterium Pseudomonas putida KT-­2440. As there is virtually no information on s-­‐proteins in pseudomonads, the first step......, total protein samples are prepared, fractionated, and analyzed with mass spectrometry (MS/MS). The MS/MS data are compared to a custom database containing >80000 putative sORF sequences to identify candidates for validation. A total of 56 and 22 putative sORFs were obtained from MS/MS data...... and bioinformatics prediction, respectively, where there is no overlap between the putative sORFs obtained from the two approaches. The sequences encoding the putative sORFs will be integrated onto the Tn7 site on the chromosome as well as on a plasmid expression vector for validation....

  19. Development and prospective validation of a model estimating risk of readmission in cancer patients.

    Science.gov (United States)

    Schmidt, Carl R; Hefner, Jennifer; McAlearney, Ann S; Graham, Lisa; Johnson, Kristen; Moffatt-Bruce, Susan; Huerta, Timothy; Pawlik, Timothy M; White, Susan

    2018-02-26

    Hospital readmissions among cancer patients are common. While several models estimating readmission risk exist, models specific for cancer patients are lacking. A logistic regression model estimating risk of unplanned 30-day readmission was developed using inpatient admission data from a 2-year period (n = 18 782) at a tertiary cancer hospital. Readmission risk estimates derived from the model were then calculated prospectively over a 10-month period (n = 8616 admissions) and compared with actual incidence of readmission. There were 2478 (13.2%) unplanned readmissions. Model factors associated with readmission included: emergency department visit within 30 days, >1 admission within 60 days, non-surgical admission, solid malignancy, gastrointestinal cancer, emergency admission, length of stay >5 days, abnormal sodium, hemoglobin, or white blood cell count. The c-statistic for the model was 0.70. During the 10-month prospective evaluation, estimates of readmission from the model were associated with higher actual readmission incidence from 20.7% for the highest risk category to 9.6% for the lowest. An unplanned readmission risk model developed specifically for cancer patients performs well when validated prospectively. The specificity of the model for cancer patients, EMR incorporation, and prospective validation justify use of the model in future studies designed to reduce and prevent readmissions. © 2018 Wiley Periodicals, Inc.

  20. Electrostatics of cysteine residues in proteins: Parameterization and validation of a simple model

    Science.gov (United States)

    Salsbury, Freddie R.; Poole, Leslie B.; Fetrow, Jacquelyn S.

    2013-01-01

    One of the most popular and simple models for the calculation of pKas from a protein structure is the semi-macroscopic electrostatic model MEAD. This model requires empirical parameters for each residue to calculate pKas. Analysis of current, widely used empirical parameters for cysteine residues showed that they did not reproduce expected cysteine pKas; thus, we set out to identify parameters consistent with the CHARMM27 force field that capture both the behavior of typical cysteines in proteins and the behavior of cysteines which have perturbed pKas. The new parameters were validated in three ways: (1) calculation across a large set of typical cysteines in proteins (where the calculations are expected to reproduce expected ensemble behavior); (2) calculation across a set of perturbed cysteines in proteins (where the calculations are expected to reproduce the shifted ensemble behavior); and (3) comparison to experimentally determined pKa values (where the calculation should reproduce the pKa within experimental error). Both the general behavior of cysteines in proteins and the perturbed pKa in some proteins can be predicted reasonably well using the newly determined empirical parameters within the MEAD model for protein electrostatics. This study provides the first general analysis of the electrostatics of cysteines in proteins, with specific attention paid to capturing both the behavior of typical cysteines in a protein and the behavior of cysteines whose pKa should be shifted, and validation of force field parameters for cysteine residues. PMID:22777874

  1. In silico modelling and validation of differential expressed proteins in lung cancer

    Directory of Open Access Journals (Sweden)

    Bhagavathi S

    2012-05-01

    Full Text Available Objective: The present study aims predict the three dimensional structure of three major proteins responsible for causing Lung cancer. Methods: These are the differentially expressed proteins in lung cancer dataset. Initially, the structural template for these proteins is identified from structural database using homology search and perform homology modelling approach to predict its native 3D structure. Three-dimensional model obtained was validated using Ramachandran plot analysis to find the reliability of the model. Results: Four proteins were differentially expressed and were significant proteins in causing lung cancer. Among the four proteins, Matrixmetallo proteinase (P39900 had a known 3D structure and hence was not considered for modelling. The remaining proteins Polo like kinase I Q58A51, Trophinin B1AKF1, Thrombomodulin P07204 were modelled and validated. Conclusions: The three dimensional structure of proteins provides insights about the functional aspect and regulatory aspect of the protein. Thus, this study will be a breakthrough for further lung cancer related studies.

  2. Implementing an X-ray validation pipeline for the Protein Data Bank.

    Science.gov (United States)

    Gore, Swanand; Velankar, Sameer; Kleywegt, Gerard J

    2012-04-01

    There is an increasing realisation that the quality of the biomacromolecular structures deposited in the Protein Data Bank (PDB) archive needs to be assessed critically using established and powerful validation methods. The Worldwide Protein Data Bank (wwPDB) organization has convened several Validation Task Forces (VTFs) to advise on the methods and standards that should be used to validate all of the entries already in the PDB as well as all structures that will be deposited in the future. The recommendations of the X-ray VTF are currently being implemented in a software pipeline. Here, ongoing work on this pipeline is briefly described as well as ways in which validation-related information could be presented to users of structural data.

  3. Protein and amino acid bioavailability estimates for canine foods

    NARCIS (Netherlands)

    Hendriks, W.H.; Bakker, E.J.; Bosch, G.

    2015-01-01

    Estimates of nutrient bioavailability are required for establishing dietary nutrient requirements and to evaluate the nutritional value of food ingredients or foods that are exposed to processing or extended storage. This study aimed to generate estimates for the bioavailability of dietary CP and AA

  4. Protein structure estimation from NMR data by matrix completion.

    Science.gov (United States)

    Li, Zhicheng; Li, Yang; Lei, Qiang; Zhao, Qing

    2017-09-01

    Knowledge of protein structures is very important to understand their corresponding physical and chemical properties. Nuclear Magnetic Resonance (NMR) spectroscopy is one of the main methods to measure protein structure. In this paper, we propose a two-stage approach to calculate the structure of a protein from a highly incomplete distance matrix, where most data are obtained from NMR. We first randomly "guess" a small part of unobservable distances by utilizing the triangle inequality, which is crucial for the second stage. Then we use matrix completion to calculate the protein structure from the obtained incomplete distance matrix. We apply the accelerated proximal gradient algorithm to solve the corresponding optimization problem. Furthermore, the recovery error of our method is analyzed, and its efficiency is demonstrated by several practical examples.

  5. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Melius, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  6. The relative validity and repeatability of an FFQ for estimating intake of zinc and its absorption modifiers in young and older Saudi adults.

    Science.gov (United States)

    Alsufiani, Hadeil M; Yamani, Fatmah; Kumosani, Taha A; Ford, Dianne; Mathers, John C

    2015-04-01

    To assess the relative validity and repeatability of a sixty-four-item FFQ for estimating dietary intake of Zn and its absorption modifiers in Saudi adults. In addition, we used the FFQ to investigate the effect of age and gender on these intakes. To assess validity, all participants completed the FFQ (FFQ1) and a 3 d food record. After 1 month, the FFQ was administered for a second time (FFQ2) to assess repeatability. Jeddah, Saudi Arabia. One hundred males and females aged 20-30 years and 60-70 years participated. Mean intakes of Zn and protein from FFQ1 were significantly higher than those from the food record while there were no detectable differences between tools for measurement of phytic acid intake. Estimated intakes of Zn, protein and phytate by both approaches were strongly correlated (Prange of intakes while for Zn and phytic acid, the difference increased with increasing mean intake. Zn and protein intakes from FFQ1 and FFQ2 were highly correlated (r>0·68, Padults consumed less Zn and protein compared with young adults. Intakes of all dietary components were lower in females than in males. The FFQ developed and tested in the current study demonstrated reasonable relative validity and high repeatability and was capable of detecting differences in intakes between age and gender groups.

  7. Long-term monitoring of endangered Laysan ducks: Index validation and population estimates 1998–2012

    Science.gov (United States)

    Reynolds, Michelle H.; Courtot, Karen; Brinck, Kevin W.; Rehkemper, Cynthia; Hatfield, Jeffrey

    2015-01-01

    Monitoring endangered wildlife is essential to assessing management or recovery objectives and learning about population status. We tested assumptions of a population index for endangered Laysan duck (or teal; Anas laysanensis) monitored using mark–resight methods on Laysan Island, Hawai’i. We marked 723 Laysan ducks between 1998 and 2009 and identified seasonal surveys through 2012 that met accuracy and precision criteria for estimating population abundance. Our results provide a 15-y time series of seasonal population estimates at Laysan Island. We found differences in detection among seasons and how observed counts related to population estimates. The highest counts and the strongest relationship between count and population estimates occurred in autumn (September–November). The best autumn surveys yielded population abundance estimates that ranged from 674 (95% CI = 619–730) in 2003 to 339 (95% CI = 265–413) in 2012. A population decline of 42% was observed between 2010 and 2012 after consecutive storms and Japan’s To¯hoku earthquake-generated tsunami in 2011. Our results show positive correlations between the seasonal maximum counts and population estimates from the same date, and support the use of standardized bimonthly counts of unmarked birds as a valid index to monitor trends among years within a season at Laysan Island.

  8. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  9. Validation of Molecular Dynamics Simulations for Prediction of Three-Dimensional Structures of Small Proteins.

    Science.gov (United States)

    Kato, Koichi; Nakayoshi, Tomoki; Fukuyoshi, Shuichi; Kurimoto, Eiji; Oda, Akifumi

    2017-10-12

    Although various higher-order protein structure prediction methods have been developed, almost all of them were developed based on the three-dimensional (3D) structure information of known proteins. Here we predicted the short protein structures by molecular dynamics (MD) simulations in which only Newton's equations of motion were used and 3D structural information of known proteins was not required. To evaluate the ability of MD simulationto predict protein structures, we calculated seven short test protein (10-46 residues) in the denatured state and compared their predicted and experimental structures. The predicted structure for Trp-cage (20 residues) was close to the experimental structure by 200-ns MD simulation. For proteins shorter or longer than Trp-cage, root-mean square deviation values were larger than those for Trp-cage. However, secondary structures could be reproduced by MD simulations for proteins with 10-34 residues. Simulations by replica exchange MD were performed, but the results were similar to those from normal MD simulations. These results suggest that normal MD simulations can roughly predict short protein structures and 200-ns simulations are frequently sufficient for estimating the secondary structures of protein (approximately 20 residues). Structural prediction method using only fundamental physical laws are useful for investigating non-natural proteins, such as primitive proteins and artificial proteins for peptide-based drug delivery systems.

  10. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  11. Estimation of in-vivo neurotransmitter release by brain microdialysis: the issue of validity.

    Science.gov (United States)

    Di Chiara, G.; Tanda, G.; Carboni, E.

    1996-11-01

    Although microdialysis is commonly understood as a method of sampling low molecular weight compounds in the extracellular compartment of tissues, this definition appears insufficient to specifically describe brain microdialysis of neurotransmitters. In fact, transmitter overflow from the brain into dialysates is critically dependent upon the composition of the perfusing Ringer. Therefore, the dialysing Ringer not only recovers the transmitter from the extracellular brain fluid but is a main determinant of its in-vivo release. Two types of brain microdialysis are distinguished: quantitative micro-dialysis and conventional microdialysis. Quantitative microdialysis provides an estimate of neurotransmitter concentrations in the extracellular fluid in contact with the probe. However, this information might poorly reflect the kinetics of neurotransmitter release in vivo. Conventional microdialysis involves perfusion at a constant rate with a transmitter-free Ringer, resulting in the formation of a steep neurotransmitter concentration gradient extending from the Ringer into the extracellular fluid. This artificial gradient might be critical for the ability of conventional microdialysis to detect and resolve phasic changes in neurotransmitter release taking place in the implanted area. On the basis of these characteristics, conventional microdialysis of neurotransmitters can be conceptualized as a model of the in-vivo release of neurotransmitters in the brain. As such, the criteria of face-validity, construct-validity and predictive-validity should be applied to select the most appropriate experimental conditions for estimating neurotransmitter release in specific brain areas in relation to behaviour.

  12. Estimating misclassification error: a closer look at cross-validation based methods

    Directory of Open Access Journals (Sweden)

    Ounpraseuth Songthip

    2012-11-01

    Full Text Available Abstract Background To estimate a classifier’s error in predicting future observations, bootstrap methods have been proposed as reduced-variation alternatives to traditional cross-validation (CV methods based on sampling without replacement. Monte Carlo (MC simulation studies aimed at estimating the true misclassification error conditional on the training set are commonly used to compare CV methods. We conducted an MC simulation study to compare a new method of bootstrap CV (BCV to k-fold CV for estimating clasification error. Findings For the low-dimensional conditions simulated, the modest positive bias of k-fold CV contrasted sharply with the substantial negative bias of the new BCV method. This behavior was corroborated using a real-world dataset of prognostic gene-expression profiles in breast cancer patients. Our simulation results demonstrate some extreme characteristics of variance and bias that can occur due to a fault in the design of CV exercises aimed at estimating the true conditional error of a classifier, and that appear not to have been fully appreciated in previous studies. Although CV is a sound practice for estimating a classifier’s generalization error, using CV to estimate the fixed misclassification error of a trained classifier conditional on the training set is problematic. While MC simulation of this estimation exercise can correctly represent the average bias of a classifier, it will overstate the between-run variance of the bias. Conclusions We recommend k-fold CV over the new BCV method for estimating a classifier’s generalization error. The extreme negative bias of BCV is too high a price to pay for its reduced variance.

  13. Fiber-bound nitrogen in gorilla diets: implications for estimating dietary protein intake of primates.

    Science.gov (United States)

    Rothman, Jessica M; Chapman, Colin A; Pell, Alice N

    2008-07-01

    Protein is essential for living organisms, but digestibility of crude protein is poorly understood and difficult to predict. Nitrogen is used to estimate protein content because nitrogen is a component of the amino acids that comprise protein, but a substantial portion of the nitrogen in plants may be bound to fiber in an indigestible form. To estimate the amount of crude protein that is unavailable in the diets of mountain gorillas (Gorilla beringei) in Bwindi Impenetrable National Park, Uganda, foods routinely eaten were analyzed to determine the amount of nitrogen bound to the acid-detergent fiber residue. The amount of fiber-bound nitrogen varied among plant parts: herbaceous leaves 14.5+/-8.9% (reported as a percentage of crude protein on a dry matter (DM) basis), tree leaves (16.1+/-6.7% DM), pith/herbaceous peel (26.2+/-8.9% DM), fruit (34.7+/-17.8% DM), bark (43.8+/-15.6% DM), and decaying wood (85.2+/-14.6% DM). When crude protein and available protein intake of adult gorillas was estimated over a year, 15.1% of the dietary crude protein was indigestible. These results indicate that the proportion of fiber-bound protein in primate diets should be considered when estimating protein intake, food selection, and food/habitat quality.

  14. A stepwise validation of a wearable system for estimating energy expenditure in field-based research

    International Nuclear Information System (INIS)

    Rumo, Martin; Mäder, Urs; Amft, Oliver; Tröster, Gerhard

    2011-01-01

    Regular physical activity (PA) is an important contributor to a healthy lifestyle. Currently, standard sensor-based methods to assess PA in field-based research rely on a single accelerometer mounted near the body's center of mass. This paper introduces a wearable system that estimates energy expenditure (EE) based on seven recognized activity types. The system was developed with data from 32 healthy subjects and consists of a chest mounted heart rate belt and two accelerometers attached to a thigh and dominant upper arm. The system was validated with 12 other subjects under restricted lab conditions and simulated free-living conditions against indirect calorimetry, as well as in subjects' habitual environments for 2 weeks against the doubly labeled water method. Our stepwise validation methodology gradually trades reference information from the lab against realistic data from the field. The average accuracy for EE estimation was 88% for restricted lab conditions, 55% for simulated free-living conditions and 87% and 91% for the estimation of average daily EE over the period of 1 and 2 weeks

  15. DeepQA: Improving the estimation of single protein model quality with deep belief networks

    OpenAIRE

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-01-01

    Background Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. Results We introduce a novel single-model quality assessment method DeepQA based on deep belie...

  16. MR-based water content estimation in cartilage: design and validation of a method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sophie; Ringgaard, Steffen

    Purpose: Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Methods and Materials: Cartilage tissue T1 map based water content MR sequences were used on a 37 Celsius degree stable system. The T1 map intensity signal was analyzed on 6...... cartilage samples from living animals (pig) and on 8 gelatin samples which water content was already known. For the data analysis a T1 intensity signal map software analyzer used. Finally, the method was validated after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained...... map based water content sequences can provide information that, after being analyzed using a T1-map analysis software, can be interpreted as the water contained inside a cartilage tissue. The amount of water estimated using this method was similar to the one obtained at the dry-freeze procedure...

  17. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    Science.gov (United States)

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  18. Low-cost extrapolation method for maximal lte radio base station exposure estimation: Test and validation

    International Nuclear Information System (INIS)

    Verloock, L.; Joseph, W.; Gati, A.; Varsier, N.; Flach, B.; Wiart, J.; Martens, L.

    2013-01-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on down-link band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2x2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. (authors)

  19. Estimating the effect of fermentation yeast on distillers grains protein

    Science.gov (United States)

    Distillers dried grains with solubles (DDGS) is the key co-product of bio-ethanol production from grains. Major factors affecting its quality and market values include protein quantity (concentration) and quality (amino acid composition). Yet, the effect of fermentation yeast on DDGS quality has no...

  20. MSX-3D: a tool to validate 3D protein models using mass spectrometry.

    Science.gov (United States)

    Heymann, Michaël; Paramelle, David; Subra, Gilles; Forest, Eric; Martinez, Jean; Geourjon, Christophe; Deléage, Gilbert

    2008-12-01

    The technique of chemical cross-linking followed by mass spectrometry has proven to bring valuable information about the protein structure and interactions between proteic subunits. It is an effective and efficient way to experimentally investigate some aspects of a protein structure when NMR and X-ray crystallography data are lacking. We introduce MSX-3D, a tool specifically geared to validate protein models using mass spectrometry. In addition to classical peptides identifications, it allows an interactive 3D visualization of the distance constraints derived from a cross-linking experiment. Freely available at http://proteomics-pbil.ibcp.fr

  1. On-Road Validation of a Simplified Model for Estimating Real-World Fuel Economy: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Eric; Gonder, Jeff; Jehlik, Forrest

    2017-01-01

    On-road fuel economy is known to vary significantly between individual trips in real-world driving conditions. This work introduces a methodology for rapidly simulating a specific vehicle's fuel economy over the wide range of real-world conditions experienced across the country. On-road test data collected using a highly instrumented vehicle is used to refine and validate this modeling approach. Model accuracy relative to on-road data collection is relevant to the estimation of 'off-cycle credits' that compensate for real-world fuel economy benefits that are not observed during certification testing on a chassis dynamometer.

  2. Convergent validity of ActiGraph and Actical accelerometers for estimating physical activity in adults

    DEFF Research Database (Denmark)

    Duncan, Scott; Stewart, Tom; Bo Schneller, Mikkel

    2018-01-01

    PURPOSE: The aim of the present study was to examine the convergent validity of two commonly-used accelerometers for estimating time spent in various physical activity intensities in adults. METHODS: The sample comprised 37 adults (26 males) with a mean (SD) age of 37.6 (12.2) years from San Diego......, USA. Participants wore ActiGraph GT3X+ and Actical accelerometers for three consecutive days. Percent agreement was used to compare time spent within four physical activity intensity categories under three counts per minute (CPM) threshold protocols: (1) using thresholds developed specifically......Graph and Actical accelerometers provide significantly different estimates of time spent in various physical activity intensities. Regression and threshold adjustment were able to reduce these differences, although some level of non-agreement persisted. Researchers should be aware of the inherent limitations...

  3. Model-based PSF and MTF estimation and validation from skeletal clinical CT images.

    Science.gov (United States)

    Pakdel, Amirreza; Mainprize, James G; Robert, Normand; Fialkov, Jeffery; Whyne, Cari M

    2014-01-01

    A method was developed to correct for systematic errors in estimating the thickness of thin bones due to image blurring in CT images using bone interfaces to estimate the point-spread-function (PSF). This study validates the accuracy of the PSFs estimated using said method from various clinical CT images featuring cortical bones. Gaussian PSFs, characterized by a different extent in the z (scan) direction than in the x and y directions were obtained using our method from 11 clinical CT scans of a cadaveric craniofacial skeleton. These PSFs were estimated for multiple combinations of scanning parameters and reconstruction methods. The actual PSF for each scan setting was measured using the slanted-slit technique within the image slice plane and the longitudinal axis. The Gaussian PSF and the corresponding modulation transfer function (MTF) are compared against the actual PSF and MTF for validation. The differences (errors) between the actual and estimated full-width half-max (FWHM) of the PSFs were 0.09 ± 0.05 and 0.14 ± 0.11 mm for the xy and z axes, respectively. The overall errors in the predicted frequencies measured at 75%, 50%, 25%, 10%, and 5% MTF levels were 0.06 ± 0.07 and 0.06 ± 0.04 cycles/mm for the xy and z axes, respectively. The accuracy of the estimates was dependent on whether they were reconstructed with a standard kernel (Toshiba's FC68, mean error of 0.06 ± 0.05 mm, MTF mean error 0.02 ± 0.02 cycles/mm) or a high resolution bone kernel (Toshiba's FC81, PSF FWHM error 0.12 ± 0.03 mm, MTF mean error 0.09 ± 0.08 cycles/mm). The method is accurate in 3D for an image reconstructed using a standard reconstruction kernel, which conforms to the Gaussian PSF assumption but less accurate when using a high resolution bone kernel. The method is a practical and self-contained means of estimating the PSF in clinical CT images featuring cortical bones, without the need phantoms or any prior knowledge about the scanner-specific parameters.

  4. Model-based PSF and MTF estimation and validation from skeletal clinical CT images

    International Nuclear Information System (INIS)

    Pakdel, Amirreza; Mainprize, James G.; Robert, Normand; Fialkov, Jeffery; Whyne, Cari M.

    2014-01-01

    Purpose: A method was developed to correct for systematic errors in estimating the thickness of thin bones due to image blurring in CT images using bone interfaces to estimate the point-spread-function (PSF). This study validates the accuracy of the PSFs estimated using said method from various clinical CT images featuring cortical bones. Methods: Gaussian PSFs, characterized by a different extent in the z (scan) direction than in the x and y directions were obtained using our method from 11 clinical CT scans of a cadaveric craniofacial skeleton. These PSFs were estimated for multiple combinations of scanning parameters and reconstruction methods. The actual PSF for each scan setting was measured using the slanted-slit technique within the image slice plane and the longitudinal axis. The Gaussian PSF and the corresponding modulation transfer function (MTF) are compared against the actual PSF and MTF for validation. Results: The differences (errors) between the actual and estimated full-width half-max (FWHM) of the PSFs were 0.09 ± 0.05 and 0.14 ± 0.11 mm for the xy and z axes, respectively. The overall errors in the predicted frequencies measured at 75%, 50%, 25%, 10%, and 5% MTF levels were 0.06 ± 0.07 and 0.06 ± 0.04 cycles/mm for the xy and z axes, respectively. The accuracy of the estimates was dependent on whether they were reconstructed with a standard kernel (Toshiba's FC68, mean error of 0.06 ± 0.05 mm, MTF mean error 0.02 ± 0.02 cycles/mm) or a high resolution bone kernel (Toshiba's FC81, PSF FWHM error 0.12 ± 0.03 mm, MTF mean error 0.09 ± 0.08 cycles/mm). Conclusions: The method is accurate in 3D for an image reconstructed using a standard reconstruction kernel, which conforms to the Gaussian PSF assumption but less accurate when using a high resolution bone kernel. The method is a practical and self-contained means of estimating the PSF in clinical CT images featuring cortical bones, without the need phantoms or any prior knowledge about the

  5. Model-based PSF and MTF estimation and validation from skeletal clinical CT images

    Energy Technology Data Exchange (ETDEWEB)

    Pakdel, Amirreza [Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada and Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5S 3M2 (Canada); Mainprize, James G.; Robert, Normand [Sunnybrook Research Institute, Toronto, Ontario M4N 3M5 (Canada); Fialkov, Jeffery [Division of Plastic Surgery, Sunnybrook Health Sciences Center, Toronto, Ontario M4N 3M5, Canada and Department of Surgery, University of Toronto, Toronto, Ontario M5S 3M2 (Canada); Whyne, Cari M., E-mail: cari.whyne@sunnybrook.ca [Sunnybrook Research Institute, Toronto, Ontario M4N 3M5, Canada and Department of Surgery, Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5S 3M2 (Canada)

    2014-01-15

    Purpose: A method was developed to correct for systematic errors in estimating the thickness of thin bones due to image blurring in CT images using bone interfaces to estimate the point-spread-function (PSF). This study validates the accuracy of the PSFs estimated using said method from various clinical CT images featuring cortical bones. Methods: Gaussian PSFs, characterized by a different extent in the z (scan) direction than in the x and y directions were obtained using our method from 11 clinical CT scans of a cadaveric craniofacial skeleton. These PSFs were estimated for multiple combinations of scanning parameters and reconstruction methods. The actual PSF for each scan setting was measured using the slanted-slit technique within the image slice plane and the longitudinal axis. The Gaussian PSF and the corresponding modulation transfer function (MTF) are compared against the actual PSF and MTF for validation. Results: The differences (errors) between the actual and estimated full-width half-max (FWHM) of the PSFs were 0.09 ± 0.05 and 0.14 ± 0.11 mm for the xy and z axes, respectively. The overall errors in the predicted frequencies measured at 75%, 50%, 25%, 10%, and 5% MTF levels were 0.06 ± 0.07 and 0.06 ± 0.04 cycles/mm for the xy and z axes, respectively. The accuracy of the estimates was dependent on whether they were reconstructed with a standard kernel (Toshiba's FC68, mean error of 0.06 ± 0.05 mm, MTF mean error 0.02 ± 0.02 cycles/mm) or a high resolution bone kernel (Toshiba's FC81, PSF FWHM error 0.12 ± 0.03 mm, MTF mean error 0.09 ± 0.08 cycles/mm). Conclusions: The method is accurate in 3D for an image reconstructed using a standard reconstruction kernel, which conforms to the Gaussian PSF assumption but less accurate when using a high resolution bone kernel. The method is a practical and self-contained means of estimating the PSF in clinical CT images featuring cortical bones, without the need phantoms or any prior knowledge

  6. Type-specific human papillomavirus biological features: validated model-based estimates.

    Directory of Open Access Journals (Sweden)

    Iacopo Baussano

    Full Text Available Infection with high-risk (hr human papillomavirus (HPV is considered the necessary cause of cervical cancer. Vaccination against HPV16 and 18 types, which are responsible of about 75% of cervical cancer worldwide, is expected to have a major global impact on cervical cancer occurrence. Valid estimates of the parameters that regulate the natural history of hrHPV infections are crucial to draw reliable projections of the impact of vaccination. We devised a mathematical model to estimate the probability of infection transmission, the rate of clearance, and the patterns of immune response following the clearance of infection of 13 hrHPV types. To test the validity of our estimates, we fitted the same transmission model to two large independent datasets from Italy and Sweden and assessed finding consistency. The two populations, both unvaccinated, differed substantially by sexual behaviour, age distribution, and study setting (screening for cervical cancer or Chlamydia trachomatis infection. Estimated transmission probability of hrHPV types (80% for HPV16, 73%-82% for HPV18, and above 50% for most other types; clearance rates decreasing as a function of time since infection; and partial protection against re-infection with the same hrHPV type (approximately 20% for HPV16 and 50% for the other types were similar in the two countries. The model could accurately predict the HPV16 prevalence observed in Italy among women who were not infected three years before. In conclusion, our models inform on biological parameters that cannot at the moment be measured directly from any empirical data but are essential to forecast the impact of HPV vaccination programmes.

  7. Validity and reliability of central blood pressure estimated by upper arm oscillometric cuff pressure.

    Science.gov (United States)

    Climie, Rachel E D; Schultz, Martin G; Nikolic, Sonja B; Ahuja, Kiran D K; Fell, James W; Sharman, James E

    2012-04-01

    Noninvasive central blood pressure (BP) independently predicts mortality, but current methods are operator-dependent, requiring skill to obtain quality recordings. The aims of this study were first, to determine the validity of an automatic, upper arm oscillometric cuff method for estimating central BP (O(CBP)) by comparison with the noninvasive reference standard of radial tonometry (T(CBP)). Second, we determined the intratest and intertest reliability of O(CBP). To assess validity, central BP was estimated by O(CBP) (Pulsecor R6.5B monitor) and compared with T(CBP) (SphygmoCor) in 47 participants free from cardiovascular disease (aged 57 ± 9 years) in supine, seated, and standing positions. Brachial mean arterial pressure (MAP) and diastolic BP (DBP) from the O(CBP) device were used to calibrate in both devices. Duplicate measures were recorded in each position on the same day to assess intratest reliability, and participants returned within 10 ± 7 days for repeat measurements to assess intertest reliability. There was a strong intraclass correlation (ICC = 0.987, P difference (1.2 ± 2.2 mm Hg) for central systolic BP (SBP) determined by O(CBP) compared with T(CBP). Ninety-six percent of all comparisons (n = 495 acceptable recordings) were within 5 mm Hg. With respect to reliability, there were strong correlations but higher limits of agreement for the intratest (ICC = 0.975, P difference 0.6 ± 4.5 mm Hg) and intertest (ICC = 0.895, P difference 4.3 ± 8.0 mm Hg) comparisons. Estimation of central SBP using cuff oscillometry is comparable to radial tonometry and has good reproducibility. As a noninvasive, relatively operator-independent method, O(CBP) may be as useful as T(CBP) for estimating central BP in clinical practice.

  8. Relative validity of an FFQ to estimate daily food and nutrient intakes for Chilean adults.

    Science.gov (United States)

    Dehghan, Mahshid; Martinez, Solange; Zhang, Xiaohe; Seron, Pamela; Lanas, Fernando; Islam, Shofiqul; Merchant, Anwar T

    2013-10-01

    FFQ are commonly used to rank individuals by their food and nutrient intakes in large epidemiological studies. The purpose of the present study was to develop and validate an FFQ to rank individuals participating in an ongoing Prospective Urban and Rural Epidemiological (PURE) study in Chile. An FFQ and four 24 h dietary recalls were completed over 1 year. Pearson correlation coefficients, energy-adjusted and de-attenuated correlations and weighted kappa were computed between the dietary recalls and the FFQ. The level of agreement between the two dietary assessment methods was evaluated by Bland-Altman analysis. Temuco, Chile. Overall, 166 women and men enrolled in the present study. One hundred men and women participated in FFQ development and sixty-six individuals participated in FFQ validation. The FFQ consisted of 109 food items. For nutrients, the crude correlation coefficients between the dietary recalls and FFQ varied from 0.14 (protein) to 0.44 (fat). Energy adjustment and de-attenuation improved correlation coefficients and almost all correlation coefficients exceeded 0.40. Similar correlation coefficients were observed for food groups; the highest de-attenuated energy adjusted correlation coefficient was found for margarine and butter (0.75) and the lowest for potatoes (0.12). The FFQ showed moderate to high agreement for most nutrients and food groups, and can be used to rank individuals based on energy, nutrient and food intakes. The validation study was conducted in a unique setting and indicated that the tool is valid for use by adults in Chile.

  9. Development and Statistical Validation of Spectrophotometric Methods for the Estimation of Nabumetone in Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    A. R. Rote

    2010-01-01

    Full Text Available Three new simple, economic spectrophotometric methods were developed and validated for the estimation of nabumetone in bulk and tablet dosage form. First method includes determination of nabumetone at absorption maxima 330 nm, second method applied was area under curve for analysis of nabumetone in the wavelength range of 326-334 nm and third method was First order derivative spectra with scaling factor 4. Beer law obeyed in the concentration range of 10-30 μg/mL for all three methods. The correlation coefficients were found to be 0.9997, 0.9998 and 0.9998 by absorption maxima, area under curve and first order derivative spectra. Results of analysis were validated statistically and by performing recovery studies. The mean percent recoveries were found satisfactory for all three methods. The developed methods were also compared statistically using one way ANOVA. The proposed methods have been successfully applied for the estimation of nabumetone in bulk and pharmaceutical tablet dosage form.

  10. Validation of walk score for estimating neighborhood walkability: an analysis of four US metropolitan areas.

    Science.gov (United States)

    Duncan, Dustin T; Aldstadt, Jared; Whalen, John; Melly, Steven J; Gortmaker, Steven L

    2011-11-01

    Neighborhood walkability can influence physical activity. We evaluated the validity of Walk Score(®) for assessing neighborhood walkability based on GIS (objective) indicators of neighborhood walkability with addresses from four US metropolitan areas with several street network buffer distances (i.e., 400-, 800-, and 1,600-meters). Address data come from the YMCA-Harvard After School Food and Fitness Project, an obesity prevention intervention involving children aged 5-11 years and their families participating in YMCA-administered, after-school programs located in four geographically diverse metropolitan areas in the US (n = 733). GIS data were used to measure multiple objective indicators of neighborhood walkability. Walk Scores were also obtained for the participant's residential addresses. Spearman correlations between Walk Scores and the GIS neighborhood walkability indicators were calculated as well as Spearman correlations accounting for spatial autocorrelation. There were many significant moderate correlations between Walk Scores and the GIS neighborhood walkability indicators such as density of retail destinations and intersection density (p walkability. Correlations generally became stronger with a larger spatial scale, and there were some geographic differences. Walk Score(®) is free and publicly available for public health researchers and practitioners. Results from our study suggest that Walk Score(®) is a valid measure of estimating certain aspects of neighborhood walkability, particularly at the 1600-meter buffer. As such, our study confirms and extends the generalizability of previous findings demonstrating that Walk Score is a valid measure of estimating neighborhood walkability in multiple geographic locations and at multiple spatial scales.

  11. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    Science.gov (United States)

    CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological

  12. Validation of the Maslach Burnout Inventory-Human Services Survey for Estimating Burnout in Dental Students.

    Science.gov (United States)

    Montiel-Company, José María; Subirats-Roig, Cristian; Flores-Martí, Pau; Bellot-Arcís, Carlos; Almerich-Silla, José Manuel

    2016-11-01

    The aim of this study was to examine the validity and reliability of the Maslach Burnout Inventory-Human Services Survey (MBI-HSS) as a tool for assessing the prevalence and level of burnout in dental students in Spanish universities. The survey was adapted from English to Spanish. A sample of 533 dental students from 15 Spanish universities and a control group of 188 medical students self-administered the survey online, using the Google Drive service. The test-retest reliability or reproducibility showed an Intraclass Correlation Coefficient of 0.95. The internal consistency of the survey was 0.922. Testing the construct validity showed two components with an eigenvalue greater than 1.5, which explained 51.2% of the total variance. Factor I (36.6% of the variance) comprised the items that estimated emotional exhaustion and depersonalization. Factor II (14.6% of the variance) contained the items that estimated personal accomplishment. The cut-off point for the existence of burnout achieved a sensitivity of 92.2%, a specificity of 92.1%, and an area under the curve of 0.96. Comparison of the total dental students sample and the control group of medical students showed significantly higher burnout levels for the dental students (50.3% vs. 40.4%). In this study, the MBI-HSS was found to be viable, valid, and reliable for measuring burnout in dental students. Since the study also found that the dental students suffered from high levels of this syndrome, these results suggest the need for preventive burnout control programs.

  13. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    Directory of Open Access Journals (Sweden)

    Suzana Papile Maciel Carvalho

    2013-07-01

    Full Text Available Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. OBJECTIVE: This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995, previously used in a population sample from Northeast Brazil. MATERIAL AND METHODS: The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. RESULTS: The results demonstrated that the application of the method of Oliveira, et al. (1995 in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. CONCLUSION: It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995 presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South

  14. The Air Force Mobile Forward Surgical Team (MFST): Using the Estimating Supplies Program to Validate Clinical Requirement

    National Research Council Canada - National Science Library

    Nix, Ralph E; Onofrio, Kathleen; Konoske, Paula J; Galarneau, Mike R; Hill, Martin

    2004-01-01

    .... The primary objective of the study was to provide the Air Force with the ability to validate clinical requirements of the MFST assemblage, with the goal of using NHRC's Estimating Supplies Program (ESP...

  15. Reproducibility and relative validity of a food frequency questionnaire to estimate intake of dietary phylloquinone and menaquinones.

    NARCIS (Netherlands)

    Zwakenberg, S R; Engelen, A I P; Dalmeijer, G W; Booth, S L; Vermeer, C; Drijvers, J J M M; Ocke, M C; Feskens, E J M; van der Schouw, Y T; Beulens, J W J

    2017-01-01

    This study aims to investigate the reproducibility and relative validity of the Dutch food frequency questionnaire (FFQ), to estimate intake of dietary phylloquinone and menaquinones compared with 24-h dietary recalls (24HDRs) and plasma markers of vitamin K status.

  16. Electrostatics of cysteine residues in proteins: parameterization and validation of a simple model.

    Science.gov (United States)

    Salsbury, Freddie R; Poole, Leslie B; Fetrow, Jacquelyn S

    2012-11-01

    One of the most popular and simple models for the calculation of pK(a) s from a protein structure is the semi-macroscopic electrostatic model MEAD. This model requires empirical parameters for each residue to calculate pK(a) s. Analysis of current, widely used empirical parameters for cysteine residues showed that they did not reproduce expected cysteine pK(a) s; thus, we set out to identify parameters consistent with the CHARMM27 force field that capture both the behavior of typical cysteines in proteins and the behavior of cysteines which have perturbed pK(a) s. The new parameters were validated in three ways: (1) calculation across a large set of typical cysteines in proteins (where the calculations are expected to reproduce expected ensemble behavior); (2) calculation across a set of perturbed cysteines in proteins (where the calculations are expected to reproduce the shifted ensemble behavior); and (3) comparison to experimentally determined pK(a) values (where the calculation should reproduce the pK(a) within experimental error). Both the general behavior of cysteines in proteins and the perturbed pK(a) in some proteins can be predicted reasonably well using the newly determined empirical parameters within the MEAD model for protein electrostatics. This study provides the first general analysis of the electrostatics of cysteines in proteins, with specific attention paid to capturing both the behavior of typical cysteines in a protein and the behavior of cysteines whose pK(a) should be shifted, and validation of force field parameters for cysteine residues. Copyright © 2012 Wiley Periodicals, Inc.

  17. On the validity of the incremental approach to estimate the impact of cities on air quality

    Science.gov (United States)

    Thunis, Philippe

    2018-01-01

    The question of how much cities are the sources of their own air pollution is not only theoretical as it is critical to the design of effective strategies for urban air quality planning. In this work, we assess the validity of the commonly used incremental approach to estimate the likely impact of cities on their air pollution. With the incremental approach, the city impact (i.e. the concentration change generated by the city emissions) is estimated as the concentration difference between a rural background and an urban background location, also known as the urban increment. We show that the city impact is in reality made up of the urban increment and two additional components and consequently two assumptions need to be fulfilled for the urban increment to be representative of the urban impact. The first assumption is that the rural background location is not influenced by emissions from within the city whereas the second requires that background concentration levels, obtained with zero city emissions, are equal at both locations. Because the urban impact is not measurable, the SHERPA modelling approach, based on a full air quality modelling system, is used in this work to assess the validity of these assumptions for some European cities. Results indicate that for PM2.5, these two assumptions are far from being fulfilled for many large or medium city sizes. For this type of cities, urban increments are largely underestimating city impacts. Although results are in better agreement for NO2, similar issues are met. In many situations the incremental approach is therefore not an adequate estimate of the urban impact on air pollution. This poses issues in terms of interpretation when these increments are used to define strategic options in terms of air quality planning. We finally illustrate the interest of comparing modelled and measured increments to improve our confidence in the model results.

  18. Validation and Refinement of Prediction Models to Estimate Exercise Capacity in Cancer Survivors Using the Steep Ramp Test

    NARCIS (Netherlands)

    Stuiver, Martijn M.; Kampshoff, Caroline S.; Persoon, Saskia; Groen, Wim; van Mechelen, Willem; Chinapaw, Mai J. M.; Brug, Johannes; Nollet, Frans; Kersten, Marie-José; Schep, Goof; Buffart, Laurien M.

    2017-01-01

    Objective: To further test the validity and clinical usefulness of the steep ramp test (SRT) in estimating exercise tolerance in cancer survivors by external validation and extension of previously published prediction models for peak oxygen consumption (Vo2(peak)) and peak power output (W-peak).&

  19. Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.

    Directory of Open Access Journals (Sweden)

    Timothy B Hallett

    2008-04-01

    Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.

  20. Full automation and validation of a flexible ELISA platform for host cell protein and protein A impurity detection in biopharmaceuticals.

    Science.gov (United States)

    Rey, Guillaume; Wendeler, Markus W

    2012-11-01

    Monitoring host cell protein (HCP) and protein A impurities is important to ensure successful development of recombinant antibody drugs. Here, we report the full automation and validation of an ELISA platform on a robotic system that allows the detection of Chinese hamster ovary (CHO) HCPs and residual protein A of in-process control samples and final drug substance. The ELISA setup is designed to serve three main goals: high sample throughput, high quality of results, and sample handling flexibility. The processing of analysis requests, determination of optimal sample dilutions, and calculation of impurity content is performed automatically by a spreadsheet. Up to 48 samples in three unspiked and spiked dilutions each are processed within 24 h. The dilution of each sample is individually prepared based on the drug concentration and the expected impurity content. Adaptable dilution protocols allow the analysis of sample dilutions ranging from 1:2 to 1:2×10(7). The validity of results is assessed by automatic testing for dilutional linearity and spike recovery for each sample. This automated impurity ELISA facilitates multi-project process development, is easily adaptable to other impurity ELISA formats, and increases analytical capacity by combining flexible sample handling with high data quality. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. [Estimates of the availability of energy and protein by province for the period 1985-2000].

    Science.gov (United States)

    Jus'at, I; Idrus, D

    1984-06-01

    An attempt is made to estimate the availability of adequate per capita food supplies and protein in the provinces of Indonesia over the period 1985 to 2000. The estimates are based on official population projections. Variations in food supply by province are discussed. (summary in ENG)

  2. A new generation of crystallographic validation tools for the protein data bank.

    Science.gov (United States)

    Read, Randy J; Adams, Paul D; Arendall, W Bryan; Brunger, Axel T; Emsley, Paul; Joosten, Robbie P; Kleywegt, Gerard J; Krissinel, Eugene B; Lütteke, Thomas; Otwinowski, Zbyszek; Perrakis, Anastassis; Richardson, Jane S; Sheffler, William H; Smith, Janet L; Tickle, Ian J; Vriend, Gert; Zwart, Peter H

    2011-10-12

    This report presents the conclusions of the X-ray Validation Task Force of the worldwide Protein Data Bank (PDB). The PDB has expanded massively since current criteria for validation of deposited structures were adopted, allowing a much more sophisticated understanding of all the components of macromolecular crystals. The size of the PDB creates new opportunities to validate structures by comparison with the existing database, and the now-mandatory deposition of structure factors creates new opportunities to validate the underlying diffraction data. These developments highlighted the need for a new assessment of validation criteria. The Task Force recommends that a small set of validation data be presented in an easily understood format, relative to both the full PDB and the applicable resolution class, with greater detail available to interested users. Most importantly, we recommend that referees and editors judging the quality of structural experiments have access to a concise summary of well-established quality indicators. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Development of robust flexible OLED encapsulations using simulated estimations and experimental validations

    International Nuclear Information System (INIS)

    Lee, Chang-Chun; Shih, Yan-Shin; Wu, Chih-Sheng; Tsai, Chia-Hao; Yeh, Shu-Tang; Peng, Yi-Hao; Chen, Kuang-Jung

    2012-01-01

    This work analyses the overall stress/strain characteristic of flexible encapsulations with organic light-emitting diode (OLED) devices. A robust methodology composed of a mechanical model of multi-thin film under bending loads and related stress simulations based on nonlinear finite element analysis (FEA) is proposed, and validated to be more reliable compared with related experimental data. With various geometrical combinations of cover plate, stacked thin films and plastic substrate, the position of the neutral axis (NA) plate, which is regarded as a key design parameter to minimize stress impact for the concerned OLED devices, is acquired using the present methodology. The results point out that both the thickness and mechanical properties of the cover plate help in determining the NA location. In addition, several concave and convex radii are applied to examine the reliable mechanical tolerance and to provide an insight into the estimated reliability of foldable OLED encapsulations. (paper)

  4. An examination of healthy aging across a conceptual continuum: prevalence estimates, demographic patterns, and validity.

    Science.gov (United States)

    McLaughlin, Sara J; Jette, Alan M; Connell, Cathleen M

    2012-06-01

    Although the notion of healthy aging has gained wide acceptance in gerontology, measuring the phenomenon is challenging. Guided by a prominent conceptualization of healthy aging, we examined how shifting from a more to less stringent definition of healthy aging influences prevalence estimates, demographic patterns, and validity. Data are from adults aged 65 years and older who participated in the Health and Retirement Study. We examined four operational definitions of healthy aging. For each, we calculated prevalence estimates and examined the odds of healthy aging by age, education, gender, and race-ethnicity in 2006. We also examined the association between healthy aging and both self-rated health and death. Across definitions, the prevalence of healthy aging ranged from 3.3% to 35.5%. For all definitions, those classified as experiencing healthy aging had lower odds of fair or poor self-rated health and death over an 8-year period. The odds of being classified as "healthy" were lower among those of advanced age, those with less education, and women than for their corresponding counterparts across all definitions. Moving across the conceptual continuum--from a more to less rigid definition of healthy aging--markedly increases the measured prevalence of healthy aging. Importantly, results suggest that all examined definitions identified a subgroup of older adults who had substantially lower odds of reporting fair or poor health and dying over an 8-year period, providing evidence of the validity of our definitions. Conceptualizations that emphasize symptomatic disease and functional health may be particularly useful for public health purposes.

  5. Validation of equations and proposed reference values to estimate fat mass in Chilean university students.

    Science.gov (United States)

    Gómez Campos, Rossana; Pacheco Carrillo, Jaime; Almonacid Fierro, Alejandro; Urra Albornoz, Camilo; Cossío-Bolaños, Marco

    2018-03-01

    (i) To propose regression equations based on anthropometric measures to estimate fat mass (FM) using dual energy X-ray absorptiometry (DXA) as reference method, and (ii)to establish population reference standards for equation-derived FM. A cross-sectional study on 6,713 university students (3,354 males and 3,359 females) from Chile aged 17.0 to 27.0years. Anthropometric measures (weight, height, waist circumference) were taken in all participants. Whole body DXA was performed in 683 subjects. A total of 478 subjects were selected to develop regression equations, and 205 for their cross-validation. Data from 6,030 participants were used to develop reference standards for FM. Equations were generated using stepwise multiple regression analysis. Percentiles were developed using the LMS method. Equations for men were: (i) FM=-35,997.486 +232.285 *Weight +432.216 *CC (R 2 =0.73, SEE=4.1); (ii)FM=-37,671.303 +309.539 *Weight +66,028.109 *ICE (R2=0.76, SEE=3.8), while equations for women were: (iii)FM=-13,216.917 +461,302 *Weight+91.898 *CC (R 2 =0.70, SEE=4.6), and (iv) FM=-14,144.220 +464.061 *Weight +16,189.297 *ICE (R 2 =0.70, SEE=4.6). Percentiles proposed included p10, p50, p85, and p95. The developed equations provide valid and accurate estimation of FM in both sexes. The values obtained using the equations may be analyzed from percentiles that allow for categorizing body fat levels by age and sex. Copyright © 2017 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Lactate minimum in a ramp protocol and its validity to estimate the maximal lactate steady state

    Directory of Open Access Journals (Sweden)

    Emerson Pardono

    2009-01-01

    Full Text Available http://dx.doi.org/10.5007/1980-0037.2009v11n2p174   The objectives of this study were to evaluate the validity of the lactate minimum (LM using a ramp protocol for the determination of LM intensity (LMI, and to estimate the exercise intensity corresponding to maximal blood lactate steady state (MLSS. In addition, the possibility of determining aerobic and anaerobic fitness was investigated. Fourteen male cyclists of regional level performed one LM protocol on a cycle ergometer (Excalibur–Lode consisting of an incremental test at an initial workload of 75 Watts, with increments of 1 Watt every 6 seconds. Hyperlactatemia was induced by a 30-second Wingate anaerobic test (WAT (Monark–834E at a workload corresponding to 8.57% of the volunteer’s body weight. Peak power (11.5±2 Watts/kg, mean power output (9.8±1.7 Watts/kg, fatigue index (33.7±2.3% and lactate 7 min after WAT (10.5±2.3 mmol/L were determined. The incremental test identified LMI (207.8±17.7 Watts and its respective blood lactate concentration (2.9±0.7 mmol/L, heart rate (153.6±10.6 bpm, and also maximal aerobic power (305.2±31.0 Watts. MLSS intensity was identified by 2 to 4 constant exercise tests (207.8±17.7 Watts, with no difference compared to LMI and good agreement between the two parameters. The LM test using a ramp protocol seems to be a valid method for the identification of LMI and estimation of MLSS intensity in regional cyclists. In addition, both anaerobic and aerobic fitness parameters were identified during a single session.

  7. Analytical errors in measuring radioactivity in cell proteins and their effect on estimates of protein turnover in L cells

    International Nuclear Information System (INIS)

    Silverman, J.A.; Mehta, J.; Brocher, S.; Amenta, J.S.

    1985-01-01

    Previous studies on protein turnover in 3 H-labelled L-cell cultures have shown recovery of total 3 H at the end of a three-day experiment to be always significantly in excess of the 3 H recovered at the beginning of the experiment. A number of possible sources for this error in measuring radioactivity in cell proteins has been reviewed. 3 H-labelled proteins, when dissolved in NaOH and counted for radioactivity in a liquid-scintillation spectrometer, showed losses of 30-40% of the radioactivity; neither external or internal standardization compensated for this loss. Hydrolysis of these proteins with either Pronase or concentrated HCl significantly increased the measured radioactivity. In addition, 5-10% of the cell protein is left on the plastic culture dish when cells are recovered in phosphate-buffered saline. Furthermore, this surface-adherent protein, after pulse labelling, contains proteins of high radioactivity that turn over rapidly and make a major contribution to the accumulating radioactivity in the medium. These combined errors can account for up to 60% of the total radioactivity in the cell culture. Similar analytical errors have been found in studies of other cell cultures. The effect of these analytical errors on estimates of protein turnover in cell cultures is discussed. (author)

  8. Validity of anthropometric procedures to estimate body density and body fat percent in military men

    Directory of Open Access Journals (Sweden)

    Ciro Romélio Rodriguez-Añez

    1999-12-01

    Full Text Available The objective of this study was to verify the validity of the Katch e McArdle’s equation (1973,which uses the circumferences of the arm, forearm and abdominal to estimate the body density and the procedure of Cohen (1986 which uses the circumferences of the neck and abdominal to estimate the body fat percent (%F in military men. Therefore data from 50 military men, with mean age of 20.26 ± 2.04 years serving in Santa Maria, RS, was collected. The circumferences were measured according with Katch e McArdle (1973 and Cohen (1986 procedures. The body density measured (Dm obtained under water weighting was used as criteria and its mean value was 1.0706 ± 0.0100 g/ml. The residual lung volume was estimated using the Goldman’s e Becklake’s equation (1959. The %F was obtained with the Siri’s equation (1961 and its mean value was 12.70 ± 4.71%. The validation criterion suggested by Lohman (1992 was followed. The analysis of the results indicated that the procedure developed by Cohen (1986 has concurrent validity to estimate %F in military men or in other samples with similar characteristics with standard error of estimate of 3.45%. . RESUMO Através deste estudo objetivou-se verificar a validade: da equação de Katch e McArdle (1973 que envolve os perímetros do braço, antebraço e abdômen, para estimar a densidade corporal; e, o procedimento de Cohen (1986 que envolve os perímetros do pescoço e abdômen, para estimar o % de gordura (%G; para militares. Para tanto, coletou-se os dados de 50 militares masculinos, com idade média de 20,26 ± 2,04 anos, lotados na cidade de Santa Maria, RS. Mensurou-se os perímetros conforme procedimentos de Katch e McArdle (1973 e Cohen (1986. Utilizou-se a densidade corporal mensurada (Dm através da pesagem hidrostática como critério de validação, cujo valor médio foi de 1,0706 ± 0,0100 g/ml. Estimou-se o volume residual pela equação de Goldman e Becklake (1959. O %G derivado da Dm estimou

  9. Development and Validation of Protein Microarray Technology for Simultaneous Inflammatory Mediator Detection in Human Sera

    Directory of Open Access Journals (Sweden)

    Senthooran Selvarajah

    2014-01-01

    Full Text Available Biomarkers, including cytokines, can help in the diagnosis, prognosis, and prediction of treatment response across a wide range of disease settings. Consequently, the recent emergence of protein microarray technology, which is able to quantify a range of inflammatory mediators in a large number of samples simultaneously, has become highly desirable. However, the cost of commercial systems remains somewhat prohibitive. Here we show the development, validation, and implementation of an in-house microarray platform which enables the simultaneous quantitative analysis of multiple protein biomarkers. The accuracy and precision of the in-house microarray system were investigated according to the Food and Drug Administration (FDA guidelines for pharmacokinetic assay validation. The assay fell within these limits for all but the very low-abundant cytokines, such as interleukin- (IL- 10. Additionally, there were no significant differences between cytokine detection using our microarray system and the “gold standard” ELISA format. Crucially, future biomarker detection need not be limited to the 16 cytokines shown here but could be expanded as required. In conclusion, we detail a bespoke protein microarray system, utilizing well-validated ELISA reagents, that allows accurate, precise, and reproducible multiplexed biomarker quantification, comparable with commercial ELISA, and allowing customization beyond that of similar commercial microarrays.

  10. Development and Fit-for-Purpose Validation of a Soluble Human Programmed Death-1 Protein Assay.

    Science.gov (United States)

    Ni, Yan G; Yuan, Xiling; Newitt, John A; Peterson, Jon E; Gleason, Carol R; Haulenbeek, Jonathan; Santockyte, Rasa; Lafont, Virginie; Marsilio, Frank; Neely, Robert J; DeSilva, Binodh; Piccoli, Steven P

    2015-07-01

    Programmed death-1 (PD-1) protein is a co-inhibitory receptor which negatively regulates immune cell activation and permits tumors to evade normal immune defense. Anti-PD-1 antibodies have been shown to restore immune cell activation and effector function-an exciting breakthrough in cancer immunotherapy. Recent reports have documented a soluble form of PD-1 (sPD-1) in the circulation of normal and disease state individuals. A clinical assay to quantify sPD-1 would contribute to the understanding of sPD-1-function and facilitate the development of anti-PD-1 drugs. Here, we report the development and validation of a sPD-1 protein assay. The assay validation followed the framework for full validation of a biotherapeutic pharmacokinetic assay. A purified recombinant human PD-1 protein was characterized extensively and was identified as the assay reference material which mimics the endogenous analyte in structure and function. The lower limit of quantitation (LLOQ) was determined to be 100 pg/mL, with a dynamic range spanning three logs to 10,000 pg/mL. The intra- and inter-assay imprecision were ≤15%, and the assay bias (percent deviation) was ≤10%. Potential matrix effects were investigated in sera from both normal healthy volunteers and selected cancer patients. Bulk-prepared frozen standards and pre-coated Streptavidin plates were used in the assay to ensure consistency in assay performance over time. This assay appears to specifically measure total sPD-1 protein since the human anti-PD-1 antibody, nivolumab, and the endogenous ligands of PD-1 protein, PDL-1 and PDL-2, do not interfere with the assay.

  11. Estimation of dynamic rotor loads for the rotor systems research aircraft: Methodology development and validation

    Science.gov (United States)

    Duval, R. W.; Bahrami, M.

    1985-01-01

    The Rotor Systems Research Aircraft uses load cells to isolate the rotor/transmission systm from the fuselage. A mathematical model relating applied rotor loads and inertial loads of the rotor/transmission system to the load cell response is required to allow the load cells to be used to estimate rotor loads from flight data. Such a model is derived analytically by applying a force and moment balance to the isolated rotor/transmission system. The model is tested by comparing its estimated values of applied rotor loads with measured values obtained from a ground based shake test. Discrepancies in the comparison are used to isolate sources of unmodeled external loads. Once the structure of the mathematical model has been validated by comparison with experimental data, the parameters must be identified. Since the parameters may vary with flight condition it is desirable to identify the parameters directly from the flight data. A Maximum Likelihood identification algorithm is derived for this purpose and tested using a computer simulation of load cell data. The identification is found to converge within 10 samples. The rapid convergence facilitates tracking of time varying parameters of the load cell model in flight.

  12. Secretin-stimulated ultrasound estimation of pancreatic secretion in cystic fibrosis validated by magnetic resonance imaging

    International Nuclear Information System (INIS)

    Engjom, Trond; Dimcevski, Georg; Tjora, Erling; Wathle, Gaute; Erchinger, Friedemann; Laerum, Birger N.; Gilja, Odd H.; Haldorsen, Ingfrid Salvesen

    2018-01-01

    Secretin-stimulated magnetic resonance imaging (s-MRI) is the best validated radiological modality assessing pancreatic secretion. The purpose of this study was to compare volume output measures from secretin-stimulated transabdominal ultrasonography (s-US) to s-MRI for the diagnosis of exocrine pancreatic failure in cystic fibrosis (CF). We performed transabdominal ultrasonography and MRI before and at timed intervals during 15 minutes after secretin stimulation in 21 CF patients and 13 healthy controls. To clearly identify the subjects with reduced exocrine pancreatic function, we classified CF patients as pancreas-sufficient or -insufficient by secretin-stimulated endoscopic short test and faecal elastase. Pancreas-insufficient CF patients had reduced pancreatic secretions compared to pancreas-sufficient subjects based on both imaging modalities (p < 0.001). Volume output estimates assessed by s-US correlated to that of s-MRI (r = 0.56-0.62; p < 0.001). Both s-US (AUC: 0.88) and s-MRI (AUC: 0.99) demonstrated good diagnostic accuracy for exocrine pancreatic failure. Pancreatic volume-output estimated by s-US corresponds well to exocrine pancreatic function in CF patients and yields comparable results to that of s-MRI. s-US provides a simple and feasible tool in the assessment of pancreatic secretion. (orig.)

  13. Regional GRACE-based estimates of water mass variations over Australia: validation and interpretation

    Science.gov (United States)

    Seoane, L.; Ramillien, G.; Frappart, F.; Leblanc, M.

    2013-04-01

    Time series of regional 2°-by-2° GRACE solutions have been computed from 2003 to 2011 with a 10 day resolution by using an energy integral method over Australia [112° E 156° E; 44° S 10° S]. This approach uses the dynamical orbit analysis of GRACE Level 1 measurements, and specially accurate along-track K Band Range Rate (KBRR) residuals (1 μm s-1 level of error) to estimate the total water mass over continental regions. The advantages of regional solutions are a significant reduction of GRACE aliasing errors (i.e. north-south stripes) providing a more accurate estimation of water mass balance for hydrological applications. In this paper, the validation of these regional solutions over Australia is presented as well as their ability to describe water mass change as a reponse of climate forcings such as El Niño. Principal component analysis of GRACE-derived total water storage maps show spatial and temporal patterns that are consistent with independent datasets (e.g. rainfall, climate index and in-situ observations). Regional TWS show higher spatial correlations with in-situ water table measurements over Murray-Darling drainage basin (80-90%), and they offer a better localization of hydrological structures than classical GRACE global solutions (i.e. Level 2 GRGS products and 400 km ICA solutions as a linear combination of GFZ, CSR and JPL GRACE solutions).

  14. Estimation of skull table thickness with clinical CT and validation with microCT.

    Science.gov (United States)

    Lillie, Elizabeth M; Urban, Jillian E; Weaver, Ashley A; Powers, Alexander K; Stitzel, Joel D

    2015-01-01

    Brain injuries resulting from motor vehicle crashes (MVC) are extremely common yet the details of the mechanism of injury remain to be well characterized. Skull deformation is believed to be a contributing factor to some types of traumatic brain injury (TBI). Understanding biomechanical contributors to skull deformation would provide further insight into the mechanism of head injury resulting from blunt trauma. In particular, skull thickness is thought be a very important factor governing deformation of the skull and its propensity for fracture. Current computed tomography (CT) technology is limited in its ability to accurately measure cortical thickness using standard techniques. A method to evaluate cortical thickness using cortical density measured from CT data has been developed previously. This effort validates this technique for measurement of skull table thickness in clinical head CT scans using two postmortem human specimens. Bone samples were harvested from the skulls of two cadavers and scanned with microCT to evaluate the accuracy of the estimated cortical thickness measured from clinical CT. Clinical scans were collected at 0.488 and 0.625 mm in plane resolution with 0.625 mm thickness. The overall cortical thickness error was determined to be 0.078 ± 0.58 mm for cortical samples thinner than 4 mm. It was determined that 91.3% of these differences fell within the scanner resolution. Color maps of clinical CT thickness estimations are comparable to color maps of microCT thickness measurements, indicating good quantitative agreement. These data confirm that the cortical density algorithm successfully estimates skull table thickness from clinical CT scans. The application of this technique to clinical CT scans enables evaluation of cortical thickness in population-based studies. © 2014 Anatomical Society.

  15. Epithelium percentage estimation facilitates epithelial quantitative protein measurement in tissue specimens.

    Science.gov (United States)

    Chen, Jing; Toghi Eshghi, Shadi; Bova, George Steven; Li, Qing Kay; Li, Xingde; Zhang, Hui

    2013-12-01

    The rapid advancement of high-throughput tools for quantitative measurement of proteins has demonstrated the potential for the identification of proteins associated with cancer. However, the quantitative results on cancer tissue specimens are usually confounded by tissue heterogeneity, e.g. regions with cancer usually have significantly higher epithelium content yet lower stromal content. It is therefore necessary to develop a tool to facilitate the interpretation of the results of protein measurements in tissue specimens. Epithelial cell adhesion molecule (EpCAM) and cathepsin L (CTSL) are two epithelial proteins whose expressions in normal and tumorous prostate tissues were confirmed by measuring staining intensity with immunohistochemical staining (IHC). The expressions of these proteins were measured by ELISA in protein extracts from OCT embedded frozen prostate tissues. To eliminate the influence of tissue heterogeneity on epithelial protein quantification measured by ELISA, a color-based segmentation method was developed in-house for estimation of epithelium content using H&E histology slides from the same prostate tissues and the estimated epithelium percentage was used to normalize the ELISA results. The epithelium contents of the same slides were also estimated by a pathologist and used to normalize the ELISA results. The computer based results were compared with the pathologist's reading. We found that both EpCAM and CTSL levels, measured by ELISA assays itself, were greatly affected by epithelium content in the tissue specimens. Without adjusting for epithelium percentage, both EpCAM and CTSL levels appeared significantly higher in tumor tissues than normal tissues with a p value less than 0.001. However, after normalization by the epithelium percentage, ELISA measurements of both EpCAM and CTSL were in agreement with IHC staining results, showing a significant increase only in EpCAM with no difference in CTSL expression in cancer tissues. These results

  16. Validity of bioelectrical impedance analysis in estimation of fat-free mass in colorectal cancer patients.

    Science.gov (United States)

    Ræder, Hanna; Kværner, Ane Sørlie; Henriksen, Christine; Florholmen, Geir; Henriksen, Hege Berg; Bøhn, Siv Kjølsrud; Paur, Ingvild; Smeland, Sigbjørn; Blomhoff, Rune

    2018-02-01

    Bioelectrical impedance analysis (BIA) is an accessible and cheap method to measure fat-free mass (FFM). However, BIA estimates are subject to uncertainty in patient populations with altered body composition and hydration. The aim of the current study was to validate a whole-body and a segmental BIA device against dual-energy X-ray absorptiometry (DXA) in colorectal cancer (CRC) patients, and to investigate the ability of different empiric equations for BIA to predict DXA FFM (FFM DXA ). Forty-three non-metastatic CRC patients (aged 50-80 years) were enrolled in this study. Whole-body and segmental BIA FFM estimates (FFM whole-bodyBIA , FFM segmentalBIA ) were calculated using 14 empiric equations, including the equations from the manufacturers, before comparison to FFM DXA estimates. Strong linear relationships were observed between FFM BIA and FFM DXA estimates for all equations (R 2  = 0.94-0.98 for both devices). However, there were large discrepancies in FFM estimates depending on the equations used with mean differences in the ranges -6.5-6.8 kg and -11.0-3.4 kg for whole-body and segmental BIA, respectively. For whole-body BIA, 77% of BIA derived FFM estimates were significantly different from FFM DXA , whereas for segmental BIA, 85% were significantly different. For whole-body BIA, the Schols* equation gave the highest agreement with FFM DXA with mean difference ±SD of -0.16 ± 1.94 kg (p = 0.582). The manufacturer's equation gave a small overestimation of FFM with 1.46 ± 2.16 kg (p FFM DXA (0.17 ± 1.83 kg (p = 0.546)). Using the manufacturer's equation, no difference in FFM estimates was observed (-0.34 ± 2.06 kg (p = 0.292)), however, a clear proportional bias was detected (r = 0.69, p FFM compared to DXA using the optimal equation. In a population of non-metastatic CRC patients, mostly consisting of Caucasian adults and with a wide range of body composition measures, both the whole-body BIA and segmental BIA device

  17. ValidatorDB: database of up-to-date validation results for ligands and non-standard residues from the Protein Data Bank.

    Science.gov (United States)

    Sehnal, David; Svobodová Vařeková, Radka; Pravda, Lukáš; Ionescu, Crina-Maria; Geidl, Stanislav; Horský, Vladimír; Jaiswal, Deepti; Wimmerová, Michaela; Koča, Jaroslav

    2015-01-01

    Following the discovery of serious errors in the structure of biomacromolecules, structure validation has become a key topic of research, especially for ligands and non-standard residues. ValidatorDB (freely available at http://ncbr.muni.cz/ValidatorDB) offers a new step in this direction, in the form of a database of validation results for all ligands and non-standard residues from the Protein Data Bank (all molecules with seven or more heavy atoms). Model molecules from the wwPDB Chemical Component Dictionary are used as reference during validation. ValidatorDB covers the main aspects of validation of annotation, and additionally introduces several useful validation analyses. The most significant is the classification of chirality errors, allowing the user to distinguish between serious issues and minor inconsistencies. Other such analyses are able to report, for example, completely erroneous ligands, alternate conformations or complete identity with the model molecules. All results are systematically classified into categories, and statistical evaluations are performed. In addition to detailed validation reports for each molecule, ValidatorDB provides summaries of the validation results for the entire PDB, for sets of molecules sharing the same annotation (three-letter code) or the same PDB entry, and for user-defined selections of annotations or PDB entries. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Development and Cross-Validation of Equation for Estimating Percent Body Fat of Korean Adults According to Body Mass Index

    Directory of Open Access Journals (Sweden)

    Hoyong Sung

    2017-06-01

    Full Text Available Background : Using BMI as an independent variable is the easiest way to estimate percent body fat. Thus far, few studies have investigated the development and cross-validation of an equation for estimating the percent body fat of Korean adults according to the BMI. The goals of this study were the development and cross-validation of an equation for estimating the percent fat of representative Korean adults using the BMI. Methods : Samples were obtained from the Korea National Health and Nutrition Examination Survey between 2008 and 2011. The samples from 2008-2009 and 2010-2011 were labeled as the validation group (n=10,624 and the cross-validation group (n=8,291, respectively. The percent fat was measured using dual-energy X-ray absorptiometry, and the body mass index, gender, and age were included as independent variables to estimate the measured percent fat. The coefficient of determination (R², standard error of estimation (SEE, and total error (TE were calculated to examine the accuracy of the developed equation. Results : The cross-validated R² was 0.731 for Model 1 and 0.735 for Model 2. The SEE was 3.978 for Model 1 and 3.951 for Model 2. The equations developed in this study are more accurate for estimating percent fat of the cross-validation group than those previously published by other researchers. Conclusion : The newly developed equations are comparatively accurate for the estimation of the percent fat of Korean adults.

  19. Validation of computer code TRAFIC used for estimation of charcoal heatup in containment ventilation systems

    International Nuclear Information System (INIS)

    Yadav, D.H.; Datta, D.; Malhotra, P.K.; Ghadge, S.G.; Bajaj, S.S.

    2005-01-01

    Full text of publication follows: Standard Indian PHWRs are provided with a Primary Containment Filtration and Pump-Back System (PCFPB) incorporating charcoal filters in the ventilation circuit to remove radioactive iodine that may be released from reactor core into the containment during LOCA+ECCS failure which is a Design Basis Accident for containment of radioactive release. This system is provided with two identical air circulation loops, each having 2 full capacity fans (1 operating and 1 standby) for a bank of four combined charcoal and High Efficiency Particulate Activity (HEPA) filters, in addition to other filters. While the filtration circuit is designed to operate under forced flow conditions, it is of interest to understand the performance of the charcoal filters, in the event of failure of the fans after operating for some time, i.e., when radio-iodine inventory is at its peak value. It is of interest to check whether the buoyancy driven natural circulation occurring in the filtration circuit is sufficient enough to keep the temperature in the charcoal under safe limits. A computer code TRAFIC (Transient Analysis of Filters in Containment) was developed using conservative one dimensional model to analyze the system. Suitable parametric studies were carried out to understand the problem and to identify the safety of existing system. TRAFIC Code has two important components. The first one estimates the heat generation in charcoal filter based on 'Source Term'; while the other one performs thermal-hydraulic computations. In an attempt validate the Code, experimental studies have been carried out. For this purpose, an experimental set up comprising of scaled down model of filtration circuit with heating coils embedded in charcoal for simulating the heating effect due to radio iodine has been constructed. The present work of validation consists of utilizing the results obtained from experiments conducted for different heat loads, elevations and adsorbent

  20. Estimating mortality from external causes using data from retrospective surveys: A validation study in Niakhar (Senegal

    Directory of Open Access Journals (Sweden)

    Gilles Pison

    2018-03-01

    Full Text Available Background: In low- and middle-income countries (LMICs, data on causes of death is often inaccurate or incomplete. In this paper, we test whether adding a few questions about injuries and accidents to mortality questionnaires used in representative household surveys would yield accurate estimates of the extent of mortality due to external causes (accidents, homicides, or suicides. Methods: We conduct a validation study in Niakhar (Senegal, during which we compare reported survey data to high-quality prospective records of deaths collected by a health and demographic surveillance system (HDSS. Results: Survey respondents more frequently list the deaths of their adult siblings who die of external causes than the deaths of those who die from other causes. The specificity of survey data is high, but sensitivity is low. Among reported deaths, less than 60Š of the deaths classified as due to external causes by the HDSS are also classified as such by survey respondents. Survey respondents better report deaths due to road-traffic accidents than deaths from suicides and homicides. Conclusions: Asking questions about deaths resulting from injuries and accidents during surveys might help measure mortality from external causes in LMICs, but the resulting data displays systematic bias in a rural population of Senegal. Future studies should 1 investigate whether similar biases also apply in other settings and 2 test new methods to further improve the accuracy of survey data on mortality from external causes. Contribution: This study helps strengthen the monitoring of sustainable development targets in LMICs by validating a simple approach for the measurement of mortality from external causes.

  1. Pattern of protein retention in growing boars of different breeds, and estimation of maximum protein retention

    DEFF Research Database (Denmark)

    Tauson, A H; Chwalibog, André; Jakobsen, K

    1998-01-01

    Protein and energy metabolism in boars of different breeds, 10 each of Hampshire, Duroc and Danish Landrace was measured in balance and respiration experiments by means of indirect calorimetry in an open-air circulation system. Measurements were performed in four periods (Period I-IV) covering th...

  2. Development and validation of risk prediction equations to estimate survival in patients with colorectal cancer: cohort study

    OpenAIRE

    Hippisley-Cox, Julia; Coupland, Carol

    2017-01-01

    Objective: To develop and externally validate risk prediction equations to estimate absolute and conditional survival in patients with colorectal cancer. \\ud \\ud Design: Cohort study.\\ud \\ud Setting: General practices in England providing data for the QResearch database linked to the national cancer registry.\\ud \\ud Participants: 44 145 patients aged 15-99 with colorectal cancer from 947 practices to derive the equations. The equations were validated in 15 214 patients with colorectal cancer ...

  3. Validation of a spectrophotometer-based method for estimating daily sperm production and deferent duct transit.

    Science.gov (United States)

    Froman, D P; Rhoads, D D

    2012-10-01

    The objectives of the present work were 3-fold. First, a new method for estimating daily sperm production was validated. This method, in turn, was used to evaluate testis output as well as deferent duct throughput. Next, this analytical approach was evaluated in 2 experiments. The first experiment compared left and right reproductive tracts within roosters. The second experiment compared reproductive tract throughput in roosters from low and high sperm mobility lines. Standard curves were constructed from which unknown concentrations of sperm cells and sperm nuclei could be predicted from observed absorbance. In each case, the independent variable was based upon hemacytometer counts, and absorbance was a linear function of concentration. Reproductive tracts were excised, semen recovered from each duct, and the extragonadal sperm reserve determined by multiplying volume by sperm cell concentration. Testicular sperm nuclei were procured by homogenization of a whole testis, overlaying a 20-mL volume of homogenate upon 15% (wt/vol) Accudenz (Accurate Chemical and Scientific Corporation, Westbury, NY), and then washing nuclei by centrifugation through the Accudenz layer. Daily sperm production was determined by dividing the predicted number of sperm nuclei within the homogenate by 4.5 d (i.e., the time sperm with elongated nuclei spend within the testis). Sperm transit through the deferent duct was estimated by dividing the extragonadal reserve by daily sperm production. Neither the efficiency of sperm production (sperm per gram of testicular parenchyma per day) nor deferent duct transit differed between left and right reproductive tracts (P > 0.05). Whereas efficiency of sperm production did not differ (P > 0.05) between low and high sperm mobility lines, deferent duct transit differed between lines (P < 0.001). On average, this process required 2.2 and 1.0 d for low and high lines, respectively. In summary, we developed and then tested a method for quantifying male

  4. Validity of Two New Brief Instruments to Estimate Vegetable Intake in Adults

    Directory of Open Access Journals (Sweden)

    Janine Wright

    2015-08-01

    Full Text Available Cost effective population-based monitoring tools are needed for nutritional surveillance and interventions. The aim was to evaluate the relative validity of two new brief instruments (three item: VEG3 and five item: VEG5 for estimating usual total vegetable intake in comparison to a 7-day dietary record (7DDR. Sixty-four Australian adult volunteers aged 30 to 69 years (30 males, mean age ± SD 56.3 ± 9.2 years and 34 female mean age ± SD 55.3 ± 10.0 years. Pearson correlations between 7DDR and VEG3 and VEG5 were modest, at 0.50 and 0.56, respectively. VEG3 significantly (p < 0.001 underestimated mean vegetable intake compared to 7DDR measures (2.9 ± 1.3 vs. 3.6 ± 1.6 serves/day, respectively, whereas mean vegetable intake assessed by VEG5 did not differ from 7DDR measures (3.3 ± 1.5 vs. 3.6 ± 1.6 serves/day. VEG5 was also able to correctly identify 95%, 88% and 75% of those subjects not consuming five, four and three serves/day of vegetables according to their 7DDR classification. VEG5, but not VEG3, can estimate usual total vegetable intake of population groups and had superior performance to VEG3 in identifying those not meeting different levels of vegetable intake. VEG5, a brief instrument, shows measurement characteristics useful for population-based monitoring and intervention targeting.

  5. Body protein losses estimated by nitrogen balance and potassium-40 counting

    International Nuclear Information System (INIS)

    Belyea, R.L.; Babbitt, C.L.; Sedgwick, H.T.; Zinn, G.M.

    1986-01-01

    Body protein losses estimated from N balance were compared with those estimated by 40K counting. Six nonlactating dairy cows were fed an adequate N diet for 7 wk, a low N diet for 9 wk, and a replete N diet for 3 wk. The low N diet contained high cell wall grass hay plus ground corn, starch, and molasses. Soybean meal was added to the low N diet to increase N in the adequate N and replete N diets. Intake was measured daily. Digestibilities, N balance, and body composition (estimated by 40K counting) were determined during each dietary regimen. During low N treatment, hay dry matter intake declined 2 kg/d, and supplement increased about .5 kg/d. Dry matter digestibility was not altered by N treatment. Protein and acid detergent fiber digestibilities decreased from 40 and 36% during adequate N to 20 and 2%, respectively, during low N. Fecal and urinary N also declined when cows were fed the low N diet. By the end of repletion, total intake, fiber, and protein digestibilities as well as N partition were similar to or exceeded those during adequate N intake. Body protein (N) loss was estimated by N balance to be about 3 kg compared with 8 kg by 40K counting. Body fat losses (32 kg) were large because of low energy digestibility and intake. Seven kilograms of body fat were regained during repletion, but there was no change in body protein

  6. Assessing the external validity of model-based estimates of the incidence of heart attack in England: a modelling study

    Directory of Open Access Journals (Sweden)

    Peter Scarborough

    2016-11-01

    Full Text Available Abstract Background The DisMod II model is designed to estimate epidemiological parameters on diseases where measured data are incomplete and has been used to provide estimates of disease incidence for the Global Burden of Disease study. We assessed the external validity of the DisMod II model by comparing modelled estimates of the incidence of first acute myocardial infarction (AMI in England in 2010 with estimates derived from a linked dataset of hospital records and death certificates. Methods Inputs for DisMod II were prevalence rates of ever having had an AMI taken from a population health survey, total mortality rates and AMI mortality rates taken from death certificates. By definition, remission rates were zero. We estimated first AMI incidence in an external dataset from England in 2010 using a linked dataset including all hospital admissions and death certificates since 1998. 95 % confidence intervals were derived around estimates from the external dataset and DisMod II estimates based on sampling variance and reported uncertainty in prevalence estimates respectively. Results Estimates of the incidence rate for the whole population were higher in the DisMod II results than the external dataset (+54 % for men and +26 % for women. Age-specific results showed that the DisMod II results over-estimated incidence for all but the oldest age groups. Confidence intervals for the DisMod II and external dataset estimates did not overlap for most age groups. Conclusion By comparison with AMI incidence rates in England, DisMod II did not achieve external validity for age-specific incidence rates, but did provide global estimates of incidence that are of similar magnitude to measured estimates. The model should be used with caution when estimating age-specific incidence rates.

  7. Validation of a Robust Neural Real-Time Voltage Estimator for Active Distribution Grids on Field Data

    DEFF Research Database (Denmark)

    Pertl, Michael; Douglass, Philip James; Heussen, Kai

    2018-01-01

    network approach for voltage estimation in active distribution grids by means of measured data from two feeders of a real low voltage distribution grid. The approach enables a real-time voltage estimation at locations in the distribution grid, where otherwise only non-real-time measurements are available......The installation of measurements in distribution grids enables the development of data driven methods for the power system. However, these methods have to be validated in order to understand the limitations and capabilities for their use. This paper presents a systematic validation of a neural...

  8. Validation of the iPhone app using the force platform to estimate vertical jump height.

    Science.gov (United States)

    Carlos-Vivas, Jorge; Martin-Martinez, Juan P; Hernandez-Mocholi, Miguel A; Perez-Gomez, Jorge

    2018-03-01

    Vertical jump performance has been evaluated with several devices: force platforms, contact mats, Vertec, accelerometers, infrared cameras and high-velocity cameras; however, the force platform is considered the gold standard for measuring vertical jump height. The purpose of this study was to validate an iPhone app called My Jump, that measures vertical jump height by comparing it with other methods that use the force platform to estimate vertical jump height, namely, vertical velocity at take-off and time in the air. A total of 40 sport sciences students (age 21.4±1.9 years) completed five countermovement jumps (CMJs) over a force platform. Thus, 200 CMJ heights were evaluated from the vertical velocity at take-off and the time in the air using the force platform, and from the time in the air with the My Jump mobile application. The height obtained was compared using the intraclass correlation coefficient (ICC). Correlation between APP and force platform using the time in the air was perfect (ICC=1.000, PJump, is an appropriate method to evaluate the vertical jump performance; however, vertical jump height is slightly overestimated compared with that of the force platform.

  9. Development and validation of RP-HPLC method for estimation of eplerenone in spiked human plasma

    Directory of Open Access Journals (Sweden)

    Paraag Gide

    2012-10-01

    Full Text Available A rapid and simple high performance liquid chromatography (HPLC method with a UV detection (241 nm was developed and validated for estimation of eplerenone from spiked human plasma. The analyte and the internal standard (valdecoxib were extracted with a mixture of dichloromethane and diethyl ether. The chromatographic separation was performed on a HiQSil C-18HS column (250 mm×4.6 mm, 5 μm with a mobile phase consisting of acetonitrile:water (50:50, v/v at flow rate of 1 mL/min. The calibration curve was linear in the range 100–3200 ng/mL and the heteroscedasticity was minimized by using weighted least squares regression with weighting factor 1/X. Keywords: Eplerenone, Liquid–liquid extraction, Weighted regression, HPLC–UV

  10. Validation of equations using anthropometric and bioelectrical impedance for estimating body composition of the elderly

    Directory of Open Access Journals (Sweden)

    Cassiano Ricardo Rech

    2006-08-01

    Full Text Available The increase of the elderly population has enhanced the need for studying aging-related issues. In this context, the analysis of morphological alterations occurring with the age has been discussed thoroughly. Evidences point that there are few information on valid methods for estimating body composition of senior citizens in Brazil. Therefore, the objective of this study was to cross-validate equations using either anthropometric or bioelectrical impedance (BIA data for estimation of body fat (%BF and of fat-free mass (FFM in a sample of older individuals from Florianópolis-SC, having the dual energy x-ray absorptiometry (DEXA as the criterion-measurement. The group was composed by 180 subjects (60 men and 120 women who participated in four community Groups for the elderly and were systematically randomly selected by a telephone interview, with age ranging from 60 to 81 years. The variables stature, body mass, body circumferences, skinfold thickness, reactance and resistance were measured in the morning at The Sports Center of the Federal University of Santa Catarina. The DEXA evaluation was performed in the afternoon at The Diagnosis Center through Image in Florianópolis-SC. Twenty anthropometric and 8 BIA equations were analyzed for cross-validation. For those equations that estimate body density, the equation of Siri (1961 and the adapted-equation by Deurenberg et al. (1989 were used for conversion into %BF. The analyses were performed with the statistical package SPSS, version 11.5, establishing the level of significance at 5%. The criteria of cross-validation suggested by Lohman (1992 and the graphic dispersion analyses in relation to the mean, as proposed by Bland and Altman (1986 were used. The group presented values for the body mass index (BMI between 18.4kg.m-2 and 39.3kg.m-2. The mean %BF was of 23.1% (sd=5.8 for men and 37.3% (sd=6.9 in women, varying from 6% to 51.4%. There were no differences among the estimates of the equations

  11. Genepleio software for effective estimation of gene pleiotropy from protein sequences.

    Science.gov (United States)

    Chen, Wenhai; Chen, Dandan; Zhao, Ming; Zou, Yangyun; Zeng, Yanwu; Gu, Xun

    2015-01-01

    Though pleiotropy, which refers to the phenomenon of a gene affecting multiple traits, has long played a central role in genetics, development, and evolution, estimation of the number of pleiotropy components remains a hard mission to accomplish. In this paper, we report a newly developed software package, Genepleio, to estimate the effective gene pleiotropy from phylogenetic analysis of protein sequences. Since this estimate can be interpreted as the minimum pleiotropy of a gene, it is used to play a role of reference for many empirical pleiotropy measures. This work would facilitate our understanding of how gene pleiotropy affects the pattern of genotype-phenotype map and the consequence of organismal evolution.

  12. Design, synthesis, and validation of a β-turn mimetic library targeting protein-protein and peptide-receptor interactions.

    Science.gov (United States)

    Whitby, Landon R; Ando, Yoshio; Setola, Vincent; Vogt, Peter K; Roth, Bryan L; Boger, Dale L

    2011-07-06

    The design and synthesis of a β-turn mimetic library as a key component of a small-molecule library targeting the major recognition motifs involved in protein-protein interactions is described. Analysis of a geometric characterization of 10,245 β-turns in the protein data bank (PDB) suggested that trans-pyrrolidine-3,4-dicarboxamide could serve as an effective and synthetically accessible library template. This was confirmed by initially screening select compounds against a series of peptide-activated GPCRs that recognize a β-turn structure in their endogenous ligands. This validation study was highlighted by identification of both nonbasic and basic small molecules with high affinities (K(i) = 390 and 23 nM, respectively) for the κ-opioid receptor (KOR). Consistent with the screening capabilities of collaborators and following the design validation, the complete library was assembled as 210 mixtures of 20 compounds, providing a total of 4200 compounds designed to mimic all possible permutations of 3 of the 4 residues in a naturally occurring β-turn. Unique to the design and because of the C(2) symmetry of the template, a typical 20 × 20 × 20-mix (8000 compounds prepared as 400 mixtures of 20 compounds) needed to represent 20 variations in the side chains of three amino acid residues reduces to a 210 × 20-mix, thereby simplifying the library synthesis and subsequent screening. The library was prepared using a solution-phase synthetic protocol with liquid-liquid or liquid-solid extractions for purification and conducted on a scale that insures its long-term availability for screening campaigns. Screening the library against the human opioid receptors (KOR, MOR, and DOR) identified not only the activity of library members expected to mimic the opioid receptor peptide ligands but also additional side-chain combinations that provided enhanced receptor binding selectivities (>100-fold) and affinities (as low as K(i) = 80 nM for KOR). A key insight to emerge from

  13. Measurement of the incorporation rates of four amino acids into proteins for estimating bacterial production.

    Science.gov (United States)

    Servais, P

    1995-03-01

    In aquatic ecosystems, [(3)H]thymidine incorporation into bacterial DNA and [(3)H]leucine incorporation into proteins are usually used to estimate bacterial production. The incorporation rates of four amino acids (leucine, tyrosine, lysine, alanine) into proteins of bacteria were measured in parallel on natural freshwater samples from the basin of the river Meuse (Belgium). Comparison of the incorporation into proteins and into the total macromolecular fraction showed that these different amino acids were incorporated at more than 90% into proteins. From incorporation measurements at four subsaturated concentrations (range, 2-77 nm), the maximum incorporation rates were determined. Strong correlations (r > 0.91 for all the calculated correlations) were found between the maximum incorporation rates of the different tested amino acids over a range of two orders of magnitude of bacterial activity. Bacterial production estimates were calculated using theoretical and experimental conversion factors. The productions calculated from the incorporation rates of the four amino acids were in good concordance, especially when the experimental conversion factors were used (slope range, 0.91-1.11, and r > 0.91). This study suggests that the incorporation of various amino acids into proteins can be used to estimate bacterial production.

  14. Validating alternative methodologies to estimate the hydrological regime of temporary streams when flow data are unavailable

    Science.gov (United States)

    Llorens, Pilar; Gallart, Francesc; Latron, Jérôme; Cid, Núria; Rieradevall, Maria; Prat, Narcís

    2016-04-01

    Aquatic life in temporary streams is strongly conditioned by the temporal variability of the hydrological conditions that control the occurrence and connectivity of diverse mesohabitats. In this context, the software TREHS (Temporary Rivers' Ecological and Hydrological Status) has been developed, in the framework of the LIFE Trivers project, to help managers for adequately implement the Water Framework Directive in this type of water bodies. TREHS, using the methodology described in Gallart et al (2012), defines six temporal 'aquatic states', based on the hydrological conditions representing different mesohabitats, for a given reach at a particular moment. Nevertheless, hydrological data for assessing the regime of temporary streams are often non-existent or scarce. The scarcity of flow data makes frequently impossible the characterization of temporary streams hydrological regimes and, as a consequence, the selection of the correct periods and methods to determine their ecological status. Because of its qualitative nature, the TREHS approach allows the use of alternative methodologies to assess the regime of temporary streams in the lack of observed flow data. However, to adapt the TREHS to this qualitative data both the temporal scheme (from monthly to seasonal) as well as the number of aquatic states (from 6 to 3) have been modified. Two alternatives complementary methodologies were tested within the TREHS framework to assess the regime of temporary streams: interviews and aerial photographs. All the gauging stations (13) belonging to the Catalan Internal Catchments (NE, Spain) with recurrent zero flows periods were selected to validate both methodologies. On one hand, non-structured interviews were carried out to inhabitants of villages and small towns near the gauging stations. Flow permanence metrics for input into TREHS were drawn from the notes taken during the interviews. On the other hand, the historical series of available aerial photographs (typically 10

  15. Prevalence Estimation and Validation of New Instruments in Psychiatric Research: An Application of Latent Class Analysis and Sensitivity Analysis

    Science.gov (United States)

    Pence, Brian Wells; Miller, William C.; Gaynes, Bradley N.

    2009-01-01

    Prevalence and validation studies rely on imperfect reference standard (RS) diagnostic instruments that can bias prevalence and test characteristic estimates. The authors illustrate 2 methods to account for RS misclassification. Latent class analysis (LCA) combines information from multiple imperfect measures of an unmeasurable latent condition to…

  16. The validity and reproducibility of food-frequency questionnaire–based total antioxidant capacity estimates in Swedish women

    Science.gov (United States)

    Total antioxidant capacity (TAC) provides an assessment of antioxidant activity and synergistic interactions of redox molecules in foods and plasma. We investigated the validity and reproducibility of food frequency questionnaire (FFQ)–based TAC estimates assessed by oxygen radical absorbance capaci...

  17. A comparative study and validation of state estimation algorithms for Li-ion batteries in battery management systems

    International Nuclear Information System (INIS)

    Klee Barillas, Joaquín; Li, Jiahao; Günther, Clemens; Danzer, Michael A.

    2015-01-01

    Highlights: • Description of state observers for estimating the battery’s SOC. • Implementation of four estimation algorithms in a BMS. • Reliability and performance study of BMS regarding the estimation algorithms. • Analysis of the robustness and code properties of the estimation approaches. • Guide to evaluate estimation algorithms to improve the BMS performance. - Abstract: To increase lifetime, safety, and energy usage battery management systems (BMS) for Li-ion batteries have to be capable of estimating the state of charge (SOC) of the battery cells with a very low estimation error. The accurate SOC estimation and the real time reliability are critical issues for a BMS. In general an increasing complexity of the estimation methods leads to higher accuracy. On the other hand it also leads to a higher computational load and may exceed the BMS limitations or increase its costs. An approach to evaluate and verify estimation algorithms is presented as a requisite prior the release of the battery system. The approach consists of an analysis concerning the SOC estimation accuracy, the code properties, complexity, the computation time, and the memory usage. Furthermore, a study for estimation methods is proposed for their evaluation and validation with respect to convergence behavior, parameter sensitivity, initialization error, and performance. In this work, the introduced analysis is demonstrated with four of the most published model-based estimation algorithms including Luenberger observer, sliding-mode observer, Extended Kalman Filter and Sigma-point Kalman Filter. The experiments under dynamic current conditions are used to verify the real time functionality of the BMS. The results show that a simple estimation method like the sliding-mode observer can compete with the Kalman-based methods presenting less computational time and memory usage. Depending on the battery system’s application the estimation algorithm has to be selected to fulfill the

  18. Model Based Optimal Control, Estimation, and Validation of Lithium-Ion Batteries

    Science.gov (United States)

    Perez, Hector Eduardo

    This dissertation focuses on developing and experimentally validating model based control techniques to enhance the operation of lithium ion batteries, safely. An overview of the contributions to address the challenges that arise are provided below. Chapter 1: This chapter provides an introduction to battery fundamentals, models, and control and estimation techniques. Additionally, it provides motivation for the contributions of this dissertation. Chapter 2: This chapter examines reference governor (RG) methods for satisfying state constraints in Li-ion batteries. Mathematically, these constraints are formulated from a first principles electrochemical model. Consequently, the constraints explicitly model specific degradation mechanisms, such as lithium plating, lithium depletion, and overheating. This contrasts with the present paradigm of limiting measured voltage, current, and/or temperature. The critical challenges, however, are that (i) the electrochemical states evolve according to a system of nonlinear partial differential equations, and (ii) the states are not physically measurable. Assuming available state and parameter estimates, this chapter develops RGs for electrochemical battery models. The results demonstrate how electrochemical model state information can be utilized to ensure safe operation, while simultaneously enhancing energy capacity, power, and charge speeds in Li-ion batteries. Chapter 3: Complex multi-partial differential equation (PDE) electrochemical battery models are characterized by parameters that are often difficult to measure or identify. This parametric uncertainty influences the state estimates of electrochemical model-based observers for applications such as state-of-charge (SOC) estimation. This chapter develops two sensitivity-based interval observers that map bounded parameter uncertainty to state estimation intervals, within the context of electrochemical PDE models and SOC estimation. Theoretically, this chapter extends the

  19. In vitro estimation of rumen microbial protein synthesis of water buffaloes using 30S as tracer

    International Nuclear Information System (INIS)

    Hendratno, C.; Abidin, Z.; Bahaudin, R.; Sastrapradja, D.

    1977-01-01

    An experiment to study the effect of diet and individual differences of animals on the in vitro estimation of rumen microbial protein synthesis in young female water buffaloes using the technique of inorganic 35 S incorporation, is described. The dietary treatments were four combinations of roughage supplemented with cassava meal. From the value of rate constant for dilution of radioactivity in the sulphide pool and percentage of inorganic 35 S incorporated into microbial protein, it can be concluded that individual differences of animals have no influence on the efficiency of microbial protein synthesis. Feed composition, on the other hand, tends to have some influence on the efficiency of protein synthesis(P3O.15). (author)

  20. A method for fast energy estimation and visualization of protein-ligand interaction

    Science.gov (United States)

    Tomioka, Nobuo; Itai, Akiko; Iitaka, Yoichi

    1987-10-01

    A new computational and graphical method for facilitating ligand-protein docking studies is developed on a three-dimensional computer graphics display. Various physical and chemical properties inside the ligand binding pocket of a receptor protein, whose structure is elucidated by X-ray crystal analysis, are calculated on three-dimensional grid points and are stored in advance. By utilizing those tabulated data, it is possible to estimate the non-bonded and electrostatic interaction energy and the number of possible hydrogen bonds between protein and ligand molecules in real time during an interactive docking operation. The method also provides a comprehensive visualization of the local environment inside the binding pocket. With this method, it becomes easier to find a roughly stable geometry of ligand molecules, and one can therefore make a rapid survey of the binding capability of many drug candidates. The method will be useful for drug design as well as for the examination of protein-ligand interactions.

  1. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  2. Validation of protein evaluation systems by means of milk production experiments with dairy cows.

    NARCIS (Netherlands)

    Straalen, van W.M.; Salaün, C.; Veen, W.A.G.; Rypkema, Y.S.; Hof, G.; Boxem, T.J.

    1994-01-01

    Protein evaluation systems (crude protein (CP), digestible crude protein (DCP), protein digested in the intestine (PDI), amino acids truly absorbed in the small intestine (AAT), absorbed protein (AP), metabolizable protein (MP), crude protein flow at the duodenum (AAS) and digestible protein in

  3. Validation and uncertainty estimation of fast neutron activation analysis method for Cu, Fe, Al, Si elements in sediment samples

    International Nuclear Information System (INIS)

    Sunardi; Samin Prihatin

    2010-01-01

    Validation and uncertainty estimation of Fast Neutron Activation Analysis (FNAA) method for Cu, Fe, Al, Si elements in sediment samples has been conduced. The aim of the research is to confirm whether FNAA method is still matches to ISO/lEC 17025-2005 standard. The research covered the verification, performance, validation of FNM and uncertainty estimation. Standard of SRM 8704 and sediments were weighted for certain weight and irradiated with 14 MeV fast neutron and then counted using gamma spectrometry. The result of validation method for Cu, Fe, Al, Si element showed that the accuracy were in the range of 95.89-98.68 %, while the precision were in the range 1.13-2.29 %. The result of uncertainty estimation for Cu, Fe, Al, and Si were 2.67, 1.46, 1.71 and 1.20 % respectively. From this data, it can be concluded that the FNM method is still reliable and valid for element contents analysis in samples, because the accuracy is up to 95 % and the precision is under 5 %, while the uncertainty are relatively small and suitable for the range 95 % level of confidence where the uncertainty maximum is 5 %. (author)

  4. Validity of parent-reported weight and height of preschool children measured at home or estimated without home measurement: a validation study

    Directory of Open Access Journals (Sweden)

    Cox Bianca

    2011-07-01

    Full Text Available Abstract Background Parental reports are often used in large-scale surveys to assess children's body mass index (BMI. Therefore, it is important to know to what extent these parental reports are valid and whether it makes a difference if the parents measured their children's weight and height at home or whether they simply estimated these values. The aim of this study is to compare the validity of parent-reported height, weight and BMI values of preschool children (3-7 y-old, when measured at home or estimated by parents without actual measurement. Methods The subjects were 297 Belgian preschool children (52.9% male. Participation rate was 73%. A questionnaire including questions about height and weight of the children was completed by the parents. Nurses measured height and weight following standardised procedures. International age- and sex-specific BMI cut-off values were employed to determine categories of weight status and obesity. Results On the group level, no important differences in accuracy of reported height, weight and BMI were identified between parent-measured or estimated values. However, for all 3 parameters, the correlations between parental reports and nurse measurements were higher in the group of children whose body dimensions were measured by the parents. Sensitivity for underweight and overweight/obesity were respectively 73% and 47% when parents measured their child's height and weight, and 55% and 47% when parents estimated values without measurement. Specificity for underweight and overweight/obesity were respectively 82% and 97% when parents measured the children, and 75% and 93% with parent estimations. Conclusions Diagnostic measures were more accurate when parents measured their child's weight and height at home than when those dimensions were based on parental judgements. When parent-reported data on an individual level is used, the accuracy could be improved by encouraging the parents to measure weight and height

  5. Differences in the validity of a visual estimation method for determining patients' meal intake between various meal types and supplied food items.

    Science.gov (United States)

    Kawasaki, Yui; Akamatsu, Rie; Tamaura, Yuki; Sakai, Masashi; Fujiwara, Keiko; Tsutsuura, Satomi

    2018-02-12

    The aim of this study was to examine differences in the validity of a visual estimation method for determining patients' meal intake between various meal types and supplied food items in hospitals and to find factors influencing the validity of a visual estimation method. There are two procedures by which we obtained the information on dietary intake of the patients in these hospitals. These are both by visual assessment from the meal trays at the time of their clearing, by the attending nursing staff and by weighing conducted by researchers. The following criteria are set for the target trays: A) standard or therapeutic meals, which are monitored by a doctor, for energy and/or protein and/or sodium; B) regular, bite-sized, minced and pureed meal texture, and C) half-portion meals. Visual assessment results were tested for their validity by comparing with the corresponding results of weighing. Differences between these two methods indicated the estimated and absolute values of nutrient intake. A total of 255 (76.1%) trays were included in the analysis out of the 335 possible trays and the results indicated that the energy consumption estimates by visual or weighing procedures are not significantly different (412 ± 173 kcal, p = 0.15). However, the mean protein consumption was significantly different (16.3 ± 6.7 g/tray, p food items were significantly misestimated for energy intake (66 ± 58 kcal/tray) compared to trays with no additions (32 ± 39 kcal/tray, p food items were significantly associated with increased odds of a difference between the two methods (OR: 3.84; 95% confidence interval [CI]: 1.07-13.85). There were high correlations between the visual estimation method and the weighing method measuring patients' dietary intake for various meal types and textures, except for meals with added supplied food items. Nursing staff need to be attentive to supplied food items. Copyright © 2018 Elsevier Ltd and European Society for Clinical

  6. Validation of the CHIRPS Satellite Rainfall Estimates over Eastern of Africa

    Science.gov (United States)

    Dinku, T.; Funk, C. C.; Tadesse, T.; Ceccato, P.

    2017-12-01

    Long and temporally consistent rainfall time series are essential in climate analyses and applications. Rainfall data from station observations are inadequate over many parts of the world due to sparse or non-existent observation networks, or limited reporting of gauge observations. As a result, satellite rainfall estimates have been used as an alternative or as a supplement to station observations. However, many satellite-based rainfall products with long time series suffer from coarse spatial and temporal resolutions and inhomogeneities caused by variations in satellite inputs. There are some satellite rainfall products with reasonably consistent time series, but they are often limited to specific geographic areas. The Climate Hazards Group Infrared Precipitation (CHIRP) and CHIRP combined with station observations (CHIRPS) are recently produced satellite-based rainfall products with relatively high spatial and temporal resolutions and quasi-global coverage. In this study, CHIRP and CHIRPS were evaluated over East Africa at daily, dekadal (10-day) and monthly time scales. The evaluation was done by comparing the satellite products with rain gauge data from about 1200 stations. The is unprecedented number of validation stations for this region covering. The results provide a unique region-wide understanding of how satellite products perform over different climatic/geographic (low lands, mountainous regions, and coastal) regions. The CHIRP and CHIRPS products were also compared with two similar satellite rainfall products: the African Rainfall Climatology version 2 (ARC2) and the latest release of the Tropical Applications of Meteorology using Satellite data (TAMSAT). The results show that both CHIRP and CHIRPS products are significantly better than ARC2 with higher skill and low or no bias. These products were also found to be slightly better than the latest version of the TAMSAT product. A comparison was also done between the latest release of the TAMSAT product

  7. Milk protein concentration, estimated breeding value for fertility, and reproductive performance in lactating dairy cows.

    Science.gov (United States)

    Morton, John M; Auldist, Martin J; Douglas, Meaghan L; Macmillan, Keith L

    2017-07-01

    Milk protein concentration in dairy cows has been positively associated with a range of measures of reproductive performance, and genetic factors affecting both milk protein concentration and reproductive performance may contribute to the observed phenotypic associations. It was of interest to assess whether these beneficial phenotypic associations are accounted for or interact with the effects of estimated breeding values for fertility. The effects of a multitrait estimated breeding value for fertility [the Australian breeding value for daughter fertility (ABV fertility)] on reproductive performance were also of interest. Interactions of milk protein concentration and ABV fertility with the interval from calving date to the start of the herd's seasonally concentrated breeding period were also assessed. A retrospective single cohort study was conducted using data collected from 74 Australian seasonally and split calving dairy herds. Associations between milk protein concentration, ABV fertility, and reproductive performance in Holstein cows were assessed using random effects logistic regression. Between 52,438 and 61,939 lactations were used for analyses of 4 reproductive performance measures. Milk protein concentration was strongly and positively associated with reproductive performance in dairy cows, and this effect was not accounted for by the effects of ABV fertility. Increases in ABV fertility had important additional beneficial effects on the probability of pregnancy by wk 6 and 21 of the herd's breeding period. For cows calved before the start of the breeding period, the effects of increases in both milk protein concentration and ABV fertility were beneficial regardless of their interval from calving to the start of the breeding period. These findings demonstrate the potential for increasing reproductive performance through identifying the causes of the association between milk protein concentration and reproductive performance and then devising management

  8. Estimating patient dose from CT exams that use automatic exposure control: Development and validation of methods to accurately estimate tube current values.

    Science.gov (United States)

    McMillan, Kyle; Bostani, Maryam; Cagnon, Christopher H; Yu, Lifeng; Leng, Shuai; McCollough, Cynthia H; McNitt-Gray, Michael F

    2017-08-01

    The vast majority of body CT exams are performed with automatic exposure control (AEC), which adapts the mean tube current to the patient size and modulates the tube current either angularly, longitudinally or both. However, most radiation dose estimation tools are based on fixed tube current scans. Accurate estimates of patient dose from AEC scans require knowledge of the tube current values, which is usually unavailable. The purpose of this work was to develop and validate methods to accurately estimate the tube current values prescribed by one manufacturer's AEC system to enable accurate estimates of patient dose. Methods were developed that took into account available patient attenuation information, user selected image quality reference parameters and x-ray system limits to estimate tube current values for patient scans. Methods consistent with AAPM Report 220 were developed that used patient attenuation data that were: (a) supplied by the manufacturer in the CT localizer radiograph and (b) based on a simulated CT localizer radiograph derived from image data. For comparison, actual tube current values were extracted from the projection data of each patient. Validation of each approach was based on data collected from 40 pediatric and adult patients who received clinically indicated chest (n = 20) and abdomen/pelvis (n = 20) scans on a 64 slice multidetector row CT (Sensation 64, Siemens Healthcare, Forchheim, Germany). For each patient dataset, the following were collected with Institutional Review Board (IRB) approval: (a) projection data containing actual tube current values at each projection view, (b) CT localizer radiograph (topogram) and (c) reconstructed image data. Tube current values were estimated based on the actual topogram (actual-topo) as well as the simulated topogram based on image data (sim-topo). Each of these was compared to the actual tube current values from the patient scan. In addition, to assess the accuracy of each method in estimating

  9. Validity of the Remote Food Photography Method (RFPM) for estimating energy and nutrient intake in near real-time

    OpenAIRE

    Martin, C. K.; Correa, J. B.; Han, H.; Allen, H. R.; Rood, J.; Champagne, C. M.; Gunturk, B. K.; Bray, G. A.

    2011-01-01

    Two studies are reported; a pilot study to demonstrate feasibility followed by a larger validity study. Study 1’s objective was to test the effect of two ecological momentary assessment (EMA) approaches that varied in intensity on the validity/accuracy of estimating energy intake with the Remote Food Photography Method (RFPM) over six days in free-living conditions. When using the RFPM, Smartphones are used to capture images of food selection and plate waste and to send the images to a server...

  10. Validation of podocalyxin-like protein as a biomarker of poor prognosis in colorectal cancer

    International Nuclear Information System (INIS)

    Larsson, Anna; Fridberg, Marie; Gaber, Alexander; Nodin, Björn; Levéen, Per; Jönsson, Göran; Uhlén, Mathias; Birgisson, Helgi; Jirström, Karin

    2012-01-01

    Podocalyxin-like 1 (PODXL) is a cell-adhesion glycoprotein and stem cell marker that has been associated with an aggressive tumour phenotype and adverse outcome in several cancer types. We recently demonstrated that overexpression of PODXL is an independent factor of poor prognosis in colorectal cancer (CRC). The aim of this study was to validate these results in two additional independent patient cohorts and to examine the correlation between PODXL mRNA and protein levels in a subset of tumours. PODXL protein expression was analyzed by immunohistochemistry in tissue microarrays with tumour samples from a consecutive, retrospective cohort of 270 CRC patients (cohort 1) and a prospective cohort of 337 CRC patients (cohort 2). The expression of PODXL mRNA was measured by real-time quantitative PCR in a subgroup of 62 patients from cohort 2. Spearman´;s Rho and Chi-Square tests were used for analysis of correlations between PODXL expression and clinicopathological parameters. Kaplan Meier analysis and Cox proportional hazards modelling were applied to assess the relationship between PODXL expression and time to recurrence (TTR), disease free survival (DFS) and overall survival (OS). High PODXL protein expression was significantly associated with unfavourable clinicopathological characteristics in both cohorts. In cohort 1, high PODXL expression was associated with a significantly shorter 5-year OS in both univariable (HR = 2.28; 95% CI 1.43-3.63, p = 0.001) and multivariable analysis (HR = 2.07; 95% CI 1.25-3.43, p = 0.005). In cohort 2, high PODXL expression was associated with a shorter TTR (HR = 2.93; 95% CI 1.26-6.82, p = 0.013) and DFS (HR = 2.44; 95% CI 1.32-4.54, p = 0.005), remaining significant in multivariable analysis, HR = 2.50; 95% CI 1.05-5.96, p = 0.038 for TTR and HR = 2.11; 95% CI 1.13-3.94, p = 0.019 for DFS. No significant correlation could be found between mRNA levels and protein expression of PODXL and there was no association between mRNA levels

  11. Deorphanization and target validation of cross-tick species conserved novel Amblyomma americanum tick saliva protein.

    Science.gov (United States)

    Mulenga, Albert; Kim, Tae Kwon; Ibelli, Adriana Mércia Guaratini

    2013-05-01

    We previously identified a cross-tick species conserved tick feeding stimuli responsive Amblyomma americanum (Aam) AV422 gene. This study demonstrates that AamAV422 belongs to a novel group of arthropod proteins that is characterized by 14 cysteine amino acid residues: C(23)-X7/9-C(33)-X23/24-C(58)-X8-C(67)-X7-C(75)-X23-C(99)-X15-C(115)-X10-C(126)-X24/25/33-C(150)C(151)-X7-C(159)-X8-C(168)-X23/24-C(192)-X9/10-C(202) predicted to form seven disulfide bonds. We show that AamAV422 protein is a ubiquitously expressed protein that is injected into the host within the first 24h of the tick attaching onto the host as revealed by Western blotting analyses of recombinant (r)AamAV422, tick saliva and dissected tick organ protein extracts using antibodies to 24 and 48 h tick saliva proteins. Native AamAV422 is apparently involved with mediating tick anti-hemostasis and anti-complement functions in that rAamAV422 delayed plasma clotting time in a dose responsive manner by up to ≈ 160 s, prevented platelet aggregation by up to ≈ 16% and caused ≈ 24% reduction in production of terminal complement complexes. Target validation analysis revealed that rAamAV422 is a potential candidate for a cocktail or multivalent tick vaccine preparation in that RNA interference (RNAi)-mediated silencing of AamAV422 mRNA caused a statistically significant (≈ 44%) reduction in tick engorgement weights, which is proxy for amounts of ingested blood. We speculate that AamAV422 is a potential target antigen for development of the highly desired universal tick vaccine in that consistent with high conservation among ticks, antibodies to 24h Ixodes scapularis tick saliva proteins specifically bound rAamAV422. We discuss data in this study in the context of advancing the biology of tick feeding physiology and discovery of potential target antigens for tick vaccine development. Copyright © 2013 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  12. An estimated 5% of new protein structures solved today represent a new Pfam family

    International Nuclear Information System (INIS)

    Mistry, Jaina; Kloppmann, Edda; Rost, Burkhard; Punta, Marco

    2013-01-01

    This study uses the Pfam database to show that the sequence redundancy of protein structures deposited in the PDB is increasing. The possible reasons behind this trend are discussed. High-resolution structural knowledge is key to understanding how proteins function at the molecular level. The number of entries in the Protein Data Bank (PDB), the repository of all publicly available protein structures, continues to increase, with more than 8000 structures released in 2012 alone. The authors of this article have studied how structural coverage of the protein-sequence space has changed over time by monitoring the number of Pfam families that acquired their first representative structure each year from 1976 to 2012. Twenty years ago, for every 100 new PDB entries released, an estimated 20 Pfam families acquired their first structure. By 2012, this decreased to only about five families per 100 structures. The reasons behind the slower pace at which previously uncharacterized families are being structurally covered were investigated. It was found that although more than 50% of current Pfam families are still without a structural representative, this set is enriched in families that are small, functionally uncharacterized or rich in problem features such as intrinsically disordered and transmembrane regions. While these are important constraints, the reasons why it may not yet be time to give up the pursuit of a targeted but more comprehensive structural coverage of the protein-sequence space are discussed

  13. Estimation of microbial protein supply in ruminants using urinary purine derivatives

    International Nuclear Information System (INIS)

    Makkar, H.P.S.; Chen, X.B.

    2004-01-01

    This publication presents various models, describing the quantitative excretion of purine derivatives in urine, developed for various breeds of cattle and for sheep, goat, camel and buffalo and their use for estimation of microbial protein supply in ruminant livestock. It also describes progress made over the last decade in analytical methods for determining purine derivatives, and a unique approach for estimating microbial protein supply using spot urine samples developed under the FAO/IAEA CRP. This approach of using spot urine samples dispenses with quantitative recovery of urine, enabling its use by field and extension workers for evaluation of the nutritional status of farm animals. Future areas of research are also highlighted in the book. This book is a good source of reference for research workers, students and extension workers alike

  14. Validation by theoretical approach to the experimental estimation of efficiency for gamma spectrometry of gas in 100 ml standard flask

    International Nuclear Information System (INIS)

    Mohan, V.; Chudalayandi, K.; Sundaram, M.; Krishnamony, S.

    1996-01-01

    Estimation of gaseous activity forms an important component of air monitoring at Madras Atomic Power Station (MAPS). The gases of importance are argon 41 an air activation product and fission product noble gas xenon 133. For estimating the concentration, the experimental method is used in which a grab sample is collected in a 100 ml volumetric standard flask. The activity of gas is then computed by gamma spectrometry using a predetermined efficiency estimated experimentally. An attempt is made using theoretical approach to validate the experimental method of efficiency estimation. Two analytical models named relative flux model and absolute activity model were developed independently of each other. Attention is focussed on the efficiencies for 41 Ar and 133 Xe. Results show that the present method of sampling and analysis using 100 ml volumetric flask is adequate and acceptable. (author). 5 refs., 2 tabs

  15. Altered plasma apolipoprotein modifications in patients with pancreatic cancer: protein characterization and multi-institutional validation.

    Directory of Open Access Journals (Sweden)

    Kazufumi Honda

    Full Text Available BACKGROUND: Among the more common human malignancies, invasive ductal carcinoma of the pancreas has the worst prognosis. The poor outcome seems to be attributable to difficulty in early detection. METHODS: We compared the plasma protein profiles of 112 pancreatic cancer patients with those of 103 sex- and age-matched healthy controls (Cohort 1 using a newly developed matrix-assisted laser desorption/ionization (oMALDI QqTOF (quadrupole time-of-flight mass spectrometry (MS system. RESULTS: We found that hemi-truncated apolipoprotein AII dimer (ApoAII-2; 17252 m/z, unglycosylated apolipoprotein CIII (ApoCIII-0; 8766 m/z, and their summed value were significantly decreased in the pancreatic cancer patients [P = 1.36×10(-21, P = 4.35×10(-14, and P = 1.83×10(-24 (Mann-Whitney U-test; area-under-curve values of 0.877, 0.798, and 0.903, respectively]. The significance was further validated in a total of 1099 plasma/serum samples, consisting of 2 retrospective cohorts [Cohort 2 (n = 103 and Cohort 3 (n = 163] and a prospective cohort [Cohort 4 (n = 833] collected from 8 medical institutions in Japan and Germany. CONCLUSIONS: We have constructed a robust quantitative MS profiling system and used it to validate alterations of modified apolipoproteins in multiple cohorts of patients with pancreatic cancer.

  16. Estimating rice grain protein contents with SPOT/HRV data acquired at maturing stage

    International Nuclear Information System (INIS)

    Asaka, D.; Shiga, H.

    2003-01-01

    Rice grain protein contents that play an important role in the eating quality of rice can be estimated from leaf color in maturing stage. In order to investigate the distribution of paddy rice grain protein of a wide area, we employed SPOT/HRV data from August to September for successive 4 years, selecting the Naganuma town, Hokkaido as the study area. The relationship between each spectral bands and ground survey data were examined. The result showed that the grain protein contents could be estimated using the normalized difference vegetation index (NDVI) with the absolute root mean square error less than 0.4% under the condition that the time lag between the satellite observation date and the maturing stage was within 20 days. In this period, we would have enough chance to get clear observation data every year under the weather conditions in the study area using the SPOT/HRV sensors that has pointing ability. For major rice varieties cultivated in Hokkaido, the same relationship between NDVI and protein contents was observed. Thus, we conclude that the method proposed in this study is operational in rice production

  17. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    Science.gov (United States)

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  18. Systematic Testing of Belief-Propagation Estimates for Absolute Free Energies in Atomistic Peptides and Proteins.

    Science.gov (United States)

    Donovan-Maiye, Rory M; Langmead, Christopher J; Zuckerman, Daniel M

    2018-01-09

    Motivated by the extremely high computing costs associated with estimates of free energies for biological systems using molecular simulations, we further the exploration of existing "belief propagation" (BP) algorithms for fixed-backbone peptide and protein systems. The precalculation of pairwise interactions among discretized libraries of side-chain conformations, along with representation of protein side chains as nodes in a graphical model, enables direct application of the BP approach, which requires only ∼1 s of single-processor run time after the precalculation stage. We use a "loopy BP" algorithm, which can be seen as an approximate generalization of the transfer-matrix approach to highly connected (i.e., loopy) graphs, and it has previously been applied to protein calculations. We examine the application of loopy BP to several peptides as well as the binding site of the T4 lysozyme L99A mutant. The present study reports on (i) the comparison of the approximate BP results with estimates from unbiased estimators based on the Amber99SB force field; (ii) investigation of the effects of varying library size on BP predictions; and (iii) a theoretical discussion of the discretization effects that can arise in BP calculations. The data suggest that, despite their approximate nature, BP free-energy estimates are highly accurate-indeed, they never fall outside confidence intervals from unbiased estimators for the systems where independent results could be obtained. Furthermore, we find that libraries of sufficiently fine discretization (which diminish library-size sensitivity) can be obtained with standard computing resources in most cases. Altogether, the extremely low computing times and accurate results suggest the BP approach warrants further study.

  19. Development and validation of a CFD based methodology to estimate the pressure loss of flow through perforated plates

    International Nuclear Information System (INIS)

    Barros Filho, Jose A.; Navarro, Moyses A.; Santos, Andre A.C. dos; Jordao, E.

    2011-01-01

    In spite of the recent great development of Computational Fluid Dynamics (CFD), there are still some issues about how to assess its accurateness. This work presents the validation of a CFD methodology devised to estimate the pressure drop of water flow through perforated plates similar to the ones used in some reactor core components. This was accomplished by comparing the results of CFD simulations against experimental data of 5 perforated plates with different geometric characteristics. The proposed methodology correlates the experimental data within a range of ± 7.5%. The validation procedure recommended by the ASME Standard for Verification and Validation in Computational Fluid Dynamics and Heat Transfer-V and V 20 is also evaluated. The conclusion is that it is not adequate to this specific use. (author)

  20. A novel approach to sequence validating protein expression clones with automated decision making

    Directory of Open Access Journals (Sweden)

    Mohr Stephanie E

    2007-06-01

    Full Text Available Abstract Background Whereas the molecular assembly of protein expression clones is readily automated and routinely accomplished in high throughput, sequence verification of these clones is still largely performed manually, an arduous and time consuming process. The ultimate goal of validation is to determine if a given plasmid clone matches its reference sequence sufficiently to be "acceptable" for use in protein expression experiments. Given the accelerating increase in availability of tens of thousands of unverified clones, there is a strong demand for rapid, efficient and accurate software that automates clone validation. Results We have developed an Automated Clone Evaluation (ACE system – the first comprehensive, multi-platform, web-based plasmid sequence verification software package. ACE automates the clone verification process by defining each clone sequence as a list of multidimensional discrepancy objects, each describing a difference between the clone and its expected sequence including the resulting polypeptide consequences. To evaluate clones automatically, this list can be compared against user acceptance criteria that specify the allowable number of discrepancies of each type. This strategy allows users to re-evaluate the same set of clones against different acceptance criteria as needed for use in other experiments. ACE manages the entire sequence validation process including contig management, identifying and annotating discrepancies, determining if discrepancies correspond to polymorphisms and clone finishing. Designed to manage thousands of clones simultaneously, ACE maintains a relational database to store information about clones at various completion stages, project processing parameters and acceptance criteria. In a direct comparison, the automated analysis by ACE took less time and was more accurate than a manual analysis of a 93 gene clone set. Conclusion ACE was designed to facilitate high throughput clone sequence

  1. Validation of an efficient visual method for estimating leaf area index ...

    African Journals Online (AJOL)

    This study aimed to evaluate the accuracy and applicability of a visual method for estimating LAI in clonal Eucalyptus grandis × E. urophylla plantations and to compare it with hemispherical photography, ceptometer and LAI-2000® estimates. Destructive sampling for direct determination of the actual LAI was performed in ...

  2. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  3. Genepleio Software for Effective Estimation of Gene Pleiotropy from Protein Sequences

    Directory of Open Access Journals (Sweden)

    Wenhai Chen

    2015-01-01

    Full Text Available Though pleiotropy, which refers to the phenomenon of a gene affecting multiple traits, has long played a central role in genetics, development, and evolution, estimation of the number of pleiotropy components remains a hard mission to accomplish. In this paper, we report a newly developed software package, Genepleio, to estimate the effective gene pleiotropy from phylogenetic analysis of protein sequences. Since this estimate can be interpreted as the minimum pleiotropy of a gene, it is used to play a role of reference for many empirical pleiotropy measures. This work would facilitate our understanding of how gene pleiotropy affects the pattern of genotype-phenotype map and the consequence of organismal evolution.

  4. CANDU radiotoxicity inventories estimation: A calculated experiment cross-check for data verification and validation

    International Nuclear Information System (INIS)

    Pavelescu, Alexandru Octavian; Cepraga, Dan Gabriel

    2007-01-01

    This paper is related to the Clearance Potential Index, Ingestion and Inhalation Hazard Factors of the nuclear spent fuel and radioactive wastes. This study required a complex activity that consisted of various phases such us: the acquisition, setting up, validation and application of procedures, codes and libraries. The paper reflects the validation phase of this study. Its objective was to compare the measured inventories of selected actinide and fission products radionuclides in an element from a Pickering CANDU reactor with inventories predicted using a recent version of the ORIGEN-ARP from SCALE 5 coupled with the time dependent cross sections library, CANDU 28.lib, produced by the sequence SAS2H of SCALE 4.4a. In this way, the procedures, codes and libraries for the characterization of radioactive material in terms of radioactive inventories, clearance, and biological hazard factors are being qualified and validated, in support for the safety management of the radioactive wastes. (authors)

  5. Validity of the Remote Food Photography Method (RFPM) for estimating energy and nutrient intake in near real-time.

    Science.gov (United States)

    Martin, Corby K; Correa, John B; Han, Hongmei; Allen, H Raymond; Rood, Jennifer C; Champagne, Catherine M; Gunturk, Bahadir K; Bray, George A

    2012-04-01

    Two studies are reported; a pilot study to demonstrate feasibility followed by a larger validity study. Study 1's objective was to test the effect of two ecological momentary assessment (EMA) approaches that varied in intensity on the validity/accuracy of estimating energy intake (EI) with the Remote Food Photography Method (RFPM) over 6 days in free-living conditions. When using the RFPM, Smartphones are used to capture images of food selection and plate waste and to send the images to a server for food intake estimation. Consistent with EMA, prompts are sent to the Smartphones reminding participants to capture food images. During Study 1, EI estimated with the RFPM and the gold standard, doubly labeled water (DLW), were compared. Participants were assigned to receive Standard EMA Prompts (n = 24) or Customized Prompts (n = 16) (the latter received more reminders delivered at personalized meal times). The RFPM differed significantly from DLW at estimating EI when Standard (mean ± s.d. = -895 ± 770 kcal/day, P < 0.0001), but not Customized Prompts (-270 ± 748 kcal/day, P = 0.22) were used. Error (EI from the RFPM minus that from DLW) was significantly smaller with Customized vs. Standard Prompts. The objectives of Study 2 included testing the RFPM's ability to accurately estimate EI in free-living adults (N = 50) over 6 days, and energy and nutrient intake in laboratory-based meals. The RFPM did not differ significantly from DLW at estimating free-living EI (-152 ± 694 kcal/day, P = 0.16). During laboratory-based meals, estimating energy and macronutrient intake with the RFPM did not differ significantly compared to directly weighed intake.

  6. Validity of a Commercial Linear Encoder to Estimate Bench Press 1 RM from the Force-Velocity Relationship

    Science.gov (United States)

    Bosquet, Laurent; Porta-Benache, Jeremy; Blais, Jérôme

    2010-01-01

    The aim of this study was to assess the validity and accuracy of a commercial linear encoder (Musclelab, Ergotest, Norway) to estimate Bench press 1 repetition maximum (1RM) from the force - velocity relationship. Twenty seven physical education students and teachers (5 women and 22 men) with a heterogeneous history of strength training participated in this study. They performed a 1 RM test and a force - velocity test using a Bench press lifting task in a random order. Mean 1 RM was 61.8 ± 15.3 kg (range: 34 to 100 kg), while 1 RM estimated by the Musclelab’s software from the force-velocity relationship was 56.4 ± 14.0 kg (range: 33 to 91 kg). Actual and estimated 1 RM were very highly correlated (r = 0.93, p<0.001) but largely different (Bias: 5.4 ± 5.7 kg, p < 0.001, ES = 1.37). The 95% limits of agreement were ±11.2 kg, which represented ±18% of actual 1 RM. It was concluded that 1 RM estimated from the force-velocity relationship was a good measure for monitoring training induced adaptations, but also that it was not accurate enough to prescribe training intensities. Additional studies are required to determine whether accuracy is affected by age, sex or initial level. Key points Some commercial devices allow to estimate 1 RM from the force-velocity relationship. These estimations are valid. However, their accuracy is not high enough to be of practical help for training intensity prescription. Day-to-day reliability of force and velocity measured by the linear encoder has been shown to be very high, but the specific reliability of 1 RM estimated from the force-velocity relationship has to be determined before concluding to the usefulness of this approach in the monitoring of training induced adaptations. PMID:24149641

  7. Validity of the Remote Food Photography Method (RFPM) for estimating energy and nutrient intake in near real-time

    Science.gov (United States)

    Martin, C. K.; Correa, J. B.; Han, H.; Allen, H. R.; Rood, J.; Champagne, C. M.; Gunturk, B. K.; Bray, G. A.

    2014-01-01

    Two studies are reported; a pilot study to demonstrate feasibility followed by a larger validity study. Study 1’s objective was to test the effect of two ecological momentary assessment (EMA) approaches that varied in intensity on the validity/accuracy of estimating energy intake with the Remote Food Photography Method (RFPM) over six days in free-living conditions. When using the RFPM, Smartphones are used to capture images of food selection and plate waste and to send the images to a server for food intake estimation. Consistent with EMA, prompts are sent to the Smartphones reminding participants to capture food images. During Study 1, energy intake estimated with the RFPM and the gold standard, doubly labeled water (DLW), were compared. Participants were assigned to receive Standard EMA Prompts (n=24) or Customized Prompts (n=16) (the latter received more reminders delivered at personalized meal times). The RFPM differed significantly from DLW at estimating energy intake when Standard (mean±SD = −895±770 kcal/day, p<.0001), but not Customized Prompts (−270±748 kcal/day, p=.22) were used. Error (energy intake from the RFPM minus that from DLW) was significantly smaller with Customized vs. Standard Prompts. The objectives of Study 2 included testing the RFPM’s ability to accurately estimate energy intake in free-living adults (N=50) over six days, and energy and nutrient intake in laboratory-based meals. The RFPM did not differ significantly from DLW at estimating free-living energy intake (−152±694 kcal/day, p=0.16). During laboratory-based meals, estimating energy and macronutrient intake with the RFPM did not differ significantly compared to directly weighed intake. PMID:22134199

  8. Validation of abundance estimates from mark–recapture and removal techniques for rainbow trout captured by electrofishing in small streams

    Science.gov (United States)

    Rosenberger, Amanda E.; Dunham, Jason B.

    2005-01-01

    Estimation of fish abundance in streams using the removal model or the Lincoln - Peterson mark - recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams. For one-, two-, three-, and four-pass sampling effort in closed sites, we evaluated the influences of fish size and habitat characteristics on sampling efficiency and the accuracy of removal abundance estimates. We also examined the use of models to generate unbiased estimates of fish abundance through adjustment of total catch or biased removal estimates. Our results suggested that the assumptions of the mark - recapture model were satisfied and that abundance estimates based on this approach were unbiased. In contrast, the removal model assumptions were not met. Decreasing sampling efficiencies over removal passes resulted in underestimated population sizes and overestimates of sampling efficiency. This bias decreased, but was not eliminated, with increased sampling effort. Biased removal estimates based on different levels of effort were highly correlated with each other but were less correlated with unbiased mark - recapture estimates. Stream size decreased sampling efficiency, and stream size and instream wood increased the negative bias of removal estimates. We found that reliable estimates of population abundance could be obtained from models of sampling efficiency for different levels of effort. Validation of abundance estimates requires extra attention to routine sampling considerations but can help fisheries biologists avoid pitfalls associated with biased data and facilitate standardized comparisons among studies that employ different sampling methods.

  9. Top-down approach in protein RDC data analysis: de novo estimation of the alignment tensor

    International Nuclear Information System (INIS)

    Chen Kang; Tjandra, Nico

    2007-01-01

    In solution NMR spectroscopy the residual dipolar coupling (RDC) is invaluable in improving both the precision and accuracy of NMR structures during their structural refinement. The RDC also provides a potential to determine protein structure de novo. These procedures are only effective when an accurate estimate of the alignment tensor has already been made. Here we present a top-down approach, starting from the secondary structure elements and finishing at the residue level, for RDC data analysis in order to obtain a better estimate of the alignment tensor. Using only the RDCs from N-H bonds of residues in α-helices and CA-CO bonds in β-strands, we are able to determine the offset and the approximate amplitude of the RDC modulation-curve for each secondary structure element, which are subsequently used as targets for global minimization. The alignment order parameters and the orientation of the major principal axis of individual helix or strand, with respect to the alignment frame, can be determined in each of the eight quadrants of a sphere. The following minimization against RDC of all residues within the helix or strand segment can be carried out with fixed alignment order parameters to improve the accuracy of the orientation. For a helical protein Bax, the three components A xx , A yy and A zz , of the alignment order can be determined with this method in average to within 2.3% deviation from the values calculated with the available atomic coordinates. Similarly for β-sheet protein Ubiquitin they agree in average to within 8.5%. The larger discrepancy in β-strand parameters comes from both the diversity of the β-sheet structure and the lower precision of CA-CO RDCs. This top-down approach is a robust method for alignment tensor estimation and also holds a promise for providing a protein topological fold using limited sets of RDCs

  10. In Vivo Validation of a Blood Vector Velocity Estimator with MR Angiography

    DEFF Research Database (Denmark)

    Hansen, Kristoffer Lindskov; Udesen, Jesper; Thomsen, Carsten

    2009-01-01

    Conventional Doppler methods for blood velocity estimation only estimate the velocity component along the ultrasound beam direction. This implies that a Doppler angle under examination close to 90° results in unreliable information about the true blood direction and blood velocity. The novel method...... indicate that reliable vector velocity estimates can be obtained in vivo using the presented angle-independent 2-D vector velocity method. The TO method can be a useful alternative to conventional Doppler systems by avoiding the angle artifact, thus giving quantitative velocity information....

  11. Use of Modern Chemical Protein Synthesis and Advanced Fluorescent Assay Techniques to Experimentally Validate the Functional Annotation of Microbial Genomes

    Energy Technology Data Exchange (ETDEWEB)

    Kent, Stephen [University of Chicago

    2012-07-20

    The objective of this research program was to prototype methods for the chemical synthesis of predicted protein molecules in annotated microbial genomes. High throughput chemical methods were to be used to make large numbers of predicted proteins and protein domains, based on microbial genome sequences. Microscale chemical synthesis methods for the parallel preparation of peptide-thioester building blocks were developed; these peptide segments are used for the parallel chemical synthesis of proteins and protein domains. Ultimately, it is envisaged that these synthetic molecules would be ‘printed’ in spatially addressable arrays. The unique ability of total synthesis to precision label protein molecules with dyes and with chemical or biochemical ‘tags’ can be used to facilitate novel assay technologies adapted from state-of-the art single molecule fluorescence detection techniques. In the future, in conjunction with modern laboratory automation this integrated set of techniques will enable high throughput experimental validation of the functional annotation of microbial genomes.

  12. Validation of SMAP Root Zone Soil Moisture Estimates with Improved Cosmic-Ray Neutron Probe Observations

    Science.gov (United States)

    Babaeian, E.; Tuller, M.; Sadeghi, M.; Franz, T.; Jones, S. B.

    2017-12-01

    Soil Moisture Active Passive (SMAP) soil moisture products are commonly validated based on point-scale reference measurements, despite the exorbitant spatial scale disparity. The difference between the measurement depth of point-scale sensors and the penetration depth of SMAP further complicates evaluation efforts. Cosmic-ray neutron probes (CRNP) with an approximately 500-m radius footprint provide an appealing alternative for SMAP validation. This study is focused on the validation of SMAP level-4 root zone soil moisture products with 9-km spatial resolution based on CRNP observations at twenty U.S. reference sites with climatic conditions ranging from semiarid to humid. The CRNP measurements are often biased by additional hydrogen sources such as surface water, atmospheric vapor, or mineral lattice water, which sometimes yield unrealistic moisture values in excess of the soil water storage capacity. These effects were removed during CRNP data analysis. Comparison of SMAP data with corrected CRNP observations revealed a very high correlation for most of the investigated sites, which opens new avenues for validation of current and future satellite soil moisture products.

  13. Repeated holdout Cross-Validation of Model to Estimate Risk of Lyme Disease by Landscape Attributes

    Science.gov (United States)

    We previously modeled Lyme disease (LD) risk at the landscape scale; here we evaluate the model's overall goodness-of-fit using holdout validation. Landscapes were characterized within road-bounded analysis units (AU). Observed LD cases (obsLD) were ascertained per AU. Data were ...

  14. Validation of traffic-related air pollution exposure estimates for long-term studies

    NARCIS (Netherlands)

    Van Roosbroeck, S.

    2007-01-01

    This thesis describes a series of studies that investigate the validity of using outdoor concentrations and/or traffic-related indicator exposure variables as a measure for exposure assessment in epidemiological studies on the long-term effect of traffic-related air pollution. A pilot study was

  15. Reliability and validity of food portion size estimation from images using manual flexible digital virtual meshes

    Science.gov (United States)

    The eButton takes frontal images at 4 second intervals throughout the day. A three-dimensional (3D) manually administered wire mesh procedure has been developed to quantify portion sizes from the two-dimensional (2D) images. This paper reports a test of the interrater reliability and validity of use...

  16. Validation and measurement uncertainty estimation in food microbiology: differences between quantitative and qualitative methods

    Directory of Open Access Journals (Sweden)

    Vesna Režić Dereani

    2010-09-01

    Full Text Available The aim of this research is to describe quality control procedures, procedures for validation and measurement uncertainty (MU determination as an important element of quality assurance in food microbiology laboratory for qualitative and quantitative type of analysis. Accreditation is conducted according to the standard ISO 17025:2007. General requirements for the competence of testing and calibration laboratories, which guarantees the compliance with standard operating procedures and the technical competence of the staff involved in the tests, recently are widely introduced in food microbiology laboratories in Croatia. In addition to quality manual introduction, and a lot of general documents, some of the most demanding procedures in routine microbiology laboratories are measurement uncertainty (MU procedures and validation experiment design establishment. Those procedures are not standardized yet even at international level, and they require practical microbiological knowledge, altogether with statistical competence. Differences between validation experiments design for quantitative and qualitative food microbiology analysis are discussed in this research, and practical solutions are shortly described. MU for quantitative determinations is more demanding issue than qualitative MU calculation. MU calculations are based on external proficiency testing data and internal validation data. In this paper, practical schematic descriptions for both procedures are shown.

  17. A comparative study of soft sensor design for lipid estimation of microalgal photobioreactor system with experimental validation.

    Science.gov (United States)

    Yoo, Sung Jin; Jung, Dong Hwi; Kim, Jung Hun; Lee, Jong Min

    2015-03-01

    This study examines the applicability of various nonlinear estimators for online estimation of the lipid concentration in microalgae cultivation system. Lipid is a useful bio-product that has many applications including biofuels and bioactives. However, the improvement of lipid productivity using real-time monitoring and control with experimental validation is limited because measurement of lipid in microalgae is a difficult and time-consuming task. In this study, estimation of lipid concentration from other measurable sources such as biomass or glucose sensor was studied. Extended Kalman filter (EKF), unscented Kalman filter (UKF), and particle filter (PF) were compared in various cases for their applicability to photobioreactor systems. Furthermore, simulation studies to identify appropriate types of sensors for estimating lipid were also performed. Based on the case studies, the most effective case was validated with experimental data and found that UKF and PF with time-varying system noise covariance is effective for microalgal photobioreactor system. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Dry matter yield, chemical composition and estimated extractable protein of legume and grass species during the spring growth

    DEFF Research Database (Denmark)

    Solati, Zeinab; Jørgensen, Uffe; Eriksen, Jørgen

    2017-01-01

    Carbohydrate and Protein System across six harvests during the spring growth. RESULTS The estimated extractable protein [g kg−1 dry matter (DM)] defined as the easily available fractions B1+B2 was significantly higher in white clover and lucerne at all harvests while, if the more cell wall attached fraction B3...... for protein production purpose in a biorefinery due to its high extractable protein content per kg DM. In order to maximise the protein production capacity, harvest should take place during early growth due to a decline in protein extractability with maturity. The final economy of the concept will depend...

  19. Estimating population cause-specific mortality fractions from in-hospital mortality: validation of a new method.

    Directory of Open Access Journals (Sweden)

    Christopher J L Murray

    2007-11-01

    Full Text Available Cause-of-death data for many developing countries are not available. Information on deaths in hospital by cause is available in many low- and middle-income countries but is not a representative sample of deaths in the population. We propose a method to estimate population cause-specific mortality fractions (CSMFs using data already collected in many middle-income and some low-income developing nations, yet rarely used: in-hospital death records.For a given cause of death, a community's hospital deaths are equal to total community deaths multiplied by the proportion of deaths occurring in hospital. If we can estimate the proportion dying in hospital, we can estimate the proportion dying in the population using deaths in hospital. We propose to estimate the proportion of deaths for an age, sex, and cause group that die in hospital from the subset of the population where vital registration systems function or from another population. We evaluated our method using nearly complete vital registration (VR data from Mexico 1998-2005, which records whether a death occurred in a hospital. In this validation test, we used 45 disease categories. We validated our method in two ways: nationally and between communities. First, we investigated how the method's accuracy changes as we decrease the amount of Mexican VR used to estimate the proportion of each age, sex, and cause group dying in hospital. Decreasing VR data used for this first step from 100% to 9% produces only a 12% maximum relative error between estimated and true CSMFs. Even if Mexico collected full VR information only in its capital city with 9% of its population, our estimation method would produce an average relative error in CSMFs across the 45 causes of just over 10%. Second, we used VR data for the capital zone (Distrito Federal and Estado de Mexico and estimated CSMFs for the three lowest-development states. Our estimation method gave an average relative error of 20%, 23%, and 31% for

  20. Optical Tracking Data Validation and Orbit Estimation for Sparse Observations of Satellites by the OWL-Net.

    Science.gov (United States)

    Choi, Jin; Jo, Jung Hyun; Yim, Hong-Suh; Choi, Eun-Jung; Cho, Sungki; Park, Jang-Hyun

    2018-06-07

    An Optical Wide-field patroL-Network (OWL-Net) has been developed for maintaining Korean low Earth orbit (LEO) satellites' orbital ephemeris. The OWL-Net consists of five optical tracking stations. Brightness signals of reflected sunlight of the targets were detected by a charged coupled device (CCD). A chopper system was adopted for fast astrometric data sampling, maximum 50 Hz, within a short observation time. The astrometric accuracy of the optical observation data was validated with precise orbital ephemeris such as Consolidated Prediction File (CPF) data and precise orbit determination result with onboard Global Positioning System (GPS) data from the target satellite. In the optical observation simulation of the OWL-Net for 2017, an average observation span for a single arc of 11 LEO observation targets was about 5 min, while an average optical observation separation time was 5 h. We estimated the position and velocity with an atmospheric drag coefficient of LEO observation targets using a sequential-batch orbit estimation technique after multi-arc batch orbit estimation. Post-fit residuals for the multi-arc batch orbit estimation and sequential-batch orbit estimation were analyzed for the optical measurements and reference orbit (CPF and GPS data). The post-fit residuals with reference show few tens-of-meters errors for in-track direction for multi-arc batch and sequential-batch orbit estimation results.

  1. Optical Tracking Data Validation and Orbit Estimation for Sparse Observations of Satellites by the OWL-Net

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2018-06-01

    Full Text Available An Optical Wide-field patroL-Network (OWL-Net has been developed for maintaining Korean low Earth orbit (LEO satellites’ orbital ephemeris. The OWL-Net consists of five optical tracking stations. Brightness signals of reflected sunlight of the targets were detected by a charged coupled device (CCD. A chopper system was adopted for fast astrometric data sampling, maximum 50 Hz, within a short observation time. The astrometric accuracy of the optical observation data was validated with precise orbital ephemeris such as Consolidated Prediction File (CPF data and precise orbit determination result with onboard Global Positioning System (GPS data from the target satellite. In the optical observation simulation of the OWL-Net for 2017, an average observation span for a single arc of 11 LEO observation targets was about 5 min, while an average optical observation separation time was 5 h. We estimated the position and velocity with an atmospheric drag coefficient of LEO observation targets using a sequential-batch orbit estimation technique after multi-arc batch orbit estimation. Post-fit residuals for the multi-arc batch orbit estimation and sequential-batch orbit estimation were analyzed for the optical measurements and reference orbit (CPF and GPS data. The post-fit residuals with reference show few tens-of-meters errors for in-track direction for multi-arc batch and sequential-batch orbit estimation results.

  2. Validity of a Commercial Linear Encoder to Estimate Bench Press 1 RM from the Force-Velocity Relationship.

    Science.gov (United States)

    Bosquet, Laurent; Porta-Benache, Jeremy; Blais, Jérôme

    2010-01-01

    The aim of this study was to assess the validity and accuracy of a commercial linear encoder (Musclelab, Ergotest, Norway) to estimate Bench press 1 repetition maximum (1RM) from the force - velocity relationship. Twenty seven physical education students and teachers (5 women and 22 men) with a heterogeneous history of strength training participated in this study. They performed a 1 RM test and a force - velocity test using a Bench press lifting task in a random order. Mean 1 RM was 61.8 ± 15.3 kg (range: 34 to 100 kg), while 1 RM estimated by the Musclelab's software from the force-velocity relationship was 56.4 ± 14.0 kg (range: 33 to 91 kg). Actual and estimated 1 RM were very highly correlated (r = 0.93, pvelocity relationship was a good measure for monitoring training induced adaptations, but also that it was not accurate enough to prescribe training intensities. Additional studies are required to determine whether accuracy is affected by age, sex or initial level. Key pointsSome commercial devices allow to estimate 1 RM from the force-velocity relationship.These estimations are valid. However, their accuracy is not high enough to be of practical help for training intensity prescription.Day-to-day reliability of force and velocity measured by the linear encoder has been shown to be very high, but the specific reliability of 1 RM estimated from the force-velocity relationship has to be determined before concluding to the usefulness of this approach in the monitoring of training induced adaptations.

  3. Relative Validity and Reproducibility of a Food-Frequency Questionnaire for Estimating Food Intakes among Flemish Preschoolers

    Directory of Open Access Journals (Sweden)

    Inge Huybrechts

    2009-01-01

    Full Text Available The aims of this study were to assess the relative validity and reproducibility of a semi-quantitative food-frequency questionnaire (FFQ applied in a large region-wide survey among 2.5-6.5 year-old children for estimating food group intakes. Parents/guardians were used as a proxy. Estimated diet records (3d were used as reference method and reproducibility was measured by repeated FFQ administrations five weeks apart. In total 650 children were included in the validity analyses and 124 in the reproducibility analyses. Comparing median FFQ1 to FFQ2 intakes, almost all evaluated food groups showed median differences within a range of ± 15%. However, for median vegetables, fruit and cheese intake, FFQ1 was > 20% higher than FFQ2. For most foods a moderate correlation (0.5-0.7 was obtained between FFQ1 and FFQ2. For cheese, sugared drinks and fruit juice intakes correlations were even > 0.7. For median differences between the 3d EDR and the FFQ, six food groups (potatoes & grains; vegetables Fruit; cheese; meat, game, poultry and fish; and sugared drinks gave a difference > 20%. The largest corrected correlations (>0.6 were found for the intake of potatoes and grains, fruit, milk products, cheese, sugared drinks, and fruit juice, while the lowest correlations (<0.4 for bread and meat products. The proportion of subjects classified within one quartile (in the same/adjacent category by FFQ and EDR ranged from 67% (for meat products to 88% (for fruit juice. Extreme misclassification into the opposite quartiles was for all food groups < 10%. The results indicate that our newly developed FFQ gives reproducible estimates of food group intake. Overall, moderate levels of relative validity were observed for estimates of food group intake.

  4. Development and test validation of a computational scheme for high-fidelity fluence estimations of the Swiss BWRs

    International Nuclear Information System (INIS)

    Vasiliev, A.; Wieselquist, W.; Ferroukhi, H.; Canepa, S.; Heldt, J.; Ledergerber, G.

    2011-01-01

    One of the current objectives within reactor analysis related projects at the Paul Scherrer Institut is the establishment of a comprehensive computational methodology for fast neutron fluence (FNF) estimations of reactor pressure vessels (RPV) and internals for both PWRs and BWRs. In the recent past, such an integral calculational methodology based on the CASMO-4/SIMULATE- 3/MCNPX system of codes was developed for PWRs and validated against RPV scraping tests. Based on the very satisfactory validation results, the methodology was recently applied for predictive FNF evaluations of a Swiss PWR to support the national nuclear safety inspectorate in the framework of life-time estimations. Today, focus is at PSI given to develop a corresponding advanced methodology for high-fidelity FNF estimations of BWR reactors. In this paper, the preliminary steps undertaken in that direction are presented. To start, the concepts of the PWR computational scheme and its transfer/adaptation to BWR are outlined. Then, the modelling of a Swiss BWR characterized by very heterogeneous core designs is presented along with preliminary sensitivity studies carried out to assess the sufficient level of details required for the complex core region. Finally, a first validation test case is presented on the basis of two dosimeter monitors irradiated during two recent cycles of the given BWR reactor. The achieved computational results show a satisfactory agreement with measured dosimeter data and illustrate thereby the feasibility of applying the PSI FNF computational scheme also for BWRs. Further sensitivity/optimization studies are nevertheless necessary in order to consolidate the scheme and to ensure increasing continuously, the fidelity and reliability of the BWR FNF estimations. (author)

  5. MR-based Water Content Estimation in Cartilage: Design and Validation of a Method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sofie; Ringgaard, Steffen

    2012-01-01

    Objective Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Material and Methods We modified and adapted to cartilage tissue T1 map based water content MR sequences commonly used in the neurology field. Using a 37 Celsius degree stable...... was costumed and programmed. Finally, we validated the method after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained data was analyzed and the water content calculated. Then, the same samples were freeze-dried (this technique allows to take out all the water that a tissue...... contains) and we measured the water they contained. Results We could reproduce twice the 37 Celsius degree system and could perform the measurements in a similar way. We found that the MR T1 map based water content sequences can provide information that, after being analyzed with a special software, can...

  6. PEANO, a toolbox for real-time process signal validation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  7. PEANO, a toolbox for real-time process signal validation and estimation

    International Nuclear Information System (INIS)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  8. Engineering C-integral estimates for generalised creep behaviour and finite element validation

    International Nuclear Information System (INIS)

    Kim, Yun-Jae; Kim, Jin-Su; Huh, Nam-Su; Kim, Young-Jin

    2002-01-01

    This paper proposes an engineering method to estimate the creep C-integral for realistic creep laws to assess defective components operating at elevated temperatures. The proposed estimation method is mainly for the steady-state C * -integral, but a suggestion is also given for estimating the transient C(t)-integral. The reference stress approach is the basis of the proposed equation, but an enhancement in terms of accuracy is made through the definition of the reference stress. The proposed estimation equations are compared with extensive elastic-creep FE results employing various creep-deformation constitutive laws for six different geometries, including two-dimensional, axi-symmetric and three-dimensional geometries. Overall good agreement between the proposed method and the FE results provides confidence in the use of the proposed method for defect assessment of components at elevated temperatures. Moreover, it is shown that for surface cracks the proposed method can be used to estimate C * at any location along the crack front

  9. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  10. Does altered protein metabolism interfere with postmortem degradation analysis for PMI estimation?

    Science.gov (United States)

    Zissler, A; Ehrenfellner, B; Foditsch, E E; Monticelli, F C; Pittner, S

    2018-03-02

    An accurate estimation of the postmortem interval (PMI) is a central aspect in forensic routine. Recently, a novel approach based on the analysis of postmortem muscle protein degradation has been proposed. However, a number of questions remain to be answered until sensible application of this method to a broad variety of forensic cases is possible. To evaluate whether altered in vivo protein metabolism interferes with postmortem degradation patterns, we conducted a comparative study. We developed a standardized animal degradation model in rats, and collected additional muscle samples from animals recovering from muscle injury and from rats with developed disuse muscle atrophy after induced spinal cord injury. All samples were analyzed by SDS-PAGE and Western blot, labeling well-characterized muscle proteins. Tropomyosin was found to be stable throughout the investigated PMI and no alterations were detected in regenerating and atrophic muscles. In contrast, significant predictable postmortem changes occurred in desmin and vinculin protein band patterns. While no significant deviations from native patterns were detected in at-death samples of disuse muscle atrophy, interestingly, samples of rats recovering from muscle injury revealed additional desmin and vinculin degradation bands that did not occur in this form in any of the examined postmortem samples regardless of PMI. It remains to be investigated whether in vivo-altered metabolism influences postmortem degradation kinetics or if such muscle samples undergo postmortem degradation in a regular fashion.

  11. Validation of a food quantification picture book and portion sizes estimation applying perception and memory methods.

    Science.gov (United States)

    Szenczi-Cseh, J; Horváth, Zs; Ambrus, Á

    2017-12-01

    We tested the applicability of EPIC-SOFT food picture series used in the context of a Hungarian food consumption survey gathering data for exposure assessment, and investigated errors in food portion estimation resulted from the visual perception and conceptualisation-memory. Sixty-two participants in three age groups (10 to foods. The results were considered acceptable if the relative difference between average estimated and actual weight obtained through the perception method was ≤25%, and the relative standard deviation of the individual weight estimates was food items were rated acceptable. Small portion sizes were tended to be overestimated, large ones were tended to be underestimated. Portions of boiled potato and creamed spinach were all over- and underestimated, respectively. Recalling the portion sizes resulted in overestimation with larger differences (up to 60.7%).

  12. In-vivo validation of fast spectral velocity estimation techniques – preliminary results

    DEFF Research Database (Denmark)

    Hansen, Kristoffer Lindskov; Gran, Fredrik; Pedersen, Mads Møller

    2008-01-01

    Spectral Doppler is a common way to estimate blood velocities in medical ultrasound (US). The standard way of estimating spectrograms is by using Welch's method (WM). WM is dependent on a long observation window (OW) (about 100 transmissions) to produce spectrograms with sufficient spectral...... resolution and contrast. Two adaptive filterbank methods have been suggested to circumvent this problem: the Blood spectral Power Capon method (BPC) and the Blood Amplitude and Phase Estimation method (BAPES). Previously, simulations and flow rig experiments have indicated that the two adaptive methods can...... was scanned using the experimental ultrasound scanner RASMUS and a B-K Medical 5 MHz linear array transducer with an angle of insonation not exceeding 60deg. All 280 spectrograms were then randomised and presented to a radiologist blinded for method and OW for visual evaluation: useful or not useful. WMbw...

  13. Validity and feasibility of a satellite imagery-based method for rapid estimation of displaced populations.

    Science.gov (United States)

    Checchi, Francesco; Stewart, Barclay T; Palmer, Jennifer J; Grundy, Chris

    2013-01-23

    Estimating the size of forcibly displaced populations is key to documenting their plight and allocating sufficient resources to their assistance, but is often not done, particularly during the acute phase of displacement, due to methodological challenges and inaccessibility. In this study, we explored the potential use of very high resolution satellite imagery to remotely estimate forcibly displaced populations. Our method consisted of multiplying (i) manual counts of assumed residential structures on a satellite image and (ii) estimates of the mean number of people per structure (structure occupancy) obtained from publicly available reports. We computed population estimates for 11 sites in Bangladesh, Chad, Democratic Republic of Congo, Ethiopia, Haiti, Kenya and Mozambique (six refugee camps, three internally displaced persons' camps and two urban neighbourhoods with a mixture of residents and displaced) ranging in population from 1,969 to 90,547, and compared these to "gold standard" reference population figures from census or other robust methods. Structure counts by independent analysts were reasonably consistent. Between one and 11 occupancy reports were available per site and most of these reported people per household rather than per structure. The imagery-based method had a precision relative to reference population figures of layout. For each site, estimates were produced in 2-5 working person-days. In settings with clearly distinguishable individual structures, the remote, imagery-based method had reasonable accuracy for the purposes of rapid estimation, was simple and quick to implement, and would likely perform better in more current application. However, it may have insurmountable limitations in settings featuring connected buildings or shelters, a complex pattern of roofs and multi-level buildings. Based on these results, we discuss possible ways forward for the method's development.

  14. Validation and scale dependencies of the triangle method for the evaporative fraction estimation over heterogeneous areas

    DEFF Research Database (Denmark)

    de Tomás, Alberto; Nieto, Héctor; Guzinski, Radoslaw

    2014-01-01

    Remote sensing has proved to be a consistent tool for monitoring water fluxes at regional scales. The triangle method, in particular, estimates the evaporative fraction (EF), defined as the ratio of latent heat flux (LE) to available energy, based on the relationship between satellite observations...... of land surface temperature and a vegetation index. Among other methodologies, this approach has been commonly used as an approximation to estimate LE, mainly over large semi-arid areas with uniform landscape features. In this study, an interpretation of the triangular space has been applied over...

  15. Validity of Standing Posture Eight-electrode Bioelectrical Impedance to Estimate Body Composition in Taiwanese Elderly

    Directory of Open Access Journals (Sweden)

    Ling-Chun Lee

    2014-09-01

    Conclusion: The results of this study showed that the impedance index and LST in the whole body, upper limbs, and lower limbs derived from DXA findings were highly correlated. The LST and BF% estimated by BIA8 in whole body and various body segments were highly correlated with the corresponding DXA results; however, BC-418 overestimates the participants' appendicular LST and underestimates whole body BF%. Therefore, caution is needed when interpreting the results of appendicular LST and whole body BF% estimated for elderly adults.

  16. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors

    NARCIS (Netherlands)

    Jongsma, Maikel; Florczyk, Urszula M.; Hendriks-Balk, Marieelle C.; Michel, Martin C.; Peters, Stephan L. M.; Alewijnse, Astrid E.

    2007-01-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative

  17. [Prognostic estimation in critical patients. Validation of a new and very simple system of prognostic estimation of survival in an intensive care unit].

    Science.gov (United States)

    Abizanda, R; Padron, A; Vidal, B; Mas, S; Belenguer, A; Madero, J; Heras, A

    2006-04-01

    To make the validation of a new system of prognostic estimation of survival in critical patients (EPEC) seen in a multidisciplinar Intensive care unit (ICU). Prospective analysis of a patient cohort seen in the ICU of a multidisciplinar Intensive Medicine Service of a reference teaching hospital with 19 beds. Four hundred eighty four patients admitted consecutively over 6 months in 2003. Data collection of a basic minimum data set that includes patient identification data (gender, age), reason for admission and their origin, prognostic estimation of survival by EPEC, MPM II 0 and SAPS II (the latter two considered as gold standard). Mortality was evaluated on hospital discharge. EPEC validation was done with analysis of its discriminating capacity (ROC curve), calibration of its prognostic capacity (Hosmer Lemeshow C test), resolution of the 2 x 2 Contingency tables around different probability values (20, 50, 70 and mean value of prognostic estimation). The standardized mortality rate (SMR) for each one of the methods was calculated. Linear regression of the EPEC regarding the MPM II 0 and SAPS II was established and concordance analyses were done (Bland-Altman test) of the prediction of mortality by the three systems. In spite of an apparently good linear correlation, similar accuracy of prediction and discrimination capacity, EPEC is not well-calibrated (no likelihood of death greater than 50%) and the concordance analyses show that more than 10% of the pairs were outside the 95% confidence interval. In spite of its ease of application and calculation and of incorporating delay of admission in ICU as a variable, EPEC does not offer any predictive advantage on MPM II 0 or SAPS II, and its predictions adapt to reality worse.

  18. Development and Validation of a Multiplexed Protein Quantitation Assay for the Determination of Three Recombinant Proteins in Soybean Tissues by Liquid Chromatography with Tandem Mass Spectrometry.

    Science.gov (United States)

    Hill, Ryan C; Oman, Trent J; Shan, Guomin; Schafer, Barry; Eble, Julie; Chen, Cynthia

    2015-08-26

    Currently, traditional immunochemistry technologies such as enzyme-linked immunosorbent assays (ELISA) are the predominant analytical tool used to measure levels of recombinant proteins expressed in genetically engineered (GE) plants. Recent advances in agricultural biotechnology have created a need to develop methods capable of selectively detecting and quantifying multiple proteins in complex matrices because of increasing numbers of transgenic proteins being coexpressed or "stacked" to achieve tolerance to multiple herbicides or to provide multiple modes of action for insect control. A multiplexing analytical method utilizing liquid chromatography with tandem mass spectrometry (LC-MS/MS) has been developed and validated to quantify three herbicide-tolerant proteins in soybean tissues: aryloxyalkanoate dioxygenase (AAD-12), 5-enol-pyruvylshikimate-3-phosphate synthase (2mEPSPS), and phosphinothricin acetyltransferase (PAT). Results from the validation showed high recovery and precision over multiple analysts and laboratories. Results from this method were comparable to those obtained with ELISA with respect to protein quantitation, and the described method was demonstrated to be suitable for multiplex quantitation of transgenic proteins in GE crops.

  19. Application of biomimetic HPLC to estimate lipophilicity, protein and phospholipid binding of potential peptide therapeutics

    Directory of Open Access Journals (Sweden)

    Klara Livia Valko

    2018-06-01

    Full Text Available Peptide therapeutics are new modalities offering several challenges to drug discovery. They are generally less stable and permeable in vivo. The characterization of their lipophilicity cannot be carried out using the traditional in silico or wet octanol/water partition coefficients. The prediction of their in vivo distribution and permeability is also challenging. In this paper, it is demonstrated that the biomimetic properties such as lipophilicity, protein and phospholipid binding can be easily assessed by HPLC using chemically bonded protein and immobilized artificial membrane (IAM stationary phases. The obtained properties for a set of potential therapeutic peptides with 3 to 33 amino acids have been analysed and it was found that similar characteristics of the properties could be observed as for small molecule drugs. The albumin binding showed correlation with their measured lipophilicity on the C-18 stationary phase with acidic peptides showing stronger than expected albumin binding. The (IAM chromatography revealed peptide membrane affinity, which was stronger for positively charged peptides (containing arginine and showed correlation to the alpha-1-acid glycoprotein (AGP binding, which was also stronger for positively charged compounds. The in vivo volume of distribution and drug efficiency of the peptides have been estimated using the models developed for small molecules. One of the candidate linear peptides has been assessed in various cellular and in vivo assays and the results have confirmed the estimated cell partition and brain to plasma ratio. It can be demonstrated, that up to 21 amino acids, the peaks of the peptides obtained on the protein phase were symmetrical and narrow. The interaction of larger peptides with the protein stationary phases resulted in wide peaks showing multiple equilibrium processes with slow kinetics during chromatography. The larger peptides showed narrow and symmetrical peaks on the IAM column enabling

  20. Simple knowledge-based descriptors to predict protein-ligand interactions. Methodology and validation

    Science.gov (United States)

    Nissink, J. Willem M.; Verdonk, Marcel L.; Klebe, Gerhard

    2000-11-01

    A new type of shape descriptor is proposed to describe the spatial orientation for non-covalent interactions. It is built from simple, anisotropic Gaussian contributions that are parameterised by 10 adjustable values. The descriptors have been used to fit propensity distributions derived from scatter data stored in the IsoStar database. This database holds composite pictures of possible interaction geometries between a common central group and various interacting moieties, as extracted from small-molecule crystal structures. These distributions can be related to probabilities for the occurrence of certain interaction geometries among different functional groups. A fitting procedure is described that generates the descriptors in a fully automated way. For this purpose, we apply a similarity index that is tailored to the problem, the Split Hodgkin Index. It accounts for the similarity in regions of either high or low propensity in a separate way. Although dependent on the division into these two subregions, the index is robust and performs better than the regular Hodgkin index. The reliability and coverage of the fitted descriptors was assessed using SuperStar. SuperStar usually operates on the raw IsoStar data to calculate propensity distributions, e.g., for a binding site in a protein. For our purpose we modified the code to have it operate on our descriptors instead. This resulted in a substantial reduction in calculation time (factor of five to eight) compared to the original implementation. A validation procedure was performed on a set of 130 protein-ligand complexes, using four representative interacting probes to map the properties of the various binding sites: ammonium nitrogen, alcohol oxygen, carbonyl oxygen, and methyl carbon. The predicted `hot spots' for the binding of these probes were compared to the actual arrangement of ligand atoms in experimentally determined protein-ligand complexes. Results indicate that the version of SuperStar that applies to

  1. Relative validity of fruit and vegetable intake estimated by the food frequency questionnaire used in the Danish National Birth Cohort

    DEFF Research Database (Denmark)

    Mikkelsen, Tina B.; Olsen, Sjurdur F.; Rasmussen, Salka E.

    2007-01-01

    ) (r=0.57); and fruit, vegetables, and juice (F&V&J) (r=0.62). Sensitivities of correct classification by FFQ into the two lowest and the two highest quintiles of F&V&J intake were 58-67% and 50-74%, respectively, and specificities were 71-79% and 65-83%, respectively. F&V&J intake estimated from......Objective: To validate the fruit and vegetable intake estimated from the Food Frequency Questionnaire (FFQ) used in the Danish National Birth Cohort (DNBC). Subjects and setting: The DNBC is a cohort of 101,042 pregnant women in Denmark, who received a FFQ by mail in gestation week 25. A validation...... study with 88 participants was made. A seven-day weighed food diary (FD) and three different biomarkers were employed as comparison methods. Results: Significant correlations between FFQ and FD-based estimates were found for fruit (r=0.66); vegetables (r=0.32); juice (r=0.52); fruit and vegetables (F&V...

  2. Sulphur levels in saliva as an estimation of sulphur status in cattle: a validation study

    NARCIS (Netherlands)

    Dermauw, V.; Froidmont, E.; Dijkstra, J.; Boever, de J.L.; Vyverman, W.; Debeer, A.E.; Janssens, G.P.J.

    2012-01-01

    Effective assessment of sulphur (S) status in cattle is important for optimal health, yet remains difficult. Rumen fluid S concentrations are preferred, but difficult to sample under practical conditions. This study aimed to evaluate salivary S concentration as estimator of S status in cattle.

  3. Validating diagnoses from hospital discharge registers change risk estimates for acute coronary syndrome

    DEFF Research Database (Denmark)

    Joensen, Albert Marni; Schmidt, E.B.; Dethlefsen, Claus

    2007-01-01

    of acute coronary syndrome (ACS) diagnoses identified in a hospital discharge register changed the relative risk estimates of well-established risk factors for ACS. Methods All first-time ACS diagnoses (n=1138) in the Danish National Patient Registry were identified among male participants in the Danish...

  4. Validation of the Ejike-Ijeh equations for the estimation of body fat ...

    African Journals Online (AJOL)

    The Ejike-Ijeh equations for the estimation of body fat percentage makes it possible for the body fat content of individuals and populations to be determined without the use of costly equipment. However, because the equations were derived using data from a young-adult (18-29 years old) Nigerian population, it is important ...

  5. Development and validation of a method to estimate body weight in ...

    African Journals Online (AJOL)

    Mid-arm circumference (MAC) has previously been used as a surrogate indicator of habitus, and the objective of this study was to determine whether MAC cut-off values could be used to predict habitus scores (HSs) to create an objective and standardised weight estimation methodology, the PAWPER XL-MAC method.

  6. Simultaneous Validation of Seven Physical Activity Questionnaires Used in Japanese Cohorts for Estimating Energy Expenditure: A Doubly Labeled Water Study.

    Science.gov (United States)

    Sasai, Hiroyuki; Nakata, Yoshio; Murakami, Haruka; Kawakami, Ryoko; Nakae, Satoshi; Tanaka, Shigeho; Ishikawa-Takata, Kazuko; Yamada, Yosuke; Miyachi, Motohiko

    2018-04-28

    Physical activity questionnaires (PAQs) used in large-scale Japanese cohorts have rarely been simultaneously validated against the gold standard doubly labeled water (DLW) method. This study examined the validity of seven PAQs used in Japan for estimating energy expenditure against the DLW method. Twenty healthy Japanese adults (9 men; mean age, 32.4 [standard deviation {SD}, 9.4] years, mainly researchers and students) participated in this study. Fifteen-day daily total energy expenditure (TEE) and basal metabolic rate (BMR) were measured using the DLW method and a metabolic chamber, respectively. Activity energy expenditure (AEE) was calculated as TEE - BMR - 0.1 × TEE. Seven PAQs were self-administered to estimate TEE and AEE. The mean measured values of TEE and AEE were 2,294 (SD, 318) kcal/day and 721 (SD, 161) kcal/day, respectively. All of the PAQs indicated moderate-to-strong correlations with the DLW method in TEE (rho = 0.57-0.84). Two PAQs (Japan Public Health Center Study [JPHC]-PAQ Short and JPHC-PAQ Long) showed significant equivalence in TEE and moderate intra-class correlation coefficients (ICC). None of the PAQs showed significantly equivalent AEE estimates, with differences ranging from -547 to 77 kcal/day. Correlations and ICCs in AEE were mostly weak or fair (rho = 0.02-0.54, and ICC = 0.00-0.44). Only JPHC-PAQ Short provided significant and fair agreement with the DLW method. TEE estimated by the PAQs showed moderate or strong correlations with the results of DLW. Two PAQs showed equivalent TEE and moderate agreement. None of the PAQs showed equivalent AEE estimation to the gold standard, with weak-to-fair correlations and agreements. Further studies with larger sample sizes are needed to confirm these findings.

  7. A new validation technique for estimations of body segment inertia tensors: Principal axes of inertia do matter.

    Science.gov (United States)

    Rossi, Marcel M; Alderson, Jacqueline; El-Sallam, Amar; Dowling, James; Reinbolt, Jeffrey; Donnelly, Cyril J

    2016-12-08

    The aims of this study were to: (i) establish a new criterion method to validate inertia tensor estimates by setting the experimental angular velocity data of an airborne objects as ground truth against simulations run with the estimated tensors, and (ii) test the sensitivity of the simulations to changes in the inertia tensor components. A rigid steel cylinder was covered with reflective kinematic markers and projected through a calibrated motion capture volume. Simulations of the airborne motion were run with two models, using inertia tensor estimated with geometric formula or the compound pendulum technique. The deviation angles between experimental (ground truth) and simulated angular velocity vectors and the root mean squared deviation angle were computed for every simulation. Monte Carlo analyses were performed to assess the sensitivity of simulations to changes in magnitude of principal moments of inertia within ±10% and to changes in orientation of principal axes of inertia within ±10° (of the geometric-based inertia tensor). Root mean squared deviation angles ranged between 2.9° and 4.3° for the inertia tensor estimated geometrically, and between 11.7° and 15.2° for the compound pendulum values. Errors up to 10% in magnitude of principal moments of inertia yielded root mean squared deviation angles ranging between 3.2° and 6.6°, and between 5.5° and 7.9° when lumped with errors of 10° in principal axes of inertia orientation. The proposed technique can effectively validate inertia tensors from novel estimation methods of body segment inertial parameter. Principal axes of inertia orientation should not be neglected when modelling human/animal mechanics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. An intercomparison and validation of satellite-based surface radiative energy flux estimates over the Arctic

    Science.gov (United States)

    Riihelä, Aku; Key, Jeffrey R.; Meirink, Jan Fokke; Kuipers Munneke, Peter; Palo, Timo; Karlsson, Karl-Göran

    2017-05-01

    Accurate determination of radiative energy fluxes over the Arctic is of crucial importance for understanding atmosphere-surface interactions, melt and refreezing cycles of the snow and ice cover, and the role of the Arctic in the global energy budget. Satellite-based estimates can provide comprehensive spatiotemporal coverage, but the accuracy and comparability of the existing data sets must be ascertained to facilitate their use. Here we compare radiative flux estimates from Clouds and the Earth's Radiant Energy System (CERES) Synoptic 1-degree (SYN1deg)/Energy Balanced and Filled, Global Energy and Water Cycle Experiment (GEWEX) surface energy budget, and our own experimental FluxNet / Satellite Application Facility on Climate Monitoring cLoud, Albedo and RAdiation (CLARA) data against in situ observations over Arctic sea ice and the Greenland Ice Sheet during summer of 2007. In general, CERES SYN1deg flux estimates agree best with in situ measurements, although with two particular limitations: (1) over sea ice the upwelling shortwave flux in CERES SYN1deg appears to be underestimated because of an underestimated surface albedo and (2) the CERES SYN1deg upwelling longwave flux over sea ice saturates during midsummer. The Advanced Very High Resolution Radiometer-based GEWEX and FluxNet-CLARA flux estimates generally show a larger range in retrieval errors relative to CERES, with contrasting tendencies relative to each other. The largest source of retrieval error in the FluxNet-CLARA downwelling shortwave flux is shown to be an overestimated cloud optical thickness. The results illustrate that satellite-based flux estimates over the Arctic are not yet homogeneous and that further efforts are necessary to investigate the differences in the surface and cloud properties which lead to disagreements in flux retrievals.

  9. Are traditional body fat equations and anthropometry valid to estimate body fat in children and adolescents living with HIV?

    Science.gov (United States)

    Lima, Luiz Rodrigo Augustemak de; Martins, Priscila Custódio; Junior, Carlos Alencar Souza Alves; Castro, João Antônio Chula de; Silva, Diego Augusto Santos; Petroski, Edio Luiz

    The aim of this study was to assess the validity of traditional anthropometric equations and to develop predictive equations of total body and trunk fat for children and adolescents living with HIV based on anthropometric measurements. Forty-eight children and adolescents of both sexes (24 boys) aged 7-17 years, living in Santa Catarina, Brazil, participated in the study. Dual-energy X-ray absorptiometry was used as the reference method to evaluate total body and trunk fat. Height, body weight, circumferences and triceps, subscapular, abdominal and calf skinfolds were measured. The traditional equations of Lohman and Slaughter were used to estimate body fat. Multiple regression models were fitted to predict total body fat (Model 1) and trunk fat (Model 2) using a backward selection procedure. Model 1 had an R 2 =0.85 and a standard error of the estimate of 1.43. Model 2 had an R 2 =0.80 and standard error of the estimate=0.49. The traditional equations of Lohman and Slaughter showed poor performance in estimating body fat in children and adolescents living with HIV. The prediction models using anthropometry provided reliable estimates and can be used by clinicians and healthcare professionals to monitor total body and trunk fat in children and adolescents living with HIV. Copyright © 2017 Sociedade Brasileira de Infectologia. Published by Elsevier Editora Ltda. All rights reserved.

  10. Validation of a simple evaporation-transpiration scheme (SETS) to estimate evaporation using micro-lysimeter measurements

    Science.gov (United States)

    Ghazanfari, Sadegh; Pande, Saket; Savenije, Hubert

    2014-05-01

    Several methods exist to estimate E and T. The Penman-Montieth or Priestly-Taylor methods along with the Jarvis scheme for estimating vegetation resistance are commonly used to estimate these fluxes as a function of land cover, atmospheric forcing and soil moisture content. In this study, a simple evaporation transpiration method is developed based on MOSAIC Land Surface Model that explicitly accounts for soil moisture. Soil evaporation and transpiration estimated by SETS is validated on a single column of soil profile with measured evaporation data from three micro-lysimeters located at Ferdowsi University of Mashhad synoptic station, Iran, for the year 2005. SETS is run using both implicit and explicit computational schemes. Results show that the implicit scheme estimates the vapor flux close to that by the explicit scheme. The mean difference between the implicit and explicit scheme is -0.03 mm/day. The paired T-test of mean difference (p-Value = 0.042 and t-Value = 2.04) shows that there is no significant difference between the two methods. The sum of soil evaporation and transpiration from SETS is also compared with P-M equation and micro-lysimeters measurements. The SETS predicts the actual evaporation with a lower bias (= 1.24mm/day) than P-M (= 1.82 mm/day) and with R2 value of 0.82.

  11. Validation of an elastic registration technique to estimate anatomical lung modification in Non-Small-Cell Lung Cancer Tomotherapy

    International Nuclear Information System (INIS)

    Faggiano, Elena; Cattaneo, Giovanni M; Ciavarro, Cristina; Dell'Oca, Italo; Persano, Diego; Calandrino, Riccardo; Rizzo, Giovanna

    2011-01-01

    The study of lung parenchyma anatomical modification is useful to estimate dose discrepancies during the radiation treatment of Non-Small-Cell Lung Cancer (NSCLC) patients. We propose and validate a method, based on free-form deformation and mutual information, to elastically register planning kVCT with daily MVCT images, to estimate lung parenchyma modification during Tomotherapy. We analyzed 15 registrations between the planning kVCT and 3 MVCT images for each of the 5 NSCLC patients. Image registration accuracy was evaluated by visual inspection and, quantitatively, by Correlation Coefficients (CC) and Target Registration Errors (TRE). Finally, a lung volume correspondence analysis was performed to specifically evaluate registration accuracy in lungs. Results showed that elastic registration was always satisfactory, both qualitatively and quantitatively: TRE after elastic registration (average value of 3.6 mm) remained comparable and often smaller than voxel resolution. Lung volume variations were well estimated by elastic registration (average volume and centroid errors of 1.78% and 0.87 mm, respectively). Our results demonstrate that this method is able to estimate lung deformations in thorax MVCT, with an accuracy within 3.6 mm comparable or smaller than the voxel dimension of the kVCT and MVCT images. It could be used to estimate lung parenchyma dose variations in thoracic Tomotherapy

  12. Global temperature estimates in the troposphere and stratosphere: a validation study of COSMIC/FORMOSAT-3 measurements

    Directory of Open Access Journals (Sweden)

    P. Kishore

    2009-02-01

    Full Text Available This paper mainly focuses on the validation of temperature estimates derived with the newly launched Constellation Observing System for Meteorology Ionosphere and Climate (COSMIC/Formosa Satellite 3 (FORMOSAT-3 system. The analysis is based on the radio occultation (RO data samples collected during the first year observation from April 2006 to April 2007. For the validation, we have used the operational stratospheric analyses including the National Centers for Environmental Prediction - Reanalysis (NCEP, the Japanese 25-year Reanalysis (JRA-25, and the United Kingdom Met Office (MetO data sets. Comparisons done in different formats reveal good agreement between the COSMIC and reanalysis outputs. Spatially, the largest deviations are noted in the polar latitudes, and height-wise, the tropical tropopause region noted the maximum differences (2–4 K. We found that among the three reanalysis data sets the NCEP data sets have the best resemblance with the COSMIC measurements.

  13. The Validity of Value-Added Estimates from Low-Stakes Testing Contexts: The Impact of Change in Test-Taking Motivation and Test Consequences

    Science.gov (United States)

    Finney, Sara J.; Sundre, Donna L.; Swain, Matthew S.; Williams, Laura M.

    2016-01-01

    Accountability mandates often prompt assessment of student learning gains (e.g., value-added estimates) via achievement tests. The validity of these estimates have been questioned when performance on tests is low stakes for students. To assess the effects of motivation on value-added estimates, we assigned students to one of three test consequence…

  14. Evaluation of Different Estimation Methods for Accuracy and Precision in Biological Assay Validation.

    Science.gov (United States)

    Yu, Binbing; Yang, Harry

    2017-01-01

    Biological assays ( bioassays ) are procedures to estimate the potency of a substance by studying its effects on living organisms, tissues, and cells. Bioassays are essential tools for gaining insight into biologic systems and processes including, for example, the development of new drugs and monitoring environmental pollutants. Two of the most important parameters of bioassay performance are relative accuracy (bias) and precision. Although general strategies and formulas are provided in USP, a comprehensive understanding of the definitions of bias and precision remain elusive. Additionally, whether there is a beneficial use of data transformation in estimating intermediate precision remains unclear. Finally, there are various statistical estimation methods available that often pose a dilemma for the analyst who must choose the most appropriate method. To address these issues, we provide both a rigorous definition of bias and precision as well as three alternative methods for calculating relative standard deviation (RSD). All methods perform similarly when the RSD ≤10%. However, the USP estimates result in larger bias and root-mean-square error (RMSE) compared to the three proposed methods when the actual variation was large. Therefore, the USP method should not be used for routine analysis. For data with moderate skewness and deviation from normality, the estimates based on the original scale perform well. The original scale method is preferred, and the method based on log-transformation may be used for noticeably skewed data. LAY ABSTRACT: Biological assays, or bioassays, are essential in the development and manufacture of biopharmaceutical products for potency testing and quality monitoring. Two important parameters of assay performance are relative accuracy (bias) and precision. The definitions of bias and precision in USP 〈1033〉 are elusive and confusing. Another complicating issue is whether log-transformation should be used for calculating the

  15. Gradient HPLC method development and validation for Simultaneous estimation of Rosiglitazone and Gliclazide.

    Directory of Open Access Journals (Sweden)

    Uttam Singh Baghel

    2012-10-01

    Full Text Available Objective: The aim of present work was to develop a gradient RP-HPLC method for simultaneous analysis of rosiglitazone and gliclazide, in a tablet dosage form. Method: Chromatographic system was optimized using a hypersil C18 (250mm x 4.6mm, 5毺 m column with potassium dihydrogen phosphate (pH-7.0 and acetonitrile in the ratio of 60:40, as mobile phase, at a flow rate of 1.0 ml/min. Detection was carried out at 225 nm by a SPD-20A prominence UV/Vis detector. Result: Rosiglitazone and gliclazide were eluted with retention times of 17.36 and 7.06 min, respectively. Beer’s Lambert ’s Law was obeyed over the concentration ranges of 5 to 70 毺 g/ml and 2 to 12 毺 g/ml for rosiglitazone and gliclazide, respectively. Conclusion: The high recovery and low coefficients of variation confirm the suitability of the method for simultaneous analysis of both drugs in a tablets dosage form. Statistical analysis proves that the method is sensitive and significant for the analysis of rosiglitazone and gliclazide in pure and in pharmaceutical dosage form without any interference from the excipients. The method was validated in accordance with ICH guidelines. Validation revealed the method is specific, rapid, accurate, precise, reliable, and reproducible.

  16. Validation of energy intake estimated from a food frequency questionnaire: a doubly labelled water study.

    Science.gov (United States)

    Andersen, L Frost; Tomten, H; Haggarty, P; Løvø, A; Hustvedt, B-E

    2003-02-01

    The validation of dietary assessment methods is critical in the evaluation of the relation between dietary intake and health. The aim of this study was to assess the validity of a food frequency questionnaire by comparing energy intake with energy expenditure measured with the doubly labelled water method. Total energy expenditure was measured with the doubly labelled water (DLW) method during a 10 day period. Furthermore, the subjects filled in the food frequency questionnaire about 18-35 days after the DLW phase of the study was completed. Twenty-one healthy, non-pregnant females volunteered to participate in the study; only 17 subjects completed the study. The group energy intake was on average 10% lower than the energy expenditure, but the difference was not statistically significant. However, there was a wide range in reporting accuracy: seven subjects were identified as acceptable reporters, eight as under-reporters and two were identified as over-reporters. The width of the 95% confidence limits of agreement in a Bland and Altman plot for energy intake and energy expenditure varied from -5 to 3 MJ. The data showed that there was substantial variability in the accuracy of the food frequency questionnaire at the individual level. Furthermore, the results showed that the questionnaire was more accurate for groups than individuals.

  17. CSNI Integral Test Facility Matrices for Validation of Best-Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Internationally agreed Integral Test Facility (ITF) matrices for validation of realistic thermal hydraulic system computer codes were established. ITF development is mainly for Pressurised Water Reactors (PWRs) and Boiling Water Reactors (BWRs). A separate activity was for Russian Pressurised Water-cooled and Water-moderated Energy Reactors (WWER). Firstly, the main physical phenomena that occur during considered accidents are identified, test types are specified, and test facilities suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. In this paper some specific examples from the ITF matrices will also be provided. The matrices will be a guide for code validation, will be a basis for comparisons of code predictions performed with different system codes, and will contribute to the quantification of the uncertainty range of code model predictions. In addition to this objective, the construction of such a matrix is an attempt to record information which has been generated around the world over the last years, so that it is more accessible to present and future workers in that field than would otherwise be the case.

  18. A Validated RP-HPLC Method for Simultaneous Estimation of Atenolol and Indapamide in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    G. Tulja Rani

    2011-01-01

    Full Text Available A simple, fast, precise, selective and accurate RP-HPLC method was developed and validated for the simultaneous determination of atenolol and indapamide from bulk and formulations. Chromatographic separation was achieved isocratically on a Waters C18 column (250×4.6 mm, 5 µ particle size using a mobile phase, methanol and water (adjusted to pH 2.7 with 1% orthophosphoric acid in the ratio of 80:20. The flow rate was 1 mL/min and effluent was detected at 230 nm. The retention time of atenolol and indapamide were 1.766 min and 3.407 min. respectively. Linearity was observed in the concentration range of 12.5-150 µg/mL for atenolol and 0.625-7.5 µg/mL for indapamide. Percent recoveries obtained for both the drugs were 99.74-100.06% and 98.65-99.98% respectively. The method was validated according to the ICH guidelines with respect to specificity, linearity, accuracy, precision and robustness. The method developed can be used for the routine analysis of atenolol and indapamide from their combined dosage form.

  19. Validating novel air pollution sensors to improve exposure estimates for epidemiological analyses and citizen science.

    Science.gov (United States)

    Jerrett, Michael; Donaire-Gonzalez, David; Popoola, Olalekan; Jones, Roderic; Cohen, Ronald C; Almanza, Estela; de Nazelle, Audrey; Mead, Iq; Carrasco-Turigas, Glòria; Cole-Hunter, Tom; Triguero-Mas, Margarita; Seto, Edmund; Nieuwenhuijsen, Mark

    2017-10-01

    Low cost, personal air pollution sensors may reduce exposure measurement errors in epidemiological investigations and contribute to citizen science initiatives. Here we assess the validity of a low cost personal air pollution sensor. Study participants were drawn from two ongoing epidemiological projects in Barcelona, Spain. Participants repeatedly wore the pollution sensor - which measured carbon monoxide (CO), nitric oxide (NO), and nitrogen dioxide (NO 2 ). We also compared personal sensor measurements to those from more expensive instruments. Our personal sensors had moderate to high correlations with government monitors with averaging times of 1-h and 30-min epochs (r ~ 0.38-0.8) for NO and CO, but had low to moderate correlations with NO 2 (~0.04-0.67). Correlations between the personal sensors and more expensive research instruments were higher than with the government monitors. The sensors were able to detect high and low air pollution levels in agreement with expectations (e.g., high levels on or near busy roadways and lower levels in background residential areas and parks). Our findings suggest that the low cost, personal sensors have potential to reduce exposure measurement error in epidemiological studies and provide valid data for citizen science studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Parameters estimation for reactive transport: A way to test the validity of a reactive model

    Science.gov (United States)

    Aggarwal, Mohit; Cheikh Anta Ndiaye, Mame; Carrayrou, Jérôme

    The chemical parameters used in reactive transport models are not known accurately due to the complexity and the heterogeneous conditions of a real domain. We will present an efficient algorithm in order to estimate the chemical parameters using Monte-Carlo method. Monte-Carlo methods are very robust for the optimisation of the highly non-linear mathematical model describing reactive transport. Reactive transport of tributyltin (TBT) through natural quartz sand at seven different pHs is taken as the test case. Our algorithm will be used to estimate the chemical parameters of the sorption of TBT onto the natural quartz sand. By testing and comparing three models of surface complexation, we show that the proposed adsorption model cannot explain the experimental data.

  1. Theoretical estimation and validation of radiation field in alkaline hydrolysis plant

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Sanjay; Krishnamohanan, T.; Gopalakrishnan, R.K., E-mail: singhs@barc.gov.in [Radiation Safety Systems Division, Bhabha Atomic Research Centre, Mumbai (India); Anand, S. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai (India); Pancholi, K. C. [Waste Management Division, Bhabha Atomic Research Centre, Mumbai (India)

    2014-07-01

    Spent organic solvent (30% TBP + 70% n-Dodecane) from reprocessing facility is treated at ETP in Alkaline Hydrolysis Plant (AHP) and Organic Waste Incineration (ORWIN) Facility. In AHP-ORWIN, there are three horizontal cylindrical tanks having 2.0 m{sup 3} operating capacity used for waste storage and transfer. The three tanks are, Aqueous Waste Tank (AWT), Waste Receiving Tank (WRT) and Dodecane Waste Tank (DWT). These tanks are en-housed in a shielded room in this facility. Monte Carlo N-Particle (MCNP) radiation transport code was used to estimate ambient radiation field levels when the storage tanks are having hold up volumes of desired specific activity levels. In this paper the theoretically estimated values of radiation field is compared with the actual measured dose.

  2. Validity of a Commercial Linear Encoder to Estimate Bench Press 1 RM from the Force-Velocity Relationship

    OpenAIRE

    Bosquet, Laurent; Porta-Benache, Jeremy; Blais, Jérôme

    2010-01-01

    The aim of this study was to assess the validity and accuracy of a commercial linear encoder (Musclelab, Ergotest, Norway) to estimate Bench press 1 repetition maximum (1RM) from the force - velocity relationship. Twenty seven physical education students and teachers (5 women and 22 men) with a heterogeneous history of strength training participated in this study. They performed a 1 RM test and a force - velocity test using a Bench press lifting task in a random order. Mean 1 RM was 61.8 ± 15...

  3. Towards valid 'serious non-fatal injury' indicators for international comparisons based on probability of admission estimates

    DEFF Research Database (Denmark)

    Cryer, Colin; Miller, Ted R; Lyons, Ronan A

    2017-01-01

    in regions of Canada, Denmark, Greece, Spain and the USA. International Classification of Diseases (ICD)-9 or ICD-10 4-digit/character injury diagnosis-specific ED attendance and inpatient admission counts were provided, based on a common protocol. Diagnosis-specific and region-specific PrAs with 95% CIs...... diagnoses with high estimated PrAs. These diagnoses can be used as the basis for more valid international comparisons of life-threatening injury, based on hospital discharge data, for countries with well-developed healthcare and data collection systems....

  4. Lightning stroke distance estimation from single station observation and validation with WWLLN data

    Directory of Open Access Journals (Sweden)

    V. Ramachandran

    2007-07-01

    Full Text Available A simple technique to estimate the distance of the lightning strikes d with a single VLF electromagnetic wave receiver at a single station is described. The technique is based on the recording of oscillatory waveforms of the electric fields of sferics. Even though the process of estimating d using the waveform is a rather classical one, a novel and simple procedure for finding d is proposed in this paper. The procedure adopted provides two independent estimates of the distance of the stroke. The accuracy of measurements has been improved by employing high speed (333 ns sampling rate signal processing techniques. GPS time is used as the reference time, which enables us to compare the calculated distances of the lightning strikes, by both methods, with those calculated from the data obtained by the World-Wide Lightning Location Network (WWLLN, which uses a multi-station technique. The estimated distances of the lightning strikes (77, whose times correlated, ranged from ~3000–16 250 km. When dd compared with those calculated with the multi-station lightning location system is ~4.7%, while for all the strokes it was ~8.8%. One of the lightnings which was recorded by WWLLN, whose field pattern was recorded and the spectrogram of the sferic was also recorded at the site, is analyzed in detail. The deviations in d calculated from the field pattern and from the arrival time of the sferic were 3.2% and 1.5%, respectively, compared to d calculated from the WWLLN location. FFT analysis of the waveform showed that only a narrow band of frequencies is received at the site, which is confirmed by the intensity of the corresponding sferic in the spectrogram.

  5. Validation of real-time zenith tropospheric delay estimation with TOMION software within WAGNSS networks

    OpenAIRE

    Graffigna, Victoria

    2017-01-01

    The TOmographic Model of the IONospheric electron content (TOMION) software implements a simultaneous precise geodetic and ionospheric modeling, which can be used to test new approaches for real-time precise GNSS modeling (positioning, ionospheric and tropospheric delays, clock errors, among others). In this work, the software is used to estimate the Zenith Tropospheric Delay (ZTD) emulating real time and its performance is evaluated through a comparative analysis with a built-in GIPSY estima...

  6. Validity of rapid estimation of erythrocyte volume in the diagnosis of polycytemia vera

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, S.; Roedbro, P.

    1989-01-01

    In the diagnosis of polycytemia vera, estimation of erythrocyte volume (EV) from plasma volume (PV) and venous hematocrit (Hct/sub v/) is usually thought unadvisable, because the ratio of whole body hematocrit to venous hematocrit (f ratio) is higher in patients with splenomegaly than in normal subjects, and varies considerably between individuals. We determined the mean f ratio in 232 consecutive patients suspected of polycytemia vera (anti f=0.967; SD 0.048) and used it with each patient's PV and Hct/sub v/ to calculate an estimated normalised EV/sub n/. With measured EV as a reference value, EV/sub n/ was investigated as a diagnostic test. By means of two cut off levels the EV/sub n/ values could be divided into EV/sub n/ elevated, EV/sub n/ not elevated (both with high predictive values), and an EV/sub n/ borderline group. The size of the borderline EV/sub n/ group ranged from 5% to 46% depending on position of the cut off levels, i.e. with the efficiency demanded from the diagnostic test. EV can safely and rapidly be estimated from PV and Hct/sub v/, if anti f is determined from the relevant population, and if the results in an easily definable borderline range of EV/sub n/ values are supplemented by direct EV determination.

  7. A transdisciplinary approach to the initial validation of a single cell protein as an alternative protein source for use in aquafeeds

    Directory of Open Access Journals (Sweden)

    Michael Tlusty

    2017-04-01

    Full Text Available The human population is growing and, globally, we must meet the challenge of increased protein needs required to feed this population. Single cell proteins (SCP, when coupled to aquaculture production, offer a means to ensure future protein needs can be met without direct competition with food for people. To demonstrate a given type of SCP has potential as a protein source for use in aquaculture feed, a number of steps need to be validated including demonstrating that the SCP is accepted by the species in question, leads to equivalent survival and growth, does not result in illness or other maladies, is palatable to the consumer, is cost effective to produce and can easily be incorporated into diets using existing technology. Here we examine white shrimp (Litopenaeus vannamei growth and consumer taste preference, smallmouth grunt (Haemulon chrysargyreum growth, survival, health and gut microbiota, and Atlantic salmon (Salmo salar digestibility when fed diets that substitute the bacterium Methylobacterium extorquens at a level of 30% (grunts, 100% (shrimp, or 55% (salmon of the fishmeal in a compound feed. In each of these tests, animals performed equivalently when fed diets containing M. extorquens as when fed a standard aquaculture diet. This transdisciplinary approach is a first validation of this bacterium as a potential SCP protein substitute in aquafeeds. Given the ease to produce this SCP through an aerobic fermentation process, the broad applicability for use in aquaculture indicates the promise of M. extorquens in leading toward greater food security in the future.

  8. Validation of tumor protein marker quantification by two independent automated immunofluorescence image analysis platforms

    Science.gov (United States)

    Peck, Amy R; Girondo, Melanie A; Liu, Chengbao; Kovatich, Albert J; Hooke, Jeffrey A; Shriver, Craig D; Hu, Hai; Mitchell, Edith P; Freydin, Boris; Hyslop, Terry; Chervoneva, Inna; Rui, Hallgeir

    2016-01-01

    Protein marker levels in formalin-fixed, paraffin-embedded tissue sections traditionally have been assayed by chromogenic immunohistochemistry and evaluated visually by pathologists. Pathologist scoring of chromogen staining intensity is subjective and generates low-resolution ordinal or nominal data rather than continuous data. Emerging digital pathology platforms now allow quantification of chromogen or fluorescence signals by computer-assisted image analysis, providing continuous immunohistochemistry values. Fluorescence immunohistochemistry offers greater dynamic signal range than chromogen immunohistochemistry, and combined with image analysis holds the promise of enhanced sensitivity and analytic resolution, and consequently more robust quantification. However, commercial fluorescence scanners and image analysis software differ in features and capabilities, and claims of objective quantitative immunohistochemistry are difficult to validate as pathologist scoring is subjective and there is no accepted gold standard. Here we provide the first side-by-side validation of two technologically distinct commercial fluorescence immunohistochemistry analysis platforms. We document highly consistent results by (1) concordance analysis of fluorescence immunohistochemistry values and (2) agreement in outcome predictions both for objective, data-driven cutpoint dichotomization with Kaplan–Meier analyses or employment of continuous marker values to compute receiver-operating curves. The two platforms examined rely on distinct fluorescence immunohistochemistry imaging hardware, microscopy vs line scanning, and functionally distinct image analysis software. Fluorescence immunohistochemistry values for nuclear-localized and tyrosine-phosphorylated Stat5a/b computed by each platform on a cohort of 323 breast cancer cases revealed high concordance after linear calibration, a finding confirmed on an independent 382 case cohort, with concordance correlation coefficients >0

  9. Cosmic Ray Neutron Sensing: Use, Calibration and Validation for Soil Moisture Estimation

    International Nuclear Information System (INIS)

    2017-03-01

    Nuclear and related techniques can help develop climate-smart agricultural practices by optimizing water use efficiency. The measurement of soil water content is essential to improve the use of this resource in agriculture. However, most sensors monitor small areas (less than 1m in radius), hence a large number of sensors are needed to obtain soil water content across a large area. This can be both costly and labour intensive and so larger scale measuring devices are needed as an alternative to traditional point-based soil moisture sensing techniques. The cosmic ray neutron sensor (CRNS) is such a device that monitors soil water content in a non-invasive and continuous way. This publication provides background information about this novel technique, and explains in detail the calibration and validation process.

  10. Validation of the Visible Occlusal Plaque Index (VOPI) in estimating caries lesion activity

    DEFF Research Database (Denmark)

    Carvalho, J.C.; Mestrinho, H D; Oliveira, L S

    2017-01-01

    ). RESULTS: Construct validity was assumed based on qualitative assessment as no plaque (score 0) and thin plaque (score 1) reflected the theoretical knowledge that a regular disorganization of the dental biofilm either maintains the caries process at sub-clinical levels or inactivate it clinically. The VOPI...... of the VOPI was evidenced with multivariable analysis (GEE), by its ability to discriminate between the groups of adolescents with different oral hygiene status; negative association between adolescents with thick and heavy plaque and those with sound occlusal surfaces was found (OR=0.3, p... of oral hygiene and caries lesion activity. The VOPI is recommended to standardize and categorize information on the occlusal biofilm, thus being suitable for direct application in research and clinical settings....

  11. Development and Validation of UV Spectrophotometric Method For Estimation of Dolutegravir Sodium in Tablet Dosage Form

    International Nuclear Information System (INIS)

    Balasaheb, B.G.

    2015-01-01

    A simple, rapid, precise and accurate spectrophotometric method has been developed for quantitative analysis of Dolutegravir sodium in tablet formulations. The initial stock solution of Dolutegravir sodium was prepared in methanol solvent and subsequent dilution was done in water. The standard solution of Dolutegravir sodium in water showed maximum absorption at wavelength 259.80 nm. The drug obeyed Beer-Lamberts law in the concentration range of 5-40 μg/ mL with coefficient of correlation (R"2) was 0.9992. The method was validated as per the ICH guidelines. The developed method can be adopted in routine analysis of Dolutegravir sodium in bulk or tablet dosage form and it involves relatively low cost solvents and no complex extraction techniques. (author)

  12. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    Directory of Open Access Journals (Sweden)

    Kranti P. Musmade

    2014-01-01

    Full Text Available A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC method with UV detection has been developed and validated for quantification of naringin (NAR in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1. The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity.

  13. Development and Validation of a Prediction Model to Estimate Individual Risk of Pancreatic Cancer.

    Science.gov (United States)

    Yu, Ami; Woo, Sang Myung; Joo, Jungnam; Yang, Hye-Ryung; Lee, Woo Jin; Park, Sang-Jae; Nam, Byung-Ho

    2016-01-01

    There is no reliable screening tool to identify people with high risk of developing pancreatic cancer even though pancreatic cancer represents the fifth-leading cause of cancer-related death in Korea. The goal of this study was to develop an individualized risk prediction model that can be used to screen for asymptomatic pancreatic cancer in Korean men and women. Gender-specific risk prediction models for pancreatic cancer were developed using the Cox proportional hazards model based on an 8-year follow-up of a cohort study of 1,289,933 men and 557,701 women in Korea who had biennial examinations in 1996-1997. The performance of the models was evaluated with respect to their discrimination and calibration ability based on the C-statistic and Hosmer-Lemeshow type χ2 statistic. A total of 1,634 (0.13%) men and 561 (0.10%) women were newly diagnosed with pancreatic cancer. Age, height, BMI, fasting glucose, urine glucose, smoking, and age at smoking initiation were included in the risk prediction model for men. Height, BMI, fasting glucose, urine glucose, smoking, and drinking habit were included in the risk prediction model for women. Smoking was the most significant risk factor for developing pancreatic cancer in both men and women. The risk prediction model exhibited good discrimination and calibration ability, and in external validation it had excellent prediction ability. Gender-specific risk prediction models for pancreatic cancer were developed and validated for the first time. The prediction models will be a useful tool for detecting high-risk individuals who may benefit from increased surveillance for pancreatic cancer.

  14. P185-M Protein Identification and Validation of Results in Workflows that Integrate over Various Instruments, Datasets, Search Engines

    Science.gov (United States)

    Hufnagel, P.; Glandorf, J.; Körting, G.; Jabs, W.; Schweiger-Hufnagel, U.; Hahner, S.; Lubeck, M.; Suckau, D.

    2007-01-01

    Analysis of complex proteomes often results in long protein lists, but falls short in measuring the validity of identification and quantification results on a greater number of proteins. Biological and technical replicates are mandatory, as is the combination of the MS data from various workflows (gels, 1D-LC, 2D-LC), instruments (TOF/TOF, trap, qTOF or FTMS), and search engines. We describe a database-driven study that combines two workflows, two mass spectrometers, and four search engines with protein identification following a decoy database strategy. The sample was a tryptically digested lysate (10,000 cells) of a human colorectal cancer cell line. Data from two LC-MALDI-TOF/TOF runs and a 2D-LC-ESI-trap run using capillary and nano-LC columns were submitted to the proteomics software platform ProteinScape. The combined MALDI data and the ESI data were searched using Mascot (Matrix Science), Phenyx (GeneBio), ProteinSolver (Bruker and Protagen), and Sequest (Thermo) against a decoy database generated from IPI-human in order to obtain one protein list across all workflows and search engines at a defined maximum false-positive rate of 5%. ProteinScape combined the data to one LC-MALDI and one LC-ESI dataset. The initial separate searches from the two combined datasets generated eight independent peptide lists. These were compiled into an integrated protein list using the ProteinExtractor algorithm. An initial evaluation of the generated data led to the identification of approximately 1200 proteins. Result integration on a peptide level allowed discrimination of protein isoforms that would not have been possible with a mere combination of protein lists.

  15. VALIDITY OF A COMMERCIAL LINEAR ENCODER TO ESTIMATE BENCH PRESS 1 RM FROM THE FORCE-VELOCITY RELATIONSHIP

    Directory of Open Access Journals (Sweden)

    Laurent Bosquet

    2010-09-01

    Full Text Available The aim of this study was to assess the validity and accuracy of a commercial linear encoder (Musclelab, Ergotest, Norway to estimate Bench press 1 repetition maximum (1RM from the force - velocity relationship. Twenty seven physical education students and teachers (5 women and 22 men with a heterogeneous history of strength training participated in this study. They performed a 1 RM test and a force - velocity test using a Bench press lifting task in a random order. Mean 1 RM was 61.8 ± 15.3 kg (range: 34 to 100 kg, while 1 RM estimated by the Musclelab's software from the force-velocity relationship was 56.4 ± 14.0 kg (range: 33 to 91 kg. Actual and estimated 1 RM were very highly correlated (r = 0.93, p<0.001 but largely different (Bias: 5.4 ± 5.7 kg, p < 0.001, ES = 1.37. The 95% limits of agreement were ±11.2 kg, which represented ±18% of actual 1 RM. It was concluded that 1 RM estimated from the force-velocity relationship was a good measure for monitoring training induced adaptations, but also that it was not accurate enough to prescribe training intensities. Additional studies are required to determine whether accuracy is affected by age, sex or initial level.

  16. Left ventricular strain and its pattern estimated from cine CMR and validation with DENSE

    International Nuclear Information System (INIS)

    Gao, Hao; Luo, Xiaoyu; Allan, Andrew; McComb, Christie; Berry, Colin

    2014-01-01

    Measurement of local strain provides insight into the biomechanical significance of viable myocardium. We attempted to estimate myocardial strain from cine cardiovascular magnetic resonance (CMR) images by using a b-spline deformable image registration method. Three healthy volunteers and 41 patients with either recent or chronic myocardial infarction (MI) were studied at 1.5 Tesla with both cine and DENSE CMR. Regional circumferential and radial left ventricular strains were estimated from cine and DENSE acquisitions. In all healthy volunteers, there was no difference for peak circumferential strain (− 0.18 ± 0.04 versus − 0.18 ± 0.03, p = 0.76) between cine and DENSE CMR, however peak radial strain was overestimated from cine (0.84 ± 0.37 versus 0.49 ± 0.2, p < 0.01). In the patient study, the peak strain patterns predicted by cine were similar to the patterns from DENSE, including the strain evolution related to recovery time and strain patterns related to MI scar extent. Furthermore, cine-derived strain disclosed different strain patterns in MI and non-MI regions, and regions with transmural and non-transmural MI as DENSE. Although there were large variations with radial strain measurements from cine CMR images, useful circumferential strain information can be obtained from routine clinical CMR imaging. Cine strain analysis has potential to improve the diagnostic yield from routine CMR imaging in clinical practice. (paper)

  17. Left ventricular strain and its pattern estimated from cine CMR and validation with DENSE.

    Science.gov (United States)

    Gao, Hao; Allan, Andrew; McComb, Christie; Luo, Xiaoyu; Berry, Colin

    2014-07-07

    Measurement of local strain provides insight into the biomechanical significance of viable myocardium. We attempted to estimate myocardial strain from cine cardiovascular magnetic resonance (CMR) images by using a b-spline deformable image registration method. Three healthy volunteers and 41 patients with either recent or chronic myocardial infarction (MI) were studied at 1.5 Tesla with both cine and DENSE CMR. Regional circumferential and radial left ventricular strains were estimated from cine and DENSE acquisitions. In all healthy volunteers, there was no difference for peak circumferential strain (- 0.18 ± 0.04 versus - 0.18 ± 0.03, p = 0.76) between cine and DENSE CMR, however peak radial strain was overestimated from cine (0.84 ± 0.37 versus 0.49 ± 0.2, p cine were similar to the patterns from DENSE, including the strain evolution related to recovery time and strain patterns related to MI scar extent. Furthermore, cine-derived strain disclosed different strain patterns in MI and non-MI regions, and regions with transmural and non-transmural MI as DENSE. Although there were large variations with radial strain measurements from cine CMR images, useful circumferential strain information can be obtained from routine clinical CMR imaging. Cine strain analysis has potential to improve the diagnostic yield from routine CMR imaging in clinical practice.

  18. Empirical models validation to estimate global solar irradiance on a horizontal plan in Ouargla, Algeria

    Science.gov (United States)

    Gougui, Abdelmoumen; Djafour, Ahmed; Khelfaoui, Narimane; Boutelli, Halima

    2018-05-01

    In this paper a comparison between three models for predicting the total solar flux falling on a horizontal surface has been processed. Capderou, Perrin & Brichambaut and Hottel models used to estimate the global solar radiation, the models are identified and evaluated using MATLAB environment. The recorded data have been obtained from a small weather station installed at the LAGE laboratory of Ouargla University, Algeria. Solar radiation data have been recorded using four sample days, every 15thday of the month, (March, April, May and October). The Root Mean Square Error (RMSE), Correlation Coefficient (CC) and Mean Absolute Percentage Error (MAPE) have been also calculated so as that to test the reliability of the proposed models. A comparisons between the measured and the calculated values have been made. The results obtained in this study depict that Perrin & Brichambaut and Capderou models are more effective to estimate the total solar intensity on a horizontal surface for clear sky over Ouargla city (Latitude of 31° 95' N, Longitude of 5° 24' E, and Altitude of 0.141km above Mean Sea Level), these models dedicated from meteorological parameters, geographical location and number of days since the first January. Perrin & Brichambaut and Capderou models give the best tendency with a CC of 0.985-0.999 and 0.932-0.995 consecutively further, Hottel give's a CC of 0.617-0.942.

  19. Development and Validation of RP-HPLC Method for Simultaneous Estimation of Aspirin and Esomeprazole Magnesium in Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    Dipali Patel

    2013-01-01

    Full Text Available A simple, specific, precise, and accurate reversed-phase HPLC method was developed and validated for simultaneous estimation of aspirin and esomeprazole magnesium in tablet dosage forms. The separation was achieved by HyperChrom ODS-BP C18 column (200 mm × 4.6 mm; 5.0 μm using acetonitrile: methanol: 0.05 M phosphate buffer at pH 3 adjusted with orthophosphoric acid (25 : 25 : 50, v/v as eluent, at a flow rate of 1 mL/min. Detection was carried out at wavelength 230 nm. The retention times of aspirin and esomeprazole magnesium were 4.29 min and 6.09 min, respectively. The linearity was established over the concentration ranges of 10–70 μg/mL and 10–30 μg/mL with correlation coefficients (r2 0.9986 and 0.9973 for aspirin and esomeprazole magnesium, respectively. The mean recoveries were found to be in the ranges of 99.80–100.57% and 99.70–100.83% for aspirin and esomeprazole magnesium, respectively. The proposed method has been validated as per ICH guidelines and successfully applied to the estimation of aspirin and esomeprazole magnesium in their combined tablet dosage form.

  20. Validity and Reliability of the Brazilian Version of the Rapid Estimate of Adult Literacy in Dentistry--BREALD-30.

    Science.gov (United States)

    Junkes, Monica C; Fraiz, Fabian C; Sardenberg, Fernanda; Lee, Jessica Y; Paiva, Saul M; Ferreira, Fernanda M

    2015-01-01

    The aim of the present study was to translate, perform the cross-cultural adaptation of the Rapid Estimate of Adult Literacy in Dentistry to Brazilian-Portuguese language and test the reliability and validity of this version. After translation and cross-cultural adaptation, interviews were conducted with 258 parents/caregivers of children in treatment at the pediatric dentistry clinics and health units in Curitiba, Brazil. To test the instrument's validity, the scores of Brazilian Rapid Estimate of Adult Literacy in Dentistry (BREALD-30) were compared based on occupation, monthly household income, educational attainment, general literacy, use of dental services and three dental outcomes. The BREALD-30 demonstrated good internal reliability. Cronbach's alpha ranged from 0.88 to 0.89 when words were deleted individually. The analysis of test-retest reliability revealed excellent reproducibility (intraclass correlation coefficient = 0.983 and Kappa coefficient ranging from moderate to nearly perfect). In the bivariate analysis, BREALD-30 scores were significantly correlated with the level of general literacy (rs = 0.593) and income (rs = 0.327) and significantly associated with occupation, educational attainment, use of dental services, self-rated oral health and the respondent's perception regarding his/her child's oral health. However, only the association between the BREALD-30 score and the respondent's perception regarding his/her child's oral health remained significant in the multivariate analysis. The BREALD-30 demonstrated satisfactory psychometric properties and is therefore applicable to adults in Brazil.

  1. Estimating structure quality trends in the Protein Data Bank by equivalent resolution.

    Science.gov (United States)

    Bagaria, Anurag; Jaravine, Victor; Güntert, Peter

    2013-10-01

    The quality of protein structures obtained by different experimental and ab-initio calculation methods varies considerably. The methods have been evolving over time by improving both experimental designs and computational techniques, and since the primary aim of these developments is the procurement of reliable and high-quality data, better techniques resulted on average in an evolution toward higher quality structures in the Protein Data Bank (PDB). Each method leaves a specific quantitative and qualitative "trace" in the PDB entry. Certain information relevant to one method (e.g. dynamics for NMR) may be lacking for another method. Furthermore, some standard measures of quality for one method cannot be calculated for other experimental methods, e.g. crystal resolution or NMR bundle RMSD. Consequently, structures are classified in the PDB by the method used. Here we introduce a method to estimate a measure of equivalent X-ray resolution (e-resolution), expressed in units of Å, to assess the quality of any type of monomeric, single-chain protein structure, irrespective of the experimental structure determination method. We showed and compared the trends in the quality of structures in the Protein Data Bank over the last two decades for five different experimental techniques, excluding theoretical structure predictions. We observed that as new methods are introduced, they undergo a rapid method development evolution: within several years the e-resolution score becomes similar for structures obtained from the five methods and they improve from initially poor performance to acceptable quality, comparable with previously established methods, the performance of which is essentially stable. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. In vitro estimation of rumen protein degradability using 35S to label the bacterial mass

    International Nuclear Information System (INIS)

    Khristov, A.; Aleksandrov, S.; Aleksiev, I.

    1994-01-01

    An experiment was carried out in order to simplify a previously developed 15 N-method for in vitro estimation of rumen protein degradability. Casein (Cas), whole soybeans (Sb) heated at 120 o C for 20 min (SbTherm) and sunflower (Sfl) were incubated at 39 o C for 4 hours in a water bathshaker with the following media: McDougall's buffer, strained and enriched with particle associated bacteria rumen fluid (2:1), rapidly (maltose, sucrose, glucose) and more slowly (pectin, soluble starch) degradable carbohydrates with final concentration of 815 mg/100 ml and 21.7 μCi/100 ml of 35 S (from Na 2 35 SO 4 ). After the incubation had been ceased, a bacterial fraction was isolated through differential centrifugation and specific activity of bacterial (Bac) and high speed total solids (TS) nitrogen was measured. The ratio was used to calculate bacterial mass in TS and through the Kjeldahl nitrogen concentration in TS - the net bacterial growth (against control vessels without protein). The level of ammonia-N in the supernate after blank correction was used to find the ammonia-N released from protein degradation. The data showed that the rate (and extend) of degradation for the Cas (as a standard protein) was lower compared to those obtained through the 15 N-method but it was higher than the rate derived through another in vitro method. The Cas equivalent of the Sb was higher than the figure we found in a previous experiment with solvent extracted soybean meal suggesting that the 35 S-method underestimated the degradability of the Cas. After being tested on a wider range of foodstuffs, the proposed 35 S-method might be considered as an alternative procedure which is less laborous than the 15 N-method. (author)

  3. Model validation and error estimation of tsunami runup using high resolution data in Sadeng Port, Gunungkidul, Yogyakarta

    Science.gov (United States)

    Basith, Abdul; Prakoso, Yudhono; Kongko, Widjo

    2017-07-01

    A tsunami model using high resolution geometric data is indispensable in efforts to tsunami mitigation, especially in tsunami prone areas. It is one of the factors that affect the accuracy results of numerical modeling of tsunami. Sadeng Port is a new infrastructure in the Southern Coast of Java which could potentially hit by massive tsunami from seismic gap. This paper discusses validation and error estimation of tsunami model created using high resolution geometric data in Sadeng Port. Tsunami model validation uses the height wave of Tsunami Pangandaran 2006 recorded by Tide Gauge of Sadeng. Tsunami model will be used to accommodate the tsunami numerical modeling involves the parameters of earthquake-tsunami which is derived from the seismic gap. The validation results using t-test (student) shows that the height of the tsunami modeling results and observation in Tide Gauge of Sadeng are considered statistically equal at 95% confidence level and the value of the RMSE and NRMSE are 0.428 m and 22.12%, while the differences of tsunami wave travel time is 12 minutes.

  4. Online dietary intake estimation: reproducibility and validity of the Food4Me food frequency questionnaire against a 4-day weighed food record.

    Science.gov (United States)

    Fallaize, Rosalind; Forster, Hannah; Macready, Anna L; Walsh, Marianne C; Mathers, John C; Brennan, Lorraine; Gibney, Eileen R; Gibney, Michael J; Lovegrove, Julie A

    2014-08-11

    Advances in nutritional assessment are continuing to embrace developments in computer technology. The online Food4Me food frequency questionnaire (FFQ) was created as an electronic system for the collection of nutrient intake data. To ensure its accuracy in assessing both nutrient and food group intake, further validation against data obtained using a reliable, but independent, instrument and assessment of its reproducibility are required. The aim was to assess the reproducibility and validity of the Food4Me FFQ against a 4-day weighed food record (WFR). Reproducibility of the Food4Me FFQ was assessed using test-retest methodology by asking participants to complete the FFQ on 2 occasions 4 weeks apart. To assess the validity of the Food4Me FFQ against the 4-day WFR, half the participants were also asked to complete a 4-day WFR 1 week after the first administration of the Food4Me FFQ. Level of agreement between nutrient and food group intakes estimated by the repeated Food4Me FFQ and the Food4Me FFQ and 4-day WFR were evaluated using Bland-Altman methodology and classification into quartiles of daily intake. Crude unadjusted correlation coefficients were also calculated for nutrient and food group intakes. In total, 100 people participated in the assessment of reproducibility (mean age 32, SD 12 years), and 49 of these (mean age 27, SD 8 years) also took part in the assessment of validity. Crude unadjusted correlations for repeated Food4Me FFQ ranged from .65 (vitamin D) to .90 (alcohol). The mean cross-classification into "exact agreement plus adjacent" was 92% for both nutrient and food group intakes, and Bland-Altman plots showed good agreement for energy-adjusted macronutrient intakes. Agreement between the Food4Me FFQ and 4-day WFR varied, with crude unadjusted correlations ranging from .23 (vitamin D) to .65 (protein, % total energy) for nutrient intakes and .11 (soups, sauces and miscellaneous foods) to .73 (yogurts) for food group intake. The mean cross

  5. Development and validation of analytical method for the estimation of nateglinide in rabbit plasma

    Directory of Open Access Journals (Sweden)

    Nihar Ranjan Pani

    2012-12-01

    Full Text Available Nateglinide has been widely used in the treatment of type-2 diabetics as an insulin secretogoga. A reliable, rapid, simple and sensitive reversed-phase high performance liquid chromatography (RP-HPLC method was developed and validated for determination of nateglinide in rabbit plasma. The method was developed on Hypersil BDSC-18 column (250 mm×4.6 mm, 5 mm using a mobile phase of 10 mM phosphate buffer (pH 2.5 and acetonitrile (35:65, v/v. The elute was monitored with the UV–vis detector at 210 nm with a flow rate of 1 mL/min. Calibration curve was linear over the concentration range of 25–2000 ng/mL. The retention times of nateglinide and internal standard (gliclazide were 9.608 min and 11.821 min respectively. The developed RP-HPLC method can be successfully applied to the quantitative pharmacokinetic parameters determination of nateglinide in rabbit model. Keywords: HPLC, Nateglinide, Rabbit plasma, Pharmacokinetics

  6. HPLC method development and validation for the estimation of axitinibe in rabbit plasma

    Directory of Open Access Journals (Sweden)

    Achanta Suneetha

    2017-10-01

    Full Text Available ABSTRACT A rapid, sensitive, and accurate high performance liquid chromatography for the determination of axitinibe (AN in rabbit plasma is developed using crizotinibe as an internal standard (IS. Axitinibe is a tyrosine kinase inhibitor, used in the treatment of advanced kidney cancer, which works by slowing or stopping the growth of cancer cells. The chromatographic separation was performed on a Waters 2695, Kromosil (150 mm × 4.6 mm, 5 µm column using a mobile phase containing buffer (pH 4.6 and acetonitrile in the ratio of 65:35 v/v with a flow rate of1 mL/min. The analyte and internal standard were extracted using liquid-liquid extraction with acetonitrile. The elution was detected by photo diode array detector at 320 nm.The total chromatographic runtime is 10.0 min with a retention time for axitinibe and IS of 5.685, and 3.606 min, respectively. The method was validated over a dynamic linear range of 0.002-0.2µg/mL for axitinibe with a correlation coefficient of r2 0.999.

  7. Development and validation of satellite-based estimates of surface visibility

    Science.gov (United States)

    Brunner, J.; Pierce, R. B.; Lenzen, A.

    2016-02-01

    A satellite-based surface visibility retrieval has been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) measurements as a proxy for Advanced Baseline Imager (ABI) data from the next generation of Geostationary Operational Environmental Satellites (GOES-R). The retrieval uses a multiple linear regression approach to relate satellite aerosol optical depth, fog/low cloud probability and thickness retrievals, and meteorological variables from numerical weather prediction forecasts to National Weather Service Automated Surface Observing System (ASOS) surface visibility measurements. Validation using independent ASOS measurements shows that the GOES-R ABI surface visibility retrieval (V) has an overall success rate of 64.5 % for classifying clear (V ≥ 30 km), moderate (10 km ≤ V United States Environmental Protection Agency (EPA) and National Park Service (NPS) Interagency Monitoring of Protected Visual Environments (IMPROVE) network and provide useful information to the regional planning offices responsible for developing mitigation strategies required under the EPA's Regional Haze Rule, particularly during regional haze events associated with smoke from wildfires.

  8. Validation of Walk Score® for Estimating Neighborhood Walkability: An Analysis of Four US Metropolitan Areas

    Science.gov (United States)

    Duncan, Dustin T.; Aldstadt, Jared; Whalen, John; Melly, Steven J.; Gortmaker, Steven L.

    2011-01-01

    Neighborhood walkability can influence physical activity. We evaluated the validity of Walk Score® for assessing neighborhood walkability based on GIS (objective) indicators of neighborhood walkability with addresses from four US metropolitan areas with several street network buffer distances (i.e., 400-, 800-, and 1,600-meters). Address data come from the YMCA-Harvard After School Food and Fitness Project, an obesity prevention intervention involving children aged 5–11 years and their families participating in YMCA-administered, after-school programs located in four geographically diverse metropolitan areas in the US (n = 733). GIS data were used to measure multiple objective indicators of neighborhood walkability. Walk Scores were also obtained for the participant’s residential addresses. Spearman correlations between Walk Scores and the GIS neighborhood walkability indicators were calculated as well as Spearman correlations accounting for spatial autocorrelation. There were many significant moderate correlations between Walk Scores and the GIS neighborhood walkability indicators such as density of retail destinations and intersection density (p walkability. Correlations generally became stronger with a larger spatial scale, and there were some geographic differences. Walk Score® is free and publicly available for public health researchers and practitioners. Results from our study suggest that Walk Score® is a valid measure of estimating certain aspects of neighborhood walkability, particularly at the 1600-meter buffer. As such, our study confirms and extends the generalizability of previous findings demonstrating that Walk Score is a valid measure of estimating neighborhood walkability in multiple geographic locations and at multiple spatial scales. PMID:22163200

  9. Validity of segmental bioelectrical impedance analysis for estimating fat-free mass in children including overweight individuals.

    Science.gov (United States)

    Ohta, Megumi; Midorikawa, Taishi; Hikihara, Yuki; Masuo, Yoshihisa; Sakamoto, Shizuo; Torii, Suguru; Kawakami, Yasuo; Fukunaga, Tetsuo; Kanehisa, Hiroaki

    2017-02-01

    This study examined the validity of segmental bioelectrical impedance (BI) analysis for predicting the fat-free masses (FFMs) of whole-body and body segments in children including overweight individuals. The FFM and impedance (Z) values of arms, trunk, legs, and whole body were determined using a dual-energy X-ray absorptiometry and segmental BI analyses, respectively, in 149 boys and girls aged 6 to 12 years, who were divided into model-development (n = 74), cross-validation (n = 35), and overweight (n = 40) groups. Simple regression analysis was applied to (length) 2 /Z (BI index) for each of the whole-body and 3 segments to develop the prediction equations of the measured FFM of the related body part. In the model-development group, the BI index of each of the 3 segments and whole body was significantly correlated to the measured FFM (R 2 = 0.867-0.932, standard error of estimation = 0.18-1.44 kg (5.9%-8.7%)). There was no significant difference between the measured and predicted FFM values without systematic error. The application of each equation derived in the model-development group to the cross-validation and overweight groups did not produce significant differences between the measured and predicted FFM values and systematic errors, with an exception that the arm FFM in the overweight group was overestimated. Segmental bioelectrical impedance analysis is useful for predicting the FFM of each of whole-body and body segments in children including overweight individuals, although the application for estimating arm FFM in overweight individuals requires a certain modification.

  10. Estimation and Validation of Land Surface Temperatures from Chinese Second-Generation Polar-Orbit FY-3A VIRR Data

    Directory of Open Access Journals (Sweden)

    Bo-Hui Tang

    2015-03-01

    Full Text Available This work estimated and validated the land surface temperature (LST from thermal-infrared Channels 4 (10.8 µm and 5 (12.0 µm of the Visible and Infrared Radiometer (VIRR onboard the second-generation Chinese polar-orbiting FengYun-3A (FY-3A meteorological satellite. The LST, mean emissivity and atmospheric water vapor content (WVC were divided into several tractable sub-ranges with little overlap to improve the fitting accuracy. The experimental results showed that the root mean square errors (RMSEs were proportional to the viewing zenith angles (VZAs and WVC. The RMSEs were below 1.0 K for VZA sub-ranges less than 30° or for VZA sub-ranges less than 60° and WVC less than 3.5 g/cm2, provided that the land surface emissivities were known. A preliminary validation using independently simulated data showed that the estimated LSTs were quite consistent with the actual inputs, with a maximum RMSE below 1 K for all VZAs. An inter-comparison using the Moderate Resolution Imaging Spectroradiometer (MODIS-derived LST product MOD11_L2 showed that the minimum RMSE was 1.68 K for grass, and the maximum RMSE was 3.59 K for barren or sparsely vegetated surfaces. In situ measurements at the Hailar field site in northeastern China from October, 2013, to September, 2014, were used to validate the proposed method. The result showed that the RMSE between the LSTs calculated from the ground measurements and derived from the VIRR data was 1.82 K.

  11. Development and validation of a Kalman filter-based model for vehicle slip angle estimation

    Science.gov (United States)

    Gadola, M.; Chindamo, D.; Romano, M.; Padula, F.

    2014-01-01

    It is well known that vehicle slip angle is one of the most difficult parameters to measure on a vehicle during testing or racing activities. Moreover, the appropriate sensor is very expensive and it is often difficult to fit to a car, especially on race cars. We propose here a strategy to eliminate the need for this sensor by using a mathematical tool which gives a good estimation of the vehicle slip angle. A single-track car model, coupled with an extended Kalman filter, was used in order to achieve the result. Moreover, a tuning procedure is proposed that takes into consideration both nonlinear and saturation characteristics typical of vehicle lateral dynamics. The effectiveness of the proposed algorithm has been proven by both simulation results and real-world data.

  12. Validation of estimating food intake in gray wolves by 22Na turnover

    Science.gov (United States)

    DelGiudice, G.D.; Duquette, L.S.; Seal, U.S.; Mech, L.D.

    1991-01-01

    We studied 22sodium (22Na) turnover as a means of estimating food intake in 6 captive, adult gray wolves (Canis lupus) (2 F, 4 M) over a 31-day feeding period. Wolves were fed white-tailed deer (Odocoileus virginianus) meat only. Mean mass-specific exchangeable Na pool was 44.8 .+-. 0.7 mEq/kg; there was no differeence between males and females. Total exchangeable Na was related (r2 = 0.85, P food consumption (g/kg/day) in wolves over a 32-day period. Sampling blood and weighing wolves every 1-4 days permitted identification of several potential sources of error, including changes in size of exchangeable Na pools, exchange of 22Na with gastrointestinal and bone Na, and rapid loss of the isotope by urinary excretion.

  13. A novel method for coil efficiency estimation: Validation with a 13C birdcage

    DEFF Research Database (Denmark)

    Giovannetti, Giulio; Frijia, Francesca; Hartwig, Valentina

    2012-01-01

    Coil efficiency, defined as the B1 magnetic field induced at a given point on the square root of supplied power P, is an important parameter that characterizes both the transmit and receive performance of the radiofrequency (RF) coil. Maximizing coil efficiency will maximize also the signal......-to-noise ratio. In this work, we propose a novel method for RF coil efficiency estimation based on the use of a perturbing loop. The proposed method consists of loading the coil with a known resistor by inductive coupling and measuring the quality factor with and without the load. We tested the method...... by measuring the efficiency of a 13C birdcage coil tuned at 32.13 MHz and verified its accuracy by comparing the results with the nuclear magnetic resonance nutation experiment. The method allows coil performance characterization in a short time and with great accuracy, and it can be used both on the bench...

  14. Validity of energy intake estimated by digital photography plus recall in overweight and obese young adults.

    Science.gov (United States)

    Ptomey, Lauren T; Willis, Erik A; Honas, Jeffery J; Mayo, Matthew S; Washburn, Richard A; Herrmann, Stephen D; Sullivan, Debra K; Donnelly, Joseph E

    2015-09-01

    Recent reports have questioned the adequacy of self-report measures of dietary intake as the basis for scientific conclusions regarding the associations of dietary intake and health, and reports have recommended the development and evaluation of better methods for the assessment of dietary intake in free-living individuals. We developed a procedure that used pre- and post-meal digital photographs in combination with dietary recalls (DP+R) to assess energy intake during ad libitum eating in a cafeteria setting. To compare mean daily energy intake of overweight and obese young adults assessed by a DP+R method with mean total daily energy expenditure assessed by doubly labeled water (TDEE(DLW)). Energy intake was assessed using the DP+R method in 91 overweight and obese young adults (age = 22.9±3.2 years, body mass index [BMI; calculated as kg/m(2)]=31.2±5.6, female=49%) over 7 days of ad libitum eating in a university cafeteria. Foods consumed outside the cafeteria (ie, snacks, non-cafeteria meals) were assessed using multiple-pass recall procedures, using food models and standardized, neutral probing questions. TDEE(DLW) was assessed in all participants over the 14-day period. The mean energy intakes estimated by DP+R and TDEE(DLW) were not significantly different (DP+R=2912±661 kcal/d; TDEE(DLW)=2849±748 kcal/d, P=0.42). The DP+R method overestimated TDEE(DLW) by 63±750 kcal/d (6.8±28%). Results suggest that the DP+R method provides estimates of energy intake comparable to those obtained by TDEE(DLW). Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  15. Validity of a self-administered food frequency questionnaire (FFQ and its generalizability to the estimation of dietary folate intake in Japan

    Directory of Open Access Journals (Sweden)

    Iso Hiroyasu

    2005-10-01

    Full Text Available Abstract Background In an epidemiological study, it is essential to test the validity of the food frequency questionnaire (FFQ for its ability to estimate dietary intake. The objectives of our study were to 1 validate a FFQ for estimating folate intake, and to identify the foods that contribute to inter-individual variation of folate intake in the Japanese population. Methods Validity of the FFQ was evaluated using 28-day weighed dietary records (DRs as gold standard in the two groups independently. In the group for which the FFQ was developed, validity was evaluated by Spearman's correlation coefficients (CCs, and linear regression analysis was used to identify foods with large inter-individual variation. The cumulative mean intake of these foods was compared with total intake estimated by the DR. The external validity of the FFQ and intake from foods on the same list were evaluated in the other group to verify generalizability. Subjects were a subsample from the Japan Public Health Center-based prospective Study who volunteered to participate in the FFQ validation study. Results CCs for the internal validity of the FFQ were 0.49 for men and 0.29 and women, while CCs for external validity were 0.33 for men and 0.42 for women. CCs for cumulative folate intake from 33 foods selected by regression analysis were also applicable to an external population. Conclusion Our FFQ was valid for and generalizable to the estimation of folate intake. Foods identified as predictors of inter-individual variation in folate intake were also generalizable in Japanese populations. The FFQ with 138 foods was valid for the estimation of folate intake, while that with 33 foods might be useful for estimating inter-individual variation and ranking of individual folate intake.

  16. Validity of a self-administered food frequency questionnaire (FFQ) and its generalizability to the estimation of dietary folate intake in Japan

    Science.gov (United States)

    Ishihara, Junko; Yamamoto, Seiichiro; Iso, Hiroyasu; Inoue, Manami; Tsugane, Shoichiro

    2005-01-01

    Background In an epidemiological study, it is essential to test the validity of the food frequency questionnaire (FFQ) for its ability to estimate dietary intake. The objectives of our study were to 1) validate a FFQ for estimating folate intake, and to identify the foods that contribute to inter-individual variation of folate intake in the Japanese population. Methods Validity of the FFQ was evaluated using 28-day weighed dietary records (DRs) as gold standard in the two groups independently. In the group for which the FFQ was developed, validity was evaluated by Spearman's correlation coefficients (CCs), and linear regression analysis was used to identify foods with large inter-individual variation. The cumulative mean intake of these foods was compared with total intake estimated by the DR. The external validity of the FFQ and intake from foods on the same list were evaluated in the other group to verify generalizability. Subjects were a subsample from the Japan Public Health Center-based prospective Study who volunteered to participate in the FFQ validation study. Results CCs for the internal validity of the FFQ were 0.49 for men and 0.29 and women, while CCs for external validity were 0.33 for men and 0.42 for women. CCs for cumulative folate intake from 33 foods selected by regression analysis were also applicable to an external population. Conclusion Our FFQ was valid for and generalizable to the estimation of folate intake. Foods identified as predictors of inter-individual variation in folate intake were also generalizable in Japanese populations. The FFQ with 138 foods was valid for the estimation of folate intake, while that with 33 foods might be useful for estimating inter-individual variation and ranking of individual folate intake. PMID:16202175

  17. Methods for validating the presence of and characterizing proteins deposited onto an array

    Science.gov (United States)

    Schabacker, Daniel S.

    2010-09-21

    A method of determining if proteins have been transferred from liquid-phase protein fractions to an array comprising staining the array with a total protein stain and imaging the array, optionally comparing the staining with a standard curve generated by staining known amounts of a known protein on the same or a similar array; a method of characterizing proteins transferred from liquid-phase protein fractions to an array including staining the array with a post-translational modification-specific (PTM-specific) stain and imaging the array and, optionally, after staining the array with a PTM-specific stain and imaging the array, washing the array, re-staining the array with a total protein stain, imaging the array, and comparing the imaging with the PTM-specific stain with the imaging with the total protein stain; stained arrays; and images of stained arrays.

  18. Development and validation of a method to estimate the potential wind erosion risk in Germany

    Science.gov (United States)

    Funk, Roger; Deumlich, Detlef; Völker, Lidia

    2017-04-01

    The introduction of the Cross Compliance (CC) regulations for soil protection resulted in the demand for the classification of the the wind erosion risk on agricultural areas in Germany nationwide. A spatial highly resolved method was needed based on uniform data sets and validation principles, which provides a fair and equivalent procedure for all affected farmers. A GIS-procedure was developed, which derives the site specific wind erosion risk from the main influencing factors: soil texture, wind velocity, wind direction and landscape structure following the German standard DIN 19706. The procedure enables different approaches in the Federal States and comparable classification results. Here, we present the approach of the Federal State of Brandenburg. In the first step a complete soil data map was composed in a grid size of 10 x 10 m. Data were taken from 1.) the Soil quality Appraisal (scale 1:10.000), 2.) the Medium-scale Soil Mapping (MMK, 1:25.000), 3.) extrapolating the MMK, 4.) new Soil quality Appraisal (new areas after coal-mining). Based on the texture and carbon content the wind erosion susceptibility was divided in 6 classes. This map was combined with data of the annual average wind velocity resulting in an increase of the risk classes for wind velocities > 5 ms-1 and a decrease for structure is regarded by allocating a height to each landscape element, corresponding to the described features in the digital "Biotope and Land Use Map". The "hill shade" procedure of ArcGIS was used to set virtual shadows behind the landscape elements for eight directions. The relative frequency of wind from each direction was used as a weighting factor and multiplied with the numerical values of the shadowed cells. Depending on the distance to the landscape element the shadowing effect was combined with the risk classes. The results show that the wind erosion risk is obviously reduced by integrating landscape structures into the risk assessment. After the renewed

  19. Effect of thermal processing on estimated metabolizable protein supply to dairy cattle from camelina seeds: relationship with protein molecular structural changes.

    Science.gov (United States)

    Peng, Quanhui; Khan, Nazir A; Wang, Zhisheng; Zhang, Xuewei; Yu, Peiqiang

    2014-08-20

    This study evaluated the effect of thermal processing on the estimated metabolizable protein (MP) supply to dairy cattle from camelina seeds (Camelina sativa L. Crantz) and determined the relationship between heat-induced changes in protein molecular structural characteristics and the MP supply. Seeds from two camelina varieties were sampled in two consecutive years and were either kept raw or were heated in an autoclave (moist heating) or in an air-draft oven (dry heating) at 120 °C for 1 h. The MP supply to dairy cattle was modeled by three commonly used protein evaluation systems. The protein molecular structures were analyzed by Fourier transform/infrared-attenuated total reflectance molecular spectroscopy. The results showed that both the dry and moist heating increased the contents of truly absorbable rumen-undegraded protein (ARUP) and total MP and decreased the degraded protein balance (DPB). However, the moist-heated camelina seeds had a significantly higher (P seeds. The regression equations showed that intensities of the protein molecular structural bands can be used to estimate the contents of ARUP, MP, and DPB with high accuracy (R(2) > 0.70). These results show that protein molecular structural characteristics can be used to rapidly assess the MP supply to dairy cattle from raw and heat-treated camelina seeds.

  20. Development and validation of a noncontact spectroscopic device for hemoglobin estimation at point-of-care

    Science.gov (United States)

    Sarkar, Probir Kumar; Pal, Sanchari; Polley, Nabarun; Aich, Rajarshi; Adhikari, Aniruddha; Halder, Animesh; Chakrabarti, Subhananda; Chakrabarti, Prantar; Pal, Samir Kumar

    2017-05-01

    Anemia severely and adversely affects human health and socioeconomic development. Measuring hemoglobin with the minimal involvement of human and financial resources has always been challenging. We describe a translational spectroscopic technique for noncontact hemoglobin measurement at low-resource point-of-care settings in human subjects, independent of their skin color, age, and sex, by measuring the optical spectrum of the blood flowing in the vascular bed of the bulbar conjunctiva. We developed software on the LabVIEW platform for automatic data acquisition and interpretation by nonexperts. The device is calibrated by comparing the differential absorbance of light of wavelength 576 and 600 nm with the clinical hemoglobin level of the subject. Our proposed method is consistent with the results obtained using the current gold standard, the automated hematology analyzer. The proposed noncontact optical device for hemoglobin estimation is highly efficient, inexpensive, feasible, and extremely useful in low-resource point-of-care settings. The device output correlates with the different degrees of anemia with absolute and trending accuracy similar to those of widely used invasive methods. Moreover, the device can instantaneously transmit the generated report to a medical expert through e-mail, text messaging, or mobile apps.

  1. Parametric validations of analytical lifetime estimates for radiation belt electron diffusion by whistler waves

    Directory of Open Access Journals (Sweden)

    A. V. Artemyev

    2013-04-01

    Full Text Available The lifetimes of electrons trapped in Earth's radiation belts can be calculated from quasi-linear pitch-angle diffusion by whistler-mode waves, provided that their frequency spectrum is broad enough and/or their average amplitude is not too large. Extensive comparisons between improved analytical lifetime estimates and full numerical calculations have been performed in a broad parameter range representative of a large part of the magnetosphere from L ~ 2 to 6. The effects of observed very oblique whistler waves are taken into account in both numerical and analytical calculations. Analytical lifetimes (and pitch-angle diffusion coefficients are found to be in good agreement with full numerical calculations based on CRRES and Cluster hiss and lightning-generated wave measurements inside the plasmasphere and Cluster lower-band chorus waves measurements in the outer belt for electron energies ranging from 100 keV to 5 MeV. Comparisons with lifetimes recently obtained from electron flux measurements on SAMPEX, SCATHA, SAC-C and DEMETER also show reasonable agreement.

  2. A statistical approach to the estimation of mechanical unfolding parameters from the unfolding patterns of protein heteropolymers

    International Nuclear Information System (INIS)

    Beddard, G S; Brockwell, D J

    2010-01-01

    A statistical calculation is described with which the saw-tooth-like unfolding patterns of concatenated heteropolymeric proteins can be used to estimate the forced unfolding parameters of a previously uncharacterized protein. The chance of observing the various sequences of unfolding events, such as ABAABBB or BBAAABB etc, for two proteins of types A and B is calculated using proteins with various ratios of A and B and at different values of effective unfolding rate constants. If the experimental rate constant for forced unfolding, k 0 , and distance to the transition state x u are known for one protein, then the calculation allows an estimation of values for the other. The predictions are compared with Monte Carlo simulations and experimental data. (communication)

  3. Validation of a Dietary History Questionnaire against a 7-D Weighed Record for Estimating Nutrient Intake among Rural Elderly Malays.

    Science.gov (United States)

    Shahar, S; Earland, J; Abdulrahman, S

    2000-03-01

    Energy and nutrient intake estimated using a pre-coded dietary history questionnaire (DHQ) was compared with results obtained from a 7-d weighed intake record (WI) in a group of 37 elderly Malays residing in rural areas of Mersing District, Johor, Malaysia to determine the validity of the DHQ. The DHQ consists of a pre-coded dietary history with a qualitative food frequency questionnaire which was developed to obtain information on food intake and usual dietary habits. The 7-d WI requires subjects to weigh each food immediately before eating and to weigh any leftovers. The medians of intake from the two methods were rather similar and varied by less than 30% for every nutrient, except for vitamin C (114%). For most of the nutrients, analysis of group means using the Wilcoxon matched pairs signed rank sum test showed no significant difference between the estimation of intake from the DHQ and from the WI, with the exceptions of vitamin C and niacin. The DHQ significantly overestimated the intake of vitamin C compared to the WI (ppopulation with a high prevalence of illiteracy, a specially designed DHQ can provide very similar estimations to that obtained from 7-d WI.

  4. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  5. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi; Zhou, Lan; Najibi, Seyed Morteza; Gao, Xin; Huang, Jianhua Z.

    2015-01-01

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  6. Nuclear based technologies for estimating microbial protein supply in ruminant livestock. Proceedings of the second research co-ordination meeting of a co-ordinated research project (phase 1)

    International Nuclear Information System (INIS)

    1999-06-01

    The Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture through its Co-ordinated Research Projects (CRPs), has been assisting national agricultural research systems in Member States to develop and apply nuclear and related techniques for improving livestock productivity. The programmes have focused on animal nutrition, animal reproduction and more recently on animal nutrition/reproduction interactions with emphasis on smallholder farming systems. The measurement of microbial protein supply to ruminant livestock has been an important area of research in ruminant nutrition. An estimate of microbial protein contribution to the intestinal protein flow is important for estimating the protein requirement of ruminant animals. Understanding the process of microbial protein synthesis has been difficult however, and due to the lack of simple and accurate methods for measuring microbial protein production in vivo, the methods used are based on complex microbial markers which require surgically prepared animals. As a result of a consultants meeting held in May 1995 to advise the Joint FAO/IAEA Division on the feasibility of using nuclear and related techniques for the development and validation of techniques for measuring microbial protein supply in ruminant animals, an FAO/IAEA Co-ordinated Research Project on Development, Standardization and Validation of Nuclear Based Technologies for Measuring Microbial Protein Supply in Ruminant Livestock for Improving Productivity was initiated in 1996, with a view to validating and adapting this technology for use in developing countries. To assist scientists participating in the CRP, a laboratory manual containing experimental protocols and methodologies for standardization and validation of the urine purine derivative technique and the development of models to suit local conditions, was published as IAEA-TECDOC-945. The present publication contains the final reports from participants in Phase 1 of the project

  7. Nuclear based technologies for estimating microbial protein supply in ruminant livestock. Proceedings of the second research co-ordination meeting of a co-ordinated research project (phase 1)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture through its Co-ordinated Research Projects (CRPs), has been assisting national agricultural research systems in Member States to develop and apply nuclear and related techniques for improving livestock productivity. The programmes have focused on animal nutrition, animal reproduction and more recently on animal nutrition/reproduction interactions with emphasis on smallholder farming systems. The measurement of microbial protein supply to ruminant livestock has been an important area of research in ruminant nutrition. An estimate of microbial protein contribution to the intestinal protein flow is important for estimating the protein requirement of ruminant animals. Understanding the process of microbial protein synthesis has been difficult however, and due to the lack of simple and accurate methods for measuring microbial protein production in vivo, the methods used are based on complex microbial markers which require surgically prepared animals. As a result of a consultants meeting held in May 1995 to advise the Joint FAO/IAEA Division on the feasibility of using nuclear and related techniques for the development and validation of techniques for measuring microbial protein supply in ruminant animals, an FAO/IAEA Co-ordinated Research Project on Development, Standardization and Validation of Nuclear Based Technologies for Measuring Microbial Protein Supply in Ruminant Livestock for Improving Productivity was initiated in 1996, with a view to validating and adapting this technology for use in developing countries. To assist scientists participating in the CRP, a laboratory manual containing experimental protocols and methodologies for standardization and validation of the urine purine derivative technique and the development of models to suit local conditions, was published as IAEA-TECDOC-945. The present publication contains the final reports from participants in Phase 1 of the project

  8. Assessment of heat transfer correlations for supercritical water in the frame of best-estimate code validation

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Espinoza, Victor H. Sanchez; Schneider, Niko; Hurtado, Antonio

    2009-01-01

    Within the frame of the Generation IV international forum six innovative reactor concepts are the subject of comprehensive investigations. In some projects supercritical water will be considered as coolant, moderator (as for the High Performance Light Water Reactor) or secondary working fluid (one possible option for Liquid Metal-cooled Fast Reactors). Supercritical water is characterized by a pronounced change of the thermo-physical properties when crossing the pseudo-critical line, which goes hand in hand with a change in the heat transfer (HT) behavior. Hence, it is essential to estimate, in a proper way, the heat-transfer coefficient and subsequently the wall temperature. The scope of this paper is to present and discuss the activities at the Institute for Reactor Safety (IRS) related to the implementation of correlations for wall-to-fluid HT at supercritical conditions in Best-Estimate codes like TRACE as well as its validation. It is important to validate TRACE before applying it to safety analyses of HPLWR or of other reactor systems. In the past 3 decades various experiments have been performed all over the world to reveal the peculiarities of wall-to-fluid HT at supercritical conditions. Several different heat transfer phenomena such as HT enhancement (due to higher Prandtl numbers in the vicinity of the pseudo-critical point) or HT deterioration (due to strong property variations) were observed. Since TRACE is a component based system code with a finite volume method the resolution capabilities are limited and not all physical phenomena can be modeled properly. But Best -Estimate system codes are nowadays the preferred option for safety related investigations of full plants or other integral systems. Thus, the increase of the confidence in such codes is of high priority. In this paper, the post-test analysis of experiments with supercritical parameters will be presented. For that reason various correlations for the HT, which considers the characteristics

  9. Validation of commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    DEFF Research Database (Denmark)

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold

    2014-01-01

    BACKGROUND: Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. OBJECTIVE......: The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). METHODS: Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared...... with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. RESULTS: Total imprecision was

  10. Statistical Estimation of the Protein-Ligand Binding Free Energy Based On Direct Protein-Ligand Interaction Obtained by Molecular Dynamics Simulation

    Directory of Open Access Journals (Sweden)

    Haruki Nakamura

    2012-09-01

    Full Text Available We have developed a method for estimating protein-ligand binding free energy (DG based on the direct protein-ligand interaction obtained by a molecular dynamics simulation. Using this method, we estimated the DG value statistically by the average values of the van der Waals and electrostatic interactions between each amino acid of the target protein and the ligand molecule. In addition, we introduced fluctuations in the accessible surface area (ASA and dihedral angles of the protein-ligand complex system as the entropy terms of the DG estimation. The present method included the fluctuation term of structural change of the protein and the effective dielectric constant. We applied this method to 34 protein-ligand complex structures. As a result, the correlation coefficient between the experimental and calculated DG values was 0.81, and the average error of DG was 1.2 kcal/mol with the use of the fixed parameters. These results were obtained from a 2 nsec molecular dynamics simulation.

  11. ENERGY AND PROTEIN REQUIREMENTS OF GROWING PELIBUEY SHEEP UNDER TROPICAL CONDITIONS ESTIMATED FROM A LITERATURE DATABASE ANALYSES

    Directory of Open Access Journals (Sweden)

    Fernando Duarte

    2012-01-01

    Full Text Available Data from previous studies were used to estimate the metabolizable energy and protein requirements for maintenance and growth and basal metabolism energy requirement of male Pelibuey sheep under tropical conditions were estimated. In addition, empty body weight and mature weight of males and female Pelibuey sheep were also estimated. Basal metabolism energy requirements were estimated with the Cornell Net Carbohydrate and Protein System – Sheep (CNCPS-S model using the a1 factor of the maintenance equation. Mature weight was estimated to be 69 kg for males and 45 kg for females. Empty body weight was estimated to be 81% of live weight. Metabolizable energy and protein requirements for growth were 0.106 Mcal MEm/kg LW0.75 and 2.4 g MP/kg LW0.75 for males. The collected information did not allowed appropriate estimation of female requirements. The basal metabolism energy requirement was estimated to be 0.039 Mcal MEm/kg LW0.75. Energy requirements for basal metabolism were lower in Pelibuey sheep than those reported for wool breeds even though their total requirements were similar.

  12. Accuracy and feasibility of estimated tumour volumetry in primary gastric gastrointestinal stromal tumours: validation using semiautomated technique in 127 patients.

    Science.gov (United States)

    Tirumani, Sree Harsha; Shinagare, Atul B; O'Neill, Ailbhe C; Nishino, Mizuki; Rosenthal, Michael H; Ramaiya, Nikhil H

    2016-01-01

    To validate estimated tumour volumetry in primary gastric gastrointestinal stromal tumours (GISTs) using semiautomated volumetry. In this IRB-approved retrospective study, we measured the three longest diameters in x, y, z axes on CTs of primary gastric GISTs in 127 consecutive patients (52 women, 75 men, mean age 61 years) at our institute between 2000 and 2013. Segmented volumes (Vsegmented) were obtained using commercial software by two radiologists. Estimate volumes (V1-V6) were obtained using formulae for spheres and ellipsoids. Intra- and interobserver agreement of Vsegmented and agreement of V1-6 with Vsegmented were analysed with concordance correlation coefficients (CCC) and Bland-Altman plots. Median Vsegmented and V1-V6 were 75.9, 124.9, 111.6, 94.0, 94.4, 61.7 and 80.3 cm(3), respectively. There was strong intra- and interobserver agreement for Vsegmented. Agreement with Vsegmented was highest for V6 (scalene ellipsoid, x ≠ y ≠ z), with CCC of 0.96 [95 % CI 0.95-0.97]. Mean relative difference was smallest for V6 (0.6 %), while it was -19.1 % for V5, +14.5 % for V4, +17.9 % for V3, +32.6 % for V2 and +47 % for V1. Ellipsoidal approximations of volume using three measured axes may be used to closely estimate Vsegmented when semiautomated techniques are unavailable. Estimation of tumour volume in primary GIST using mathematical formulae is feasible. Gastric GISTs are rarely spherical. Segmented volumes are highly concordant with three axis-based scalene ellipsoid volumes. Ellipsoid volume can be used as an alternative for automated tumour volumetry.

  13. Validation of a protocol for the estimation of three-dimensional body center of mass kinematics in sport.

    Science.gov (United States)

    Mapelli, Andrea; Zago, Matteo; Fusini, Laura; Galante, Domenico; Colombo, Andrea; Sforza, Chiarella

    2014-01-01

    Since strictly related to balance and stability control, body center of mass (CoM) kinematics is a relevant quantity in sport surveys. Many methods have been proposed to estimate CoM displacement. Among them, segmental method appears to be suitable to investigate CoM kinematics in sport: human body is assumed as a system of rigid bodies, hence the whole-body CoM is calculated as the weighted average of the CoM of each segment. The number of landmarks represents a crucial choice in the protocol design process: one have to find the proper compromise between accuracy and invasivity. In this study, using a motion analysis system, a protocol based upon the segmental method is validated, adopting an anatomical model comprising 14 landmarks. Two sets of experiments were conducted. Firstly, our protocol was compared to the ground reaction force method (GRF), accounted as a standard in CoM estimation. In the second experiment, we investigated the aerial phase typical of many disciplines, comparing our protocol with: (1) an absolute reference, the parabolic regression of the vertical CoM trajectory during the time of flight; (2) two common approaches to estimate CoM kinematics in gait, known as sacrum and reconstructed pelvis methods. Recognized accuracy indexes proved that the results obtained were comparable to the GRF; what is more, during the aerial phases our protocol showed to be significantly more accurate than the two other methods. The protocol assessed can therefore be adopted as a reliable tool for CoM kinematics estimation in further sport researches. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Are cannabis prevalence estimates comparable across countries and regions? A cross-cultural validation using search engine query data.

    Science.gov (United States)

    Steppan, Martin; Kraus, Ludwig; Piontek, Daniela; Siciliano, Valeria

    2013-01-01

    Prevalence estimation of cannabis use is usually based on self-report data. Although there is evidence on the reliability of this data source, its cross-cultural validity is still a major concern. External objective criteria are needed for this purpose. In this study, cannabis-related search engine query data are used as an external criterion. Data on cannabis use were taken from the 2007 European School Survey Project on Alcohol and Other Drugs (ESPAD). Provincial data came from three Italian nation-wide studies using the same methodology (2006-2008; ESPAD-Italia). Information on cannabis-related search engine query data was based on Google search volume indices (GSI). (1) Reliability analysis was conducted for GSI. (2) Latent measurement models of "true" cannabis prevalence were tested using perceived availability, web-based cannabis searches and self-reported prevalence as indicators. (3) Structure models were set up to test the influences of response tendencies and geographical position (latitude, longitude). In order to test the stability of the models, analyses were conducted on country level (Europe, US) and on provincial level in Italy. Cannabis-related GSI were found to be highly reliable and constant over time. The overall measurement model was highly significant in both data sets. On country level, no significant effects of response bias indicators and geographical position on perceived availability, web-based cannabis searches and self-reported prevalence were found. On provincial level, latitude had a significant positive effect on availability indicating that perceived availability of cannabis in northern Italy was higher than expected from the other indicators. Although GSI showed weaker associations with cannabis use than perceived availability, the findings underline the external validity and usefulness of search engine query data as external criteria. The findings suggest an acceptable relative comparability of national (provincial) prevalence

  15. Simultaneous Estimation and Validation of Atorvastatin Calcium and Aspirin in Combined Capsule Dosage Form by RP HPLC Method

    Directory of Open Access Journals (Sweden)

    B. V. Suma

    2012-01-01

    Full Text Available A new simple, specific, precise and accurate revere phase liquid chromatography method has been developed for estimation of atorvastatin calcium (AST and ASPIRIN (ASP simultaneously in a combined capsule dosage forms. The chromatographic separation was achieved on a 5 – micron C 18 column (250x 4.6mm using a mobile phase consisting of a mixture of Acetonitrile: Ammonium Acetate buffer 0.02M (68:32 pH 4.5. The flow rate was maintained at 0.8 ml/min. The detection of the constituents was done using UV detector at 245 nm for AST and ASP. The retention time of AST and ASP were found be 4.5915 ± 0.0031 min and 3.282 ±0.0024 min respectively. The developed method was validated for accuracy, linearity, precision, limit of detection (LOD and limit of quantification (LOQ and robustness as per the ICH guidelines.

  16. Imaging mass spectrometry in papillary thyroid carcinoma for the identification and validation of biomarker proteins.

    Science.gov (United States)

    Min, Kyueng-Whan; Bang, Joo-Young; Kim, Kwang Pyo; Kim, Wan-Seop; Lee, Sang Hwa; Shanta, Selina Rahman; Lee, Jeong Hwa; Hong, Ji Hye; Lim, So Dug; Yoo, Young-Bum; Na, Chan-Hyun

    2014-07-01

    Direct tissue imaging mass spectrometry (IMS) by matrix-assisted laser desorption ionization and time-of-flight (MALDI-TOF) mass spectrometry has become increasingly important in biology and medicine, because this technology can detect the relative abundance and spatial distribution of interesting proteins in tissues. Five thyroid cancer samples, along with normal tissue, were sliced and transferred onto conductive glass slides. After laser scanning by MALDI-TOF equipped with a smart beam laser, images were created for individual masses and proteins were classified at 200-µm spatial resolution. Based on the spatial distribution, region-specific proteins on a tumor lesion could be identified by protein extraction from tumor tissue and analysis using liquid chromatography with tandem mass spectrometry (LC-MS/MS). Using all the spectral data at each spot, various intensities of a specific peak were detected in the tumor and normal regions of the thyroid. Differences in the molecular weights of expressed proteins between tumor and normal regions were analyzed using unsupervised and supervised clustering. To verify the presence of discovered proteins through IMS, we identified ribosomal protein P2, which is specific for cancer. We have demonstrated the feasibility of IMS as a useful tool for the analysis of tissue sections, and identified the tumor-specific protein ribosomal protein P2.

  17. Simulating the influence of plasma protein on measured receptor affinity in biochemical assays reveals the utility of Schild analysis for estimating compound affinity for plasma proteins.

    Science.gov (United States)

    Blakeley, D; Sykes, D A; Ensor, P; Bertran, E; Aston, P J; Charlton, S J

    2015-11-01

    Plasma protein binding (PPB) influences the free fraction of drug available to bind to its target and is therefore an important consideration in drug discovery. While traditional methods for assessing PPB (e.g. rapid equilibrium dialysis) are suitable for comparing compounds with relatively weak PPB, they are not able to accurately discriminate between highly bound compounds (typically >99.5%). The aim of the present work was to use mathematical modelling to explore the potential utility of receptor binding and cellular functional assays to estimate the affinity of compounds for plasma proteins. Plasma proteins are routinely added to in vitro assays, so a secondary goal was to investigate the effect of plasma proteins on observed ligand-receptor interactions. Using the principle of conservation of mass and the law of mass action, a cubic equation was derived describing the ligand-receptor complex [LR] in the presence of plasma protein at equilibrium. The model demonstrates the profound influence of PPB on in vitro assays and identifies the utility of Schild analysis, which is usually applied to determine receptor-antagonist affinities, for calculating affinity at plasma proteins (termed KP ). We have also extended this analysis to functional effects using operational modelling and demonstrate that these approaches can also be applied to cell-based assay systems. These mathematical models can potentially be used in conjunction with experimental data to estimate drug-plasma protein affinities in the earliest phases of drug discovery programmes. © 2015 The British Pharmacological Society.

  18. Single-Molecule Force Spectroscopy Trajectories of a Single Protein and Its Polyproteins Are Equivalent: A Direct Experimental Validation Based on A Small Protein NuG2.

    Science.gov (United States)

    Lei, Hai; He, Chengzhi; Hu, Chunguang; Li, Jinliang; Hu, Xiaodong; Hu, Xiaotang; Li, Hongbin

    2017-05-22

    Single-molecule force spectroscopy (SMFS) has become a powerful tool in investigating the mechanical unfolding/folding of proteins at the single-molecule level. Polyproteins made of tandem identical repeats have been widely used in atomic force microscopy (AFM)-based SMFS studies, where polyproteins not only serve as fingerprints to identify single-molecule stretching events, but may also improve statistics of data collection. However, the inherent assumption of such experiments is that all the domains in the polyprotein are equivalent and one SMFS trajectory of stretching a polyprotein made of n domains is equivalent to n trajectories of stretching a single domain. Such an assumption has not been validated experimentally. Using a small protein NuG2 and its polyprotein (NuG2) 4 as model systems, here we use optical trapping (OT) to directly validate this assumption. Our results show that OT experiments on NuG2 and (NuG2) 4 lead to identical parameters describing the unfolding and folding kinetics of NuG2, demonstrating that indeed stretching a polyprotein of NuG2 is equivalent to stretching single NuG2 in force spectroscopy experiments and thus validating the use of polyproteins in SMFS experiments. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Validation of the method for the simultaneous estimation of bioactive marker gallic acid and quercetin in Abutilon indicum by HPTLC

    Directory of Open Access Journals (Sweden)

    Md. Sarfaraj Hussain

    2012-05-01

    Full Text Available Objective: To establish and validate an simultaneous estimation of the two biomarker compounds gallic acid (GA and quercetin (QE from methanolic extract of Abutilon indicum (AI. Methods: Chromatography was performed on aluminium foil-backed silica gel 60 F254 HPTLC plates with the binary mobile phase toluene-ethyl acetate-formic acid (5:4:1, v/v/v. Ultraviolet detection was performed densitometrically at the maximum absorbance wavelength, 270nm. The method was validated for precision, recovery, robustness, specificity, and detection and quantification limits, in accordance with ICH guidelines. Results: The system was found to give compact spots for GA and QE (Rf value of 0.31 and 0.50, respectively. The limit of detection (23 and 41 ng band-1 limit of quantification (69 and 123 ng band-1, recovery (99.4-99.9 and 98.7-99.4%, and precision (≤ 1.98 and 1.97 were satisfactory for gallic acid and quercetin respectively. Linearity range for GA and QE were 100-1000 (r2= 0.9991 and 150-900 ng band-1 (r2= 0.9956 and the contents estimated as 0.69% ± 0.01% and 0.57% ± 0.01% w/w respectively. Conclusions: This simple, precise and accurate method gave good resolution from other constituents present in the extract. The method has been successfully applied in the analysis and routine quality control of herbal material and formulations containing AI.

  20. Validity and Reliability of the Brazilian Version of the Rapid Estimate of Adult Literacy in Dentistry – BREALD-30

    Science.gov (United States)

    Junkes, Monica C.; Fraiz, Fabian C.; Sardenberg, Fernanda; Lee, Jessica Y.; Paiva, Saul M.; Ferreira, Fernanda M.

    2015-01-01

    Objective The aim of the present study was to translate, perform the cross-cultural adaptation of the Rapid Estimate of Adult Literacy in Dentistry to Brazilian-Portuguese language and test the reliability and validity of this version. Methods After translation and cross-cultural adaptation, interviews were conducted with 258 parents/caregivers of children in treatment at the pediatric dentistry clinics and health units in Curitiba, Brazil. To test the instrument's validity, the scores of Brazilian Rapid Estimate of Adult Literacy in Dentistry (BREALD-30) were compared based on occupation, monthly household income, educational attainment, general literacy, use of dental services and three dental outcomes. Results The BREALD-30 demonstrated good internal reliability. Cronbach’s alpha ranged from 0.88 to 0.89 when words were deleted individually. The analysis of test-retest reliability revealed excellent reproducibility (intraclass correlation coefficient = 0.983 and Kappa coefficient ranging from moderate to nearly perfect). In the bivariate analysis, BREALD-30 scores were significantly correlated with the level of general literacy (rs = 0.593) and income (rs = 0.327) and significantly associated with occupation, educational attainment, use of dental services, self-rated oral health and the respondent’s perception regarding his/her child's oral health. However, only the association between the BREALD-30 score and the respondent’s perception regarding his/her child's oral health remained significant in the multivariate analysis. Conclusion The BREALD-30 demonstrated satisfactory psychometric properties and is therefore applicable to adults in Brazil. PMID:26158724

  1. Developing and Validating Path-Dependent Uncertainty Estimates for use with the Regional Seismic Travel Time (RSTT) Model

    Science.gov (United States)

    Begnaud, M. L.; Anderson, D. N.; Phillips, W. S.; Myers, S. C.; Ballard, S.

    2016-12-01

    The Regional Seismic Travel Time (RSTT) tomography model has been developed to improve travel time predictions for regional phases (Pn, Sn, Pg, Lg) in order to increase seismic location accuracy, especially for explosion monitoring. The RSTT model is specifically designed to exploit regional phases for location, especially when combined with teleseismic arrivals. The latest RSTT model (version 201404um) has been released (http://www.sandia.gov/rstt). Travel time uncertainty estimates for RSTT are determined using one-dimensional (1D), distance-dependent error models, that have the benefit of being very fast to use in standard location algorithms, but do not account for path-dependent variations in error, and structural inadequacy of the RSTTT model (e.g., model error). Although global in extent, the RSTT tomography model is only defined in areas where data exist. A simple 1D error model does not accurately model areas where RSTT has not been calibrated. We are developing and validating a new error model for RSTT phase arrivals by mathematically deriving this multivariate model directly from a unified model of RSTT embedded into a statistical random effects model that captures distance, path and model error effects. An initial method developed is a two-dimensional path-distributed method using residuals. The goals for any RSTT uncertainty method are for it to be both readily useful for the standard RSTT user as well as improve travel time uncertainty estimates for location. We have successfully tested using the new error model for Pn phases and will demonstrate the method and validation of the error model for Sn, Pg, and Lg phases.

  2. Design and validation of new genotypic tools for easy and reliable estimation of HIV tropism before using CCR5 antagonists.

    Science.gov (United States)

    Poveda, Eva; Seclén, Eduardo; González, María del Mar; García, Federico; Chueca, Natalia; Aguilera, Antonio; Rodríguez, Jose Javier; González-Lahoz, Juan; Soriano, Vincent

    2009-05-01

    Genotypic tools may allow easier and less expensive estimation of HIV tropism before prescription of CCR5 antagonists compared with the Trofile assay (Monogram Biosciences, South San Francisco, CA, USA). Paired genotypic and Trofile results were compared in plasma samples derived from the maraviroc expanded access programme (EAP) in Europe. A new genotypic approach was built to improve the sensitivity to detect X4 variants based on an optimization of the webPSSM algorithm. Then, the new tool was validated in specimens from patients included in the ALLEGRO trial, a multicentre study conducted in Spain to assess the prevalence of R5 variants in treatment-experienced HIV patients. A total of 266 specimens from the maraviroc EAP were tested. Overall geno/pheno concordance was above 72%. A high specificity was generally seen for the detection of X4 variants using genotypic tools (ranging from 58% to 95%), while sensitivity was low (ranging from 31% to 76%). The PSSM score was then optimized to enhance the sensitivity to detect X4 variants changing the original threshold for R5 categorization. The new PSSM algorithms, PSSM(X4R5-8) and PSSM(SINSI-6.4), considered as X4 all V3 scoring values above -8 or -6.4, respectively, increasing the sensitivity to detect X4 variants up to 80%. The new algorithms were then validated in 148 specimens derived from patients included in the ALLEGRO trial. The sensitivity/specificity to detect X4 variants was 93%/69% for PSSM(X4R5-8) and 93%/70% for PSSM(SINSI-6.4). PSSM(X4R5-8) and PSSM(SINSI-6.4) may confidently assist therapeutic decisions for using CCR5 antagonists in HIV patients, providing an easier and rapid estimation of tropism in clinical samples.

  3. Estimation of Relationship Between In Situ and In Vitro Rumen Protein Degradability of Extruded Full Fat Soybean

    Directory of Open Access Journals (Sweden)

    Arzu Erol Tunç

    2017-10-01

    Full Text Available The objectives of this study were to estimate the protein degradability of extruded full fat soybean (ESB by in situ (nylon bag and in vitro enzymatic method and to develop an equation in order predict in situ degradability from in vitro values. In the study enzymatic technique; hydrolysis after 1 h (INV1 and after 24 h (INV24 by a purified protease extracted from Streptomyces griseus in a borate-phosphate buffer at pH 8 was used as in vitro method. Relationship between in situ effective protein degradability (INSE and in vitro degradability after 1 and 24 hours incubations (INV1 and INV24 were determined. In situ protein degradability was measured at 0, 2, 4, 8, 16, 24, and 48 and at 72 h incubations in the rumen of 3 Holstein cows. In the study INSE, INV1 and INV24 were determined as 58.05, 20.24 and 41.46% respectively. Despite there were differences between in situ and in vitro protein degradability values, correlation coefficients between in situ and in vitro protein degradability of ESB were high and regression equations for estimation of in situ from in vitro were found significant. As conclusion in vitro enzymatic protein degradability (INV1 and INV24 can be used for estimation of in situ effective protein degradability of extruded full fat soybean.

  4. Development and Validation of RP-HPLC Method for Simultaneous Estimation of Ramipril, Aspirin and Atorvastatin in Pharmaceutical Preparations

    Directory of Open Access Journals (Sweden)

    Rajesh Sharma

    2012-01-01

    Full Text Available A simple, sensitive, accurate and rapid reverse phase high performance liquid chromatographic method is developed for the simultaneous estimation of ramipril, aspirin and atorvastatin in pharmaceutical preparations. Chromatography was performed on a 25cm×4.6 mm i.d, 5µm particle, C18 column with Mixture of (A acetonitrile methanol (65:35 and (B 10 mM sodium dihydrogen phosphate monohydrate (NaH2PO4.H2O buffer and mixture of A:B (60:40 v/v adjusted to pH 3.0 with o-phosphoric acid (5%v/v was used as a mobile phase at a flow rate of 1.5 ml min-1. UV detection was performed at 230 nm. Total run time was less then 12 min; retention time for Ramipril, aspirin and Atorvastatin were 3.620, 4.920 min and 11.710 min respectively. The method was validated for accuracy, precision, linearity, specificity and sensitivity in accordance with ICH guidelines. Validation revealed that the method is specific, rapid, accurate, precise, reliable, and reproducible. Calibration plots were linear over the concentration ranges 05-50 µg mL-1 for Ramipril, 05-100 µgmL-1 for aspirin and 02-20 µg mL-1 for atorvastatin. Limits of detection were 0.014, 0.10 and 0.0095 ng mL-1 limits of quantification were 0.043, 0.329 and 0.029 ng mL-1 for ramipril aspirin and atorvastatin respectively. The high recovery and low coefficients of variation confirm the suitability of the method for simultaneous analysis of the all three drugs in the dosage forms. The validated method was successfully used for quantitative analysis of marketed pharmaceutical preparations.

  5. Primary Sclerosing Cholangitis Risk Estimate Tool (PREsTo) Predicts Outcomes in PSC: A Derivation & Validation Study Using Machine Learning.

    Science.gov (United States)

    Eaton, John E; Vesterhus, Mette; McCauley, Bryan M; Atkinson, Elizabeth J; Schlicht, Erik M; Juran, Brian D; Gossard, Andrea A; LaRusso, Nicholas F; Gores, Gregory J; Karlsen, Tom H; Lazaridis, Konstantinos N

    2018-05-09

    Improved methods are needed to risk stratify and predict outcomes in patients with primary sclerosing cholangitis (PSC). Therefore, we sought to derive and validate a new prediction model and compare its performance to existing surrogate markers. The model was derived using 509 subjects from a multicenter North American cohort and validated in an international multicenter cohort (n=278). Gradient boosting, a machine based learning technique, was used to create the model. The endpoint was hepatic decompensation (ascites, variceal hemorrhage or encephalopathy). Subjects with advanced PSC or cholangiocarcinoma at baseline were excluded. The PSC risk estimate tool (PREsTo) consists of 9 variables: bilirubin, albumin, serum alkaline phosphatase (SAP) times the upper limit of normal (ULN), platelets, AST, hemoglobin, sodium, patient age and the number of years since PSC was diagnosed. Validation in an independent cohort confirms PREsTo accurately predicts decompensation (C statistic 0.90, 95% confidence interval (CI) 0.84-0.95) and performed well compared to MELD score (C statistic 0.72, 95% CI 0.57-0.84), Mayo PSC risk score (C statistic 0.85, 95% CI 0.77-0.92) and SAP statistic 0.65, 95% CI 0.55-0.73). PREsTo continued to be accurate among individuals with a bilirubin statistic 0.90, 95% CI 0.82-0.96) and when the score was re-applied at a later course in the disease (C statistic 0.82, 95% CI 0.64-0.95). PREsTo accurately predicts hepatic decompensation in PSC and exceeds the performance among other widely available, noninvasive prognostic scoring systems. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  6. Global Validation of MODIS Atmospheric Profile-Derived Near-Surface Air Temperature and Dew Point Estimates

    Science.gov (United States)

    Famiglietti, C.; Fisher, J.; Halverson, G. H.

    2017-12-01

    This study validates a method of remote sensing near-surface meteorology that vertically interpolates MODIS atmospheric profiles to surface pressure level. The extraction of air temperature and dew point observations at a two-meter reference height from 2001 to 2014 yields global moderate- to fine-resolution near-surface temperature distributions that are compared to geographically and temporally corresponding measurements from 114 ground meteorological stations distributed worldwide. This analysis is the first robust, large-scale validation of the MODIS-derived near-surface air temperature and dew point estimates, both of which serve as key inputs in models of energy, water, and carbon exchange between the land surface and the atmosphere. Results show strong linear correlations between remotely sensed and in-situ near-surface air temperature measurements (R2 = 0.89), as well as between dew point observations (R2 = 0.77). Performance is relatively uniform across climate zones. The extension of mean climate-wise percent errors to the entire remote sensing dataset allows for the determination of MODIS air temperature and dew point uncertainties on a global scale.

  7. Validation of myocardial blood flow estimation with nitrogen-13 ammonia PET by the argon inert gas technique in humans

    International Nuclear Information System (INIS)

    Kotzerke, J.; Glatting, G.; Neumaier, B.; Reske, S.N.; Hoff, J. van den; Hoeher, M.; Woehrle, J. n

    2001-01-01

    We simultaneously determined global myocardial blood flow (MBF) by the argon inert gas technique and by nitrogen-13 ammonia positron emission tomography (PET) to validate PET-derived MBF values in humans. A total of 19 patients were investigated at rest (n=19) and during adenosine-induced hyperaemia (n=16). Regional coronary artery stenoses were ruled out by angiography. The argon inert gas method uses the difference of arterial and coronary sinus argon concentrations during inhalation of a mixture of 75% argon and 25% oxygen to estimate global MBF. It can be considered as valid as the microspheres technique, which, however, cannot be applied in humans. Dynamic PET was performed after injection of 0.8±0.2 GBq 13 N-ammonia and MBF was calculated applying a two-tissue compartment model. MBF values derived from the argon method at rest and during the hyperaemic state were 1.03±0.24 ml min -1 g -1 and 2.64±1.02 ml min -1 g -1 , respectively. MBF values derived from ammonia PET at rest and during hyperaemia were 0.95±0.23 ml min -1 g -1 and 2.44±0.81 ml min -1 g -1 , respectively. The correlation between the two methods was close (y=0.92x+0.14, r=0.96; P 13 N-ammonia PET. (orig.)

  8. PIG's Speed Estimated with Pressure Transducers and Hall Effect Sensor: An Industrial Application of Sensors to Validate a Testing Laboratory.

    Science.gov (United States)

    Lima, Gustavo F; Freitas, Victor C G; Araújo, Renan P; Maitelli, André L; Salazar, Andrés O

    2017-09-15

    The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG's movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG's passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory.

  9. Protein structure validation and refinement using amide proton chemical shifts derived from quantum mechanics

    DEFF Research Database (Denmark)

    Christensen, Anders Steen; Linnet, Troels Emtekær; Borg, Mikael

    2013-01-01

    We present the ProCS method for the rapid and accurate prediction of protein backbone amide proton chemical shifts - sensitive probes of the geometry of key hydrogen bonds that determine protein structure. ProCS is parameterized against quantum mechanical (QM) calculations and reproduces high level...

  10. MODIS Observation of Aerosols over Southern Africa During SAFARI 2000: Data, Validation, and Estimation of Aerosol Radiative Forcing

    Science.gov (United States)

    Ichoku, Charles; Kaufman, Yoram; Remer, Lorraine; Chu, D. Allen; Mattoo, Shana; Tanre, Didier; Levy, Robert; Li, Rong-Rong; Kleidman, Richard; Lau, William K. M. (Technical Monitor)

    2001-01-01

    Aerosol properties, including optical thickness and size parameters, are retrieved operationally from the MODIS sensor onboard the Terra satellite launched on 18 December 1999. The predominant aerosol type over the Southern African region is smoke, which is generated from biomass burning on land and transported over the southern Atlantic Ocean. The SAFARI-2000 period experienced smoke aerosol emissions from the regular biomass burning activities as well as from the prescribed burns administered on the auspices of the experiment. The MODIS Aerosol Science Team (MAST) formulates and implements strategies for the retrieval of aerosol products from MODIS, as well as for validating and analyzing them in order to estimate aerosol effects in the radiative forcing of climate as accurately as possible. These activities are carried out not only from a global perspective, but also with a focus on specific regions identified as having interesting characteristics, such as the biomass burning phenomenon in southern Africa and the associated smoke aerosol, particulate, and trace gas emissions. Indeed, the SAFARI-2000 aerosol measurements from the ground and from aircraft, along with MODIS, provide excellent data sources for a more intensive validation and a closer study of the aerosol characteristics over Southern Africa. The SAFARI-2000 ground-based measurements of aerosol optical thickness (AOT) from both the automatic Aerosol Robotic Network (AERONET) and handheld Sun photometers have been used to validate MODIS retrievals, based on a sophisticated spatio-temporal technique. The average global monthly distribution of aerosol from MODIS has been combined with other data to calculate the southern African aerosol daily averaged (24 hr) radiative forcing over the ocean for September 2000. It is estimated that on the average, for cloud free conditions over an area of 9 million square kin, this predominantly smoke aerosol exerts a forcing of -30 W/square m C lose to the terrestrial

  11. Development and Validation of Spectrophotometric Methods for Simultaneous Estimation of Valsartan and Hydrochlorothiazide in Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    Monika L. Jadhav

    2014-01-01

    Full Text Available Two UV-spectrophotometric methods have been developed and validated for simultaneous estimation of valsartan and hydrochlorothiazide in a tablet dosage form. The first method employed solving of simultaneous equations based on the measurement of absorbance at two wavelengths, 249.4 nm and 272.6 nm, λmax for valsartan and hydrochlorothiazide, respectively. The second method was absorbance ratio method, which involves formation of Q-absorbance equation at 258.4 nm (isoabsorptive point and also at 272.6 nm (λmax of hydrochlorothiazide. The methods were found to be linear between the range of 5–30 µg/mL for valsartan and 4–24 μg/mL for hydrochlorothiazide using 0.1 N NaOH as solvent. The mean percentage recovery was found to be 100.20% and 100.19% for the simultaneous equation method and 98.56% and 97.96% for the absorbance ratio method, for valsartan and hydrochlorothiazide, respectively, at three different levels of standard additions. The precision (intraday, interday of methods was found within limits (RSD<2%. It could be concluded from the results obtained in the present investigation that the two methods for simultaneous estimation of valsartan and hydrochlorothiazide in tablet dosage form are simple, rapid, accurate, precise and economical and can be used, successfully, in the quality control of pharmaceutical formulations and other routine laboratory analysis.

  12. Development and Validation of a Calculator for Estimating the Probability of Urinary Tract Infection in Young Febrile Children.

    Science.gov (United States)

    Shaikh, Nader; Hoberman, Alejandro; Hum, Stephanie W; Alberty, Anastasia; Muniz, Gysella; Kurs-Lasky, Marcia; Landsittel, Douglas; Shope, Timothy

    2018-06-01

    Accurately estimating the probability of urinary tract infection (UTI) in febrile preverbal children is necessary to appropriately target testing and treatment. To develop and test a calculator (UTICalc) that can first estimate the probability of UTI based on clinical variables and then update that probability based on laboratory results. Review of electronic medical records of febrile children aged 2 to 23 months who were brought to the emergency department of Children's Hospital of Pittsburgh, Pittsburgh, Pennsylvania. An independent training database comprising 1686 patients brought to the emergency department between January 1, 2007, and April 30, 2013, and a validation database of 384 patients were created. Five multivariable logistic regression models for predicting risk of UTI were trained and tested. The clinical model included only clinical variables; the remaining models incorporated laboratory results. Data analysis was performed between June 18, 2013, and January 12, 2018. Documented temperature of 38°C or higher in children aged 2 months to less than 2 years. With the use of culture-confirmed UTI as the main outcome, cutoffs for high and low UTI risk were identified for each model. The resultant models were incorporated into a calculation tool, UTICalc, which was used to evaluate medical records. A total of 2070 children were included in the study. The training database comprised 1686 children, of whom 1216 (72.1%) were female and 1167 (69.2%) white. The validation database comprised 384 children, of whom 291 (75.8%) were female and 200 (52.1%) white. Compared with the American Academy of Pediatrics algorithm, the clinical model in UTICalc reduced testing by 8.1% (95% CI, 4.2%-12.0%) and decreased the number of UTIs that were missed from 3 cases to none. Compared with empirically treating all children with a leukocyte esterase test result of 1+ or higher, the dipstick model in UTICalc would have reduced the number of treatment delays by 10.6% (95% CI

  13. Estimation of nonpaternity in the Mexican population of Nuevo Leon: a validation study with blood group markers.

    Science.gov (United States)

    Cerda-Flores, R M; Barton, S A; Marty-Gonzalez, L F; Rivas, F; Chakraborty, R

    1999-07-01

    A method for estimating the general rate of nonpaternity in a population was validated using phenotype data on seven blood groups (A1A2BO, MNSs, Rh, Duffy, Lutheran, Kidd, and P) on 396 mother, child, and legal father trios from Nuevo León, Mexico. In all, 32 legal fathers were excluded as the possible father based on genetic exclusions at one or more loci (combined average exclusion probability of 0.694 for specific mother-child phenotype pairs). The maximum likelihood estimate of the general nonpaternity rate in the population was 0.118 +/- 0.020. The nonpaternity rates in Nuevo León were also seen to be inversely related with the socioeconomic status of the families, i.e., the highest in the low and the lowest in the high socioeconomic class. We further argue that with the moderately low (69.4%) power of exclusion for these seven blood group systems, the traditional critical values of paternity index (PI > or = 19) were not good indicators of true paternity, since a considerable fraction (307/364) of nonexcluded legal fathers had a paternity index below 19 based on the seven markers. Implications of these results in the context of genetic-epidemiological studies as well as for detection of true fathers for child-support adjudications are discussed, implying the need to employ a battery of genetic markers (possibly DNA-based tests) that yield a higher power of exclusion. We conclude that even though DNA markers are more informative, the probabilistic approach developed here would still be needed to estimate the true rate of nonpaternity in a population or to evaluate the precision of detecting true fathers.

  14. Validating accelerometry estimates of energy expenditure across behaviours using heart rate data in a free-living seabird.

    Science.gov (United States)

    Hicks, Olivia; Burthe, Sarah; Daunt, Francis; Butler, Adam; Bishop, Charles; Green, Jonathan A

    2017-05-15

    Two main techniques have dominated the field of ecological energetics: the heart rate and doubly labelled water methods. Although well established, they are not without their weaknesses, namely expense, intrusiveness and lack of temporal resolution. A new technique has been developed using accelerometers; it uses the overall dynamic body acceleration (ODBA) of an animal as a calibrated proxy for energy expenditure. This method provides high-resolution data without the need for surgery. Significant relationships exist between the rate of oxygen consumption ( V̇ O 2 ) and ODBA in controlled conditions across a number of taxa; however, it is not known whether ODBA represents a robust proxy for energy expenditure consistently in all natural behaviours and there have been specific questions over its validity during diving, in diving endotherms. Here, we simultaneously deployed accelerometers and heart rate loggers in a wild population of European shags ( Phalacrocorax aristotelis ). Existing calibration relationships were then used to make behaviour-specific estimates of energy expenditure for each of these two techniques. Compared with heart rate-derived estimates, the ODBA method predicts energy expenditure well during flight and diving behaviour, but overestimates the cost of resting behaviour. We then combined these two datasets to generate a new calibration relationship between ODBA and V̇ O 2  that accounts for this by being informed by heart rate-derived estimates. Across behaviours we found a good relationship between ODBA and V̇ O 2 Within individual behaviours, we found useable relationships between ODBA and V̇ O 2  for flight and resting, and a poor relationship during diving. The error associated with these new calibration relationships mostly originates from the previous heart rate calibration rather than the error associated with the ODBA method. The equations provide tools for understanding how energy constrains ecology across the complex behaviour

  15. Better estimation of protein-DNA interaction parameters improve prediction of functional sites

    Directory of Open Access Journals (Sweden)

    O'Flanagan Ruadhan A

    2008-12-01

    Full Text Available Abstract Background Characterizing transcription factor binding motifs is a common bioinformatics task. For transcription factors with variable binding sites, we need to get many suboptimal binding sites in our training dataset to get accurate estimates of free energy penalties for deviating from the consensus DNA sequence. One procedure to do that involves a modified SELEX (Systematic Evolution of Ligands by Exponential Enrichment method designed to produce many such sequences. Results We analyzed low stringency SELEX data for E. coli Catabolic Activator Protein (CAP, and we show here that appropriate quantitative analysis improves our ability to predict in vitro affinity. To obtain large number of sequences required for this analysis we used a SELEX SAGE protocol developed by Roulet et al. The sequences obtained from here were subjected to bioinformatic analysis. The resulting bioinformatic model characterizes the sequence specificity of the protein more accurately than those sequence specificities predicted from previous analysis just by using a few known binding sites available in the literature. The consequences of this increase in accuracy for prediction of in vivo binding sites (and especially functional ones in the E. coli genome are also discussed. We measured the dissociation constants of several putative CAP binding sites by EMSA (Electrophoretic Mobility Shift Assay and compared the affinities to the bioinformatics scores provided by methods like the weight matrix method and QPMEME (Quadratic Programming Method of Energy Matrix Estimation trained on known binding sites as well as on the new sites from SELEX SAGE data. We also checked predicted genome sites for conservation in the related species S. typhimurium. We found that bioinformatics scores based on SELEX SAGE data does better in terms of prediction of physical binding energies as well as in detecting functional sites. Conclusion We think that training binding site detection

  16. [Evaluation of the adjusted amino acid score by digestibility for estimating the protein quality and protein available in food and diets].

    Science.gov (United States)

    Pak, N; Vera, G; Araya, H

    1985-03-01

    The purpose of the present study was to evaluate the amino acid score adjusted by digestibility to estimate protein quality and utilizable protein in foods and diets, considering net protein utilization (NPU) as a biological reference method. Ten foods of vegetable origin and ten of animal origin, as well as eight mixtures of foods of vegetable and animal origin were studied. When all the foods were considered, a positive (r = 0.83) and highly significant correlation (p less than 0.001) between NPU and the amino acid score adjusted by digestibility was found. When the foods were separated according to their origin, this correlation was positive only for the foods of vegetable origin (r = 0.93) and statistically significant (p less than 0.001). Also, only in those foods were similar values found between NPU and amino acid score adjusted by digestibility, as well as in utilizable protein estimated considering both methods. Caution is required to interpret protein quality and utilizable protein values of foods of animal origin and mixtures of foods of vegetable and animal origin when the amino acid score method adjusted by digestibility, or NPU, are utilized.

  17. Near-native protein loop sampling using nonparametric density estimation accommodating sparcity.

    Science.gov (United States)

    Joo, Hyun; Chavan, Archana G; Day, Ryan; Lennox, Kristin P; Sukhanov, Paul; Dahl, David B; Vannucci, Marina; Tsai, Jerry

    2011-10-01

    Unlike the core structural elements of a protein like regular secondary structure, template based modeling (TBM) has difficulty with loop regions due to their variability in sequence and structure as well as the sparse sampling from a limited number of homologous templates. We present a novel, knowledge-based method for loop sampling that leverages homologous torsion angle information to estimate a continuous joint backbone dihedral angle density at each loop position. The φ,ψ distributions are estimated via a Dirichlet process mixture of hidden Markov models (DPM-HMM). Models are quickly generated based on samples from these distributions and were enriched using an end-to-end distance filter. The performance of the DPM-HMM method was evaluated against a diverse test set in a leave-one-out approach. Candidates as low as 0.45 Å RMSD and with a worst case of 3.66 Å were produced. For the canonical loops like the immunoglobulin complementarity-determining regions (mean RMSD 7.0 Å), this sampling method produces a population of loop structures to around 3.66 Å for loops up to 17 residues. In a direct test of sampling to the Loopy algorithm, our method demonstrates the ability to sample nearer native structures for both the canonical CDRH1 and non-canonical CDRH3 loops. Lastly, in the realistic test conditions of the CASP9 experiment, successful application of DPM-HMM for 90 loops from 45 TBM targets shows the general applicability of our sampling method in loop modeling problem. These results demonstrate that our DPM-HMM produces an advantage by consistently sampling near native loop structure. The software used in this analysis is available for download at http://www.stat.tamu.edu/~dahl/software/cortorgles/.

  18. Near-native protein loop sampling using nonparametric density estimation accommodating sparcity.

    Directory of Open Access Journals (Sweden)

    Hyun Joo

    2011-10-01

    Full Text Available Unlike the core structural elements of a protein like regular secondary structure, template based modeling (TBM has difficulty with loop regions due to their variability in sequence and structure as well as the sparse sampling from a limited number of homologous templates. We present a novel, knowledge-based method for loop sampling that leverages homologous torsion angle information to estimate a continuous joint backbone dihedral angle density at each loop position. The φ,ψ distributions are estimated via a Dirichlet process mixture of hidden Markov models (DPM-HMM. Models are quickly generated based on samples from these distributions and were enriched using an end-to-end distance filter. The performance of the DPM-HMM method was evaluated against a diverse test set in a leave-one-out approach. Candidates as low as 0.45 Å RMSD and with a worst case of 3.66 Å were produced. For the canonical loops like the immunoglobulin complementarity-determining regions (mean RMSD 7.0 Å, this sampling method produces a population of loop structures to around 3.66 Å for loops up to 17 residues. In a direct test of sampling to the Loopy algorithm, our method demonstrates the ability to sample nearer native structures for both the canonical CDRH1 and non-canonical CDRH3 loops. Lastly, in the realistic test conditions of the CASP9 experiment, successful application of DPM-HMM for 90 loops from 45 TBM targets shows the general applicability of our sampling method in loop modeling problem. These results demonstrate that our DPM-HMM produces an advantage by consistently sampling near native loop structure. The software used in this analysis is available for download at http://www.stat.tamu.edu/~dahl/software/cortorgles/.

  19. Near-Native Protein Loop Sampling Using Nonparametric Density Estimation Accommodating Sparcity

    Science.gov (United States)

    Day, Ryan; Lennox, Kristin P.; Sukhanov, Paul; Dahl, David B.; Vannucci, Marina; Tsai, Jerry

    2011-01-01

    Unlike the core structural elements of a protein like regular secondary structure, template based modeling (TBM) has difficulty with loop regions due to their variability in sequence and structure as well as the sparse sampling from a limited number of homologous templates. We present a novel, knowledge-based method for loop sampling that leverages homologous torsion angle information to estimate a continuous joint backbone dihedral angle density at each loop position. The φ,ψ distributions are estimated via a Dirichlet process mixture of hidden Markov models (DPM-HMM). Models are quickly generated based on samples from these distributions and were enriched using an end-to-end distance filter. The performance of the DPM-HMM method was evaluated against a diverse test set in a leave-one-out approach. Candidates as low as 0.45 Å RMSD and with a worst case of 3.66 Å were produced. For the canonical loops like the immunoglobulin complementarity-determining regions (mean RMSD 7.0 Å), this sampling method produces a population of loop structures to around 3.66 Å for loops up to 17 residues. In a direct test of sampling to the Loopy algorithm, our method demonstrates the ability to sample nearer native structures for both the canonical CDRH1 and non-canonical CDRH3 loops. Lastly, in the realistic test conditions of the CASP9 experiment, successful application of DPM-HMM for 90 loops from 45 TBM targets shows the general applicability of our sampling method in loop modeling problem. These results demonstrate that our DPM-HMM produces an advantage by consistently sampling near native loop structure. The software used in this analysis is available for download at http://www.stat.tamu.edu/~dahl/software/cortorgles/. PMID:22028638

  20. Estimation of rumen microbial protein production from urinary purine derivatives in zebu cattle and water buffalo

    International Nuclear Information System (INIS)

    Liang, J.B.; Pimpa, O.; Abdullah, N.; Jelan, Z.A.; Nolan, J.V.

    1999-01-01

    Two experiments were conducted in order to develop equations for predicting rumen microbial protein production for indigenous Kedah-Kelantan (KK) cattle and swamp buffaloes in Malaysia, using urinary purine derivatives (PD) excretion rates. Endogenous PD excretion rates determined by a fasting procedure for KK cattle and swamp buffalo were 275 and 370 μmol/kg W 0.75 /day, respectively. Urinary PD excretion rate per kg digestible organic matter intake (DOMI) for KK cattle was higher than that for swamp buffalo, reconfirming the earlier findings. Glomerular filtration rate, allantoin and uric acid tubular load and PD re-absorption rate for swamp buffalo were generally higher than those for KK cattle. However, due to the large variations among animals within species, these parameters were not significantly different between species. Nevertheless, the higher PD reabsorption in swamp buffalo provides support for the earlier postulation that the lower urinary PD excretion rate of swamp buffalo was due to their higher recycling of plasma PD as compared to KK cattle. Labelled 8- 14 C uric acid was used to estimate the ratio of renal to non-renal PD excretion. The recovery rates of the radioactive tracer via the renal route for both species were much lower than values reported previously for unlabelled PD for European cattle. (author)

  1. Identifying Potential Protein Targets for Toluene Using a Molecular Similarity Search, in Silico Docking and in Vitro Validation

    Science.gov (United States)

    2015-01-01

    performed under standard conditions. Ana- lysis of purified hemoglobin using SDS and native polyacryl - amide gel electrophoresis (PAGE) indicated that the...search of T3DB. They represent several families of proteins (calcium-transporting ATPases, sodium/ potassium -transporting ATPase, cytochrome P450...REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1

  2. Validation and Refinement of Prediction Models to Estimate Exercise Capacity in Cancer Survivors Using the Steep Ramp Test.

    Science.gov (United States)

    Stuiver, Martijn M; Kampshoff, Caroline S; Persoon, Saskia; Groen, Wim; van Mechelen, Willem; Chinapaw, Mai J M; Brug, Johannes; Nollet, Frans; Kersten, Marie-José; Schep, Goof; Buffart, Laurien M

    2017-11-01

    To further test the validity and clinical usefulness of the steep ramp test (SRT) in estimating exercise tolerance in cancer survivors by external validation and extension of previously published prediction models for peak oxygen consumption (Vo 2peak ) and peak power output (W peak ). Cross-sectional study. Multicenter. Cancer survivors (N=283) in 2 randomized controlled exercise trials. Not applicable. Prediction model accuracy was assessed by intraclass correlation coefficients (ICCs) and limits of agreement (LOA). Multiple linear regression was used for model extension. Clinical performance was judged by the percentage of accurate endurance exercise prescriptions. ICCs of SRT-predicted Vo 2peak and W peak with these values as obtained by the cardiopulmonary exercise test were .61 and .73, respectively, using the previously published prediction models. 95% LOA were ±705mL/min with a bias of 190mL/min for Vo 2peak and ±59W with a bias of 5W for W peak . Modest improvements were obtained by adding body weight and sex to the regression equation for the prediction of Vo 2peak (ICC, .73; 95% LOA, ±608mL/min) and by adding age, height, and sex for the prediction of W peak (ICC, .81; 95% LOA, ±48W). Accuracy of endurance exercise prescription improved from 57% accurate prescriptions to 68% accurate prescriptions with the new prediction model for W peak . Predictions of Vo 2peak and W peak based on the SRT are adequate at the group level, but insufficiently accurate in individual patients. The multivariable prediction model for W peak can be used cautiously (eg, supplemented with a Borg score) to aid endurance exercise prescription. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  3. Validating the use of 137Cs and 210Pbex measurements to estimate rates of soil loss from cultivated land in southern Italy

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.

    2012-01-01

    Soil erosion represents an important threat to the long-term sustainability of agriculture and forestry in many areas of the world, including southern Italy. Numerous models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution, based on the local topography, hydrometeorology, soil type and land management. However, there remains an important need for empirical measurements to provide a basis for validating and calibrating such models and prediction procedures as well as to support specific investigations and experiments. In this context, erosion plots provide useful information on gross rates of soil loss, but are unable to document the efficiency of the onward transfer of the eroded sediment within a field and towards the stream system, and thus net rates of soil loss from larger areas. The use of environmental radionuclides, particularly caesium-137 ( 137 Cs) and excess lead-210 ( 210 Pb ex ), as a means of estimating rates of soil erosion and deposition has attracted increasing attention in recent years and the approach has now been recognised as possessing several important advantages. In order to provide further confirmation of the validity of the estimates of longer-term erosion and soil redistribution rates provided by 137 Cs and 210 Pb ex measurements, there is a need for studies aimed explicitly at validating the results obtained. In this context, the authors directed attention to the potential offered by a set of small erosion plots located near Reggio Calabria in southern Italy, for validating estimates of soil loss provided by 137 Cs and 210 Pb ex measurements. A preliminary assessment suggested that, notwithstanding the limitations and constraints involved, a worthwhile investigation aimed at validating the use of 137 Cs and 210 Pb ex measurements to estimate rates of soil loss from cultivated land could be undertaken. The results demonstrate a close consistency between the measured rates of soil loss and

  4. A Critical Review of Validation, Blind Testing, and Real- World Use of Alchemical Protein-Ligand Binding Free Energy Calculations.

    Science.gov (United States)

    Abel, Robert; Wang, Lingle; Mobley, David L; Friesner, Richard A

    2017-01-01

    Protein-ligand binding is among the most fundamental phenomena underlying all molecular biology, and a greater ability to more accurately and robustly predict the binding free energy of a small molecule ligand for its cognate protein is expected to have vast consequences for improving the efficiency of pharmaceutical drug discovery. We briefly reviewed a number of scientific and technical advances that have enabled alchemical free energy calculations to recently emerge as a preferred approach, and critically considered proper validation and effective use of these techniques. In particular, we characterized a selection bias effect which may be important in prospective free energy calculations, and introduced a strategy to improve the accuracy of the free energy predictions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Reliability and validity of the Turkish version of the Rapid Estimate of Adult Literacy in Dentistry (TREALD-30).

    Science.gov (United States)

    Peker, Kadriye; Köse, Taha Emre; Güray, Beliz; Uysal, Ömer; Erdem, Tamer Lütfi

    2017-04-01

    To culturally adapt the Turkish version of Rapid Estimate of Adult Literacy in Dentistry (TREALD-30) for Turkish-speaking adult dental patients and to evaluate its psychometric properties. After translation and cross-cultural adaptation, TREALD-30 was tested in a sample of 127 adult patients who attended a dental school clinic in Istanbul. Data were collected through clinical examinations and self-completed questionnaires, including TREALD-30, the Oral Health Impact Profile (OHIP), the Rapid Estimate of Adult Literacy in Medicine (REALM), two health literacy screening questions, and socio-behavioral characteristics. Psychometric properties were examined using Classical Test Theory (CTT) and Rasch analysis. Internal consistency (Cronbach's Alpha = 0.91) and test-retest reliability (Intraclass correlation coefficient = 0.99) were satisfactory for TREALD-30. It exhibited good convergent and predictive validity. Monthly family income, years of education, dental flossing, health literacy, and health literacy skills were found as stronger predictors of patients'oral health literacy (OHL). Confirmatory factor analysis (CFA) confirmed a two-factor model. The Rasch model explained 37.9% of the total variance in this dataset. In addition, TREALD-30 had eleven misfitting items, which indicated evidence of multidimensionality. The reliability indeces provided in Rasch analysis (person separation reliability = 0.91 and expected-a-posteriori/plausible reliability = 0.94) indicated that TREALD-30 had acceptable reliability. TREALD-30 showed satisfactory psychometric properties. It may be used to identify patients with low OHL. Socio-demographic factors, oral health behaviors and health literacy skills should be taken into account when planning future studies to assess the OHL in both clinical and community settings.

  6. Phytochemical investigation and simultaneous estimation of bioactive lupeol and stigmasterol in Abutilon indicum by validated HPTLC method

    Directory of Open Access Journals (Sweden)

    Md. Sarfaraj Hussain

    2014-05-01

    Full Text Available Objective: To perform a simultaneous quantitative estimation of two biologically active triterpenoid compounds lupeol and a steroid compound, stigmasterol, in Abutilon indicum (A. indicum using high-performance thin-layer chromatography (HPTLC. Methods: TLC aluminum plates precoated with silica-gel 60 F254 (20 cm×10 cm were used with a mobile phase of toluene-methanol-formic acid (7.0:2.7:0.3, v/v/v and densitometric determination of these compounds was carried out at 530 nm in reflectance/absorbance mode. Results: Compact bands for lupeol and stigmasterol were obtained at Rf 0.52±0.02 and 0.28±0.05. The limit of detection (45 and 18 ng/band, limit of quantification (135 and 54 ng/band, recovery (98.2%-99.7% and 97.2%-99.6% and precision (≤2.18 and 1.91 were satisfactory for lupeol and stigmasterol respectively. Linearity range for lupeol and stigmasterol were 100-1000 (r 2 =0.999 4 and 50-500 ng/band (r 2 =0.994 1 and the contents were estimated as (0.59±0.10% and (0.83±0.10% w/ w respectively. The total phenolic, flavonoid, proanthocyanidin, alkaloidal and saponin contents of methanolic extract of A. indicum were also measured in this work. According to International Conference on Harmonization (ICH guidelines, the method was validated for linearity, precision, accuracy, and recovery, limit of detection, limit of quantification, specificity, and robustness. Conclusions: The HPTLC method was found to be reproducible, accurate, and precise and could detect these two compounds at nanogram level from the A. indicum.

  7. Prediction of glutathionylation sites in proteins using minimal sequence information and their experimental validation.

    Science.gov (United States)

    Pal, Debojyoti; Sharma, Deepak; Kumar, Mukesh; Sandur, Santosh K

    2016-09-01

    S-glutathionylation of proteins plays an important role in various biological processes and is known to be protective modification during oxidative stress. Since, experimental detection of S-glutathionylation is labor intensive and time consuming, bioinformatics based approach is a viable alternative. Available methods require relatively longer sequence information, which may prevent prediction if sequence information is incomplete. Here, we present a model to predict glutathionylation sites from pentapeptide sequences. It is based upon differential association of amino acids with glutathionylated and non-glutathionylated cysteines from a database of experimentally verified sequences. This data was used to calculate position dependent F-scores, which measure how a particular amino acid at a particular position may affect the likelihood of glutathionylation event. Glutathionylation-score (G-score), indicating propensity of a sequence to undergo glutathionylation, was calculated using position-dependent F-scores for each amino-acid. Cut-off values were used for prediction. Our model returned an accuracy of 58% with Matthew's correlation-coefficient (MCC) value of 0.165. On an independent dataset, our model outperformed the currently available model, in spite of needing much less sequence information. Pentapeptide motifs having high abundance among glutathionylated proteins were identified. A list of potential glutathionylation hotspot sequences were obtained by assigning G-scores and subsequent Protein-BLAST analysis revealed a total of 254 putative glutathionable proteins, a number of which were already known to be glutathionylated. Our model predicted glutathionylation sites in 93.93% of experimentally verified glutathionylated proteins. Outcome of this study may assist in discovering novel glutathionylation sites and finding candidate proteins for glutathionylation.

  8. Clinical utility of spot urine protein-to-creatinine ratio modified by estimated daily creatinine excretion in children.

    Science.gov (United States)

    Yang, Eun Mi; Yoon, Bo Ae; Kim, Soo Wan; Kim, Chan Jong

    2017-06-01

    The spot urine protein-to-creatinine ratio (UPCR) is widely used to predict 24-h urine protein (24-h UP) excretion. In patients with low daily urine creatinine excretion (UCr), however, the UPCR may overestimate 24-h UP. The aim of this study was to predict 24-h UP using UPCR adjusted by estimated 24-h UCr in children. This study included 442 children whose 24-h UP and spot UPCR were measured concomitantly. Estimated 24-h UCr was calculated using three previously existing equations. We estimated the 24-h UP excretion from UPCR by multiplying the estimated UCr. The results were compared with the measured 24-h UP. There was a strong correlation between UPCR and 24-h UP (r = 0.801, P < 0.001), and the correlation improved after multiplying the UPCR by the measured UCr (r = 0.847, P < 0.001). Using the estimated UCr rather than the measured UCr, there was high accuracy and strong correlation between the estimated UPCR weighted by the Cockcroft-Gault equation and 24-h UP. Improvement was also observed in the subgroup (proteinuria vs. non-proteinuria) analysis, particularly in the proteinuria group. The spot UPCR multiplied by the estimated UCr improved the accuracy of prediction of the 24-h UP in children.

  9. Numerical experiment to estimate the validity of negative ion diagnostic using photo-detachment combined with Langmuir probing

    Energy Technology Data Exchange (ETDEWEB)

    Oudini, N. [Laboratoire des plasmas de décharges, Centre de Développement des Technologies Avancées, Cité du 20 Aout BP 17 Baba Hassen, 16081 Algiers (Algeria); Sirse, N.; Ellingboe, A. R. [Plasma Research Laboratory, School of Physical Sciences and NCPST, Dublin City University, Dublin 9 (Ireland); Benallal, R. [Unité de Recherche Matériaux et Energies Renouvelables, BP 119, Université Abou Bekr Belkaïd, Tlemcen 13000 (Algeria); Taccogna, F. [Istituto di Metodologie Inorganiche e di Plasmi, CNR, via Amendola 122/D, 70126 Bari (Italy); Aanesland, A. [Laboratoire de Physique des Plasmas, (CNRS, Ecole Polytechnique, Sorbonne Universités, UPMC Univ Paris 06, Univ Paris-Sud), École Polytechnique, 91128 Palaiseau Cedex (France); Bendib, A. [Laboratoire d' Electronique Quantique, Faculté de Physique, USTHB, El Alia BP 32, Bab Ezzouar, 16111 Algiers (Algeria)

    2015-07-15

    This paper presents a critical assessment of the theory of photo-detachment diagnostic method used to probe the negative ion density and electronegativity α = n{sub -}/n{sub e}. In this method, a laser pulse is used to photo-detach all negative ions located within the electropositive channel (laser spot region). The negative ion density is estimated based on the assumption that the increase of the current collected by an electrostatic probe biased positively to the plasma is a result of only the creation of photo-detached electrons. In parallel, the background electron density and temperature are considered as constants during this diagnostics. While the numerical experiments performed here show that the background electron density and temperature increase due to the formation of an electrostatic potential barrier around the electropositive channel. The time scale of potential barrier rise is about 2 ns, which is comparable to the time required to completely photo-detach the negative ions in the electropositive channel (∼3 ns). We find that neglecting the effect of the potential barrier on the background plasma leads to an erroneous determination of the negative ion density. Moreover, the background electron velocity distribution function within the electropositive channel is not Maxwellian. This is due to the acceleration of these electrons through the electrostatic potential barrier. In this work, the validity of the photo-detachment diagnostic assumptions is questioned and our results illustrate the weakness of these assumptions.

  10. Validity of the tritiated thymidine method for estimating bacterial growth rates: measurement of isotope dilution during DNA synthesis

    International Nuclear Information System (INIS)

    Pollard, P.C.; Moriarty, D.J.W.

    1984-01-01

    The rate of tritiated thymidine incorporation into DNA was used to estimate bacterial growth rates in aquatic environments. To be accurate, the calculation of growth rates has to include a factor for the dilution of isotope before incorporation. The validity of an isotope dilution analysis to determine this factor was verified in experiments reported here with cultures of a marine bacterium growing in a chemostat. Growth rates calculated from data on chemostat dilution rates and cell density agreed well with rates calculated by tritiated thymidine incorporation into DNA and isotope dilution analysis. With sufficiently high concentrations of exogenous thymidine, de novo synthesis of deoxythymidine monophosphate was inhibited, thereby preventing the endogenous dilution of isoope. The thymidine technique was also shown to be useful for measuring growth rates of mixed suspensions of bacteria growing anaerobically. Thymidine was incorporated into the DNA of a range of marine pseudomonads that were investigated. Three species did not take up thymidine. The common marine cyanobacterium Synechococcus species did not incorporate thymidine into DNA

  11. Discovery and validation of protein abundance differences between follicular thyroid neoplasms.

    NARCIS (Netherlands)

    Netea-Maier, R.T.; Hunsucker, S.W.; Hoevenaars, B.M.; Helmke, S.M.; Slootweg, P.J.; Hermus, A.R.M.M.; Haugen, B.R.; Duncan, M.W.

    2008-01-01

    Distinguishing between benign follicular thyroid adenoma (FTA) and malignant follicular thyroid carcinoma (FTC) by cytologic features alone is not possible. Molecular markers may aid distinguishing FTA from FTC in patients with indeterminate cytology. The aim of this study is to define protein

  12. A high confidence, manually validated human blood plasma protein reference set

    DEFF Research Database (Denmark)

    Schenk, Susann; Schoenhals, Gary J; de Souza, Gustavo

    2008-01-01

    BACKGROUND: The immense diagnostic potential of human plasma has prompted great interest and effort in cataloging its contents, exemplified by the Human Proteome Organization (HUPO) Plasma Proteome Project (PPP) pilot project. Due to challenges in obtaining a reliable blood plasma protein list......-trap-Fourier transform (LTQ-FT) and a linear ion trap-Orbitrap (LTQ-Orbitrap) for mass spectrometry (MS) analysis. Both instruments allow the measurement of peptide masses in the low ppm range. Furthermore, we employed a statistical score that allows database peptide identification searching using the products of two...... consecutive stages of tandem mass spectrometry (MS3). The combination of MS3 with very high mass accuracy in the parent peptide allows peptide identification with orders of magnitude more confidence than that typically achieved. RESULTS: Herein we established a high confidence set of 697 blood plasma proteins...

  13. Are Visceral Proteins Valid Markers for Nutritional Status in the Burn Intensive Care Unit?

    Science.gov (United States)

    2015-05-01

    serum CRP, haptoglobin, and α-1-antitrypsin) were measured weekly. Serum creatinine was measured daily. Urinary urea nitrogen (UUN) was measured weekly...but its effects on acute-phase reactant and visceral protein metabolism are not known. The insulin infusion may have decreased the urinary nitrogen...journal for their assistance in editing this manuscript. REFERENCES 1. Demling RH, Seigne P. Metabolic management of patients with severe burns. World

  14. Validation of a commercially available indirect ELISA using a nucleocapside recombinant protein for detection of Schmallenberg virus antibodies.

    Directory of Open Access Journals (Sweden)

    Emmanuel Bréard

    Full Text Available A newly developed Enzym Like Immuno Sorbant Assay (ELISA based on the recombinant nucleocapsid protein (N of Schmallenberg virus (SBV was evaluated and validated for the detection of SBV-specific IgG antibodies in ruminant sera by three European Reference Laboratories. Validation data sets derived from sheep, goat and bovine sera collected in France and Germany (n = 1515 in 2011 and 2012 were categorized according to the results of a virus neutralization test (VNT or an indirect immuno-fluorescence assay (IFA. The specificity was evaluated with 1364 sera from sheep, goat and bovine collected in France and Belgium before 2009. Overall agreement between VNT and ELISA was 98.9% and 98.3% between VNT and IFA, indicating a very good concordance between the different techniques. Although cross-reactions with other Orthobunyavirus from the Simbu serogroup viruses might occur, it is a highly sensitive, specific and robust ELISA-test validated to detect anti-SBV antibodies. This test can be applied for SBV sero-diagnostics and disease-surveillance studies in ruminant species in Europe.

  15. Validation of the manufacturing process used to produce long-acting recombinant factor IX Fc fusion protein.

    Science.gov (United States)

    McCue, J; Osborne, D; Dumont, J; Peters, R; Mei, B; Pierce, G F; Kobayashi, K; Euwart, D

    2014-07-01

    Recombinant factor IX Fc (rFIXFc) fusion protein is the first of a new class of bioengineered long-acting factors approved for the treatment and prevention of bleeding episodes in haemophilia B. The aim of this work was to describe the manufacturing process for rFIXFc, to assess product quality and to evaluate the capacity of the process to remove impurities and viruses. This manufacturing process utilized a transferable and scalable platform approach established for therapeutic antibody manufacturing and adapted for production of the rFIXFc molecule. rFIXFc was produced using a process free of human- and animal-derived raw materials and a host cell line derived from human embryonic kidney (HEK) 293H cells. The process employed multi-step purification and viral clearance processing, including use of a protein A affinity capture chromatography step, which binds to the Fc portion of the rFIXFc molecule with high affinity and specificity, and a 15 nm pore size virus removal nanofilter. Process validation studies were performed to evaluate identity, purity, activity and safety. The manufacturing process produced rFIXFc with consistent product quality and high purity. Impurity clearance validation studies demonstrated robust and reproducible removal of process-related impurities and adventitious viruses. The rFIXFc manufacturing process produces a highly pure product, free of non-human glycan structures. Validation studies demonstrate that this product is produced with consistent quality and purity. In addition, the scalability and transferability of this process are key attributes to ensure consistent and continuous supply of rFIXFc. © 2014 The Authors. Haemophilia Published by John Wiley & Sons Ltd.

  16. Field-level validation of a CLIMEX model for Cactoblastis cactorum (Lepidoptera: Pyralidae) using estimated larval growth rates.

    Science.gov (United States)

    Legaspi, Benjamin C; Legaspi, Jesusa Crisostomo

    2010-04-01

    Invasive pests, such as the cactus moth, Cactoblastis cactorum (Berg) (Lepidoptera: Pyralidae), have not reached equilibrium distributions and present unique opportunities to validate models by comparing predicted distributions with eventual realized geographic ranges. A CLIMEX model was developed for C. cactorum. Model validation was attempted at the global scale by comparing worldwide distribution against known occurrence records and at the field scale by comparing CLIMEX "growth indices" against field measurements of larval growth. Globally, CLIMEX predicted limited potential distribution in North America (from the Caribbean Islands to Florida, Texas, and Mexico), Africa (South Africa and parts of the eastern coast), southern India, parts of Southeast Asia, and the northeastern coast of Australia. Actual records indicate the moth has been found in the Caribbean (Antigua, Barbuda, Montserrat Saint Kitts and Nevis, Cayman Islands, and U.S. Virgin Islands), Cuba, Bahamas, Puerto Rico, southern Africa, Kenya, Mexico, and Australia. However, the model did not predict that distribution would extend from India to the west into Pakistan. In the United States, comparison of the predicted and actual distribution patterns suggests that the moth may be close to its predicted northern range along the Atlantic coast. Parts of Texas and most of Mexico may be vulnerable to geographic range expansion of C. cactorum. Larval growth rates in the field were estimated by measuring differences in head capsules and body lengths of larval cohorts at weekly intervals. Growth indices plotted against measures of larval growth rates compared poorly when CLIMEX was run using the default historical weather data. CLIMEX predicted a single period conducive to insect development, in contrast to the three generations observed in the field. Only time and more complete records will tell whether C. cactorum will extend its geographical distribution to regions predicted by the CLIMEX model. In terms

  17. Validation of 14-3-3 Protein as a Marker in Sporadic Creutzfeldt-Jakob Disease Diagnostic.

    Science.gov (United States)

    Schmitz, Matthias; Ebert, Elisabeth; Stoeck, Katharina; Karch, André; Collins, Steven; Calero, Miguel; Sklaviadis, Theodor; Laplanche, Jean-Louis; Golanska, Ewa; Baldeiras, Ines; Satoh, Katsuya; Sanchez-Valle, Raquel; Ladogana, Anna; Skinningsrud, Anders; Hammarin, Anna-Lena; Mitrova, Eva; Llorens, Franc; Kim, Yong Sun; Green, Alison; Zerr, Inga

    2016-05-01

    At present, the testing of 14-3-3 protein in cerebrospinal fluid (CSF) is a standard biomarker test in suspected sporadic Creutzfeldt-Jakob disease (sCJD) diagnosis. Increasing 14-3-3 test referrals in CJD reference laboratories in the last years have led to an urgent need to improve established 14-3-3 test methods. The main result of our study was the validation of a commercially available 14-3-3 ELISA next to the commonly used Western blot method as a high-throughput screening test. Hereby, 14-3-3 protein expression was quantitatively analyzed in CSF of 231 sCJD and 2035 control patients. We obtained excellent sensitivity/specificity values of 88 and 96% that are comparable to the established Western blot method. Since standard protocols and preanalytical sample handling have become more important in routine diagnostic, we investigated in a further step the reproducibility and stability of 14-3-3 as a biomarker for human prion diseases. Ring trial data from 2009 to 2013 revealed an increase of Fleiss' kappa from 0.51 to 0.68 indicating an improving reliability of 14-3-3 protein detection. The stability of 14-3-3 protein under short-term and long-term storage conditions at various temperatures and after repeated freezing/thawing cycles was confirmed. Contamination of CSF samples with blood appears likely to be an important factor at a concentration of more than 2500 erythrocytes/μL. Hemolysis of erythrocytes with significant release of 14-3-3 protein started after 2 days at room temperature. We first define clear standards for the sample handling, short- and long-term storage of CSF samples as well as the handling of blood- contaminated samples which may result in artificially elevated CSF levels of 14-3-3.

  18. Validity of a food frequency questionnaire to estimate long-chain polyunsaturated fatty acid intake among Japanese women in early and late pregnancy.

    Science.gov (United States)

    Kobayashi, Minatsu; Jwa, Seung Chik; Ogawa, Kohei; Morisaki, Naho; Fujiwara, Takeo

    2017-01-01

    The relative validity of food frequency questionnaires for estimating long-chain polyunsaturated fatty acid (LC-PUFA) intake among pregnant Japanese women is currently unclear. The aim of this study was to verify the external validity of a food frequency questionnaire, originally developed for non-pregnant adults, to assess the dietary intake of LC-PUFA using dietary records and serum phospholipid levels among Japanese women in early and late pregnancy. A validation study involving 188 participants in early pregnancy and 169 participants in late pregnancy was conducted. Intake LC-PUFA was estimated using a food frequency questionnaire and evaluated using a 3-day dietary record and serum phospholipid concentrations in both early and late pregnancy. The food frequency questionnaire provided estimates of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) intake with higher precision than dietary records in both early and late pregnancy. Significant correlations were observed for LC-PUFA intake estimated using dietary records in both early and late pregnancy, particularly for EPA and DHA (correlation coefficients ranged from 0.34 to 0.40, p food frequency questionnaire, which was originally designed for non-pregnant adults and was evaluated in this study against dietary records and biological markers, has good validity for assessing LC-PUFA intake, especially EPA and DHA intake, among Japanese women in early and late pregnancy. Copyright © 2016 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  19. Validation of cold plasma treatment for protein inactivation: a surface plasmon resonance-based biosensor study

    International Nuclear Information System (INIS)

    Bernard, C; Leduc, A; Barbeau, J; Saoudi, B; Yahia, L'H; Crescenzo, G De

    2006-01-01

    Gas plasma is being proposed as an interesting and promising tool to achieve sterilization. The efficacy of gas plasma to destroy bacterial spores (the most resistant living microorganisms) has been demonstrated and documented over the last ten years. In addition to causing damage to deoxyribonucleic acid by UV radiation emitted by excited species originating from the plasma, gas plasma has been shown to promote erosion of the microorganism in addition to possible oxidation reactions within the microorganism. In this work, we used lysozyme as a protein model to assess the effect of gas plasma on protein inactivation. Lysozyme samples have been subjected to the flowing afterglow of a gas discharge achieved in a nitrogen-oxygen mixture. The efficiency of this plasma treatment on lysozyme has been tested by two different assays. These are an enzyme-linked immunosorbent assay (ELISA) and a surface plasmon resonance (SPR)-based biosensor assay. The two methods showed that exposure to gas plasma can abrogate lysozyme interactions with lysozyme-specific antibodies, more likely by destroying the epitopes responsible for the interaction. More specifically, two SPR-based assays were developed since our ELISA approach did not allow us to discriminate between background and low, but still intact, quantities of lysozyme epitope after plasma treatment. Our SPR results clearly demonstrated that significant protein destruction or desorption was achieved when amounts of lysozyme less than 12.5 ng had been deposited in polystyrene 96-well ELISA plates. At higher lysozyme amounts, traces of available lysozyme epitopes were detected by SPR through indirect measurements. Finally, we demonstrated that a direct SPR approach in which biosensor-immobilized lysozyme activity is directly measured prior and after plasma treatment is more sensitive, and thus, more appropriate to define plasma treatment efficacy with more certainty

  20. Validation of cold plasma treatment for protein inactivation: a surface plasmon resonance-based biosensor study

    Science.gov (United States)

    Bernard, C.; Leduc, A.; Barbeau, J.; Saoudi, B.; Yahia, L'H.; DeCrescenzo, G.

    2006-08-01

    Gas plasma is being proposed as an interesting and promising tool to achieve sterilization. The efficacy of gas plasma to destroy bacterial spores (the most resistant living microorganisms) has been demonstrated and documented over the last ten years. In addition to causing damage to deoxyribonucleic acid by UV radiation emitted by excited species originating from the plasma, gas plasma has been shown to promote erosion of the microorganism in addition to possible oxidation reactions within the microorganism. In this work, we used lysozyme as a protein model to assess the effect of gas plasma on protein inactivation. Lysozyme samples have been subjected to the flowing afterglow of a gas discharge achieved in a nitrogen-oxygen mixture. The efficiency of this plasma treatment on lysozyme has been tested by two different assays. These are an enzyme-linked immunosorbent assay (ELISA) and a surface plasmon resonance (SPR)-based biosensor assay. The two methods showed that exposure to gas plasma can abrogate lysozyme interactions with lysozyme-specific antibodies, more likely by destroying the epitopes responsible for the interaction. More specifically, two SPR-based assays were developed since our ELISA approach did not allow us to discriminate between background and low, but still intact, quantities of lysozyme epitope after plasma treatment. Our SPR results clearly demonstrated that significant protein destruction or desorption was achieved when amounts of lysozyme less than 12.5 ng had been deposited in polystyrene 96-well ELISA plates. At higher lysozyme amounts, traces of available lysozyme epitopes were detected by SPR through indirect measurements. Finally, we demonstrated that a direct SPR approach in which biosensor-immobilized lysozyme activity is directly measured prior and after plasma treatment is more sensitive, and thus, more appropriate to define plasma treatment efficacy with more certainty.

  1. External validation of a forest inventory and analysis volume equation and comparisons with estimates from multiple stem-profile models

    Science.gov (United States)

    Christopher M. Oswalt; Adam M. Saunders

    2009-01-01

    Sound estimation procedures are desideratum for generating credible population estimates to evaluate the status and trends in resource conditions. As such, volume estimation is an integral component of the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program's reporting. In effect, reliable volume estimation procedures are...

  2. Using Clinical Factors and Mammographic Breast Density to Estimate Breast Cancer Risk: Development and Validation of a New Predictive Model

    Science.gov (United States)

    Tice, Jeffrey A.; Cummings, Steven R.; Smith-Bindman, Rebecca; Ichikawa, Laura; Barlow, William E.; Kerlikowske, Karla

    2009-01-01

    Background Current models for assessing breast cancer risk are complex and do not include breast density, a strong risk factor for breast cancer that is routinely reported with mammography. Objective To develop and validate an easy-to-use breast cancer risk prediction model that includes breast density. Design Empirical model based on Surveillance, Epidemiology, and End Results incidence, and relative hazards from a prospective cohort. Setting Screening mammography sites participating in the Breast Cancer Surveillance Consortium. Patients 1 095 484 women undergoing mammography who had no previous diagnosis of breast cancer. Measurements Self-reported age, race or ethnicity, family history of breast cancer, and history of breast biopsy. Community radiologists rated breast density by using 4 Breast Imaging Reporting and Data System categories. Results During 5.3 years of follow-up, invasive breast cancer was diagnosed in 14 766 women. The breast density model was well calibrated overall (expected–observed ratio, 1.03 [95% CI, 0.99 to 1.06]) and in racial and ethnic subgroups. It had modest discriminatory accuracy (concordance index, 0.66 [CI, 0.65 to 0.67]). Women with low-density mammograms had 5-year risks less than 1.67% unless they had a family history of breast cancer and were older than age 65 years. Limitation The model has only modest ability to discriminate between women who will develop breast cancer and those who will not. Conclusion A breast cancer prediction model that incorporates routinely reported measures of breast density can estimate 5-year risk for invasive breast cancer. Its accuracy needs to be further evaluated in independent populations before it can be recommended for clinical use. PMID:18316752

  3. Systematized water content calculation in cartilage using T1-mapping MR estimations: design and validation of a mathematical model.

    Science.gov (United States)

    Shiguetomi-Medina, J M; Ramirez-Gl, J L; Stødkilde-Jørgensen, H; Møller-Madsen, B

    2017-09-01

    Up to 80 % of cartilage is water; the rest is collagen fibers and proteoglycans. Magnetic resonance (MR) T1-weighted measurements can be employed to calculate the water content of a tissue using T1 mapping. In this study, a method that translates T1 values into water content data was tested statistically. To develop a predictive equation, T1 values were obtained for tissue-mimicking gelatin samples. 1.5 T MRI was performed using inverse angle phase and an inverse sequence at 37 (±0.5) °C. Regions of interest were manually delineated and the mean T1 value was estimated in arbitrary units. Data were collected and modeled using linear regression. To validate the method, articular cartilage from six healthy pigs was used. The experiment was conducted in accordance with the Danish Animal Experiment Committee. Double measurements were performed for each animal. Ex vivo, all water in the tissue was extracted by lyophilization, thus allowing the volume of water to be measured. This was then compared with the predicted water content via Lin's concordance correlation coefficient at the 95 % confidence level. The mathematical model was highly significant when compared to a null model (p < 0.0001). 97.3 % of the variation in water content can be explained by absolute T1 values. Percentage water content could be predicted as 0.476 + (T1 value) × 0.000193 × 100 %. We found that there was 98 % concordance between the actual and predicted water contents. The results of this study demonstrate that MR data can be used to predict percentage water contents of cartilage samples. 3 (case-control study).

  4. Rapid validated HPTLC method for estimation of piperine and piperlongumine in root of Piper longum extract and its commercial formulation

    Directory of Open Access Journals (Sweden)

    Anagha A. Rajopadhye

    2012-12-01

    Full Text Available Piperine and piperlongumine, alkaloids having diverse biological activities, commonly occur in roots of Piper longum L., Piperaceae, which have high commercial, economical and medicinal value. In present study, rapid, validated HPTLC method has been established for the determination of piperine and piperlongumine in methanolic root extract and its commercial formulation 'Mahasudarshan churna®' using ICH guidelines. The use of Accelerated Solvent Extraction (ASE as an alternative to conventional techniques has been explored. The methanol extracts of root, its formulation and both standard solutions were applied on silica gel F254 HPTLC plates. The plates were developed in Twin chamber using mobile phase toluene: ethyl acetate (6:4, v/v and scanned at 342 and 325 nm (λmax of piperine and piperlongumine, respectively using Camag TLC scanner 3 with CATS 4 software. A linear relationship was obtained between response (peak area and amount of piperine and piperlongumine in the range of 20-100 and 30-150 ng/spot, respectively; the correlation coefficient was 0.9957 and 0.9941 respectively. Sharp, symmetrical and well resolved peaks of piperine and piperlongumine spots resolved at Rf 0.51 and 0.74, respectively from other components of the sample extracts. The HPTLC method showed good linearity, recovery and high precision of both markers. Extraction of plant using ASE and rapid HPTLC method provides a new and powerful approach to estimate piperine and piperlongumine as phytomarkers in the extract as well as its commercial formulations for routine quality control.

  5. Protein Structure Validation and Refinement Using Chemical Shifts Derived from Quantum Mechanics

    DEFF Research Database (Denmark)

    Bratholm, Lars Andersen

    to within 3 A. Furthermore, a fast quantum mechanics based chemical shift predictor was developed together with methodology for using chemical shifts in structure simulations. The developed predictor was used for renement of several protein structures and for reducing the computational cost of quantum...... mechanics / molecular mechanics (QM/MM) computations of chemical shieldings. Several improvements to the predictor is ongoing, where among other things, kernel based machine learning techniques have successfully been used to improve the quantum mechanical level of theory used in the predictions....

  6. Fractionation of whey protein isolate with supercritical carbon dioxide – process modeling and cost estimation

    Science.gov (United States)

    An economical and environmentally friendly whey protein fractionation process was developed using supercritical carbon dioxide (sCO2) as an acid to produce enriched fractions of alpha-lactalbumin (alpha-La) and beta-lactoglobulin (beta-Lg) from a commercial whey protein isolate (WPI) containing 55% ...

  7. Estimation of extractable protein in botanical fractions of legume and grass species

    DEFF Research Database (Denmark)

    Solati, Zeinab; Jørgensen, Uffe; Eriksen, Jørgen

    2018-01-01

    With a globally strong interest in bio-based products such as fuels and chemicals, a feasible source of protein for the industry with positive economic impacts could be from leaves. However, more knowledge is needed on how to improve the content of extractable protein. Grasses and legumes have a ...

  8. Development of Equation Based on Urinary Purine Derivatives to Estimate Rumen Microbial Protein Production in Goats

    International Nuclear Information System (INIS)

    Jetana, Thongsuk; Abdullah, Norhani; Liang, Boo Juan; Syed Salim, Syed Jalaludin; Ho, Wan Yin

    2003-06-01

    Three experiments were conducted at the farm of the Universiti Putra Malaysia, Serdang, Selangor, Malaysia, to establish a model as an index for estimating rumen microbial protein production. In Experiment 1, six Ferral male goats (wt. 40.2±4.6 kg) were used to determine the endogenous purine derivatives (PD) excreted in the urine by fasting. In Experiment 2, four Ferral male goats (wt. 39.6±1.8 kg) were used to measure the proportion of plasma PD excreted in the urine by using [ 14 C]-uric acid as a marker at two levels of feed intake (40% and 80% voluntary intake), using an incomplete 2x4 Latin square experimental design. The feed consisted of 40% oil palm frond and 60% concentrate (OPFC). In Experiment 3, four Ferral male goats fed (OPFC)) were slaughtered and rumen contents were taken for measurements of purine and total nitrogen contents of mixed rumen microbes. The results showed that endogenous PD (allantoin, uric acid, xanthine and hypoxanthine) excreted in the urine obtained by the fasting trial was 202±17 μmol/kg BW 0 . 75 d - 1. The average percentage recovery of plasma PD excretion in the urine by using [ 14 C)-uric acid as a marker was 83±2.0% (cv=6.88, ranged 76.3-91.4%, n=8). Percentage recovery was not affected by levels of feed intake. The ratio of purine N: total N in the mixed rumen liquid associated bacteria (LAB) was 0.085. In this study, a preliminary model for goats was established by using the information from the recovery of labeled PD [ 14 C]-uric acid and the fasting PD excretion. The model obtained was Y 0.83X + 0.202 x BW 0 . 75 , where Y = PD excretion in the urine (mmol/d) X PD absorption at small intestine (mmol/d) BW 0 . 75 = Metabolic body weight (kg) Thus the microbial nitrogen based on total PD (MNpd) can be calculated as follows: MNpd = 70 x X = 0.992 x X (g/d) 0.085 x 0.83 x 1000 where 0.085 is the ratio of purine-N: total N in mixed rumen microbes, 0.83 is the average of digestibility of microbial purine from published

  9. Estimating rumen microbial protein supply for indigenous ruminants using nuclear and purine excretion techniques in Indonesia

    International Nuclear Information System (INIS)

    Soejono, M.; Yusiati, L.M.; Budhi, S.P.S.; Widyobroto, B.P.; Bachrudin, Z.

    1999-01-01

    The microbial protein supply to ruminants can be estimated based on the amount of purine derivatives (PD) excreted in the urine. Four experiments were conducted to evaluate the PD excretion method for Bali and Ongole cattle. In the first experiment, six male, two year old Bali cattle (Bos sondaicus) and six Ongole cattle (Bos indicus) of similar sex and age, were used to quantify the endogenous contribution to total PD excretion in the urine. In the second experiment, four cattle from each breed were used to examine the response of PD excretion to feed intake. 14 C-uric acid was injected in one single dose to define the partitioning ratio of renal:non-renal losses of plasma PD. The third experiment was conducted to examine the ratio of purine N:total N in mixed rumen microbial population. The fourth experiment measured the enzyme activities of blood, liver and intestinal tissues concerned with PD metabolism. The results of the first experiment showed that endogenous PD excretion was 145 ± 42.0 and 132 ± 20.0 μmol/kg W 0.75 /d, for Bali and Ongole cattle, respectively. The second experiment indicated that the proportion of plasma PD excreted in the urine of Bali and Ongole cattle was 0.78 and 0.77 respectively. Hence, the prediction of purine absorbed based on PD excretion can be stated as Y = 0.78 X + 0.145 W 0.75 and Y = 0.77 X + 0.132 W 0.75 for Bali and Ongole cattle, respectively. The third experiment showed that there were no differences in the ratio of purine N:total N in mixed rumen microbes of Bali and Ongole cattle (17% vs 18%). The last experiment, showed that intestinal xanthine oxidase activity of Bali cattle was lower than that of Ongole cattle (0.001 vs 0.015 μmol uric acid produced/min/g tissue) but xanthine oxidase activity in the blood and liver of Bali cattle was higher than that of Ongole cattle (3.48 vs 1.34 μmol/min/L plasma and 0.191 vs 0.131 μmol/min/g liver tissue). Thus, there was no difference in PD excretion between these two breeds

  10. Development and validation of a genotype 3 recombinant protein-based immunoassay for hepatitis E virus serology in swine

    Directory of Open Access Journals (Sweden)

    W.H.M. van der Poel

    2014-04-01

    Full Text Available Hepatitis E virus (HEV is classified within the family Hepeviridae, genus Hepevirus. HEV genotype 3 (Gt3 infections are endemic in pigs in Western Europe and in North and South America and cause zoonotic infections in humans. Several serological assays to detect HEV antibodies in pigs have been developed, at first mainly based on HEV genotype 1 (Gt1 antigens. To develop a sensitive HEV Gt3 ELISA, a recombinant baculovirus expression product of HEV Gt3 open reading frame-2 was produced and coated onto polystyrene ELISA plates. After incubation of porcine sera, bound HEV antibodies were detected with anti-porcine anti-IgG and anti-IgM conjugates. For primary estimation of sensitivity and specificity of the assay, sets of sera were used from pigs experimentally infected with HEV Gt3. For further validation of the assay and to set the cutoff value, a batch of 1100 pig sera was used. All pig sera were tested using the developed HEV Gt3 assay and two other serologic assays based on HEV Gt1 antigens. Since there is no gold standard available for HEV antibody testing, further validation and a definite setting of the cutoff of the developed HEV Gt3 assay were performed using a statistical approach based on Bayes' theorem. The developed and validated HEV antibody assay showed effective detection of HEV-specific antibodies. This assay can contribute to an improved detection of HEV antibodies and enable more reliable estimates of the prevalence of HEV Gt3 in swine in different regions.

  11. Structure based descriptors for the estimation of colloidal interactions and protein aggregation propensities.

    Directory of Open Access Journals (Sweden)

    Michael Brunsteiner

    Full Text Available The control of protein aggregation is an important requirement in the development of bio-pharmaceutical formulations. Here a simple protein model is proposed that was used in molecular dynamics simulations to obtain a quantitative assessment of the relative contributions of proteins' net-charges, dipole-moments, and the size of hydrophobic or charged surface patches to their colloidal interactions. The results demonstrate that the strength of these interactions correlate with net-charge and dipole moment. Variation of both these descriptors within ranges typical for globular proteins have a comparable effect. By comparison no clear trends can be observed upon varying the size of hydrophobic or charged patches while keeping the other parameters constant. The results are discussed in the context of experimental literature data on protein aggregation. They provide a clear guide line for the development of improved algorithms for the prediction of aggregation propensities.

  12. Transferability of Skills: Convergent, Postdictive, Criterion-Related, and Construct Validation of Cross-Job Retraining Time Estimates

    National Research Council Canada - National Science Library

    Kavanagh, Michael

    1997-01-01

    ... (job learning difficulty and cross-AFS differences in aptitude requirements), (b) XJRThs exhibited some postdictive validity when evaluated against Airman Retraining Program Survey retraining ease criteria, (c...

  13. Genome wide gene expression regulation by HIP1 Protein Interactor, HIPPI: Prediction and validation

    Directory of Open Access Journals (Sweden)

    Lahiri Ansuman

    2011-09-01

    Full Text Available Abstract Background HIP1 Protein Interactor (HIPPI is a pro-apoptotic protein that induces Caspase8 mediated apoptosis in cell. We have shown earlier that HIPPI could interact with a specific 9 bp sequence motif, defined as the HIPPI binding site (HBS, present in the upstream promoter of Caspase1 gene and regulate its expression. We also have shown that HIPPI, without any known nuclear localization signal, could be transported to the nucleus by HIP1, a NLS containing nucleo-cytoplasmic shuttling protein. Thus our present work aims at the investigation of the role of HIPPI as a global transcription regulator. Results We carried out genome wide search for the presence of HBS in the upstream sequences of genes. Our result suggests that HBS was predominantly located within 2 Kb upstream from transcription start site. Transcription factors like CREBP1, TBP, OCT1, EVI1 and P53 half site were significantly enriched in the 100 bp vicinity of HBS indicating that they might co-operate with HIPPI for transcription regulation. To illustrate the role of HIPPI on transcriptome, we performed gene expression profiling by microarray. Exogenous expression of HIPPI in HeLa cells resulted in up-regulation of 580 genes (p HIP1 was knocked down. HIPPI-P53 interaction was necessary for HIPPI mediated up-regulation of Caspase1 gene. Finally, we analyzed published microarray data obtained with post mortem brains of Huntington's disease (HD patients to investigate the possible involvement of HIPPI in HD pathogenesis. We observed that along with the transcription factors like CREB, P300, SREBP1, Sp1 etc. which are already known to be involved in HD, HIPPI binding site was also significantly over-represented in the upstream sequences of genes altered in HD. Conclusions Taken together, the results suggest that HIPPI could act as an important transcription regulator in cell regulating a vast array of genes, particularly transcription factors and at least, in part, play a

  14. Variables influencing wearable sensor outcome estimates in individuals with stroke and incomplete spinal cord injury: a pilot investigation validating two research grade sensors.

    Science.gov (United States)

    Jayaraman, Chandrasekaran; Mummidisetty, Chaithanya Krishna; Mannix-Slobig, Alannah; McGee Koch, Lori; Jayaraman, Arun

    2018-03-13

    Monitoring physical activity and leveraging wearable sensor technologies to facilitate active living in individuals with neurological impairment has been shown to yield benefits in terms of health and quality of living. In this context, accurate measurement of physical activity estimates from these sensors are vital. However, wearable sensor manufacturers generally only provide standard proprietary algorithms based off of healthy individuals to estimate physical activity metrics which may lead to inaccurate estimates in population with neurological impairment like stroke and incomplete spinal cord injury (iSCI). The main objective of this cross-sectional investigation was to evaluate the validity of physical activity estimates provided by standard proprietary algorithms for individuals with stroke and iSCI. Two research grade wearable sensors used in clinical settings were chosen and the outcome metrics estimated using standard proprietary algorithms were validated against designated golden standard measures (Cosmed K4B2 for energy expenditure and metabolic equivalent and manual tallying for step counts). The influence of sensor location, sensor type and activity characteristics were also studied. 28 participants (Healthy (n = 10); incomplete SCI (n = 8); stroke (n = 10)) performed a spectrum of activities in a laboratory setting using two wearable sensors (ActiGraph and Metria-IH1) at different body locations. Manufacturer provided standard proprietary algorithms estimated the step count, energy expenditure (EE) and metabolic equivalent (MET). These estimates were compared with the estimates from gold standard measures. For verifying validity, a series of Kruskal Wallis ANOVA tests (Games-Howell multiple comparison for post-hoc analyses) were conducted to compare the mean rank and absolute agreement of outcome metrics estimated by each of the devices in comparison with the designated gold standard measurements. The sensor type, sensor location

  15. Immunodetection of the serotonin transporter protein is a more valid marker for serotonergic fibers than serotonin

    DEFF Research Database (Denmark)

    Nielsen, Kirsten; Brask, Dorthe; Knudsen, Gitte M.

    2006-01-01

    Tracking serotonergic pathways in the brain through immunodetection of serotonin has widely been used for the anatomical characterization of the serotonergic system. Immunostaining for serotonin is also frequently applied for the visualization of individual serotonin containing fibers...... and quantification of serotonin positive fibers has been widely used to detect changes in the serotonergic innervation. However, particularly in conditions with enhanced serotonin metabolism the detection level of serotonin may lead to an underestimation of the true number of serotonergic fibers. The serotonin...... immunostained for serotonin and SERT protein and colocalization was quantified in several brain areas by confocal microscopy. In comparison with untreated rats, MAO inhibitor treated rats had a significantly higher number (almost 200% increase) of serotonin immunopositive fibers whereas no difference...

  16. Validating the use of 137Cs and 210Pbex measurements to estimate rates of soil loss from cultivated land in southern Italy.

    Science.gov (United States)

    Porto, Paolo; Walling, Des E

    2012-04-01

    Soil erosion represents an important threat to the long-term sustainability of agriculture and forestry in many areas of the world, including southern Italy. Numerous models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution, based on the local topography, hydrometeorology, soil type and land management. However, there remains an important need for empirical measurements to provide a basis for validating and calibrating such models and prediction procedures as well as to support specific investigations and experiments. In this context, erosion plots provide useful information on gross rates of soil loss, but are unable to document the efficiency of the onward transfer of the eroded sediment within a field and towards the stream system, and thus net rates of soil loss from larger areas. The use of environmental radionuclides, particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)), as a means of estimating rates of soil erosion and deposition has attracted increasing attention in recent years and the approach has now been recognised as possessing several important advantages. In order to provide further confirmation of the validity of the estimates of longer-term erosion and soil redistribution rates provided by (137)Cs and (210)Pb(ex) measurements, there is a need for studies aimed explicitly at validating the results obtained. In this context, the authors directed attention to the potential offered by a set of small erosion plots located near Reggio Calabria in southern Italy, for validating estimates of soil loss provided by (137)Cs and (210)Pb(ex) measurements. A preliminary assessment suggested that, notwithstanding the limitations and constraints involved, a worthwhile investigation aimed at validating the use of (137)Cs and (210)Pb(ex) measurements to estimate rates of soil loss from cultivated land could be undertaken. The results demonstrate a close consistency between the measured rates of soil

  17. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pbex measurements

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.

    2012-01-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ( 137 Cs) and excess lead-210 ( 210 Pb ex ) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from 137 Cs and 210 Pb ex measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. - Highlights: ► Soil erosion is an important threat to the long-term sustainability of agriculture.

  18. FASTERp: A Feature Array Search Tool for Estimating Resemblance of Protein Sequences

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, Derek; Egan, Rob; Wang, Zhong

    2014-03-14

    Metagenome sequencing efforts have provided a large pool of billions of genes for identifying enzymes with desirable biochemical traits. However, homology search with billions of genes in a rapidly growing database has become increasingly computationally impractical. Here we present our pilot efforts to develop a novel alignment-free algorithm for homology search. Specifically, we represent individual proteins as feature vectors that denote the presence or absence of short kmers in the protein sequence. Similarity between feature vectors is then computed using the Tanimoto score, a distance metric that can be rapidly computed on bit string representations of feature vectors. Preliminary results indicate good correlation with optimal alignment algorithms (Spearman r of 0.87, ~;;1,000,000 proteins from Pfam), as well as with heuristic algorithms such as BLAST (Spearman r of 0.86, ~;;1,000,000 proteins). Furthermore, a prototype of FASTERp implemented in Python runs approximately four times faster than BLAST on a small scale dataset (~;;1000 proteins). We are optimizing and scaling to improve FASTERp to enable rapid homology searches against billion-protein databases, thereby enabling more comprehensive gene annotation efforts.

  19. The diagnostic value of c-reactive protein estimation in differentiating bacterial from viral meningitis

    International Nuclear Information System (INIS)

    Sheikh, A.

    2001-01-01

    Objective: To evaluate the efficacy of serum and CSF C-reactive protein (C-rp) in differentiating bacterial from viral meningitis. Design: An observational, respective hospital-based study. Place and duration of study: It was conducted at the Department of Medicine and Department of Pediatrics, Shaikh Zayed Postgraduate Medical Institute Lahore, Over a Period of one year between march, 1999 and March, 2000. Subject and Methods: A randomized group of thirty patients, who presented with clinical features, suggestive of meningitis, were included in the study. C-reactive protein determinations were performed by latex agglutination method on the serum and cerebrospinal fluid (CSF) of these patients. Results: In the present study, c-reactive protein was found to be a more sensitive test for differentiating bacterial from non-bacterial meningitis on initial examination than the usual conventional methods used to diagnose bacterial meningitis. CSF C-reactive protein had a greater sensitivity (92% as compared to serum C-reactive protein (71%). Conclusion: C-reactive protein determination in CSF was found to be a useful indicator of bacterial meningitis that can be used to distinguish it from viral meningitis. (author)

  20. COMBINING LIDAR ESTIMATES OF BIOMASS AND LANDSAT ESTIMATES OF STAND AGE FOR SPATIALLY EXTENSIVE VALIDATION OF MODELED FOREST PRODUCTIVITY. (R828309)

    Science.gov (United States)

    Extensive estimates of forest productivity are required to understand the relationships between shifting land use, changing climate and carbon storage and fluxes. Aboveground net primary production of wood (NPPAw) is a major component of total NPP and...

  1. Method for the validation and uncertainty estimation of tocopherol analysis applied to soybean oil with addition of spices and TBHQ

    Directory of Open Access Journals (Sweden)

    da Silva, M. G.

    2013-09-01

    Full Text Available The tocopherol contents of refined soybean oil with the addition of rosemary, oregano, garlic, annatto seeds and TBHQ was evaluated during storage at 25 °C and 35 °C for twelve months, in comparison with a control soybean oil without the antioxidant addition. The method proposed to assess the tocopherol content was validated and the uncertainty estimation was determined. The method presented adequate linearity and precision, accuracy between 93% and 103% and expanded uncertainty of 2%. The contents of α-, γ- and δ-tocopherols of all the tested soybean oils remained constant during the storage at 25 °C and 35 °C regardless of antioxidant addition, while β-tocopherol content decreased. The addition of a mixture of rosemary, oregano, garlic and annatto seeds increased the concentration of γ- and δ-tocopherol. The oil with spices presented a similar behavior to that of the oil with the addition of TBHQ.La concentración de tocoferoles en aceite de soja refinado (muestra control, aceite de soja adicionado de romero, orégano, ajo, semilla de achiote y TBHQ fueron cuantificados durante el almacenamiento durante 12 meses a 25°C y 35°C. El método propuesto para medir tocoferoles fue validado y determinada la incertidumbre. Este método presentó linealidad y precisión adecuadas, exactitud entre 93% y 103% además de una incertidumbre expandida de 2%. Las cantidades de α-, γ- y δ-tocoferol en el aceite de soja refinado, aceite de soja adicionado de condimentos y aceite de soja adicionado con TBHQ se mantuvieron constantes durante el almacenamiento a 25°C y 35°C con excepción del β-tocoferol el cual disminuyó. El aceite de soja adicionado de condimentos (romero, orégano, ajo, y semilla de achiote presentó mayores concentraciones de γ- y δ-tocoferol en comparación con el aceite de soja refinado utilizado como control. El aceite de soja adicionado de condimentos presentó un comportamiento semejante al aceite de soja adicionado

  2. The Cross-Calibration of Spectral Radiances and Cross-Validation of CO2 Estimates from GOSAT and OCO-2

    Directory of Open Access Journals (Sweden)

    Fumie Kataoka

    2017-11-01

    Full Text Available The Greenhouse gases Observing SATellite (GOSAT launched in January 2009 has provided radiance spectra with a Fourier Transform Spectrometer for more than eight years. The Orbiting Carbon Observatory 2 (OCO-2 launched in July 2014, collects radiance spectra using an imaging grating spectrometer. Both sensors observe sunlight reflected from Earth’s surface and retrieve atmospheric carbon dioxide (CO2 concentrations, but use different spectrometer technologies, observing geometries, and ground track repeat cycles. To demonstrate the effectiveness of satellite remote sensing for CO2 monitoring, the GOSAT and OCO-2 teams have worked together pre- and post-launch to cross-calibrate the instruments and cross-validate their retrieval algorithms and products. In this work, we first compare observed radiance spectra within three narrow bands centered at 0.76, 1.60 and 2.06 µm, at temporally coincident and spatially collocated points from September 2014 to March 2017. We reconciled the differences in observation footprints size, viewing geometry and associated differences in surface bidirectional reflectance distribution function (BRDF. We conclude that the spectral radiances measured by the two instruments agree within 5% for all bands. Second, we estimated mean bias and standard deviation of column-averaged CO2 dry air mole fraction (XCO2 retrieved from GOSAT and OCO-2 from September 2014 to May 2016. GOSAT retrievals used Build 7.3 (V7.3 of the Atmospheric CO2 Observations from Space (ACOS algorithm while OCO-2 retrievals used Version 7 of the OCO-2 retrieval algorithm. The mean biases and standard deviations are −0.57 ± 3.33 ppm over land with high gain, −0.17 ± 1.48 ppm over ocean with high gain and −0.19 ± 2.79 ppm over land with medium gain. Finally, our study is complemented with an analysis of error sources: retrieved surface pressure (Psurf, aerosol optical depth (AOD, BRDF and surface albedo inhomogeneity. We found no change in XCO2

  3. Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii

    Energy Technology Data Exchange (ETDEWEB)

    Kourosh Salehi-Ashtiani; Jason A. Papin

    2012-01-13

    Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnected metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and

  4. ModFOLD6: an accurate web server for the global and local quality estimation of 3D protein models.

    Science.gov (United States)

    Maghrabi, Ali H A; McGuffin, Liam J

    2017-07-03

    Methods that reliably estimate the likely similarity between the predicted and native structures of proteins have become essential for driving the acceptance and adoption of three-dimensional protein models by life scientists. ModFOLD6 is the latest version of our leading resource for Estimates of Model Accuracy (EMA), which uses a pioneering hybrid quasi-single model approach. The ModFOLD6 server integrates scores from three pure-single model methods and three quasi-single model methods using a neural network to estimate local quality scores. Additionally, the server provides three options for producing global score estimates, depending on the requirements of the user: (i) ModFOLD6_rank, which is optimized for ranking/selection, (ii) ModFOLD6_cor, which is optimized for correlations of predicted and observed scores and (iii) ModFOLD6 global for balanced performance. The ModFOLD6 methods rank among the top few for EMA, according to independent blind testing by the CASP12 assessors. The ModFOLD6 server is also continuously automatically evaluated as part of the CAMEO project, where significant performance gains have been observed compared to our previous server and other publicly available servers. The ModFOLD6 server is freely available at: http://www.reading.ac.uk/bioinf/ModFOLD/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  6. Protein estimation and palynlogical studies of cannabis sativa l. pollen in relation to respiratory allergies

    International Nuclear Information System (INIS)

    Shinwari, Z.K.; Tanvir, M.; Yusuf, O.

    2015-01-01

    Airborne pollen allergies and asthma are on a rise in the metropolitan city of Islamabad. Knowledge of allergenic pollen is limited in the area. Cannabis sativa L. or commonly known as Hemp is widely spread weed in the city. Morphological studies performed via light microscopy and SEM have shown that the pollen of Cannabis sativa are 21 micro m long having triporate aperture, spheroidal in shape and scaberate exine. Quantitative and qualitative analysis of pollen proteins has also be done in to recognize allergenic protein bands. Bradford's analysis for proteins quantification has shown that the hemp pollen has 30.69 mg/g protein in fresh weight of pollen. While SDS-PAGE analysis showed 11 bands of various protein size ranging from 17kDa to 150kDa. The research findings indicate that Cannabis sativa, could be a potent allergenic pollen-producing weed that might cause serious health problems in the population of Islamabad. (author)

  7. Estimation of the protein synthesis rates of the whole body of growing broilers

    International Nuclear Information System (INIS)

    Koehler, R.; Pahle, T.; Gruhn, K.; Zander, R.; Jeroch, H.; Gebhardt, G.

    1988-01-01

    The purpose of the investigations was to prove a method, developed for monogastric mammalians, based on a 3-compartment model and assuming a proportional growth of the pools of total N, whether it is applicable to growing poultry. The tracer, 15 N-L-lysine, was given quasi-continuously for four days. In this time and in the following period of five days without tracer intake, the 15 N excretion in the urine was measured. The average of the live weight of the broiler cockerels was 1724 g. The animals were colostomized for sampling the urine separately. Using the fluxes of lysine, the calculation of the whole-body protein synthesis rate was 64.1 g/d. The protein degradation rate was 54.4 g/d. The adequate values of the fractional rates of protein synthesis and degradation for the whole body (without feathers) were 23.3% and 19.8%, resp. Thus it is clearly shown, that the method applied gives real data of the parameters of the N metabolism for growing broilers, being in the range of values for muscle proteins and proteins of the whole body of growing poultry, published by other authors. (author)

  8. Estimating the wound healing ability of bioactive milk proteins using an optimized cell based assay

    DEFF Research Database (Denmark)

    Nyegaard, Steffen; Andreasen, Trine; Rasmussen, Jan Trige

    Milk contains many different proteins of which the larger constituents like the caseins and major whey constituents are well characterized. We have for some time been studying the structure and function of proteins associated with the milk fat globule membrane like lactadherin, MUC1/15, xanthine...... oxidoreductase along with minor whey constituents like osteopontin, EPV20 etc. The enterocyte migration rate is a key parameter in maintaining intestinal homeostasis and intestinal repair when recovering from infection or intestinal diseases like Crohns and ulcerative colitis. We developed a novel in vitro wound...... healing assay to determine the bioactive effects of various milk proteins using human small intestine cells grown on extracellular matrix. Silicone inserts are placed in a 96-well plate and enterocytes seeded around it, creating a monolayer with a cell free area. In current ongoing experiments, various...

  9. A new method for protein estimation in large seeds using fast-neutron-activation analysis

    International Nuclear Information System (INIS)

    Gupta, U.C.; Misra, S.C.; Rao, U.S.

    1974-01-01

    A new method was developed for the determination of protein content of large seeds, using powders of different N content. The powders were obtained by mixing glucose with amino acids in different proportions and were irradiated with and without the seeds in the MeV neutron flux. The irradiated samples were counted under identical conditions and their activities were used to calculate the protein content of the seeds. The results were compared with those obtained by conventional activation technique and were found to be in good agreement. This new method has the advantage of being non-destructive. (author)

  10. Development of a Reference Data Set (RDS) for dental age estimation (DAE) and testing of this with a separate Validation Set (VS) in a southern Chinese population.

    Science.gov (United States)

    Jayaraman, Jayakumar; Wong, Hai Ming; King, Nigel M; Roberts, Graham J

    2016-10-01

    Many countries have recently experienced a rapid increase in the demand for forensic age estimates of unaccompanied minors. Hong Kong is a major tourist and business center where there has been an increase in the number of people intercepted with false travel documents. An accurate estimation of age is only possible when a dataset for age estimation that has been derived from the corresponding ethnic population. Thus, the aim of this study was to develop and validate a Reference Data Set (RDS) for dental age estimation for southern Chinese. A total of 2306 subjects were selected from the patient archives of a large dental hospital and the chronological age for each subject was recorded. This age was assigned to each specific stage of dental development for each tooth to create a RDS. To validate this RDS, a further 484 subjects were randomly chosen from the patient archives and their dental age was assessed based on the scores from the RDS. Dental age was estimated using meta-analysis command corresponding to random effects statistical model. Chronological age (CA) and Dental Age (DA) were compared using the paired t-test. The overall difference between the chronological and dental age (CA-DA) was 0.05 years (2.6 weeks) for males and 0.03 years (1.6 weeks) for females. The paired t-test indicated that there was no statistically significant difference between the chronological and dental age (p > 0.05). The validated southern Chinese reference dataset based on dental maturation accurately estimated the chronological age. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  11. Integral validation of the effective beta parameter for the MOX reactors and incinerators; Validation integrale des estimations du parametre beta effectif pour les reacteurs Mox et incinerateurs

    Energy Technology Data Exchange (ETDEWEB)

    Zammit-Averlant, V

    1998-11-19

    {beta}{sub eff}, which represents the effective delayed neutron fraction, is an important parameter for the reactor nominal working as well as for studies of its behaviour in accidental situation. In order to improve the safety of nuclear reactors, we propose here to validate its calculation by using the ERANOS code with ERALIB1 library and by taking into account all the fission process physics through the {nu} energy dependence. To validate the quality of this calculation formalism, we calculated uncertainties as precisely as possible. The experimental values of {beta}{sub eff}, as well their uncertainties, have also been re-evaluated for consistency, because these `experimental` values actually contain a calculated component. We therefore obtained an entirely coherent set of calculated and measured {beta}{sub eff}. The comparative study of the calculated and measured values pointed out that the JEF2.2 {nu}{sub d} are already sufficient because the (E-C)/C are inferior to 3 % in average and in their uncertainly bars. The experimental uncertainties, even if lightly superior to those previously edited, remain inferior to the uncertainties of the calculated values. This allowed us to fit {nu}{sub d} with {beta}{sub eff}. This adjustment has brought an additional improvement on the recommendations of the {nu}{sub d} average values, for the classical scheme (thermal energy, fast energy) and for the new scheme which explains the {nu}{sub d} energy dependence. {beta}{sub eff}, for MOX or UOX fuel assemblies in thermal or fast configurations, can therefore be obtained with an uncertainty due to the nuclear data of about 2.0 %. (author) 110 refs.

  12. PoPMuSiC 2.1: a web server for the estimation of protein stability changes upon mutation and sequence optimality

    Directory of Open Access Journals (Sweden)

    Rooman Marianne

    2011-05-01

    Full Text Available Abstract Background The rational design of modified proteins with controlled stability is of extreme importance in a whole range of applications, notably in the biotechnological and environmental areas, where proteins are used for their catalytic or other functional activities. Future breakthroughs in medical research may also be expected from an improved understanding of the effect of naturally occurring disease-causing mutations on the molecular level. Results PoPMuSiC-2.1 is a web server that predicts the thermodynamic stability changes caused by single site mutations in proteins, using a linear combination of statistical potentials whose coefficients depend on the solvent accessibility of the mutated residue. PoPMuSiC presents good prediction performances (correlation coefficient of 0.8 between predicted and measured stability changes, in cross validation, after exclusion of 10% outliers. It is moreover very fast, allowing the prediction of the stability changes resulting from all possible mutations in a medium size protein in less than a minute. This unique functionality is user-friendly implemented in PoPMuSiC and is particularly easy to exploit. Another new functionality of our server concerns the estimation of the optimality of each amino acid in the sequence, with respect to the stability of the structure. It may be used to detect structural weaknesses, i.e. clusters of non-optimal residues, which represent particularly interesting sites for introducing targeted mutations. This sequence optimality data is also expected to have significant implications in the prediction and the analysis of particular structural or functional protein regions. To illustrate the interest of this new functionality, we apply it to a dataset of known catalytic sites, and show that a much larger than average concentration of structural weaknesses is detected, quantifying how these sites have been optimized for function rather than stability. Conclusion The

  13. Density parameter estimation for finding clusters of homologous proteins-tracing actinobacterial pathogenicity lifestyles

    DEFF Research Database (Denmark)

    Röttger, Richard; Kalaghatgi, Prabhav; Sun, Peng

    2013-01-01

    Homology detection is a long-standing challenge in computational biology. To tackle this problem, typically all-versus-all BLAST results are coupled with data partitioning approaches resulting in clusters of putative homologous proteins. One of the main problems, however, has been widely neglecte...

  14. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    Science.gov (United States)

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Development and Validation of UV-Visible Spectrophotometric Methods for Simultaneous Estimation of Thiocolchicoside and Dexketoprofen in Bulk and Tablet Dosage Form

    OpenAIRE

    M. T. Harde; S. B. Jadhav; D. L. Dharam; P. D. Chaudhari

    2012-01-01

    Development and validation of two simple, accurate, precise and economical UV Spectrophotometric methods for simultaneous estimation of Thiocolchicoside and Dexketoprofen in bulk and in tablet dosage form. The methods employed were Method-1 Absorbance correction method and Method-2 First order derivative spectroscopic method. In method-1 Absorbance is measured at two wavelengths 370nm at which Dexketoprofen has no absorbance and 255nm at which both the drug have considerable absorbance. In me...

  16. Validation of a novel modified wall motion score for estimation of left ventricular ejection fraction in ischemic and non-ischemic cardiomyopathy

    Energy Technology Data Exchange (ETDEWEB)

    Scholl, David, E-mail: David.Scholl@utoronto.ca [Imaging Research Laboratories, Robarts Research Institute, London, Ontario (Canada); Kim, Han W., E-mail: hanwkim@gmail.com [Duke Cardiovascular Magnetic Resonance Center, Division of Cardiology, Duke University, NC (United States); Shah, Dipan, E-mail: djshah@tmhs.org [The Methodist DeBakey Heart Center, Houston, TX (United States); Fine, Nowell M., E-mail: nowellfine@gmail.com [Division of Cardiology, Department of Medicine, Schulich School of Medicine and Dentistry, University of Western Ontario (Canada); Tandon, Shruti, E-mail: standon4@uwo.ca [Division of Cardiology, Department of Medicine, Schulich School of Medicine and Dentistry, University of Western Ontario (Canada); Thompson, Terry, E-mail: thompson@lawsonimaging.ca [Lawson Health Research Institute, London, Ontario (Canada); Department of Medical Biophysics, University of Western Ontario, London, Ontario (Canada); Drangova, Maria, E-mail: mdrangov@imaging.robarts.ca [Imaging Research Laboratories, Robarts Research Institute, London, Ontario (Canada); Department of Medical Biophysics, University of Western Ontario, London, Ontario (Canada); White, James A., E-mail: jwhite@imaging.robarts.ca [Division of Cardiology, Department of Medicine, Schulich School of Medicine and Dentistry, University of Western Ontario (Canada); Lawson Health Research Institute, London, Ontario (Canada); Imaging Research Laboratories, Robarts Research Institute, London, Ontario (Canada)

    2012-08-15

    Background: Visual determination of left ventricular ejection fraction (LVEF) by segmental scoring may be a practical alternative to volumetric analysis of cine magnetic resonance imaging (MRI). The accuracy and reproducibility of this approach for has not been described. The purpose of this study was to validate a novel segmental visual scoring method for LVEF estimation using cine MRI. Methods: 362 patients with known or suspected cardiomyopathy were studied. A modified wall motion score (mWMS) was used to blindly score the wall motion of all cardiac segments from cine MRI imaging. The same datasets were subjected to blinded volumetric analysis using endocardial contour tracing. The population was then separated into a model cohort (N = 181) and validation cohort (N = 181), with the former used to derive a regression equation of mWMS versus true volumetric LVEF. The validation cohort was then used to test the accuracy of this regression model to estimate the true LVEF from a visually determined mWMS. Reproducibility testing of mWMS scoring was performed upon a randomly selected sample of 20 cases. Results: The regression equation relating mWMS to true LVEF in the model cohort was: LVEF = 54.23 - 0.5761 Multiplication-Sign mWMS. In the validation cohort this equation produced a strong correlation between mWMS-derived LVEF and true volumetric LVEF (r = 0.89). Bland and Altman analysis showed no systematic bias in the LVEF estimated using the mWMS (-0.3231%, 95% limits of agreement -12.22% to 11.58%). Inter-observer and intra-observer reproducibility was excellent (r = 0.93 and 0.97, respectively). Conclusion: The mWMS is a practical tool for reporting regional wall motion and provides reproducible estimates of LVEF from cine MRI.

  17. Développement et validation de NESSIE: un outil d'estimation de performances multi-critères pour systèmes-sur-puce.

    OpenAIRE

    Richard, Aliénor

    2010-01-01

    The work presented in this thesis aims at validating an original multicriteria performances estimation tool, NESSIE, dedicated to the prediction of performances to accelerate the design of electronic embedded systems. This tool has been developed in a previous thesis to cope with the limitations of existing design tools and offers a new solution to face the growing complexity of the current applications and electronic platforms and the multiple constraints they are subjected to. More precisel...

  18. The Potential of The Synergy of Sunphotometer and Lidar Data to Validate Vertical Profiles of The Aerosol Mass Concentration Estimated by An Air Quality Model

    Directory of Open Access Journals (Sweden)

    Siomos N.

    2016-01-01

    Full Text Available Vertical profiles of the aerosol mass concentration derived by the Lidar/Radiometer Inversion Code (LIRIC, that uses combined sunphotometer and lidar data, were used in order to validate the aerosol mass concentration profiles estimated by the air quality model CAMx. Lidar and CIMEL measurements performed at the Laboratory of Atmospheric Physics of the Aristotle University of Thessaloniki, Greece (40.5N, 22.9E from the period 2013-2014 were used in this study.

  19. Validating fatty acid intake as estimated by an FFQ: how does the 24 h recall perform as reference method compared with the duplicate portion?

    Science.gov (United States)

    Trijsburg, Laura; de Vries, Jeanne Hm; Hollman, Peter Ch; Hulshof, Paul Jm; van 't Veer, Pieter; Boshuizen, Hendriek C; Geelen, Anouk

    2018-05-08

    To compare the performance of the commonly used 24 h recall (24hR) with the more distinct duplicate portion (DP) as reference method for validation of fatty acid intake estimated with an FFQ. Intakes of SFA, MUFA, n-3 fatty acids and linoleic acid (LA) were estimated by chemical analysis of two DP and by on average five 24hR and two FFQ. Plasma n-3 fatty acids and LA were used to objectively compare ranking of individuals based on DP and 24hR. Multivariate measurement error models were used to estimate validity coefficients and attenuation factors for the FFQ with the DP and 24hR as reference methods. Wageningen, the Netherlands. Ninety-two men and 106 women (aged 20-70 years). Validity coefficients for the fatty acid estimates by the FFQ tended to be lower when using the DP as reference method compared with the 24hR. Attenuation factors for the FFQ tended to be slightly higher based on the DP than those based on the 24hR as reference method. Furthermore, when using plasma fatty acids as reference, the DP showed comparable to slightly better ranking of participants according to their intake of n-3 fatty acids (0·33) and n-3:LA (0·34) than the 24hR (0·22 and 0·24, respectively). The 24hR gives only slightly different results compared with the distinctive but less feasible DP, therefore use of the 24hR seems appropriate as the reference method for FFQ validation of fatty acid intake.

  20. Identification and validation of quantitative trait loci for seed yield, oil and protein contents in two recombinant inbred line populations of soybean.

    Science.gov (United States)

    Wang, Xianzhi; Jiang, Guo-Liang; Green, Marci; Scott, Roy A; Song, Qijian; Hyten, David L; Cregan, Perry B

    2014-10-01

    Soybean seeds contain high levels of oil and protein, and are the important sources of vegetable oil and plant protein for human consumption and livestock feed. Increased seed yield, oil and protein contents are the main objectives of soybean breeding. The objectives of this study were to identify and validate quantitative trait loci (QTLs) associated with seed yield, oil and protein contents in two recombinant inbred line populations, and to evaluate the consistency of QTLs across different environments, studies and genetic backgrounds. Both the mapping population (SD02-4-59 × A02-381100) and validation population (SD02-911 × SD00-1501) were phenotyped for the three traits in multiple environments. Genetic analysis indicated that oil and protein contents showed high heritabilities while yield exhibited a lower heritability in both populations. Based on a linkage map constructed previously with the mapping population and using composite interval mapping and/or interval mapping analysis, 12 QTLs for seed yield, 16 QTLs for oil content and 11 QTLs for protein content were consistently detected in multiple environments and/or the average data over all environments. Of the QTLs detected in the mapping population, five QTLs for seed yield, eight QTLs for oil content and five QTLs for protein content were confirmed in the validation population by single marker analysis in at least one environment and the average data and by ANOVA over all environments. Eight of these validated QTLs were newly identified. Compared with the other studies, seven QTLs for seed yield, eight QTLs for oil content and nine QTLs for protein content further verified the previously reported QTLs. These QTLs will be useful for breeding higher yield and better quality cultivars, and help effectively and efficiently improve yield potential and nutritional quality in soybean.

  1. Ecosystem services - from assessements of estimations to quantitative, validated, high-resolution, continental-scale mapping via airborne LIDAR

    Science.gov (United States)

    Zlinszky, András; Pfeifer, Norbert

    2016-04-01

    service potential" which is the ability of the local ecosystem to deliver various functions (water retention, carbon storage etc.), but can't quantify how much of these are actually used by humans or what the estimated monetary value is. Due to its ability to measure both terrain relief and vegetation structure in high resolution, airborne LIDAR supports direct quantification of the properties of an ecosystem that lead to it delivering a given service (such as biomass, water retention, micro-climate regulation or habitat diversity). In addition, its high resolution allows direct calibration with field measurements: routine harvesting-based ecological measurements, local biodiversity indicator surveys or microclimate recordings all take place at the human scale and can be directly linked to the local value of LIDAR-based indicators at meter resolution. Therefore, if some field measurements with standard ecological methods are performed on site, the accuracy of LIDAR-based ecosystem service indicators can be rigorously validated. With this conceptual and technical approach high resolution ecosystem service assessments can be made with well established credibility. These would consolidate the concept of ecosystem services and support both scientific research and evidence-based environmental policy at local and - as data coverage is continually increasing - continental scale.

  2. Validation databases for simulation models: aboveground biomass and net primary productive, (NPP) estimation using eastwide FIA data

    Science.gov (United States)

    Jennifer C. Jenkins; Richard A. Birdsey

    2000-01-01

    As interest grows in the role of forest growth in the carbon cycle, and as simulation models are applied to predict future forest productivity at large spatial scales, the need for reliable and field-based data for evaluation of model estimates is clear. We created estimates of potential forest biomass and annual aboveground production for the Chesapeake Bay watershed...

  3. Estimation of Genetic Parameters for the Protein Profile in Danish Holstein Milk

    DEFF Research Database (Denmark)

    Buitenhuis, Albert Johannes; Poulsen, Nina Aagaard; Larsen, Lotte Bach

    in the univariate and bivariate analysis of the protein traits. Hertitabilities ranged from 0.77 for κ-CN% to 0 for αs 1-CN% (SE: 0.13-0.21). The genetic correlation between β-CN and κ-CN was low (0.01 ± 0.53) whereas the genetic correlation of αs 2-CN with both β-CN and κ-CN was high (0.90). Furthermore α...

  4. Validity of food frequency questionnaire-based estimates of long-term long-chain n-3 polyunsaturated fatty acid intake.

    Science.gov (United States)

    Wallin, Alice; Di Giuseppe, Daniela; Burgaz, Ann; Håkansson, Niclas; Cederholm, Tommy; Michaëlsson, Karl; Wolk, Alicja

    2014-01-01

    To evaluate how long-term dietary intake of long-chain n-3 polyunsaturated fatty acids (LCn-3 PUFAs), estimated by repeated food frequency questionnaires (FFQs) over 15 years, is correlated with LCn-3 PUFAs in adipose tissue (AT). Subcutaneous adipose tissue was obtained in 2003-2004 (AT-03) from 239 randomly selected women, aged 55-75 years, after completion of a 96-item FFQ (FFQ-03). All participants had previously returned an identical FFQ in 1997 (FFQ-97) and a 67-item version in 1987-1990 (FFQ-87). Pearson product-moment correlations were used to evaluate associations between intake of total and individual LCn-3 PUFAs as estimated by the three FFQ assessments and AT-03 content (% of total fatty acids). FFQ-estimated mean relative intake of LCn-3 PUFAs (% of total fat intake) increased between all three assessments (FFQ-87, 0.55 ± 0.34; FFQ-97, 0.74 ± 0.64; FFQ-03, 0.88 ± 0.56). Validity, in terms of Pearson correlations between FFQ-03 estimates and AT-03 content, was 0.41 (95% CI 0.30-0.51) for total LCn-3 PUFA and ranged from 0.29 to 0.48 for individual fatty acids; lower correlation was observed among participants with higher percentage body fat. With regard to long-term intake estimates, past dietary intake was also correlated with AT-03 content, with correlation coefficients in the range of 0.21-0.33 and 0.21-0.34 for FFQ-97 and FFQ-87, respectively. The correlations were improved by using average estimates from two or more FFQ assessments. Exclusion of fish oil supplement users (14%) did not alter the correlations. These data indicate reasonable validity of FFQ-based estimates of long-term (up to 15 years) LCn-3 PUFA intake, justifying their use in studies of diet-disease associations.

  5. Concurrent validity and reliability of torso-worn inertial measurement unit for jump power and height estimation.

    Science.gov (United States)

    Rantalainen, Timo; Gastin, Paul B; Spangler, Rhys; Wundersitz, Daniel

    2018-09-01

    The purpose of the present study was to evaluate the concurrent validity and test-retest repeatability of torso-worn IMU-derived power and jump height in a counter-movement jump test. Twenty-seven healthy recreationally active males (age, 21.9 [SD 2.0] y, height, 1.76 [0.7] m, mass, 73.7 [10.3] kg) wore an IMU and completed three counter-movement jumps a week apart. A force platform and a 3D motion analysis system were used to concurrently measure the jumps and subsequently derive power and jump height (based on take-off velocity and flight time). The IMU significantly overestimated power (mean difference = 7.3 W/kg; P jump heights exhibited poorer concurrent validity (ICC = 0.72 to 0.78) and repeatability (ICC = 0.68) than flight-time-derived jump heights, which exhibited excellent validity (ICC = 0.93 to 0.96) and reliability (ICC = 0.91). Since jump height and power are closely related, and flight-time-derived jump height exhibits excellent concurrent validity and reliability, flight-time-derived jump height could provide a more desirable measure compared to power when assessing athletic performance in a counter-movement jump with IMUs.

  6. A validated risk score to estimate mortality risk in patients with dementia and pneumonia: barriers to clinical impact

    NARCIS (Netherlands)

    van der Steen, J.T.; Albers, G.; Strunk, E.; Muller, M.T.; Ribbe, M.W.

    2011-01-01

    Background: The clinical impact of risk score use in end-of-life settings is unknown, with reports limited to technical properties. Methods: We conducted a mixed-methods study to evaluate clinical impact of a validated mortality risk score aimed at informing prognosis and supporting clinicians in

  7. Validation of five minimally obstructive methods to estimate physical activity energy expenditure in young adults in semi-standardized settings

    DEFF Research Database (Denmark)

    Schneller, Mikkel Bo; Pedersen, Mogens Theisen; Gupta, Nidhi

    2015-01-01

    We compared the accuracy of five objective methods, including two newly developed methods combining accelerometry and activity type recognition (Acti4), against indirect calorimetry, to estimate total energy expenditure (EE) of different activities in semi-standardized settings. Fourteen particip...

  8. Reproducibility and relative validity of a food frequency questionnaire to estimate intake of dietary phylloquinone and menaquinones

    Science.gov (United States)

    Background: Several observational studies have investigated the relation of dietary phylloquinone and menaquinone intake with occurrence of chronic diseases. Most of these studies relied on food frequency questionnaires (FFQ) to estimate the intake of phylloquinone and menaquinones. However, none of...

  9. Development, standardization and validation of purine excretion technique for measuring microbial protein supply for Yerli Kara cross-breed cattle

    International Nuclear Information System (INIS)

    Cetinkaya, N.; Ozdemir, H.; Gucus, A.I.; Ozcan, H.; Sogut, A.; Yaman, S.

    2002-01-01

    Three experiments were conducted to evaluate of the developed techniques for uric acid, allantoin and creatinine in Yerli Kara cross-breed cattle on farm at different feeding level locally available feed resources and linking the observed information to feed intake and to assess of protein nutrition status of Yerli Kara cross-breed dairy cattle using urinary PD and creatinine excretion. In Experiment I. Response of daily PD excretion to feed intake in Yerli Kara cross-breed on state farm was measured. Animals were fed a mixed diet containing 30 % wheat straw and 70 % compounded feed. The diet contained 90 % DM, its N and OM contents were 124 and 950 g/kg DM, respectively. In Experiment II. Spot urine sampling techniques was applied at state farm. Four Yerli Kara cross-breed bulls live weight with a mean of 211±41.3 kg were used. Experimental design, feeding and diet were the same as in Experiment I. The treatments were [located according to a 4x4 Latin Square design. In Experiment III. Spot urine sampling techniques was applied at smallholder farms. Compound feed containing 65 % barley, 25 % bran, 6 % sunflower seed meal, 3 % manner dust and 1 % mineral and vitamin mixture (120 g/kg DM-Crude Protein and 950 g/kg DM-Organic Matter)- was offered total in between 2 to 3 kg in two parts one in the morning (07:30 h) and one in the afternoon (17:00 h). Compound feed ingredients were similar given to all animals but Groups I, II and III animals were receiving 1 to 2 kg/d of straw (30 g CP/kg DM, 93Og OM/kg DM), grass hay (70g CP/kg DM, 915 g OM/kg DM), straw and grass hay respectively. There were significant correlations (R 2 =0.99) between PD excretion (mmol/d) and DOMI (kg/d) for YK-C cattle. PD execration (mmol/L) was plotted against PD: Creatinine W 0.75 to obtain slope and use as constant for the estimation of daily PD excretion from spot sampling from animals held by small holders. The equation could be expressed as: PD (mmol/d)=8.27+0.960 (PD:CxW 0.75 ). The

  10. Estimation of Selected Milk Protein Genetic Variants by Multi-Component Analysis of Amino Acid Profiles

    OpenAIRE

    Hollar, Carol M.

    1992-01-01

    Cation-exchange fast protein liquid chromatography separated whole casein into β-casein A2, A1, and B, K-casein, αs1-casein, and αs2-casein fractions as well as γ-caseins and several unidentified peaks using a urea-acetate buffer at pH 5 and a NaCl gradient. The whole casein fractions eluted in the following order: breakdown products of β-casein and unidentified peaks; β-casein A2, Al, and B; additional breakdown products of β-casein and unidentified peaks; K-casein; αs1-casein; and αs2-casei...

  11. Estimation of hand hygiene opportunities on an adult medical ward using 24-hour camera surveillance: validation of the HOW2 Benchmark Study.

    Science.gov (United States)

    Diller, Thomas; Kelly, J William; Blackhurst, Dawn; Steed, Connie; Boeker, Sue; McElveen, Danielle C

    2014-06-01

    We previously published a formula to estimate the number of hand hygiene opportunities (HHOs) per patient-day using the World Health Organization's "Five Moments for Hand Hygiene" methodology (HOW2 Benchmark Study). HHOs can be used as a denominator for calculating hand hygiene compliance rates when product utilization data are available. This study validates the previously derived HHO estimate using 24-hour video surveillance of health care worker hand hygiene activity. The validation study utilized 24-hour video surveillance recordings of 26 patients' hospital stays to measure the actual number of HHOs per patient-day on a medicine ward in a large teaching hospital. Statistical methods were used to compare these results to those obtained by episodic observation of patient activity in the original derivation study. Total hours of data collection were 81.3 and 1,510.8, resulting in 1,740 and 4,522 HHOs in the derivation and validation studies, respectively. Comparisons of the mean and median HHOs per 24-hour period did not differ significantly. HHOs were 71.6 (95% confidence interval: 64.9-78.3) and 73.9 (95% confidence interval: 69.1-84.1), respectively. This study validates the HOW2 Benchmark Study and confirms that expected numbers of HHOs can be estimated from the unit's patient census and patient-to-nurse ratio. These data can be used as denominators in calculations of hand hygiene compliance rates from electronic monitoring using the "Five Moments for Hand Hygiene" methodology. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  12. Relative Validity and Reproducibility of an Interviewer Administered 14-Item FFQ to Estimate Flavonoid Intake Among Older Adults with Mild-Moderate Dementia.

    Science.gov (United States)

    Kent, Katherine; Charlton, Karen

    2017-01-01

    There is a large burden on researchers and participants when attempting to accurately measure dietary flavonoid intake using dietary assessment. Minimizing participant and researcher burden when collecting dietary data may improve the validity of the results, especially in older adults with cognitive impairment. A short 14-item food frequency questionnaire (FFQ) to measure flavonoid intake, and flavonoid subclasses (anthocyanins, flavan-3-ols, flavones, flavonols, and flavanones) was developed and assessed for validity and reproducibility against a 24-hour recall. Older adults with mild-moderate dementia (n = 49) attended two interviews 12 weeks apart. With the assistance of a family carer, a 24-h recall was collected at the first interview, and the flavonoid FFQ was interviewer-administered at both time-points. Validity and reproducibility was assessed using the Wilcoxon signed-rank sum test, Spearman's correlation coefficient, Bland-Altman Plots, and Cohen's kappa. Mean flavonoid intake was determined (FFQ1 = 795 ± 492.7 mg/day, 24-h recall = 515.6 ± 384.3 mg/day). Tests of validity indicated the FFQ was better at estimating total flavonoid intake than individual flavonoid subclasses compared with the 24-h recall. There was a significant difference in total flavonoid intake estimates between the FFQ and the 24-h recall (Wilcoxon signed-rank sum p Wilcoxon signed-rank sum test showed no significant difference, Spearman's correlation coefficient indicated excellent reliability (r = 0.75, p < 0.001), Bland-Altman plots visually showed small, nonsignificant bias and wide limits of agreement, and Cohen's kappa indicated fair agreement (κ = 0.429, p < 0.001). A 14-item FFQ developed to easily measure flavonoid intake in older adults with dementia demonstrates fair validity against a 24-h recall and good reproducibility.

  13. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors.

    Science.gov (United States)

    Jongsma, Maikel; Florczyk, Urszula M; Hendriks-Balk, Mariëlle C; Michel, Martin C; Peters, Stephan L M; Alewijnse, Astrid E

    2007-07-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative method using a sphingosine-1-phosphate (S1P) receptor as a model. Because of a lack of suitable binding studies, it has been difficult to study S1P receptor internalisation. Using a N-terminal HisG-tag, S1P(1) receptors on the cell membrane can be visualised via immunocytochemistry with a specific anti-HisG antibody. S1P-induced internalisation was concentration dependent and was quantified using a microplate reader, detecting either absorbance, a fluorescent or luminescent signal, depending on the antibodies used. Among those, the fluorescence detection method was the most convenient to use. The relative ease of this method makes it suitable to measure a large number of data points, e.g. to compare the potency and efficacy of receptor ligands.

  14. Prediction of CD8+ Epitopes in Leishmania braziliensis Proteins Using EPIBOT: In Silico Search and In Vivo Validation.

    Directory of Open Access Journals (Sweden)

    Angelo Duarte

    Full Text Available Leishmaniasis is caused by intracellular Leishmania parasites that induce a T-cell mediated response associated with recognition of CD4+ and CD8+ T cell Line 1Lineepitopes. Identification of CD8+ antigenic determinants is crucial for vaccine and therapy development. Herein, we developed an open-source software dedicated to search and compile data obtained from currently available on line prediction algorithms.We developed a two-phase algorithm and implemented in an open source software called EPIBOT, that consolidates the results obtained with single prediction algorithms, generating a final output in which epitopes are ranked. EPIBOT was initially trained using a set of 831 known epitopes from 397 proteins from IEDB. We then screened 63 Leishmania braziliensis vaccine candidates with the EPIBOT trained tool to search for CD8+ T cell epitopes. A proof-of-concept experiment was conducted with the top eight CD8+ epitopes, elected by EPIBOT. To do this, the elected peptides were synthesized and validated for their in vivo cytotoxicity. Among the tested epitopes, three were able to induce lysis of pulsed-target cells.Our results show that EPIBOT can successfully search across existing prediction tools, generating a compiled list of candidate CD8+ epitopes. This software is fast and a simple search engine that can be customized to search over different MHC alleles or HLA haplotypes.

  15. The German version of the Expanded Prostate Cancer Index Composite (EPIC): translation, validation and minimal important difference estimation.

    Science.gov (United States)

    Umbehr, Martin H; Bachmann, Lucas M; Poyet, Cedric; Hammerer, Peter; Steurer, Johann; Puhan, Milo A; Frei, Anja

    2018-02-20

    No official German translation exists for the 50-item Expanded Prostate Cancer Index Composite (EPIC), and no minimal important difference (MID) has been established yet. The aim of the study was to translate and validate a German version of the EPIC with cultural adaptation to the different German speaking countries and to establish the MID. We translated and culturally adapted the EPIC into German. For validation, we included a consecutive subsample of 92 patients with localized prostate cancer undergoing radical prostatectomy who participated the Prostate Cancer Outcomes Cohort. Baseline and follow-up assessments took place before and six weeks after prostatectomy in 2010 and 2011. We assessed the EPIC, EORTC QLQ-PR25, Feeling Thermometer, SF-36 and a global rating of health state change variable. We calculated the internal consistency, test-retest reliability, construct validity, responsiveness and MID. For most EPIC domains and subscales, our a priori defined criteria for reliability were fulfilled (construct reliability: Cronbach's alpha 0.7-0.9; test-retest reliability: intraclass-correlation coefficient ≥ 0.7). Cross-sectional and longitudinal correlations between EPIC and EORTC QLQ-PR25 domains ranged from 0.14-0.79, and 0.06-0.5 and 0.08-0.72 for Feeling Thermometer and SF-36, respectively. We established MID values of 10, 4, 12, and 6 for the urinary, bowel, sexual and hormonal domain. The German version of the EPIC is reliable, responsive and valid to measure HRQL in prostate cancer patients and is now available in German language. With the suggested MID we provide interpretation to what extent changes in HRQL are clinically relevant for patients. Hence, study results are of interest beyond German speaking countries.

  16. Genetic mapping and validation of the loci controlling 7S α' and 11S A-type storage protein subunits in soybean [Glycine max (L.) Merr.].

    Science.gov (United States)

    Boehm, Jeffrey D; Nguyen, Vi; Tashiro, Rebecca M; Anderson, Dale; Shi, Chun; Wu, Xiaoguang; Woodrow, Lorna; Yu, Kangfu; Cui, Yuhai; Li, Zenglu

    2018-03-01

    Four soybean storage protein subunit QTLs were mapped using bulked segregant analysis and an F 2 population, which were validated with an F 5 RIL population. The storage protein globulins β-conglycinin (7S subunit) and glycinin (11S subunits) can affect the quantity and quality of proteins found in soybean seeds and account for more than 70% of the total soybean protein. Manipulating the storage protein subunits to enhance soymeal nutrition and for desirable tofu manufacturing characteristics are two end-use quality goals in soybean breeding programs. To aid in developing soybean cultivars with desired seed composition, an F 2 mapping population (n = 448) and an F 5 RIL population (n = 180) were developed by crossing high protein cultivar 'Harovinton' with the breeding line SQ97-0263_3-1a, which lacks the 7S α', 11S A 1 , 11S A 2 , 11S A 3 and 11S A 4 subunits. The storage protein composition of each individual in the F 2 and F 5 populations were profiled using SDS-PAGE. Based on the presence/absence of the subunits, genomic DNA bulks were formed among the F 2 plants to identify genomic regions controlling the 7S α' and 11S protein subunits. By utilizing polymorphic SNPs between the bulks characterized with Illumina SoySNP50K iSelect BeadChips at targeted genomic regions, KASP assays were designed and used to map QTLs causing the loss of the subunits. Soybean storage protein QTLs were identified on Chromosome 3 (11S A 1 ), Chromosome 10 (7S α' and 11S A 4 ), and Chromosome 13 (11S A 3 ), which were also validated in the F 5 RIL population. The results of this research could allow for the deployment of marker-assisted selection for desired storage protein subunits by screening breeding populations using the SNPs linked with the subunits of interest.

  17. The validity of anthropometric leg muscle volume estimation across a wide spectrum: from able-bodied adults to individuals with a spinal cord injury.

    Science.gov (United States)

    Layec, Gwenael; Venturelli, Massimo; Jeong, Eun-Kee; Richardson, Russell S

    2014-05-01

    The assessment of muscle volume, and changes over time, have significant clinical and research-related implications. Methods to assess muscle volume vary from simple and inexpensive to complex and expensive. Therefore this study sought to examine the validity of muscle volume estimated simply by anthropometry compared with the more complex proton magnetic resonance imaging ((1)H-MRI) across a wide spectrum of individuals including those with a spinal cord injury (SCI), a group recognized to exhibit significant muscle atrophy. Accordingly, muscle volume of the thigh and lower leg of eight subjects with a SCI and eight able-bodied subjects (controls) was determined by anthropometry and (1)H-MRI. With either method, muscle volumes were significantly lower in the SCI compared with the controls (P muscle volume were strongly correlated to the values assessed by (1)H-MRI in both the thigh (r(2) = 0.89; P muscle volume compared with (1)H-MRI in both the thigh (mean bias = 2407cm(3)) and the lower (mean bias = 170 cm(3)) leg. Thus with an appropriate correction for this systemic overestimation, muscle volume estimated from anthropometric measurements is a valid approach and provides acceptable accuracy across a spectrum of adults with normal muscle mass to a SCI and severe muscle atrophy. In practical terms this study provides the formulas that add validity to the already simple and inexpensive anthropometric approach to assess muscle volume in clinical and research settings.

  18. Creation of a Human Secretome: A Novel Composite Library of Human Secreted Proteins: Validation Using Ovarian Cancer Gene Expression Data and a Virtual Secretome Array.

    Science.gov (United States)

    Vathipadiekal, Vinod; Wang, Victoria; Wei, Wei; Waldron, Levi; Drapkin, Ronny; Gillette, Michael; Skates, Steven; Birrer, Michael

    2015-11-01

    To generate a comprehensive "Secretome" of proteins potentially found in the blood and derive a virtual Affymetrix array. To validate the utility of this database for the discovery of novel serum-based biomarkers using ovarian cancer transcriptomic data. The secretome was constructed by aggregating the data from databases of known secreted proteins, transmembrane or membrane proteins, signal peptides, G-protein coupled receptors, or proteins existing in the extracellular region, and the virtual array was generated by mapping them to Affymetrix probeset identifiers. Whole-genome microarray data from ovarian cancer, normal ovarian surface epithelium, and fallopian tube epithelium were used to identify transcripts upregulated in ovarian cancer. We established the secretome from eight public databases and a virtual array consisting of 16,521 Affymetrix U133 Plus 2.0 probesets. Using ovarian cancer transcriptomic data, we identified candidate blood-based biomarkers for ovarian cancer and performed bioinformatic validation by demonstrating rediscovery of known biomarkers including CA125 and HE4. Two novel top biomarkers (FGF18 and GPR172A) were validated in serum samples from an independent patient cohort. We present the secretome, comprising the most comprehensive resource available for protein products that are potentially found in the blood. The associated virtual array can be used to translate gene-expression data into cancer biomarker discovery. A list of blood-based biomarkers for ovarian cancer detection is reported and includes CA125 and HE4. FGF18 and GPR172A were identified and validated by ELISA as being differentially expressed in the serum of ovarian cancer patients compared with controls. ©2015 American Association for Cancer Research.

  19. Air temperature estimation with MSG-SEVIRI data: Calibration and validation of the TVX algorithm for the Iberian Peninsula

    DEFF Research Database (Denmark)

    Nieto Solana, Hector; Sandholt, Inge; Aguado, Inmaculada

    2011-01-01

    Air temperature can be estimated from remote sensing by combining information in thermal infrared and optical wavelengths. The empirical TVX algorithm is based on an estimated linear relationship between observed Land Surface Temperature (LST) and a Spectral Vegetation Index (NDVI). Air temperature...... variation, land cover, landscape heterogeneity and topography. Results showed that the new calibrated NDVImax perform well, with a Mean Absolute Error ranging between 2.8 °C and 4 °C. In addition, vegetation-specific NDVImax improve the accuracy compared with a unique NDVImax....

  20. The excretion of isotope in urea and ammonia for estimating protein turnover in man with [15N]glycine

    International Nuclear Information System (INIS)

    Fern, E.B.; Garlick, P.J.; McNurlan, M.A.; Waterlow, J.C.

    1981-01-01

    Four normal adults were given [ 15 N]-glycine in a single dose either orally or intravenously. Rates of whole-body protein turnover were estimated from the excretion of 15 N in ammonia and in urea during the following 9 h. The rate derived from urea took account of the [ 15 N]urea retained in body water. In postabsorptive subjects the rates of protein synthesis given by ammonia were equal to those from urea, when the isotope was given orally, but lower when an intravenous dose was given. In subjects receiving equal portions of food every 2 h rates of synthesis calculated from ammonia were much lower than those from urea whether an oral or intravenous isotope was given. Comparison of rates obtained during the postabsorptive and absorptive periods indicated regulation by food intake primarily of synthesis when measurements were made on urea, but regulation primarily of breakdown when measurements were made on ammonia. These inconsistencies suggest that changes in protein metabolism might be assessed better by correlating results given by different end-products, and it is suggested that the mean value given by urea and ammonia will be useful for this purpose. (author)

  1. Excretion of isotope in urea and ammonia for estimating protein turnover in man with (/sup 15/N)glycine

    Energy Technology Data Exchange (ETDEWEB)

    Fern, E B; Garlick, P J; McNurlan, M A; Waterlow, J C [London School of Hygiene and Tropical Medicine (UK)

    1981-01-01

    Four normal adults were given (/sup 15/N)-glycine in a single dose either orally or intravenously. Rates of whole-body protein turnover were estimated from the excretion of /sup 15/N in ammonia and in urea during the following 9 h. The rate derived from urea took account of the (/sup 15/N)urea retained in body water. In postabsorptive subjects the rates of protein synthesis given by ammonia were equal to those from urea, when the isotope was given orally, but lower when an intravenous dose was given. In subjects receiving equal portions of food every 2 h rates of synthesis calculated from ammonia were much lower than those from urea whether an oral or intravenous isotope was given. Comparison of rates obtained during the postabsorptive and absorptive periods indicated regulation by food intake primarily of synthesis when measurements were made on urea, but regulation primarily of breakdown when measurements were made on ammonia. These inconsistencies suggest that changes in protein metabolism might be assessed better by correlating results given by different end-products, and it is suggested that the mean value given by urea and ammonia will be useful for this purpose.

  2. Improved GRACE regional mass balance estimates of the Greenland ice sheet cross-validated with the input-output method

    NARCIS (Netherlands)

    Xu, Zheng; Schrama, Ernst J. O.; van der Wal, Wouter; van den Broeke, Michiel; Enderlin, Ellyn M.

    2016-01-01

    In this study, we use satellite gravimetry data from the Gravity Recovery and Climate Experiment (GRACE) to estimate regional mass change of the Greenland ice sheet (GrIS) and neighboring glaciated regions using a least squares inversion approach. We also consider results from the input–output

  3. Improved GRACE regional mass balance estimates of the Greenland ice sheet cross-validated with the input–output method

    NARCIS (Netherlands)

    Xu, Z.; Schrama, E.J.O.; van der Wal, W.; van den Broeke, MR; Enderlin, EM

    2016-01-01

    In this study, we use satellite gravimetry data from the Gravity Recovery and Climate Experiment (GRACE) to estimate regional mass change of the Greenland ice sheet (GrIS) and neighboring glaciated regions using a least squares inversion approach. We also consider results from the input–output

  4. Validation of abundance estimates from mark-recapture and removal techniques for rainbow trout captured by electrofishing in small streams

    Science.gov (United States)

    Amanda E. Rosenberger; Jason B. Dunham

    2005-01-01

    Estimation of fish abundance in streams using the removal model or the Lincoln–Peterson mark–recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams....

  5. Identification and Validation of Protein Biomarkers of Response to Neoadjuvant Platinum Chemotherapy in Muscle Invasive Urothelial Carcinoma.

    Directory of Open Access Journals (Sweden)

    Alexander S Baras

    of both these protein biomarkers detected by IHC in biopsy specimens along with the relevant clinical parameters resulted in a prediction model able to significantly stratify the likelihood of NAC resistance in our cohort (n = 37 into two well separated halves: low-26% n = 19 and high-89% n = 18, Fisher's exact p = 0.0002.We illustrate the feasibility of translating a gene expression signature of NAC response from a discovery cohort into immunohistochemical markers readily applicable to MIBC biopsy specimens in our independent cohort. The results from this study are being characterized in additional validation cohorts. Additionally, we anticipate that emerging somatic mutations in MIBC will also be important for NAC response prediction. The relationship of the findings in this study to the current understanding of variant histologic subtypes of MIBC along with the evolving molecular subtypes of MIBC as it relates to NAC response remains to be fully characterized.

  6. A validated calculator to estimate risk of cesarean after an induction of labor with an unfavorable cervix.

    Science.gov (United States)

    Levine, Lisa D; Downes, Katheryne L; Parry, Samuel; Elovitz, Michal A; Sammel, Mary D; Srinivas, Sindhu K

    2018-02-01

    Induction of labor occurs in >20% of pregnancies, which equates to approximately 1 million women undergoing an induction in the United States annually. Regardless of how common inductions are, our ability to predict induction success is limited. Although multiple risk factors for a failed induction have been identified, risk factors alone are not enough to quantify an actual risk of cesarean for an individual woman undergoing a cesarean. The objective of this study was to derive and validate a prediction model for cesarean after induction with an unfavorable cervix and to create a Web-based calculator to assist in patient counseling. Derivation and validation of a prediction model for cesarean delivery after induction was performed as part of a planned secondary analysis of a large randomized trial. A predictive model for cesarean delivery was derived using multivariable logistic regression from a large randomized trial on induction methods (n = 491) that took place from 2013 through 2015 at an academic institution. Full-term (≥37 weeks) women carrying a singleton gestation with intact membranes and an unfavorable cervix (Bishop score ≤6 and dilation ≤2 cm) undergoing an induction were included in this trial. Both nulliparous and multiparous women were included. Women with a prior cesarean were excluded. Refinement of the prediction model was performed using an observational cohort of women from the same institution who underwent an induction (n = 364) during the trial period. An external validation was performed utilizing a publicly available database (Consortium for Safe Labor) that includes information for >200,000 deliveries from 19 hospitals across the United States from 2002 through 2008. After applying the same inclusion and exclusion criteria utilized in the derivation cohort, a total of 8466 women remained for analysis. The discriminative power of each model was assessed using a bootstrap, bias-corrected area under the curve. The cesarean delivery

  7. Estimation of low-dose radiation-responsive proteins in the absence of genomic instability in normal human fibroblast cells.

    Science.gov (United States)

    Yim, Ji-Hye; Yun, Jung Mi; Kim, Ji Young; Nam, Seon Young; Kim, Cha Soon

    2017-11-01

    Low-dose radiation has various biological effects such as adaptive responses, low-dose hypersensitivity, as well as beneficial effects. However, little is known about the particular proteins involved in these effects. Here, we sought to identify low-dose radiation-responsive phosphoproteins in normal fibroblast cells. We assessed genomic instability and proliferation of fibroblast cells after γ-irradiation by γ-H2AX foci and micronucleus formation analyses and BrdU incorporation assay, respectively. We screened fibroblast cells 8 h after low-dose (0.05 Gy) γ-irradiation using Phospho Explorer Antibody Microarray and validated two differentially expressed phosphoproteins using Western blotting. Cell proliferation proceeded normally in the absence of genomic instability after low-dose γ-irradiation. Phospho antibody microarray analysis and Western blotting revealed increased expression of two phosphoproteins, phospho-NFκB (Ser536) and phospho-P70S6K (Ser418), 8 h after low-dose radiation. Our findings suggest that low-dose radiation of normal fibroblast cells activates the expression of phospho-NFκB (Ser536) and phospho-P70S6K (Ser418) in the absence of genomic instability. Therefore, these proteins may be involved in DNA damage repair processes.

  8. Estimation of Daily Proteinuria in Patients with Amyloidosis by Using the Protein-To-Creatinine ratio in Random Urine Samples.

    Science.gov (United States)

    Talamo, Giampaolo; Mir Muhammad, A; Pandey, Manoj K; Zhu, Junjia; Creer, Michael H; Malysz, Jozef

    2015-02-11

    Measurement of daily proteinuria in patients with amyloidosis is recommended at the time of diagnosis for assessing renal involvement, and for monitoring disease activity. Renal involvement is usually defined by proteinuria >500 mg/day. We evaluated the accuracy of the random urine protein-to-creatinine ratio (Pr/Cr) in predicting 24 hour proteinuria in patient with amyloidosis. We compared results of random urine Pr/Cr ratio and concomitant 24-hour urine collections in 44 patients with amyloidosis. We found a strong correlation (Spearman's ρ=0.874) between the Pr/Cr ratio and the 24 hour urine protein excretion. For predicting renal involvement, the optimal cut-off point of the Pr/Cr ratio was 715 mg/g. The sensitivity and specificity for this point were 91.8% and 95.5%, respectively, and the area under the curve value was 97.4%. We conclude that the random urine Pr/Cr ratio could be useful in the screening of renal involvement in patients with amyloidosis. If validated in a prospective study, the random urine Pr/Cr ratio could replace the 24 hour urine collection for the assessment of daily proteinuria and presence of nephrotic syndrome in patients with amyloidosis.

  9. Estimation of daily proteinuria in patients with amyloidosis by using the protein-to-creatinine ratio in random urine sample

    Directory of Open Access Journals (Sweden)

    Giampaolo Talamo

    2015-02-01

    Full Text Available Measurement of daily proteinuria in patients with amyloidosis is recommended at the time of diagnosis for assessing renal involvement, and for monitoring disease activity. Renal involvement is usually defined by proteinuria >500 mg/day. We evaluated the accuracy of the random urine protein-to-creatinine ratio (Pr/Cr in predicting 24 hour proteinuria in patient with amyloidosis. We com- pared results of random urine Pr/Cr ratio and concomitant 24-hour urine collections in 44 patients with amyloidosis. We found a strong correlation (Spearman’s ρ=0.874 between the Pr/Cr ratio and the 24 hour urine protein excretion. For predicting renal involvement, the optimal cut-off point of the Pr/Cr ratio was 715 mg/g. The sensitivity and specificity for this point were 91.8% and 95.5%, respectively, and the area under the curve value was 97.4%. We conclude that the random urine Pr/Cr ratio could be useful in the screening of renal involvement in patients with amyloidosis. If validated in a prospective study, the random urine Pr/Cr ratio could replace the 24 hour urine collection for the assessment of daily proteinuria and presence of nephrotic syndrome in patients with amyloidosis.

  10. Antibody-validated proteins in inflamed islets of fulminant type 1 diabetes profiled by laser-capture microdissection followed by mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Yoriko Nishida

    Full Text Available There are no reports of proteomic analyses of inflamed islets in type 1 diabetes.Proteins expressed in the islets of enterovirus-associated fulminant type 1 diabetes (FT1DM with extensive insulitis were identified by laser-capture microdissection mass spectrometry using formalin-fixed paraffin-embedded pancreatic tissues.Thirty-eight proteins were identified solely in FT1DM islets, most of which have not been previously linked to type 1 diabetes. Five protein-protein interacting clusters were identified, and the cellular localization of selected proteins was validated immunohistochemically. Migratory activity-related proteins, including plastin-2 (LCP1, moesin (MSN, lamin-B1 (LMNB1, Ras GTPase-activating-like protein (IQGAP1 and others, were identified in CD8+ T cells and CD68+ macrophages infiltrated to inflamed FT1DM islets. Proteins involved in successive signaling in innate/adaptive immunity were identified, including SAM domain and HD domain-containing protein 1 (SAMHD1, Ras GTPase-activating-like protein (IQGAP1, proteasome activator complex subunit 1 (PSME1, HLA class I histocompatibility antigen (HLA-C, and signal transducer and activator of transcription 1-alpha/beta (STAT1. Angiogenic (thymidine phosphorylase (TYMP and anti-angiogenic (tryptophan-tRNA ligase (WARS factors were identified in migrating CD8+ T cells and CD68+ macrophages. Proteins related to virus replication and cell proliferation, including probable ATP-dependent RNA helicase DEAD box helicase 5 (DDX5 and heterogeneous nuclear ribonucleoprotein H (HNRNPH1, were identified. The anti-apoptotic protein T-complex protein 1 subunit epsilon (CCT5, the anti-oxidative enzyme 6-phosphogluconate dehydrogenase (PDG, and the anti-viral and anti-apoptotic proteins serpin B6 (SERPINB6 and heat shock 70 kDa protein1-like (HSPA1L, were identified in FT1DM-affected islet cells.The identified FT1DM-characterizing proteins include those involved in aggressive beta cell destruction through

  11. Validating a mass balance accounting approach to using 7Be measurements to estimate event-based erosion rates over an extended period at the catchment scale

    Science.gov (United States)

    Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni

    2016-07-01

    Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.

  12. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia

    Directory of Open Access Journals (Sweden)

    Mongia Bouchoucha

    2016-08-01

    Full Text Available Background: Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. Methods: A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos to the reference method [food weighing (FW]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland–Altman limits of agreement. In total, 31 male and female volunteers aged 9–89 participated in the study. Results: We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method to those actually consumed (using FW. Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05 and dairy products (p<0.05. The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001 and dietary fiber (p<0.05. A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. Conclusion: The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.

  13. Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia.

    Science.gov (United States)

    Bouchoucha, Mongia; Akrout, Mouna; Bellali, Hédia; Bouchoucha, Rim; Tarhouni, Fadwa; Mansour, Abderraouf Ben; Zouari, Béchir

    2016-01-01

    Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions. A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos) to the reference method [food weighing (FW)]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by Bland-Altman limits of agreement. In total, 31 male and female volunteers aged 9-89 participated in the study. We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method) to those actually consumed (using FW). Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (p<0.05) and dairy products (p<0.05). The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (p<0.001) and dietary fiber (p<0.05). A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods. The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.

  14. Development and validation of a stability-indicating RP–HPLC method for estimation of atazanavir sulfate in bulk

    Directory of Open Access Journals (Sweden)

    S. Dey

    2017-04-01

    Full Text Available A stability-indicating reverse phase–high performance liquid chromatography (RP–HPLC method was developed and validated for the determination of atazanavir sulfate in tablet dosage forms using C18 column Phenomenix (250 mm×4.6 mm, 5 μm with a mobile phase consisting of 900 mL of HPLC grade methanol and 100 mL of water of HPLC grade. The pH was adjusted to 3.55 with acetic acid. The mobile phase was sonicated for 10 min and filtered through a 0.45 μm membrane filter at a flow rate of 0.5 mL/min. The detection was carried out at 249 nm and retention time of atazanavir sulfate was found to be 8.323 min. Linearity was observed from 10 to 90 μg/mL (coefficient of determination R2 was 0.999 with equation, y=23.427x+37.732. Atazanavir sulfate was subjected to stress conditions including acidic, alkaline, oxidation, photolysis and thermal degradation, and the results showed that it was more sensitive towards acidic degradation. The method was validated as per ICH guidelines.

  15. Development and validation of reversed-phase HPLC gradient method for the estimation of efavirenz in plasma.

    Directory of Open Access Journals (Sweden)

    Shweta Gupta

    Full Text Available Efavirenz is an anti-viral agent of non-nucleoside reverse transcriptase inhibitor category used as a part of highly active retroviral therapy for the treatment of infections of human immune deficiency virus type-1. A simple, sensitive and rapid reversed-phase high performance liquid chromatographic gradient method was developed and validated for the determination of efavirenz in plasma. The method was developed with high performance liquid chromatography using Waters X-Terra Shield, RP18 50 x 4.6 mm, 3.5 μm column and a mobile phase consisting of phosphate buffer pH 3.5 and Acetonitrile. The elute was monitored with the UV-Visible detector at 260 nm with a flow rate of 1.5 mL/min. Tenofovir disoproxil fumarate was used as internal standard. The method was validated for linearity, precision, accuracy, specificity, robustness and data obtained were statistically analyzed. Calibration curve was found to be linear over the concentration range of 1-300 μg/mL. The retention times of efavirenz and tenofovir disoproxil fumarate (internal standard were 5.941 min and 4.356 min respectively. The regression coefficient value was found to be 0.999. The limit of detection and the limit of quantification obtained were 0.03 and 0.1 μg/mL respectively. The developed HPLC method can be useful for quantitative pharmacokinetic parameters determination of efavirenz in plasma.

  16. Stability indicating method development and validation of assay method for the estimation of rizatriptan benzoate in tablet

    Directory of Open Access Journals (Sweden)

    Chandrashekhar K. Gadewar

    2017-05-01

    Full Text Available A simple, sensitive, precise and specific high performance liquid chromatography method was developed and validated for the determination of rizatriptan in rizatriptan benzoate tablet. The separation was carried out by using a mobile phase consisting of acetonitrile: pH 3.4 phosphate buffer in ratio of 20:80. The column used was Zorbax SB CN 250 mm × 4.6 mm, 5 μ with a flow rate of 1 ml/min using UV detection at 225 nm. The retention time of rizatriptan and benzoic acid was found to be 4.751 and 8.348 min respectively. A forced degradation study of rizatriptan benzoate in its tablet form was conducted under the condition of hydrolysis, oxidation, thermal and photolysis. Rizatriptan was found to be stable in basic buffer while in acidic buffer was found to be degraded (water bath at 60 °C for 15 min. The detector response of rizatriptan is directly proportional to concentration ranging from 30% to 160% of test concentration i.e. 15.032 to 80.172 mcg/ml. Results of analysis were validated statistically and by recovery studies (mean recovery = 99.44. The result of the study showed that the proposed method is simple, rapid, precise and accurate, which is useful for the routine determination of rizatriptan in pharmaceutical dosage forms.

  17. Validation and reliability of the sex estimation of the human os coxae using freely available DSP2 software for bioarchaeology and forensic anthropology.

    Science.gov (United States)

    Brůžek, Jaroslav; Santos, Frédéric; Dutailly, Bruno; Murail, Pascal; Cunha, Eugenia

    2017-10-01

    A new tool for skeletal sex estimation based on measurements of the human os coxae is presented using skeletons from a metapopulation of identified adult individuals from twelve independent population samples. For reliable sex estimation, a posterior probability greater than 0.95 was considered to be the classification threshold: below this value, estimates are considered indeterminate. By providing free software, we aim to develop an even more disseminated method for sex estimation. Ten metric variables collected from 2,040 ossa coxa of adult subjects of known sex were recorded between 1986 and 2002 (reference sample). To test both the validity and reliability, a target sample consisting of two series of adult ossa coxa of known sex (n = 623) was used. The DSP2 software (Diagnose Sexuelle Probabiliste v2) is based on Linear Discriminant Analysis, and the posterior probabilities are calculated using an R script. For the reference sample, any combination of four dimensions provides a correct sex estimate in at least 99% of cases. The percentage of individuals for whom sex can be estimated depends on the number of dimensions; for all ten variables it is higher than 90%. Those results are confirmed in the target sample. Our posterior probability threshold of 0.95 for sex estimate corresponds to the traditional sectioning point used in osteological studies. DSP2 software is replacing the former version that should not be used anymore. DSP2 is a robust and reliable technique for sexing adult os coxae, and is also user friendly. © 2017 Wiley Periodicals, Inc.

  18. Validation of a prefractionation method followed by two-dimensional electrophoresis – Applied to cerebrospinal fluid proteins from frontotemporal dementia patients

    Directory of Open Access Journals (Sweden)

    Sjögren Magnus

    2004-11-01

    Full Text Available Abstract Background The aim of this study was firstly, to improve and validate a cerebrospinal fluid (CSF prefractionation method followed by two-dimensional electrophoresis (2-DE and secondly, using this strategy to investigate differences between the CSF proteome of frontotemporal dementia (FTD patients and controls. From each subject three ml of CSF was prefractionated using liquid phase isoelectric focusing prior to 2-DE. Results With respect to protein recovery and purification potential, ethanol precipitation of the prefractionated CSF sample was found superior, after testing several sample preparation methods. The reproducibility of prefractionated CSF analyzed on 2-D gels was comparable to direct 2-DE analysis of CSF. The protein spots on the prefractionated 2-D gels had an increased intensity, indicating a higher protein concentration, compared to direct 2-D gels. Prefractionated 2-DE analysis of FTD and control CSF showed that 26 protein spots were changed at least two fold. Using mass spectrometry, 13 of these protein spots were identified, including retinol-binding protein, Zn-α-2-glycoprotein, proapolipoproteinA1, β-2-microglobulin, transthyretin, albumin and alloalbumin. Conclusion The results suggest that the prefractionated 2-DE method can be useful for enrichment of CSF proteins and may provide a new tool to investigate the pathology of neurodegenerative diseases. This study confirmed reduced levels of retinol-binding protein and revealed some new biomarker candidates for FTD.

  19. Discovery and Validation of a Six-Marker Serum Protein Signature for the Diagnosis of Active Pulmonary Tuberculosis.

    Science.gov (United States)

    De Groote, Mary A; Sterling, David G; Hraha, Thomas; Russell, Theresa M; Green, Louis S; Wall, Kirsten; Kraemer, Stephan; Ostroff, Rachel; Janjic, Nebojsa; Ochsner, Urs A

    2017-10-01

    New non-sputum biomarker tests for active tuberculosis (TB) diagnostics are of the highest priority for global TB control. We performed in-depth proteomic analysis using the 4,000-plex SOMAscan assay on 1,470 serum samples from seven countries where TB is endemic. All samples were from patients with symptoms and signs suggestive of active pulmonary TB that were systematically confirmed or ruled out for TB by culture and clinical follow-up. HIV coinfection was present in 34% of samples, and 25% were sputum smear negative. Serum protein biomarkers were identified by stability selection using L1-regularized logistic regression and by Kolmogorov-Smirnov (KS) statistics. A naive Bayes classifier using six host response markers (HR6 model), including SYWC, kallistatin, complement C9, gelsolin, testican-2, and aldolase C, performed well in a training set (area under the sensitivity-specificity curve [AUC] of 0.94) and in a blinded verification set (AUC of 0.92) to distinguish TB and non-TB samples. Differential expression was also highly significant ( P CA6 (carbonic anhydrase 6). Target product profiles (TPPs) for a non-sputum biomarker test to diagnose active TB for treatment initiation (TPP#1) and for a community-based triage or referral test (TPP#2) have been published by the WHO. With 90% sensitivity and 80% specificity, the HR6 model fell short of TPP#1 but reached TPP#2 performance criteria. In conclusion, we identified and validated a six-marker signature for active TB that warrants diagnostic development on a patient-near platform. Copyright © 2017 De Groote et al.

  20. Validating the absolute reliability of a fat free mass estimate equation in hemodialysis patients using near-infrared spectroscopy.

    Science.gov (United States)

    Kono, Kenichi; Nishida, Yusuke; Moriyama, Yoshihumi; Taoka, Masahiro; Sato, Takashi

    2015-06-01

    The assessment of nutritional states using fat free mass (FFM) measured with near-infrared spectroscopy (NIRS) is clinically useful. This measurement should incorporate the patient's post-dialysis weight ("dry weight"), in order to exclude the effects of any change in water mass. We therefore used NIRS to investigate the regression, independent variables, and absolute reliability of FFM in dry weight. The study included 47 outpatients from the hemodialysis unit. Body weight was measured before dialysis, and FFM was measured using NIRS before and after dialysis treatment. Multiple regression analysis was used to estimate the FFM in dry weight as the dependent variable. The measured FFM before dialysis treatment (Mw-FFM), and the difference between measured and dry weight (Mw-Dw) were independent variables. We performed Bland-Altman analysis to detect errors between the statistically estimated FFM and the measured FFM after dialysis treatment. The multiple regression equation to estimate the FFM in dry weight was: Dw-FFM = 0.038 + (0.984 × Mw-FFM) + (-0.571 × [Mw-Dw]); R(2)  = 0.99). There was no systematic bias between the estimated and the measured values of FFM in dry weight. Using NIRS, FFM in dry weight can be calculated by an equation including FFM in measured weight and the difference between the measured weight and the dry weight. © 2015 The Authors. Therapeutic Apheresis and Dialysis © 2015 International Society for Apheresis.

  1. Molecular Dynamics Simulations and Kinetic Measurements to Estimate and Predict Protein-Ligand Residence Times.

    Science.gov (United States)

    Mollica, Luca; Theret, Isabelle; Antoine, Mathias; Perron-Sierra, Françoise; Charton, Yves; Fourquez, Jean-Marie; Wierzbicki, Michel; Boutin, Jean A; Ferry, Gilles; Decherchi, Sergio; Bottegoni, Giovanni; Ducrot, Pierre; Cavalli, Andrea

    2016-08-11

    Ligand-target residence time is emerging as a key drug discovery parameter because it can reliably predict drug efficacy in vivo. Experimental approaches to binding and unbinding kinetics are nowadays available, but we still lack reliable computational tools for predicting kinetics and residence time. Most attempts have been based on brute-force molecular dynamics (MD) simulations, which are CPU-demanding and not yet particularly accurate. We recently reported a new scaled-MD-based protocol, which showed potential for residence time prediction in drug discovery. Here, we further challenged our procedure's predictive ability by applying our methodology to a series of glucokinase activators that could be useful for treating type 2 diabetes mellitus. We combined scaled MD with experimental kinetics measurements and X-ray crystallography, promptly checking the protocol's reliability by directly comparing computational predictions and experimental measures. The good agreement highlights the potential of our scaled-MD-based approach as an innovative method for computationally estimating and predicting drug residence times.

  2. Validation of attenuation, beam blockage, and calibration estimation methods using two dual polarization X band weather radars

    Science.gov (United States)

    Diederich, M.; Ryzhkov, A.; Simmer, C.; Mühlbauer, K.

    2011-12-01

    The amplitude a of radar wave reflected by meteorological targets can be misjudged due to several factors. At X band wavelength, attenuation of the radar beam by hydro meteors reduces the signal strength enough to be a significant source of error for quantitative precipitation estimation. Depending on the surrounding orography, the radar beam may be partially blocked when scanning at low elevation angles, and the knowledge of the exact amount of signal loss through beam blockage becomes necessary. The phase shift between the radar signals at horizontal and vertical polarizations is affected by the hydrometeors that the beam travels through, but remains unaffected by variations in signal strength. This has allowed for several ways of compensating for the attenuation of the signal, and for consistency checks between these variables. In this study, we make use of several weather radars and gauge network measuring in the same area to examine the effectiveness of several methods of attenuation and beam blockage corrections. The methods include consistency checks of radar reflectivity and specific differential phase, calculation of beam blockage using a topography map, estimating attenuation using differential propagation phase, and the ZPHI method proposed by Testud et al. in 2000. Results show the high effectiveness of differential phase in estimating attenuation, and potential of the ZPHI method to compensate attenuation, beam blockage, and calibration errors.

  3. Temperature based validation of the analytical model for the estimation of the amount of heat generated during friction stir welding

    Directory of Open Access Journals (Sweden)

    Milčić Dragan S.

    2012-01-01

    Full Text Available Friction stir welding is a solid-state welding technique that utilizes thermomechanical influence of the rotating welding tool on parent material resulting in a monolith joint - weld. On the contact of welding tool and parent material, significant stirring and deformation of parent material appears, and during this process, mechanical energy is partially transformed into heat. Generated heat affects the temperature of the welding tool and parent material, thus the proposed analytical model for the estimation of the amount of generated heat can be verified by temperature: analytically determined heat is used for numerical estimation of the temperature of parent material and this temperature is compared to the experimentally determined temperature. Numerical solution is estimated using the finite difference method - explicit scheme with adaptive grid, considering influence of temperature on material's conductivity, contact conditions between welding tool and parent material, material flow around welding tool, etc. The analytical model shows that 60-100% of mechanical power given to the welding tool is transformed into heat, while the comparison of results shows the maximal relative difference between the analytical and experimental temperature of about 10%.

  4. Estimation of Resting Energy Expenditure: Validation of Previous and New Predictive Equations in Obese Children and Adolescents.

    Science.gov (United States)

    Acar-Tek, Nilüfer; Ağagündüz, Duygu; Çelik, Bülent; Bozbulut, Rukiye

    2017-08-01

    Accurate estimation of resting energy expenditure (REE) in childrenand adolescents is important to establish estimated energy requirements. The aim of the present study was to measure REE in obese children and adolescents by indirect calorimetry method, compare these values with REE values estimated by equations, and develop the most appropriate equation for this group. One hundred and three obese children and adolescents (57 males, 46 females) between 7 and 17 years (10.6 ± 2.19 years) were recruited for the study. REE measurements of subjects were made with indirect calorimetry (COSMED, FitMatePro, Rome, Italy) and body compositions were analyzed. In females, the percentage of accurate prediction varied from 32.6 (World Health Organization [WHO]) to 43.5 (Molnar and Lazzer). The bias for equations was -0.2% (Kim), 3.7% (Molnar), and 22.6% (Derumeaux-Burel). Kim's (266 kcal/d), Schmelzle's (267 kcal/d), and Henry's equations (268 kcal/d) had the lowest root mean square error (RMSE; respectively 266, 267, 268 kcal/d). The equation that has the highest RMSE values among female subjects was the Derumeaux-Burel equation (394 kcal/d). In males, when the Institute of Medicine (IOM) had the lowest accurate prediction value (12.3%), the highest values were found using Schmelzle's (42.1%), Henry's (43.9%), and Müller's equations (fat-free mass, FFM; 45.6%). When Kim and Müller had the smallest bias (-0.6%, 9.9%), Schmelzle's equation had the smallest RMSE (331 kcal/d). The new specific equation based on FFM was generated as follows: REE = 451.722 + (23.202 * FFM). According to Bland-Altman plots, it has been found out that the new equations are distributed randomly in both males and females. Previously developed predictive equations mostly provided unaccurate and biased estimates of REE. However, the new predictive equations allow clinicians to estimate REE in an obese children and adolescents with sufficient and acceptable accuracy.

  5. Development and validation of a highly sensitive LC-ESI-MS/MS method for estimation of IIIM-MCD-211, a novel nitrofuranyl methyl piperazine derivative with potential activity against tuberculosis: Application to drug development.

    Science.gov (United States)

    Magotra, Asmita; Sharma, Anjna; Gupta, Ajai Prakash; Wazir, Priya; Sharma, Shweta; Singh, Parvinder Pal; Tikoo, Manoj Kumar; Vishwakarma, Ram A; Singh, Gurdarshan; Nandi, Utpal

    2017-08-15

    In the present study, a simple, sensitive, specific and rapid liquid chromatography (LC) tandem mass spectrometry (MS/MS) method was developed and validated according to the Food and Drug Administration (FDA) guidelines for estimation of IIIM-MCD-211 (a potent oral candidate with promising action against tuberculosis) in mice plasma using carbamazepine as internal standard (IS). Bioanalytical method consisted of one step protein precipitation for sample preparation followed by quantitation in LC-MS/MS using positive electrospray ionization technique (ESI) operating in multiple reaction monitoring (MRM) mode. Elution was achieved in gradient mode on High Resolution Chromolith RP-18e column with mobile phase comprised of acetonitrile and 0.1% (v/v) formic acid in water at the flow rate of 0.4mL/min. Precursor to product ion transitions (m/z 344.5/218.4 and m/z 237.3/194.2) were used to measure analyte and IS, respectively. All validation parameters were well within the limit of acceptance criteria. The method was successfully applied to assess the pharmacokinetics of the candidate in mice following oral (10mg/kg) and intravenous (IV; 2.5mg/kg) administration. It was also effectively used to quantitate metabolic stability of the compound in mouse liver microsomes (MLM) and human liver microsomes (HLM) followed by its in-vitro-in-vivo extrapolation. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. NCI-FDA Interagency Oncology Task Force Workshop Provides Guidance for Analytical Validation of Protein-based Multiplex Assays | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to

  7. Simultaneous estimation of cross-validation errors in least squares collocation applied for statistical testing and evaluation of the noise variance components

    Science.gov (United States)

    Behnabian, Behzad; Mashhadi Hossainali, Masoud; Malekzadeh, Ahad

    2018-02-01

    The cross-validation technique is a popular method to assess and improve the quality of prediction by least squares collocation (LSC). We present a formula for direct estimation of the vector of cross-validation errors (CVEs) in LSC which is much faster than element-wise CVE computation. We show that a quadratic form of CVEs follows Chi-squared distribution. Furthermore, a posteriori noise variance factor is derived by the quadratic form of CVEs. In order to detect blunders in the observations, estimated standardized CVE is proposed as the test statistic which can be applied when noise variances are known or unknown. We use LSC together with the methods proposed in this research for interpolation of crustal subsidence in the northern coast of the Gulf of Mexico. The results show that after detection and removing outliers, the root mean square (RMS) of CVEs and estimated noise standard deviation are reduced about 51 and 59%, respectively. In addition, RMS of LSC prediction error at data points and RMS of estimated noise of observations are decreased by 39 and 67%, respectively. However, RMS of LSC prediction error on a regular grid of interpolation points covering the area is only reduced about 4% which is a consequence of sparse distribution of data points for this case study. The influence of gross errors on LSC prediction results is also investigated by lower cutoff CVEs. It is indicated that after elimination of outliers, RMS of this type of errors is also reduced by 19.5% for a 5 km radius of vicinity. We propose a method using standardized CVEs for classification of dataset into three groups with presumed different noise variances. The noise variance components for each of the groups are estimated using restricted maximum-likelihood method via Fisher scoring technique. Finally, LSC assessment measures were computed for the estimated heterogeneous noise variance model and compared with those of the homogeneous model. The advantage of the proposed method is the

  8. Initial Validation for the Estimation of Resting-State fMRI Effective Connectivity by a Generalization of the Correlation Approach

    Directory of Open Access Journals (Sweden)

    Nan Xu

    2017-05-01

    Full Text Available Resting-state functional MRI (rs-fMRI is widely used to noninvasively study human brain networks. Network functional connectivity is often estimated by calculating the timeseries correlation between blood-oxygen-level dependent (BOLD signal from different regions of interest (ROIs. However, standard correlation cannot characterize the direction of information flow between regions. In this paper, we introduce and test a new concept, prediction correlation, to estimate effective connectivity in functional brain networks from rs-fMRI. In this approach, the correlation between two BOLD signals is replaced by a correlation between one BOLD signal and a prediction of this signal via a causal system driven by another BOLD signal. Three validations are described: (1 Prediction correlation performed well on simulated data where the ground truth was known, and outperformed four other methods. (2 On simulated data designed to display the “common driver” problem, prediction correlation did not introduce false connections between non-interacting driven ROIs. (3 On experimental data, prediction correlation recovered the previously identified network organization of human brain. Prediction correlation scales well to work with hundreds of ROIs, enabling it to assess whole brain interregional connectivity at the single subject level. These results provide an initial validation that prediction correlation can capture the direction of information flow and estimate the duration of extended temporal delays in information flow between regions of interest ROIs based on BOLD signal. This approach not only maintains the high sensitivity to network connectivity provided by the correlation analysis, but also performs well in the estimation of causal information flow in the brain.

  9. Validity of eyeball estimation for range of motion during the cervical flexion rotation test compared to an ultrasound-based movement analysis system.

    Science.gov (United States)

    Schäfer, Axel; Lüdtke, Kerstin; Breuel, Franziska; Gerloff, Nikolas; Knust, Maren; Kollitsch, Christian; Laukart, Alex; Matej, Laura; Müller, Antje; Schöttker-Königer, Thomas; Hall, Toby

    2018-08-01

    Headache is a common and costly health problem. Although pathogenesis of headache is heterogeneous, one reported contributing factor is dysfunction of the upper cervical spine. The flexion rotation test (FRT) is a commonly used diagnostic test to detect upper cervical movement impairment. The aim of this cross-sectional study was to investigate concurrent validity of detecting high cervical ROM impairment during the FRT by comparing measurements established by an ultrasound-based system (gold standard) with eyeball estimation. Secondary aim was to investigate intra-rater reliability of FRT ROM eyeball estimation. The examiner (6 years experience) was blinded to the data from the ultrasound-based device and to the symptoms of the patients. FRT test result (positive or negative) was based on visual estimation of range of rotation less than 34° to either side. Concurrently, range of rotation was evaluated using the ultrasound-based device. A total of 43 subjects with headache (79% female), mean age of 35.05 years (SD 13.26) were included. According to the International Headache Society Classification 23 subjects had migraine, 4 tension type headache, and 16 multiple headache forms. Sensitivity and specificity were 0.96 and 0.89 for combined rotation, indicating good concurrent reliability. The area under the ROC curve was 0.95 (95% CI 0.91-0.98) for rotation to both sides. Intra-rater reliability for eyeball estimation was excellent with Fleiss Kappa 0.79 for right rotation and left rotation. The results of this study indicate that the FRT is a valid and reliable test to detect impairment of upper cervical ROM in patients with headache.

  10. Validation of Left Atrial Volume Estimation by Left Atrial Diameter from the Parasternal Long-Axis View.

    Science.gov (United States)

    Canciello, Grazia; de Simone, Giovanni; Izzo, Raffaele; Giamundo, Alessandra; Pacelli, Filomena; Mancusi, Costantino; Galderisi, Maurizio; Trimarco, Bruno; Losi, Maria-Angela

    2017-03-01

    Measurement of left atrial (LA) volume (LAV) is recommended for quantification of LA size. Only LA anteroposterior diameter (LAd) is available in a number of large cohorts, trials, or registries. The aim of this study was to evaluate whether LAV may be reasonably estimated from LAd. One hundred forty consecutive patients referred to our outpatient clinics were prospectively enrolled to measure LAd from the long-axis view on two-dimensional echocardiography. LA orthogonal dimensions were also taken from apical four- and two-chamber views. LAV was measured using the Simpson, area-length, and ellipsoid (LAV e ) methods. The first 70 patients were the learning series and the last 70 the testing series (TeS). In the learning series, best-fitting regression analysis of LAV-LAd was run using all LAV methods, and the highest values of F were chosen among the regression equations. In the TeS, the best-fitting regressions were used to estimate LAV from LAd. In the learning series, the best-fitting regression was linear for the Spearman method (r 2  = 0.62, F = 111.85, P = .0001) and area-length method (r 2  = 0.62, F = 112.24, P = .0001) and powered for the LAV e method (r 2  = 0.81, F = 288.41, P = .0001). In the TeS, the r 2 value for LAV prediction was substantially better using the LAV e method (r 2  = 0.89) than the Simpson (r 2  = 0.72) or area-length (r 2  = 0.70) method, as was the intraclass correlation (ρ = 0.96 vs ρ = 0.89 and ρ = 0.89, respectively). In the TeS, the sensitivity and specificity of LA dilatation by the estimated LAV e method were 87% and 90%, respectively. LAV can be estimated from LAd using a nonlinear equation with an elliptical model. The proposed method may be used in retrospective analysis of existing data sets in which determination of LAV was not programmed. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  11. Membrane-bound conformation of M13 major coat protein : a structure validation through FRET-derived constraints

    NARCIS (Netherlands)

    Vos, W.L.; Koehorst, R.B.M.; Spruijt, R.B.; Hemminga, M.A.

    2005-01-01

    M13 major coat protein, a 50-amino-acid-long protein, was incorporated into DOPC/DOPG (80/20 molar ratio) unilamellar vesicles. Over 60% of all amino acid residues was replaced with cysteine residues, and the single cysteine mutants were labeled with the fluorescent label I-AEDANS. The coat protein

  12. Combining Cystatin C and Creatinine Yields a Reliable Glomerular Filtration Rate Estimation in Older Adults in Contrast to β-Trace Protein and β2-Microglobulin.

    Science.gov (United States)

    Werner, Karin; Pihlsgård, Mats; Elmståhl, Sölve; Legrand, Helen; Nyman, Ulf; Christensson, Anders

    2017-01-01

    The glomerular filtration rate (GFR) is the most important measure of kidney function and chronic kidney disease (CKD). This study aims to validate commonly used equations for estimated GFR (eGFR) based on creatinine (cr), cystatin C (cys), β-trace protein (BTP), and β2-microglobulin (B2M) in older adults. We conducted a validation study with 126 participants aged between 72 and 98 with a mean measured GFR (mGFR) by iohexol clearance of 54 mL/min/1.73 m2. The eGFR equations (CKD-Epidemiology collaboration [CKD-EPI], Berlin Initiative Study [BIS], Full Age Spectrum [FAS], Modification of Diet in Renal Disease [MDRD]cr, Caucasian-Asian-Pediatric-Adult [CAPA]cys, Lund-Malmö Revised [LM-REV]cr, and MEAN-LM-CAPAcr-cys), were assessed in terms of bias (median difference: eGFR-mGFR), precision (interquartile range of the differences), and accuracy (P30: percentage of estimates ±30% of mGFR). The equations were compared to a benchmark equation: CKD-EPIcr-cys. All cystatin C-based equations underestimated the GFR compared to mGFR, whereas bias was mixed for the equations based only on creatinine. Accuracy was the highest for CKD-EPIcr-cys (98%) and lowest for MDRD (82%). Below mGFR 45 mL/min/1.73 m2 only equations incorporating cystatin C reached P30 accuracy >90%. CKD-EPIcr-cys was not significantly more accurate than the other cystatin C-based equations. In contrast, CKD-EPIcr-cys was significantly more accurate than all creatinine-based equations except LM-REVcr. This study confirms that it is reasonable to use equations incorporating cystatin C and creatinine in older patients across a wide spectrum of GFR. However, the results call into question the use of creatinine alone below mGFR 45 mL/min/1.73 m2. B2M and BTP do not demonstrate additional value in eGFR determination in older adults. © 2017 S. Karger AG, Basel.

  13. Estimation of leaf area index using ground-based remote sensed NDVI measurements: validation and comparison with two indirect techniques

    International Nuclear Information System (INIS)

    Pontailler, J.-Y.; Hymus, G.J.; Drake, B.G.

    2003-01-01

    This study took place in an evergreen scrub oak ecosystem in Florida. Vegetation reflectance was measured in situ with a laboratory-made sensor in the red (640-665 nm) and near-infrared (750-950 nm) bands to calculate the normalized difference vegetation index (NDVI) and derive the leaf area index (LAI). LAI estimates from this technique were compared with two other nondestructive techniques, intercepted photosynthetically active radiation (PAR) and hemispherical photographs, in four contrasting 4 m 2 plots in February 2000 and two 4m 2 plots in June 2000. We used Beer's law to derive LAI from PAR interception and gap fraction distribution to derive LAI from photographs. The plots were harvested manually after the measurements to determine a 'true' LAI value and to calculate a light extinction coefficient (k). The technique based on Beer's law was affected by a large variation of the extinction coefficient, owing to the larger impact of branches in winter when LAI was low. Hemispherical photographs provided satisfactory estimates, slightly overestimated in winter because of the impact of branches or underestimated in summer because of foliage clumping. NDVI provided the best fit, showing only saturation in the densest plot (LAI = 3.5). We conclude that in situ measurement of NDVI is an accurate and simple technique to nondestructively assess LAI in experimental plots or in crops if saturation remains acceptable. (author)

  14. Estimation of leaf area index using ground-based remote sensed NDVI measurements: validation and comparison with two indirect techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pontailler, J.-Y. [Univ. Paris-Sud XI, Dept. d' Ecophysiologie Vegetale, Orsay Cedex (France); Hymus, G.J.; Drake, B.G. [Smithsonian Environmental Research Center, Kennedy Space Center, Florida (United States)

    2003-06-01

    This study took place in an evergreen scrub oak ecosystem in Florida. Vegetation reflectance was measured in situ with a laboratory-made sensor in the red (640-665 nm) and near-infrared (750-950 nm) bands to calculate the normalized difference vegetation index (NDVI) and derive the leaf area index (LAI). LAI estimates from this technique were compared with two other nondestructive techniques, intercepted photosynthetically active radiation (PAR) and hemispherical photographs, in four contrasting 4 m{sup 2} plots in February 2000 and two 4m{sup 2} plots in June 2000. We used Beer's law to derive LAI from PAR interception and gap fraction distribution to derive LAI from photographs. The plots were harvested manually after the measurements to determine a 'true' LAI value and to calculate a light extinction coefficient (k). The technique based on Beer's law was affected by a large variation of the extinction coefficient, owing to the larger impact of branches in winter when LAI was low. Hemispherical photographs provided satisfactory estimates, slightly overestimated in winter because of the impact of branches or underestimated in summer because of foliage clumping. NDVI provided the best fit, showing only saturation in the densest plot (LAI = 3.5). We conclude that in situ measurement of NDVI is an accurate and simple technique to nondestructively assess LAI in experimental plots or in crops if saturation remains acceptable. (author)

  15. Use of Shark Dental Protein to Estimate Trophic Position via Amino Acid Compound-Specific Isotope Analysis

    Science.gov (United States)

    Hayes, M.; Herbert, G.; Ellis, G.

    2017-12-01

    The diets of apex predators such as sharks are expected to change in response to overfishing of their mesopredator prey, but pre-anthropogenic baselines necessary to test for such changes are lacking. Stable isotope analysis (SIA) of soft tissues is commonly used to study diets in animals based on the bioaccumulation of heavier isotopes of carbon and nitrogen with increasing trophic level. In specimens representing pre-anthropogenic baselines, however, a modified SIA approach is needed to deal with taphonomic challenges, such as loss of soft tissues or selective loss of less stable amino acids (AAs) in other sources of organic compounds (e.g., teeth or bone) which can alter bulk isotope values. These challenges can be overcome with a compound-specific isotope analysis of individual AAs (AA-CSIA), but this first requires a thorough understanding of trophic enrichment factors for individual AAs within biomineralized tissues. In this study, we compare dental and muscle proteins of individual sharks via AA-CSIA to determine how trophic position is recorded within teeth and whether that information differs from that obtained from soft tissues. If skeletal organics reliably record information about shark ecology, then archaeological and perhaps paleontological specimens can be used to investigate pre-anthropogenic ecosystems. Preliminary experiments show that the commonly used glutamic acid/phenylalanine AA pairing may not be useful for establishing trophic position from dental proteins, but that estimated trophic position determined from alternate AA pairs are comparable to those from muscle tissue within the same species.

  16. PROCOV: maximum likelihood estimation of protein phylogeny under covarion models and site-specific covarion pattern analysis

    Directory of Open Access Journals (Sweden)

    Wang Huai-Chun

    2009-09-01

    Full Text Available Abstract Background The covarion hypothesis of molecular evolution holds that selective pressures on a given amino acid or nucleotide site are dependent on the identity of other sites in the molecule that change throughout time, resulting in changes of evolutionary rates of sites along the branches of a phylogenetic tree. At the sequence level, covarion-like evolution at a site manifests as conservation of nucleotide or amino acid states among some homologs where the states are not conserved in other homologs (or groups of homologs. Covarion-like evolution has been shown to relate to changes in functions at sites in different clades, and, if ignored, can adversely affect the accuracy of phylogenetic inference. Results PROCOV (protein covarion analysis is a software tool that implements a number of previously proposed covarion models of protein evolution for phylogenetic inference in a maximum likelihood framework. Several algorithmic and implementation improvements in this tool over previous versions make computationally expensive tree searches with covarion models more efficient and analyses of large phylogenomic data sets tractable. PROCOV can be used to identify covarion sites by comparing the site likelihoods under the covarion process to the corresponding site likelihoods under a rates-across-sites (RAS process. Those sites with the greatest log-likelihood difference between a 'covarion' and an RAS process were found to be of functional or structural significance in a dataset of bacterial and eukaryotic elongation factors. Conclusion Covarion models implemented in PROCOV may be especially useful for phylogenetic estimation when ancient divergences between sequences have occurred and rates of evolution at sites are likely to have changed over the tree. It can also be used to study lineage-specific functional shifts in protein families that result in changes in the patterns of site variability among subtrees.

  17. The effects of protein intake on albuminuria in different estimated glomerular filtration rate: A population-based study.

    Science.gov (United States)

    Liu, Yan; Tan, Rong-Shao; Zhou, Dao-Yuan; Xiao, Xiao; Ran, Jian-Min; Qin, Dan-Ping; Zhong, Xiao-Shi; Hu, Jian-Guang; Liu, Yun; Zheng, Yuan-Yuan

    2018-02-01

    Chronic kidney disease (CKD) is a serious condition associated with early mortality, decreased quality of life, and increased health-care expenditures. Data from the National Health and Nutrition Examination Survey (NHANES) collected from 1999 to 2012 were used. Subjects were divided into 4 estimated glomerular filtration rate (eGFR) categories: stage 1: eGFR≥90mL/min/1.73m 2 , stage 2: eGFR 60-89, stage 3: eGFR 30-59, and stage 4/5: eGFRprotein intake and albuminuria were determined. A total of 45,259 subjects were included. Despite decreasing protein intake, there was a significant increase in the prevalence of albuminuria with decreasing levels of eGFR. Multivariable analysis showed that albuminuria was associated with daily protein intake in patients ≥65years old with stage 1 disease, and that diabetes was associated with albuminuria in patients ≥65years old with stage 2 and 3 diseases. Overall, albuminuria in patients with stage 1 disease was associated with hours of sitting per day and blood glucose level. Albuminuria was associated with daily protein intake in patients of 45-64years old with stage 1 CKD disease, and was associated with hours of sitting per day and blood glucose level. These data further support the importance of lifestyle changes in the management of CKD, especially in patients with early-stage disease. Copyright © 2017 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  18. Development and validation of a new technique for estimating a minimum postmortem interval using adult blow fly (Diptera: Calliphoridae) carcass attendance.

    Science.gov (United States)

    Mohr, Rachel M; Tomberlin, Jeffery K

    2015-07-01

    Understanding the onset and duration of adult blow fly activity is critical to accurately estimating the period of insect activity or minimum postmortem interval (minPMI). Few, if any, reliable techniques have been developed and consequently validated for using adult fly activity to determine a minPMI. In this study, adult blow flies (Diptera: Calliphoridae) of Cochliomyia macellaria and Chrysomya rufifacies were collected from swine carcasses in rural central Texas, USA, during summer 2008 and Phormia regina and Calliphora vicina in the winter during 2009 and 2010. Carcass attendance patterns of blow flies were related to species, sex, and oocyte development. Summer-active flies were found to arrive 4-12 h after initial carcass exposure, with both C. macellaria and C. rufifacies arriving within 2 h of one another. Winter-active flies arrived within 48 h of one another. There was significant difference in degree of oocyte development on each of the first 3 days postmortem. These frequency differences allowed a minPMI to be calculated using a binomial analysis. When validated with seven tests using domestic and feral swine and human remains, the technique correctly estimated time of placement in six trials.

  19. Validation of a novel protocol for calculating estimated energy requirements and average daily physical activity ratio for the US population: 2005-2006.

    Science.gov (United States)

    Archer, Edward; Hand, Gregory A; Hébert, James R; Lau, Erica Y; Wang, Xuewen; Shook, Robin P; Fayad, Raja; Lavie, Carl J; Blair, Steven N

    2013-12-01

    To validate the PAR protocol, a novel method for calculating population-level estimated energy requirements (EERs) and average physical activity ratio (APAR), in a nationally representative sample of US adults. Estimates of EER and APAR values were calculated via a factorial equation from a nationally representative sample of 2597 adults aged 20 and 74 years (US National Health and Nutrition Examination Survey; data collected between January 1, 2005, and December 31, 2006). Validation of the PAR protocol-derived EER (EER(PAR)) values was performed via comparison with values from the Institute of Medicine EER equations (EER(IOM)). The correlation between EER(PAR) and EER(IOM) was high (0.98; Pmen to 148 kcal/d (5.7% higher) in obese women. The 2005-2006 EERs for the US population were 2940 kcal/d for men and 2275 kcal/d for women and ranged from 3230 kcal/d in obese (BMI ≥30) men to 2026 kcal/d in normal weight (BMI women. There were significant inverse relationships between APAR and both obesity and age. For men and women, the APAR values were 1.53 and 1.52, respectively. Obese men and women had lower APAR values than normal weight individuals (P¼.023 and P¼.015, respectively) [corrected], and younger individuals had higher APAR values than older individuals (Pphysical activity and health. Copyright © 2013 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  20. Convergent validity between a discrete choice experiment and a direct, open-ended method: comparison of preferred attribute levels and willingness to pay estimates.

    Science.gov (United States)

    Marjon van der Pol; Shiell, Alan; Au, Flora; Johnston, David; Tough, Suzanne

    2008-12-01

    The Discrete Choice Experiment (DCE) has become increasingly popular as a method for eliciting patient or population preferences. If DCE estimates are to inform health policy, it is crucial that the answers they provide are valid. Convergent validity is tested in this paper by comparing the results of a DCE exercise with the answers obtained from direct, open-ended questions. The two methods are compared in terms of preferred attribute levels and willingness to pay (WTP) values. Face-to-face interviews were held with 292 women in Calgary, Canada. Similar values were found between the two methods with respect to preferred levels for two out of three of the attributes examined. The DCE predicted less well for levels outside the range than for levels inside the range reaffirming the importance of extensive piloting to ensure appropriate level range in DCEs. The mean WTP derived from the open-ended question was substantially lower than the mean derived from the DCE. However, the two sets of willingness to pay estimates were consistent with each other in that individuals who were willing to pay more in the open-ended question were also willing to pay more in the DCE. The difference in mean WTP values between the two approaches (direct versus DCE) demonstrates the importance of continuing research into the different biases present across elicitation methods.

  1. Reproducibility and validity of the food frequency questionnaire for estimating habitual dietary intake in children and adolescents

    Science.gov (United States)

    2011-01-01

    Background A previous study reported the development a 75-item food frequency questionnaire for Japanese children (CFFQ). The first aim was to examine the reproducibility and validity of the CFFQ in order to assess dietary intake among two groups; 3-11 year old children (YC group) and 12-16 year old children (AD group). The second aim was to use the CFFQ and the FFQ for adults (AFFQ), and to determine which was better suited for assessing the intake of children in each group. Methods A total of the 103 children participated in this study. The interval between the first CFFQ and AFFQ and the second CFFQ and AFFQ was one month. Four weighted dietary records (WDRs) were conducted once a week. Pearson's correlation coefficients between the first and second FFQs were calculated to test the reproducibility of each FFQ. Pearson's correlation coefficients between WDRs and the second FFQ were calculated for the unadjusted value and sex-, age-, and energy-adjusted values to determine the validity of each FFQ. Results The final number of subjects participating in the analysis was 89. The median correlation coefficients between the first and second CFFQs and AFFQs were 0.76 and 0.73, respectively. There was some over/underestimation of nutrients in the CFFQ of the YC group and in the AFFQ of the AD group. The medians of the sex-, age-, and energy-adjusted correlation coefficients were not different between the YC and AD groups for each FFQ. The correlation coefficient in sex-, age-, and energy-adjusted value revealed that the largest number of subject with high (0.50 or more) value was obtained by the CFFQ in the YC group. Conclusions This study indicated that the CFFQ might be a useful tool for assessing habitual dietary intake of children in the YC group. Although the CFFQ agreed moderately with habitual intake, it was found to underestimate intake in theAD group. However, for the AFFQ, the ability to rank habitual intake was low. Therefore, it is necessary to develop a new

  2. Validation of an extraction paper chromatography (EPC) technique for estimation of trace levels of 90Sr in 90Y solutions obtained from 90Sr/90Y generator systems

    International Nuclear Information System (INIS)

    Usha Pandey; Yogendra Kumar; Ashutosh Dash

    2014-01-01

    While the extraction paper chromatography (EPC) technique constitutes a novel paradigm for the determination of few Becquerels of 90 Sr in MBq quantities of 90 Y obtained from 90 Sr/ 90 Y generator, validation of the technique is essential to ensure its usefulness as a real time analytical tool. With a view to explore the relevance and applicability of EPC technique as a real time quality control (QC) technique for the routine estimation of 90 Sr content in generator produced 90 Y, a systematic validation study was carried out diligently not only to establish its worthiness but also to broaden its horizon. The ability of the EPC technique to separate trace amounts of Sr 2+ in the presence of large amounts of Y 3+ was verified. The specificity of the technique for Y 3+ was demonstrated with 90 Y obtained by neutron irradiation. The method was validated under real experimental conditions and compared with a QC method described in US Pharmacopeia for detection of 90 Sr levels in 90 Y radiopharmaceuticals. (author)

  3. Genetic parameters estimate for milk and mozzarella cheese yield, fat and protein percentage in dairy buffaloes in Brazil

    Directory of Open Access Journals (Sweden)

    H. Tonhati

    2010-02-01

    Full Text Available The aim of this study was analyze the (covariance components and genetic and phenotypic relationships in the following traits: accumulated milk yield at 270 days (MY270, observed until 305 days of lactation; accumulated milk yield at 270 days (MY270/ A and at 305 days (MY305, observed until 335 days of lactation; mozzarella cheese yield (MCY and fat (FP and protein (PP percentage, observed until 335 days of lactation. The (covariance components were estimated by Restricted Maximum Likelihood methodology in analyses single, two and three-traits using animal models. Heritability estimated for MY270, MY270/A, MY305, MCY, FP and PP were 0.22; 0.24, 0.25, 0.14, 0.29 and 0.40 respectively. The genetic correlations between MCY and the variables MY270, MY270/A, MY305, PP and FP was: 0.85; 1.00; 0.89; 0.14 and 0.06, respectively. This way, the selection for the production of milk in long period should increase MCY. However, in the search of animals that produce milk with quality, the genetic parameters suggest that another index should be composed allying these studied traits.

  4. Response of insect relative growth rate to temperature and host-plant phenology: estimation and validation from field data.

    Directory of Open Access Journals (Sweden)

    Mamadou Ciss

    Full Text Available Between 1975 to 2011, aphid Relative Growth Rates (RGR were modelled as a function of mean outdoor temperature and host plant phenology. The model was applied to the grain aphid Sitobion avenae using data on aphid counts in winter wheat at two different climate regions in France (oceanic climate, Rennes (western France; continental climate, Paris. Mean observed aphid RGR was higher in Paris compared to the Rennes region. RGR increased with mean temperature, which is explained by aphid reproduction, growth and development being dependent on ambient temperature. From the stem extension to the heading stage in wheat, there was either a plateau in RGR values (Rennes or an increase with a maximum at heading (Paris due to high intrinsic rates of increase in aphids and also to aphid immigration. From the wheat flowering to the ripening stage, RGR decreased in both regions due to the low intrinsic rate of increase in aphids and high emigration rate linked to reduced nutrient quality in maturing wheat. The model validation process showed that the fitted models have more predictive power in the Paris region than in the Rennes region.

  5. Validating automated kidney stone volumetry in computed tomography and mathematical correlation with estimated stone volume based on diameter.

    Science.gov (United States)

    Wilhelm, Konrad; Miernik, Arkadiusz; Hein, Simon; Schlager, Daniel; Adams, Fabian; Benndorf, Matthias; Fritz, Benjamin; Langer, Mathias; Hesse, Albrecht; Schoenthaler, Martin; Neubauer, Jakob

    2018-06-02

    To validate AutoMated UroLithiasis Evaluation Tool (AMULET) software for kidney stone volumetry and compare its performance to standard clinical practice. Maximum diameter and volume of 96 urinary stones were measured as reference standard by three independent urologists. The same stones were positioned in an anthropomorphic phantom and CT scans acquired in standard settings. Three independent radiologists blinded to the reference values took manual measurements of the maximum diameter and automatic measurements of maximum diameter and volume. An "expected volume" was calculated based on manual diameter measurements using the formula: V=4/3 πr³. 96 stones were analyzed in the study. We had initially aimed to assess 100. Nine were replaced during data acquisition due of crumbling and 4 had to be excluded because the automated measurement did not work. Mean reference maximum diameter was 13.3 mm (5.2-32.1 mm). Correlation coefficients among all measured outcomes were compared. The correlation between the manual and automatic diameter measurements to the reference was 0.98 and 0.91, respectively (pvolumetry is possible and significantly more accurate than diameter-based volumetric calculations. To avoid bias in clinical trials, size should be measured as volume. However, automated diameter measurements are not as accurate as manual measurements.

  6. Arterial stiffness estimation in healthy subjects: a validation of oscillometric (Arteriograph) and tonometric (SphygmoCor) techniques.

    Science.gov (United States)

    Ring, Margareta; Eriksson, Maria Jolanta; Zierath, Juleen Rae; Caidahl, Kenneth

    2014-11-01

    Arterial stiffness is an important cardiovascular risk marker, which can be measured noninvasively with different techniques. To validate such techniques in healthy subjects, we compared the recently introduced oscillometric Arteriograph (AG) technique with the tonometric SphygmoCor (SC) method and their associations with carotid ultrasound measures and traditional risk indicators. Sixty-three healthy subjects aged 20-69 (mean 48 ± 15) years were included. We measured aortic pulse wave velocity (PWVao) and augmentation index (AIx) by AG and SC, and with SC also the PWVao standardized to 80% of the direct distance between carotid and femoral sites (St-PWVaoSC). The carotid strain, stiffness index and intima-media thickness (cIMTmean) were evaluated by ultrasound. PWVaoAG (8.00 ± 2.16 m s(-1)) was higher (Pstiffness indices by AG and SC correlate with vascular risk markers in healthy subjects. AIxao results by AG and SC are closely interrelated, but higher values are obtained by AG. In the lower range, PWVao values by AG and SC are similar, but differ for higher values. Our results imply the necessity to apply one and the same technique for repeated studies.

  7. Validation of Satellite Estimates (Tropical Rainfall Measuring Mission, TRMM for Rainfall Variability over the Pacific Slope and Coast of Ecuador

    Directory of Open Access Journals (Sweden)

    Bolívar Erazo

    2018-02-01

    Full Text Available A dense rain-gauge network within continental Ecuador was used to evaluate the quality of various products of rainfall data over the Pacific slope and coast of Ecuador (EPSC. A cokriging interpolation method is applied to the rain-gauge data yielding a gridded product at 5-km resolution covering the period 1965–2015. This product is compared with the Global Precipitation Climatology Centre (GPCC dataset, the Climatic Research Unit–University of East Anglia (CRU dataset, the Tropical Rainfall Measuring Mission (TRMM/TMPA 3B43 Version 7 dataset and the ERA-Interim Reanalysis. The analysis reveals that TRMM data show the most realistic features. The relative bias index (Rbias indicates that TRMM data is closer to the observations, mainly over lowlands (mean Rbias of 7% but have more limitations in reproducing the rainfall variability over the Andes (mean Rbias of −28%. The average RMSE and Rbias of 68.7 and −2.8% of TRMM are comparable with the GPCC (69.8 and 5.7% and CRU (102.3 and −2.3% products. This study also focuses on the rainfall inter-annual variability over the study region which experiences floods that have caused high economic losses during extreme El Niño events. Finally, our analysis evaluates the ability of TRMM data to reproduce rainfall events during El Niño years over the study area and the large basins of Esmeraldas and Guayas rivers. The results show that TRMM estimates report reasonable levels of heavy rainfall detection (for the extreme 1998 El Niño event over the EPSC and specifically towards the center-south of the EPSC (Guayas basin but present underestimations for the moderate El Niño of 2002–2003 event and the weak 2009–2010 event. Generally, the rainfall seasonal features, quantity and long-term climatology patterns are relatively well estimated by TRMM.

  8. Enhancing the Simplified Surface Energy Balance (SSEB) Approach for Estimating Landscape ET: Validation with the METRIC model

    Science.gov (United States)

    Senay, Gabriel B.; Budde, Michael E.; Verdin, James P.

    2011-01-01

    Evapotranspiration (ET) can be derived from satellite data using surface energy balance principles. METRIC (Mapping EvapoTranspiration at high Resolution with Internalized Calibration) is one of the most widely used models available in the literature to estimate ET from satellite imagery. The Simplified Surface Energy Balance (SSEB) model is much easier and less expensive to implement. The main purpose of this research was to present an enhanced version of the Simplified Surface Energy Balance (SSEB) model and to evaluate its performance using the established METRIC model. In this study, SSEB and METRIC ET fractions were compared using 7 Landsat images acquired for south central Idaho during the 2003 growing season. The enhanced SSEB model compared well with the METRIC model output exhibiting an r2 improvement from 0.83 to 0.90 in less complex topography (elevation less than 2000 m) and with an improvement of r2 from 0.27 to 0.38 in more complex (mountain) areas with elevation greater than 2000 m. Independent evaluation showed that both models exhibited higher variation in complex topographic regions, although more with SSEB than with METRIC. The higher ET fraction variation in the complex mountainous regions highlighted the difficulty of capturing the radiation and heat transfer physics on steep slopes having variable aspect with the simple index model, and the need to conduct more research. However, the temporal consistency of the results suggests that the SSEB model can be used on a wide range of elevation (more successfully up 2000 m) to detect anomalies in space and time for water resources management and monitoring such as for drought early warning systems in data scarce regions. SSEB has a potential for operational agro-hydrologic applications to estimate ET with inputs of surface temperature, NDVI, DEM and reference ET.

  9. Validity of mid arm circumference to detect protein energy malnutrition among 8-11 months old infants in a rural medical college of West Bengal.

    Science.gov (United States)

    Sadhukhan, Sanjoy Kr; Chatterjee, Chitra; Shrivastava, Prabha; Sardar, Jadav Chandra; Joardar, Gautam Kr; Lahiri, Saibendu

    2010-09-01

    This institution-based cross-sectional observational validation study was conducted in the immunisation clinic of North Bengal Medical College and Hospital, Sushrutanagar. The objective was to identify the validity characteristics of mid arm circumference to detect protein energy malnutrition among 8-11 months infants and to find out a suitable cut-off value if any. Study variables were age, sex, body weight and mid arm circumference. Mid arm circumference was validated against weight for age criteria (gold standard) of malnutrition. The mean mid arm circumference of the infants was found to be almost constant with only about 2.22% change over 4 months, signifying that single cut-off point can be used to detect protein energy malnutrition. Mid arm circumference values from 13.0 to 12.5 cm were found to have the highest accuracy to detect protein energy malnutrition (about 86%). The cut-off values of 12.5 and 12.6 cm were noted to have a sensitivity and specificity of about 52% and 96% respectively, a false negativity of 48% but a false positivity of only 4%. Receiver operating characteristics curve detected 12.5(12.6) cm as the best diagnostic cut-off point which can detect more than 50% of the malnourished babies with very little false positivity/misdiagnosis (only 4%). A simple measuring tape with some reorientation of the health workers can detect the beginning of childhood malnutrition.

  10. Development and validation of risk prediction equations to estimate future risk of blindness and lower limb amputation in patients with diabetes: cohort study.

    Science.gov (United States)

    Hippisley-Cox, Julia; Coupland, Carol

    2015-11-11

    Is it possible to develop and externally validate risk prediction equations to estimate the 10 year risk of blindness and lower limb amputation in patients with diabetes aged 25-84 years? This was a prospective cohort study using routinely collected data from general practices in England contributing to the QResearch and Clinical Practice Research Datalink (CPRD) databases during the study period 1998-2014. The equations were developed using 763 QResearch practices (n=454,575 patients with diabetes) and validated in 254 different QResearch practices (n=142,419) and 357 CPRD practices (n=206,050). Cox proportional hazards models were used to derive separate risk equations for blindness and amputation in men and women that could be evaluated at 10 years. Measures of calibration and discrimination were calculated in the two validation cohorts. Risk prediction equations to quantify absolute risk of blindness and amputation in men and women with diabetes have been developed and externally validated. In the QResearch derivation cohort, 4822 new cases of lower limb amputation and 8063 new cases of blindness occurred during follow-up. The risk equations were well calibrated in both validation cohorts. Discrimination was good in men in the external CPRD cohort for amputation (D statistic 1.69, Harrell's C statistic 0.77) and blindness (D statistic 1.40, Harrell's C statistic 0.73), with similar results in women and in the QResearch validation cohort. The algorithms are based on variables that patients are likely to know or that are routinely recorded in general practice computer systems. They can be used to identify patients at high risk for prevention or further assessment. Limitations include lack of formally adjudicated outcomes, information bias, and missing data. Patients with type 1 or type 2 diabetes are at increased risk of blindness and amputation but generally do not have accurate assessments of the magnitude of their individual risks. The new algorithms calculate

  11. Development and validation of an HPTLC method for the simultaneous estimation of Clonazepam and Paroxetine hydrochloride using a DOE approach

    Directory of Open Access Journals (Sweden)

    Purvi Shah

    2017-01-01

    Full Text Available The present study examines simultaneous multiple response optimization using Derringer's desirability function for the development of an HPTLC method to detect Clonazepam and Paroxetine hydrochloride in pharmaceutical dosage form. Central composite design (CCD was used to optimize the chromatographic conditions for HPTLC. The independent variables used for the optimization were the n-butanol content in the mobile phase, the chamber saturation time and the distance travelled. HPTLC separation was performed on aluminium plates pre-coated with silica gel 60 F254 as the stationary phase using n-butanol:glacial acetic acid:water (9:2:0.5% v/v/v as the mobile phase. Quantification was achieved based on a densitometric analysis of Clonazepam and Paroxetine hydrochloride over the concentration range of 40–240 ng/band and 300–1800 ng/band, respectively, at 288 nm. The method yielded compact and well-resolved bands at Rf of 0.77 ± 0.02 and 0.34 ± 0.02 for Clonazepam and Paroxetine hydrochloride, respectively. The linear regression analysis for the calibration plots produced r2 = 0.9958 and r2 = 0.9989 for Clonazepam and Paroxetine hydrochloride, respectively. The precision, accuracy, robustness, specificity, limit of detection and limit of quantitation of the method were validated according to the ICH guidelines. The factors evaluated in the robustness test were determined to have an insignificant effect on the selected responses. The results indicate that the method is suitable for the routine quality control testing of marketed tablet formulations.

  12. Development and validation of spectrophotometric methods for simultaneous estimation of citicoline and piracetam in tablet dosage form

    Directory of Open Access Journals (Sweden)

    Akhila Sivadas

    2013-01-01

    Full Text Available Context: Citicoline (CN and piracetam (PM combination in tablet formulation is newly introduced in market. It is necessary to develop suitable quality control methods for rapid and accurate determination of these drugs. Aim: The study aimed to develop the methods for simultaneous determination of CN and PM in combined dosage form. Materials and Methods: The first method was developed by formation and solving simultaneous equations using 280.3 and 264.1 nm as two analytical wavelengths. Second method was absorbance ratio in which wavelengths selected were 256.6 nm as its absorptive point and 280.3 nm as λmax of CN. According to International Conference on Harmonization (ICH norm, the parameters - linearity, precision, and accuracy were studied. The methods were validated statistically and by recovery studies. Results: Both the drugs obeyed Beer-Lambert′s law at the selected wavelengths in concentration range of 5-13 μg/ml for CN and 10-22 μg/ml for PM. The percentage of CN and PM in marketed tablet formulation was found to be 99.006 ± 0.173 and 99.257 ± 0.613, respectively; by simultaneous equation method. For Q-Absorption ratio method the percentage of CN and PM was found to be 99.078 ± 0.158 and 99.708 ± 0.838, respectively. Conclusions: The proposed methods were simple, reproducible, precise and robust. The methods can be successfully applied for routine analysis of tablets.

  13. Validating a High Performance Liquid Chromatography-Ion Chromatography (HPLC-IC) Method with Conductivity Detection After Chemical Suppression for Water Fluoride Estimation.

    Science.gov (United States)

    Bondu, Joseph Dian; Selvakumar, R; Fleming, Jude Joseph

    2018-01-01

    A variety of methods, including the Ion Selective Electrode (ISE), have been used for estimation of fluoride levels in drinking water. But as these methods suffer many drawbacks, the newer method of IC has replaced many of these methods. The study aimed at (1) validating IC for estimation of fluoride levels in drinking water and (2) to assess drinking water fluoride levels of villages in and around Vellore district using IC. Forty nine paired drinking water samples were measured using ISE and IC method (Metrohm). Water samples from 165 randomly selected villages in and around Vellore district were collected for fluoride estimation over 1 year. Standardization of IC method showed good within run precision, linearity and coefficient of variance with correlation coefficient R 2  = 0.998. The limit of detection was 0.027 ppm and limit of quantification was 0.083 ppm. Among 165 villages, 46.1% of the villages recorded water fluoride levels >1.00 ppm from which 19.4% had levels ranging from 1 to 1.5 ppm, 10.9% had recorded levels 1.5-2 ppm and about 12.7% had levels of 2.0-3.0 ppm. Three percent of villages had more than 3.0 ppm fluoride in the water tested. Most (44.42%) of these villages belonged to Jolarpet taluk with moderate to high (0.86-3.56 ppm) water fluoride levels. Ion Chromatography method has been validated and is therefore a reliable method in assessment of fluoride levels in the drinking water. While the residents of Jolarpet taluk (Vellore distict) are found to be at a high risk of developing dental and skeletal fluorosis.

  14. CSSI-PRO: a method for secondary structure type editing, assignment and estimation in proteins using linear combination of backbone chemical shifts

    International Nuclear Information System (INIS)

    Swain, Monalisa; Atreya, Hanudatta S.

    2009-01-01

    Estimation of secondary structure in polypeptides is important for studying their structure, folding and dynamics. In NMR spectroscopy, such information is generally obtained after sequence specific resonance assignments are completed. We present here a new methodology for assignment of secondary structure type to spin systems in proteins directly from NMR spectra, without prior knowledge of resonance assignments. The methodology, named Combination of Shifts for Secondary Structure Identification in Proteins (CSSI-PRO), involves detection of specific linear combination of backbone 1 H α and 13 C' chemical shifts in a two-dimensional (2D) NMR experiment based on G-matrix Fourier transform (GFT) NMR spectroscopy. Such linear combinations of shifts facilitate editing of residues belonging to α-helical/β-strand regions into distinct spectral regions nearly independent of the amino acid type, thereby allowing the estimation of overall secondary structure content of the protein. Comparison of the predicted secondary structure content with those estimated based on their respective 3D structures and/or the method of Chemical Shift Index for 237 proteins gives a correlation of more than 90% and an overall rmsd of 7.0%, which is comparable to other biophysical techniques used for structural characterization of proteins. Taken together, this methodology has a wide range of applications in NMR spectroscopy such as rapid protein structure determination, monitoring conformational changes in protein-folding/ligand-binding studies and automated resonance assignment

  15. A New Model of the Mean Albedo of the Earth: Estimation and Validation from the GRACE Mission and SLR Satellites.

    Science.gov (United States)

    Deleflie, F.; Sammuneh, M. A.; Coulot, D.; Pollet, A.; Biancale, R.; Marty, J. C.

    2017-12-01

    This talk provides new results of a study that we began last year, and that was the subject of a poster by the same authors presented during AGU FM 2016, entitled « Mean Effect of the Albedo of the Earth on Artificial Satellite Trajectories: an Update Over 2000-2015. »The emissivity of the Earth, split into a part in the visible domain (albedo) and the infrared domain (thermic emissivity), is at the origin of non gravitational perturbations on artificial satellite trajectories. The amplitudes and periods of these perturbations can be investigated if precise orbits can be carried out, and reveal some characteristics of the space environment where the satellite is orbiting. Analyzing the perturbations is, hence, a way to characterize how the energy from the Sun is re-emitted by the Earth. When led over a long period of time, such an approach enables to quantify the variations of the global radiation budget of the Earth.Additionally to the preliminary results presented last year, we draw an assessment of the validity of the mean model based on the orbits of the GRACE missions, and, to a certain extent, of some of the SLR satellite orbits. The accelerometric data of the GRACE satellites are used to evaluate the accuracy of the models accounting for non gravitational forces, and the ones induced by the albedo and the thermic emissivity in particular. Three data sets are used to investigate the mean effects on the orbit perturbations: Stephens tables (Stephens, 1980), ECMWF (European Centre for Medium-Range Weather Forecasts) data sets and CERES (Clouds and the Earth's Radiant Energy System) data sets (publickly available). From the trajectography point of view, based on post-fit residual analysis, we analyze what is the data set leading to the lowest residual level, to define which data set appears to be the most suitable one to derive a new « mean albedo model » from accelerometric data sets of the GRACE mission. The period of investigation covers the full GRACE

  16. Estimation of coolant void reactivity for CANDU-NG lattice using DRAGON and validation using MCNP5 and TRIPOLI-4.3

    International Nuclear Information System (INIS)

    Karthikeyan, R.; Tellier, R. L.; Hebert, A.

    2006-01-01

    The Coolant Void Reactivity (CVR) is an important safety parameter that needs to be estimated at the design stage of a nuclear reactor. It helps to have an a priori knowledge of the behavior of the system during a transient initiated by the loss of coolant. In the present paper, we have attempted to estimate the CVR for a CANDU New Generation (CANDU-NG) lattice, as proposed at an early stage of the Advanced CANDU Reactor (ACR) development. We have attempted to estimate the CVR with development version of the code DRAGON, using the method of characteristics. DRAGON has several advanced self-shielding models incorporated in it, each of them compatible with the method of characteristics. This study will bring to focus the performance of these self-shielding models, especially when there is voiding of such a tight lattice. We have also performed assembly calculations in 2 x 2 pattern for the CANDU-NG fuel, with special emphasis on checkerboard voiding. The results obtained have been validated against Monte Carlo codes MCNP5 and TRIPOLI-4.3. (authors)

  17. External validation of equations to estimate resting energy expenditure in 14952 adults with overweight and obesity and 1948 adults with normal weight from Italy.

    Science.gov (United States)

    Bedogni, Giorgio; Bertoli, Simona; Leone, Alessandro; De Amicis, Ramona; Lucchetti, Elisa; Agosti, Fiorenza; Marazzi, Nicoletta; Battezzati, Alberto; Sartorio, Alessandro

    2017-11-24

    We cross-validated 28 equations to estimate resting energy expenditure (REE) in a very large sample of adults with overweight or obesity. 14952 Caucasian men and women with overweight or obesity and 1498 with normal weight were studied. REE was measured using indirect calorimetry and estimated using two meta-regression equations and 26 other equations. The correct classification fraction (CCF) was defined as the fraction of subjects whose estimated REE was within 10% of measured REE. The highest CCF was 79%, 80%, 72%, 64%, and 63% in subjects with normal weight, overweight, class 1 obesity, class 2 obesity, and class 3 obesity, respectively. The Henry weight and height and Mifflin equations performed equally well with CCFs of 77% vs. 77% for subjects with normal weight, 80% vs. 80% for those with overweight, 72% vs. 72% for those with class 1 obesity, 64% vs. 63% for those with class 2 obesity, and 61% vs. 60% for those with class 3 obesity. The Sabounchi meta-regression equations offered an improvement over the above equations only for class 3 obesity (63%). The accuracy of REE equations decreases with increasing values of body mass index. The Henry weight & height and Mifflin equations are similarly accurate and the Sabounchi equations offer an improvement only in subjects with class 3 obesity. Copyright © 2017 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  18. Hunger and thirst numeric rating scales are not valid estimates for gastric content volumes: a prospective investigation in healthy children.

    Science.gov (United States)

    Buehrer, Sabin; Hanke, Ursula; Klaghofer, Richard; Fruehauf, Melanie; Weiss, Markus; Schmitz, Achim

    2014-03-01

    A rating scale for thirst and hunger was evaluated as a noninvasive, simple and commonly available tool to estimate preanesthetic gastric volume, a surrogate parameter for the risk of perioperative pulmonary aspiration, in healthy volunteer school age children. Numeric scales with scores from 0 to 10 combined with smileys to rate thirst and hunger were analyzed and compared with residual gastric volumes as measured by magnetic resonance imaging and fasting times in three settings: before and for 2 h after drinking clear fluid (group A, 7 ml/kg), before and for 4 vs 6