WorldWideScience

Sample records for protocols sample analyses

  1. Discrepancies in sample size calculations and data analyses reported in randomised trials: comparison of publications with protocols

    DEFF Research Database (Denmark)

    Chan, A.W.; Hrobjartsson, A.; Jorgensen, K.J.

    2008-01-01

    OBJECTIVE: To evaluate how often sample size calculations and methods of statistical analysis are pre-specified or changed in randomised trials. DESIGN: Retrospective cohort study. Data source Protocols and journal publications of published randomised parallel group trials initially approved...... in 1994-5 by the scientific-ethics committees for Copenhagen and Frederiksberg, Denmark (n=70). MAIN OUTCOME MEASURE: Proportion of protocols and publications that did not provide key information about sample size calculations and statistical methods; proportion of trials with discrepancies between...... of handling missing data was described in 16 protocols and 49 publications. 39/49 protocols and 42/43 publications reported the statistical test used to analyse primary outcome measures. Unacknowledged discrepancies between protocols and publications were found for sample size calculations (18/34 trials...

  2. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.; Metz, Thomas O.; Chia, Nicholas

    2016-05-03

    ABSTRACT

    Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical).

    IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated

  3. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  4. A modified FASP protocol for high-throughput preparation of protein samples for mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Jeremy Potriquet

    Full Text Available To facilitate high-throughput proteomic analyses we have developed a modified FASP protocol which improves the rate at which protein samples can be processed prior to mass spectrometry. Adapting the original FASP protocol to a 96-well format necessitates extended spin times for buffer exchange due to the low centrifugation speeds tolerated by these devices. However, by using 96-well plates with a more robust polyethersulfone molecular weight cutoff membrane, instead of the cellulose membranes typically used in these devices, we could use isopropanol as a wetting agent, decreasing spin times required for buffer exchange from an hour to 30 minutes. In a typical work flow used in our laboratory this equates to a reduction of 3 hours per plate, providing processing times similar to FASP for the processing of up to 96 samples per plate. To test whether our modified protocol produced similar results to FASP and other FASP-like protocols we compared the performance of our modified protocol to the original FASP and the more recently described eFASP and MStern-blot. We show that all FASP-like methods, including our modified protocol, display similar performance in terms of proteins identified and reproducibility. Our results show that our modified FASP protocol is an efficient method for the high-throughput processing of protein samples for mass spectral analysis.

  5. A protocol for analysing mathematics teacher educators' practices

    OpenAIRE

    Kuzle , Ana; Biehler , Rolf

    2015-01-01

    International audience; Studying practices in a teaching-learning environment, such as professional development programmes, is a complex and multi-faceted endeavour. While several frameworks exist to help researchers analyse teaching practices, none exist to analyse practices of those who organize professional development programmes, namely mathematics teacher educators. In this paper, based on theoretical as well as empirical results, we present a protocol for capturing different aspects of ...

  6. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    Science.gov (United States)

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  7. A new and standardized method to sample and analyse vitreous samples by the Cellient automated cell block system.

    Science.gov (United States)

    Van Ginderdeuren, Rita; Van Calster, Joachim; Stalmans, Peter; Van den Oord, Joost

    2014-08-01

    In this prospective study, a universal protocol for sampling and analysing vitreous material was investigated. Vitreous biopsies are difficult to handle because of the paucity of cells and the gelatinous structure of the vitreous. Histopathological analysis of the vitreous is useful in difficult uveitis cases to differentiate uveitis from lymphoma or infection and to define the type of cellular reaction. Hundred consecutive vitreous samples were analysed with the Cellient tissue processor (Hologic). This machine is a fully automated processor starting from a specified container with PreservCyt (fixative fluid) with cells to paraffin. Cytology was compared with fixatives Cytolyt (contains a mucolyticum) and PreservCyt. Routine histochemical and immunostainings were evaluated. In 92% of the cases, sufficient material was found for diagnosis. In 14%, a Cytolyt wash was necessary to prevent clotting of the tubes in the Cellient due to the viscosity of the sample. In 23%, the diagnosis was an acute inflammation (presence of granulocytes); in 33%, chronic active inflammation (presence of T lymphocytes); in 33%, low-grade inflammation (presence of CD68 cells, without T lymphocytes); and in 3%, a malignant process. A standardized protocol for sampling and handling vitreous biopsies, fixing in PreservCyt and processing by the Cellient gives a satisfactory result in morphology, number of cells and possibility of immuno-histochemical stainings. The diagnosis can be established or confirmed in more than 90% of cases. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  8. Subgroup analyses in randomised controlled trials: cohort study on trial protocols and journal publications.

    Science.gov (United States)

    Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias

    2014-07-16

    To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.

  9. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  11. Evaluation of sample preparation protocols for spider venom profiling by MALDI-TOF MS.

    Science.gov (United States)

    Bočánek, Ondřej; Šedo, Ondrej; Pekár, Stano; Zdráhal, Zbyněk

    2017-07-01

    Spider venoms are highly complex mixtures containing biologically active substances with potential for use in biotechnology or pharmacology. Fingerprinting of venoms by Matrix-Assisted Laser Desorption-Ionization - Time of Flight Mass Spectrometry (MALDI-TOF MS) is a thriving technology, enabling the rapid detection of peptide/protein components that can provide comparative information. In this study, we evaluated the effects of sample preparation procedures on MALDI-TOF mass spectral quality to establish a protocol providing the most reliable analytical outputs. We adopted initial sample preparation conditions from studies already published in this field. Three different MALDI matrixes, three matrix solvents, two sample deposition methods, and different acid concentrations were tested. As a model sample, venom from Brachypelma albopilosa was used. The mass spectra were evaluated on the basis of absolute and relative signal intensities, and signal resolution. By conducting three series of analyses at three weekly intervals, the reproducibility of the mass spectra were assessed as a crucial factor in the selection for optimum conditions. A sample preparation protocol based on the use of an HCCA matrix dissolved in 50% acetonitrile with 2.5% TFA deposited onto the target by the dried-droplet method was found to provide the best results in terms of information yield and repeatability. We propose that this protocol should be followed as a standard procedure, enabling the comparative assessment of MALDI-TOF MS spider venom fingerprints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Bioinspired Security Analysis of Wireless Protocols

    DEFF Research Database (Denmark)

    Petrocchi, Marinella; Spognardi, Angelo; Santi, Paolo

    2016-01-01

    work, this paper investigates feasibility of adopting fraglets as model for specifying security protocols and analysing their properties. In particular, we give concrete sample analyses over a secure RFID protocol, showing evolution of the protocol run as chemical dynamics and simulating an adversary...

  13. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of

  14. Development of bull trout sampling protocols

    Science.gov (United States)

    R. F. Thurow; J. T. Peterson; J. W. Guzevich

    2001-01-01

    This report describes results of research conducted in Washington in 2000 through Interagency Agreement #134100H002 between the U.S. Fish and Wildlife Service (USFWS) and the U.S. Forest Service Rocky Mountain Research Station (RMRS). The purpose of this agreement is to develop a bull trout (Salvelinus confluentus) sampling protocol by integrating...

  15. Zoonoses action plan Salmonella monitoring programme: an investigation of the sampling protocol.

    Science.gov (United States)

    Snary, E L; Munday, D K; Arnold, M E; Cook, A J C

    2010-03-01

    The Zoonoses Action Plan (ZAP) Salmonella Programme was established by the British Pig Executive to monitor Salmonella prevalence in quality-assured British pigs at slaughter by testing a sample of pigs with a meat juice enzyme-linked immunosorbent assay for antibodies against group B and C(1) Salmonella. Farms were assigned a ZAP level (1 to 3) depending on the monitored prevalence, and ZAP 2 or 3 farms were required to act to reduce the prevalence. The ultimate goal was to reduce the risk of human salmonellosis attributable to British pork. A mathematical model has been developed to describe the ZAP sampling protocol. Results show that the probability of assigning a farm the correct ZAP level was high, except for farms that had a seroprevalence close to the cutoff points between different ZAP levels. Sensitivity analyses identified that the probability of assigning a farm to the correct ZAP level was dependent on the sensitivity and specificity of the test, the number of batches taken to slaughter each quarter, and the number of samples taken per batch. The variability of the predicted seroprevalence was reduced as the number of batches or samples increased and, away from the cutoff points, the probability of being assigned the correct ZAP level increased as the number of batches or samples increased. In summary, the model described here provided invaluable insight into the ZAP sampling protocol. Further work is required to understand the impact of the program for Salmonella infection in British pig farms and therefore on human health.

  16. Tissue Sampling Guides for Porcine Biomedical Models.

    Science.gov (United States)

    Albl, Barbara; Haesner, Serena; Braun-Reichhart, Christina; Streckel, Elisabeth; Renner, Simone; Seeliger, Frank; Wolf, Eckhard; Wanke, Rüdiger; Blutke, Andreas

    2016-04-01

    This article provides guidelines for organ and tissue sampling adapted to porcine animal models in translational medical research. Detailed protocols for the determination of sampling locations and numbers as well as recommendations on the orientation, size, and trimming direction of samples from ∼50 different porcine organs and tissues are provided in the Supplementary Material. The proposed sampling protocols include the generation of samples suitable for subsequent qualitative and quantitative analyses, including cryohistology, paraffin, and plastic histology; immunohistochemistry;in situhybridization; electron microscopy; and quantitative stereology as well as molecular analyses of DNA, RNA, proteins, metabolites, and electrolytes. With regard to the planned extent of sampling efforts, time, and personnel expenses, and dependent upon the scheduled analyses, different protocols are provided. These protocols are adjusted for (I) routine screenings, as used in general toxicity studies or in analyses of gene expression patterns or histopathological organ alterations, (II) advanced analyses of single organs/tissues, and (III) large-scale sampling procedures to be applied in biobank projects. Providing a robust reference for studies of porcine models, the described protocols will ensure the efficiency of sampling, the systematic recovery of high-quality samples representing the entire organ or tissue as well as the intra-/interstudy comparability and reproducibility of results. © The Author(s) 2016.

  17. One Sample, One Shot - Evaluation of sample preparation protocols for the mass spectrometric proteome analysis of human bile fluid without extensive fractionation.

    Science.gov (United States)

    Megger, Dominik A; Padden, Juliet; Rosowski, Kristin; Uszkoreit, Julian; Bracht, Thilo; Eisenacher, Martin; Gerges, Christian; Neuhaus, Horst; Schumacher, Brigitte; Schlaak, Jörg F; Sitek, Barbara

    2017-02-10

    The proteome analysis of bile fluid represents a promising strategy to identify biomarker candidates for various diseases of the hepatobiliary system. However, to obtain substantive results in biomarker discovery studies large patient cohorts necessarily need to be analyzed. Consequently, this would lead to an unmanageable number of samples to be analyzed if sample preparation protocols with extensive fractionation methods are applied. Hence, the performance of simple workflows allowing for "one sample, one shot" experiments have been evaluated in this study. In detail, sixteen different protocols implying modifications at the stages of desalting, delipidation, deglycosylation and tryptic digestion have been examined. Each method has been individually evaluated regarding various performance criteria and comparative analyses have been conducted to uncover possible complementarities. Here, the best performance in terms of proteome coverage has been assessed for a combination of acetone precipitation with in-gel digestion. Finally, a mapping of all obtained protein identifications with putative biomarkers for hepatocellular carcinoma (HCC) and cholangiocellular carcinoma (CCC) revealed several proteins easily detectable in bile fluid. These results can build the basis for future studies with large and well-defined patient cohorts in a more disease-related context. Human bile fluid is a proximal body fluid and supposed to be a potential source of disease markers. However, due to its biochemical composition, the proteome analysis of bile fluid still represents a challenging task and is therefore mostly conducted using extensive fractionation procedures. This in turn leads to a high number of mass spectrometric measurements for one biological sample. Considering the fact that in order to overcome the biological variability a high number of biological samples needs to be analyzed in biomarker discovery studies, this leads to the dilemma of an unmanageable number of

  18. Evaluation of storage and filtration protocols for alpine/subalpine lake water quality samples

    Science.gov (United States)

    John L. Korfmacher; Robert C. Musselman

    2007-01-01

    Many government agencies and other organizations sample natural alpine and subalpine surface waters using varying protocols for sample storage and filtration. Simplification of protocols would be beneficial if it could be shown that sample quality is unaffected. In this study, samples collected from low ionic strength waters in alpine and subalpine lake inlets...

  19. Protocols for 16S rDNA Array Analyses of Microbial Communities by Sequence-Specific Labeling of DNA Probes

    Directory of Open Access Journals (Sweden)

    Knut Rudi

    2003-01-01

    Full Text Available Analyses of complex microbial communities are becoming increasingly important. Bottlenecks in these analyses, however, are the tools to actually describe the biodiversity. Novel protocols for DNA array-based analyses of microbial communities are presented. In these protocols, the specificity obtained by sequence-specific labeling of DNA probes is combined with the possibility of detecting several different probes simultaneously by DNA array hybridization. The gene encoding 16S ribosomal RNA was chosen as the target in these analyses. This gene contains both universally conserved regions and regions with relatively high variability. The universally conserved regions are used for PCR amplification primers, while the variable regions are used for the specific probes. Protocols are presented for DNA purification, probe construction, probe labeling, and DNA array hybridizations.

  20. A multigear protocol for sampling crayfish assemblages in Gulf of Mexico coastal streams

    Science.gov (United States)

    William R. Budnick; William E. Kelso; Susan B. Adams; Michael D. Kaller

    2018-01-01

    Identifying an effective protocol for sampling crayfish in streams that vary in habitat and physical/chemical characteristics has proven problematic. We evaluated an active, combined-gear (backpack electrofishing and dipnetting) sampling protocol in 20 Coastal Plain streams in Louisiana. Using generalized linear models and rarefaction curves, we evaluated environmental...

  1. The perils of straying from protocol: sampling bias and interviewer effects.

    Directory of Open Access Journals (Sweden)

    Carrie J Ngongo

    Full Text Available Fidelity to research protocol is critical. In a contingent valuation study in an informal urban settlement in Nairobi, Kenya, participants responded differently to the three trained interviewers. Interviewer effects were present during the survey pilot, then magnified at the start of the main survey after a seemingly slight adaptation of the survey sampling protocol allowed interviewers to speak with the "closest neighbor" in the event that no one was home at a selected household. This slight degree of interviewer choice led to inferred sampling bias. Multinomial logistic regression and post-estimation tests revealed that the three interviewers' samples differed significantly from one another according to six demographic characteristics. The two female interviewers were 2.8 and 7.7 times less likely to talk with respondents of low socio-economic status than the male interviewer. Systematic error renders it impossible to determine which of the survey responses might be "correct." This experience demonstrates why researchers must take care to strictly follow sampling protocols, consistently train interviewers, and monitor responses by interview to ensure similarity between interviewers' groups and produce unbiased estimates of the parameters of interest.

  2. Lead Sampling Protocols: Why So Many and What Do They Tell You?

    Science.gov (United States)

    Sampling protocols can be broadly categorized based on their intended purpose of 1) Pb regulatory compliance/corrosion control efficacy, 2) Pb plumbing source determination or Pb type identification, and 3) Pb exposure assessment. Choosing the appropriate protocol is crucial to p...

  3. The Gas Sampling Interval Effect on V˙O2peak Is Independent of Exercise Protocol.

    Science.gov (United States)

    Scheadler, Cory M; Garver, Matthew J; Hanson, Nicholas J

    2017-09-01

    There is a plethora of gas sampling intervals available during cardiopulmonary exercise testing to measure peak oxygen consumption (V˙O2peak). Different intervals can lead to altered V˙O2peak. Whether differences are affected by the exercise protocol or subject sample is not clear. The purpose of this investigation was to determine whether V˙O2peak differed because of the manipulation of sampling intervals and whether differences were independent of the protocol and subject sample. The first subject sample (24 ± 3 yr; V˙O2peak via 15-breath moving averages: 56.2 ± 6.8 mL·kg·min) completed the Bruce and the self-paced V˙O2max protocols. The second subject sample (21.9 ± 2.7 yr; V˙O2peak via 15-breath moving averages: 54.2 ± 8.0 mL·kg·min) completed the Bruce and the modified Astrand protocols. V˙O2peak was identified using five sampling intervals: 15-s block averages, 30-s block averages, 15-breath block averages, 15-breath moving averages, and 30-s block averages aligned to the end of exercise. Differences in V˙O2peak between intervals were determined using repeated-measures ANOVAs. The influence of subject sample on the sampling effect was determined using independent t-tests. There was a significant main effect of sampling interval on V˙O2peak (first sample Bruce and self-paced V˙O2max P sample Bruce and modified Astrand P sampling intervals followed a similar pattern for each protocol and subject sample, with 15-breath moving average presenting the highest V˙O2peak. The effect of manipulating gas sampling intervals on V˙O2peak appears to be protocol and sample independent. These findings highlight our recommendation that the clinical and scientific community request and report the sampling interval whenever metabolic data are presented. The standardization of reporting would assist in the comparison of V˙O2peak.

  4. Database communication protocol analyses and security detection

    International Nuclear Information System (INIS)

    Luo Qun; Liu Qiushi

    2003-01-01

    In this paper we introduced the analysis of TDS protocol in the communication application between Client and Server about SYBASE and MICROSOFT SQL SERVER and do some test for some bugs existed in the protocol. (authors)

  5. A protocol for analysing thermal stress in insects using infrared thermography.

    Science.gov (United States)

    Gallego, Belén; Verdú, José R; Carrascal, Luis M; Lobo, Jorge M

    2016-02-01

    The study of insect responses to thermal stress has involved a variety of protocols and methodologies that hamper the ability to compare results between studies. For that reason, the development of a protocol to standardize thermal assays is necessary. In this sense, infrared thermography solves some of the problems allowing us to take continuous temperature measurements without handling the individuals, an important fact in cold-blooded organisms like insects. Here, we present a working protocol based on infrared thermography to estimate both cold and heat thermal stress in insects. We analyse both the change in the body temperature of individuals and their behavioural response. In addition, we used partial least squares regression for the statistical analysis of our data, a technique that solves the problem of having a large number of variables and few individuals, allowing us to work with rare or endemic species. To test our protocol, we chose two species of congeneric, narrowly distributed dung beetles that are endemic to the southeastern part of the Iberian Peninsula. With our protocol we have obtained five variables in the response to cold and twelve in the response to heat. With this methodology we discriminate between the two flightless species of Jekelius through their thermal response. In response to cold, Jekelius hernandezi showed a higher rate of cooling and reached higher temperatures of stupor and haemolymph freezing than Jekelius punctatolineatus. Both species displayed similar thermoregulation ranges before reaching lethal body temperature with heat stress. Overall, we have demonstrated that infrared thermography is a suitable method to assess insect thermal responses with a high degree of sensitivity, allowing for the discrimination between closely related species. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A Robust PCR Protocol for HIV Drug Resistance Testing on Low-Level Viremia Samples

    Directory of Open Access Journals (Sweden)

    Shivani Gupta

    2017-01-01

    Full Text Available The prevalence of drug resistance (DR mutations in people with HIV-1 infection, particularly those with low-level viremia (LLV, supports the need to improve the sensitivity of amplification methods for HIV DR genotyping in order to optimize antiretroviral regimen and facilitate HIV-1 DR surveillance and relevant research. Here we report on a fully validated PCR-based protocol that achieves consistent amplification of the protease (PR and reverse transcriptase (RT regions of HIV-1 pol gene across many HIV-1 subtypes from LLV plasma samples. HIV-spiked plasma samples from the External Quality Assurance Program Oversight Laboratory (EQAPOL, covering various HIV-1 subtypes, as well as clinical specimens were used to optimize and validate the protocol. Our results demonstrate that this protocol has a broad HIV-1 subtype coverage and viral load span with high sensitivity and reproducibility. Moreover, the protocol is robust even when plasma sample volumes are limited, the HIV viral load is unknown, and/or the HIV subtype is undetermined. Thus, the protocol is applicable for the initial amplification of the HIV-1 PR and RT genes required for subsequent genotypic DR assays.

  7. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  8. A simplified field protocol for genetic sampling of birds using buccal swabs

    Science.gov (United States)

    Vilstrup, Julia T.; Mullins, Thomas D.; Miller, Mark P.; McDearman, Will; Walters, Jeffrey R.; Haig, Susan M.

    2018-01-01

    DNA sampling is an essential prerequisite for conducting population genetic studies. For many years, blood sampling has been the preferred method for obtaining DNA in birds because of their nucleated red blood cells. Nonetheless, use of buccal swabs has been gaining favor because they are less invasive yet still yield adequate amounts of DNA for amplifying mitochondrial and nuclear markers; however, buccal swab protocols often include steps (e.g., extended air-drying and storage under frozen conditions) not easily adapted to field settings. Furthermore, commercial extraction kits and swabs for buccal sampling can be expensive for large population studies. We therefore developed an efficient, cost-effective, and field-friendly protocol for sampling wild birds after comparing DNA yield among 3 inexpensive buccal swab types (2 with foam tips and 1 with a cotton tip). Extraction and amplification success was high (100% and 97.2% respectively) using inexpensive generic swabs. We found foam-tipped swabs provided higher DNA yields than cotton-tipped swabs. We further determined that omitting a drying step and storing swabs in Longmire buffer increased efficiency in the field while still yielding sufficient amounts of DNA for detailed population genetic studies using mitochondrial and nuclear markers. This new field protocol allows time- and cost-effective DNA sampling of juveniles or small-bodied birds for which drawing blood may cause excessive stress to birds and technicians alike.

  9. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  10. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  11. Toxoplasma gondii and pre-treatment protocols for polymerase chain reaction analysis of milk samples: a field trial in sheep from Southern Italy

    Directory of Open Access Journals (Sweden)

    Alice Vismarra

    2017-02-01

    Full Text Available Toxoplasmosis is a zoonotic disease caused by the protozoan Toxoplasma gondii. Ingestion of raw milk has been suggested as a risk for transmission to humans. Here the authors evaluated pre-treatment protocols for DNA extraction on T. gondii tachyzoite-spiked sheep milk with the aim of identifying the method that resulted in the most rapid and reliable polymerase chain reaction (PCR positivity. This protocol was then used to analyse milk samples from sheep of three different farms in Southern Italy, including real time PCR for DNA quantification and PCR-restriction fragment length polymorphism for genotyping. The pre-treatment protocol using ethylenediaminetetraacetic acid and Tris-HCl to remove casein gave the best results in the least amount of time compared to the others on spiked milk samples. One sample of 21 collected from sheep farms was positive on one-step PCR, real time PCR and resulted in a Type I genotype at one locus (SAG3. Milk usually contains a low number of tachyzoites and this could be a limiting factor for molecular identification. Our preliminary data has evaluated a rapid, cost-effective and sensitive protocol to treat milk before DNA extraction. The results of the present study also confirm the possibility of T. gondii transmission through consumption of raw milk and its unpasteurised derivatives.

  12. Field sampling, preparation procedure and plutonium analyses of large freshwater samples

    International Nuclear Information System (INIS)

    Straelberg, E.; Bjerk, T.O.; Oestmo, K.; Brittain, J.E.

    2002-01-01

    This work is part of an investigation of the mobility of plutonium in freshwater systems containing humic substances. A well-defined bog-stream system located in the catchment area of a subalpine lake, Oevre Heimdalsvatn, Norway, is being studied. During the summer of 1999, six water samples were collected from the tributary stream Lektorbekken and the lake itself. However, the analyses showed that the plutonium concentration was below the detection limit in all the samples. Therefore renewed sampling at the same sites was carried out in August 2000. The results so far are in agreement with previous analyses from the Heimdalen area. However, 100 times higher concentrations are found in the lowlands in the eastern part of Norway. The reason for this is not understood, but may be caused by differences in the concentrations of humic substances and/or the fact that the mountain areas are covered with snow for a longer period of time every year. (LN)

  13. Fast filtration sampling protocol for mammalian suspension cells tailored for phosphometabolome profiling by capillary ion chromatography - tandem mass spectrometry.

    Science.gov (United States)

    Kvitvang, Hans F N; Bruheim, Per

    2015-08-15

    Capillary ion chromatography (capIC) is the premium separation technology for low molecular phosphometabolites and nucleotides in biological extracts. Removal of excessive amounts of salt during sample preparation stages is a prerequisite to enable high quality capIC separation in combination with reproducible and sensitive MS detection. Existing sampling protocols for mammalian cells used for GC-MS and LC-MS metabolic profiling can therefore not be directly applied to capIC separations. Here, the development of a fast filtration sampling protocol for mammalian suspension cells tailored for quantitative profiling of the phosphometabolome on capIC-MS/MS is presented. The whole procedure from sampling the culture to transfer of filter to quenching and extraction solution takes less than 10s. To prevent leakage it is critical that a low vacuum pressure is applied, and satisfactorily reproducibility was only obtained by usage of a vacuum pressure controlling device. A vacuum of 60mbar was optimal for filtration of multiple myeloma Jjn-3 cell cultures through 5μm polyvinylidene (PVDF) filters. A quick deionized water (DI-water) rinse step prior to extraction was tested, and significantly higher metabolite yields were obtained during capIC-MS/MS analyses in this extract compared to extracts prepared by saline and reduced saline (25%) washing steps only. In addition, chromatographic performance was dramatically improved. Thus, it was verified that a quick DI-water rinse is tolerated by the cells and can be included as the final stage during filtration. Over 30 metabolites were quantitated in JJN-3 cell extracts by using the optimized sampling protocol with subsequent capIC-MS/MS analysis, and up to 2 million cells can be used in a single filtration step for the chosen filter and vacuum pressure. The technical set-up is also highly advantageous for microbial metabolome filtration protocols after optimization of vacuum pressure and washing solutions, and the reduced salt

  14. Integration of GC-MSD and ER-Calux® assay into a single protocol for determining steroid estrogens in environmental samples.

    Science.gov (United States)

    Avberšek, Miha; Žegura, Bojana; Filipič, Metka; Heath, Ester

    2011-11-01

    There are many published studies that use either chemical or biological methods to investigate steroid estrogens in the aquatic environment, but rarer are those that combine both. In this study, gas chromatography with mass selective detection (GC-MSD) and the ER-Calux(®) estrogenicity assay were integrated into a single protocol for simultaneous determination of natural (estrone--E1, 17β-estradiol--E2, estriol--E3) and synthetic (17α-ethinylestradiol--EE2) steroid estrogens concentrations and the total estrogenic potential of environmental samples. For integration purposes, several solvents were investigated and the commonly used dimethyl sulphoxide (DMSO) in the ER-Calux(®) assay was replaced by ethyl acetate, which is more compatible with gas chromatography and enables the same sample to be analysed by both GC-MSD and the ER-Calux(®) assay. The integrated protocol was initially tested using a standard mixture of estrogens. The results for pure standards showed that the estrogenicity calculated on the basis of GC-MSD and the ER-Calux(®) assay exhibited good correlation (r(2)=0.96; α=0.94). The result remained the same when spiked waste water extracts were tested (r(2)=0.92, α=1.02). When applied to real waste water influent and effluent samples the results proved (r(2)=0.93; α=0.99) the applicability of the protocol. The main advantages of this newly developed protocol are simple sample handling for both methods, and reduced material consumption and labour. In addition, it can be applied as either a complete or sequential analysis where the ER-Calux(®) assay is used as a pre-screening method prior to the chemical analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Centrifugation protocols: tests to determine optimal lithium heparin and citrate plasma sample quality.

    Science.gov (United States)

    Dimeski, Goce; Solano, Connie; Petroff, Mark K; Hynd, Matthew

    2011-05-01

    Currently, no clear guidelines exist for the most appropriate tests to determine sample quality from centrifugation protocols for plasma sample types with both lithium heparin in gel barrier tubes for biochemistry testing and citrate tubes for coagulation testing. Blood was collected from 14 participants in four lithium heparin and one serum tube with gel barrier. The plasma tubes were centrifuged at four different centrifuge settings and analysed for potassium (K(+)), lactate dehydrogenase (LD), glucose and phosphorus (Pi) at zero time, poststorage at six hours at 21 °C and six days at 2-8°C. At the same time, three citrate tubes were collected and centrifuged at three different centrifuge settings and analysed immediately for prothrombin time/international normalized ratio, activated partial thromboplastin time, derived fibrinogen and surface-activated clotting time (SACT). The biochemistry analytes indicate plasma is less stable than serum. Plasma sample quality is higher with longer centrifugation time, and much higher g force. Blood cells present in the plasma lyse with time or are damaged when transferred in the reaction vessels, causing an increase in the K(+), LD and Pi above outlined limits. The cells remain active and consume glucose even in cold storage. The SACT is the only coagulation parameter that was affected by platelets >10 × 10(9)/L in the citrate plasma. In addition to the platelet count, a limited but sensitive number of assays (K(+), LD, glucose and Pi for biochemistry, and SACT for coagulation) can be used to determine appropriate centrifuge settings to consistently obtain the highest quality lithium heparin and citrate plasma samples. The findings will aid laboratories to balance the need to provide the most accurate results in the best turnaround time.

  16. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  17. Reducing the sampling periods required in protocols for establishing ammonia emissions from pig fattening buildings using measurements and modelling

    NARCIS (Netherlands)

    Mosquera Losada, J.; Ogink, N.W.M.

    2011-01-01

    Ammonia (NH(3)) emission factors for animal housing systems in the Netherlands are based on measurements using standardised measurement protocols. Both the original Green Label (GL) protocol and the newly developed multi-site sampling protocol are based on year-round sampling periods. The objective

  18. A rapid and efficient DNA extraction protocol from fresh and frozen human blood samples.

    Science.gov (United States)

    Guha, Pokhraj; Das, Avishek; Dutta, Somit; Chaudhuri, Tapas Kumar

    2018-01-01

    Different methods available for extraction of human genomic DNA suffer from one or more drawbacks including low yield, compromised quality, cost, time consumption, use of toxic organic solvents, and many more. Herein, we aimed to develop a method to extract DNA from 500 μL of fresh or frozen human blood. Five hundred microliters of fresh and frozen human blood samples were used for standardization of the extraction procedure. Absorbance at 260 and 280 nm, respectively, (A 260 /A 280 ) were estimated to check the quality and quantity of the extracted DNA sample. Qualitative assessment of the extracted DNA was checked by Polymerase Chain reaction and double digestion of the DNA sample. Our protocol resulted in average yield of 22±2.97 μg and 20.5±3.97 μg from 500 μL of fresh and frozen blood, respectively, which were comparable to many reference protocols and kits. Besides yielding bulk amount of DNA, our protocol is rapid, economical, and avoids toxic organic solvents such as Phenol. Due to unaffected quality, the DNA is suitable for downstream applications. The protocol may also be useful for pursuing basic molecular researches in laboratories having limited funds. © 2017 Wiley Periodicals, Inc.

  19. Cooling tower wood sampling and analyses: A case study

    International Nuclear Information System (INIS)

    Haymore, J.L.

    1985-01-01

    Extensive wood sampling and analyses programs were initiated on crossflow and counterflow cooling towers that have been in service since 1951 and 1955, respectively. Wood samples were taken from all areas of the towers and were subjected to biological, chemical and physical tests. The tests and results for the analyses are discussed. The results indicate the degree of wood deterioration, and areas of the towers which experience the most advanced degree of degradation

  20. Development of a protocol for sampling and analysis of ballast water in Jamaica

    Directory of Open Access Journals (Sweden)

    Achsah A Mitchell

    2014-09-01

    Full Text Available The transfer of ballast by the international shipping industry has negatively impacted the environment. To design such a protocol for the area, the ballast water tanks of seven bulk cargo vessels entering a Jamaican port were sampled between January 28, 2010 and August 17, 2010. Vessels originated from five ports and used three main routes, some of which conducted ballast water exchange. Twenty-six preserved and 22 live replicate zooplankton samples were obtained. Abundance and richness were higher than at temperate ports. Exchange did not alter the biotic composition but reduced the abundance. Two of the live sample replicates, containing 31.67 and 16.75 viable individuals m-3, were non-compliant with the International Convention for the Control and Management of Ships’ Ballast Water and Sediments. Approximately 12% of the species identified in the ballast water were present in the waters nearest the port in 1995 and 11% were present in the entire bay in 2005. The protocol designed from this study can be used to aid the establishment of a ballast water management system in the Caribbean or used as a foundation for the development of further protocols.

  1. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  2. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  3. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2013-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  4. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2012-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  5. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author)

  6. Gamma spectrometric analyses of environmental samples at PINSTECH

    International Nuclear Information System (INIS)

    Faruq, M.U.; Parveen, N.; Ahmed, B.; Aziz, A.

    1979-01-01

    Gamma spectrometric analyses of air and other environmental samples from PINSTECH were carried out. Air particulate samples were analyzed by a Ge(Li) detector on a computer-based multichannel analyzer. Other environmental samples were analyzed by a Na(T1) scintillation detector spectrometer and a multichannel analyzer with manual analysis. Concentration of radionuclides in the media was determined and the sources of their production were identified. Age of the fall out was estimated from the ratios of the fission products. (authors)

  7. Analysing Password Protocol Security Against Off-line Dictionary Attacks

    NARCIS (Netherlands)

    Corin, R.J.; Doumen, J.M.; Etalle, Sandro; Busi, Nadia; Gorrieri, Roberto; Martinelli, Fabio

    We study the security of password protocols against off-line dictionary attacks. In addition to the standard adversary abilities, we also consider further cryptographic advantages given to the adversary when considering the password protocol being instantiated with particular encryption schemes. We

  8. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    Science.gov (United States)

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  9. Optimisation of recovery protocols for double-base smokeless powder residues analysed by total vaporisation (TV) SPME/GC-MS.

    Science.gov (United States)

    Sauzier, Georgina; Bors, Dana; Ash, Jordan; Goodpaster, John V; Lewis, Simon W

    2016-09-01

    The investigation of explosive events requires appropriate evidential protocols to recover and preserve residues from the scene. In this study, a central composite design was used to determine statistically validated optimum recovery parameters for double-base smokeless powder residues on steel, analysed using total vaporisation (TV) SPME/GC-MS. It was found that maximum recovery was obtained using isopropanol-wetted swabs stored under refrigerated conditions, then extracted for 15min into acetone on the same day as sample collection. These parameters were applied to the recovery of post-blast residues deposited on steel witness surfaces following a PVC pipe bomb detonation, resulting in detection of all target components across the majority of samples. Higher overall recoveries were obtained from plates facing the sides of the device, consistent with the point of first failure occurring in the pipe body as observed in previous studies. The methodology employed here may be readily applied to a variety of other explosive compounds, and thus assist in establishing 'best practice' procedures for explosive investigations. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  11. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1997-01-01

    Sample projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Sample projections are categorized by radiation level, protocol, sample matrix and Program. Analyses requirements are also presented

  12. Minimal sampling protocol for accurate estimation of urea production: a study with oral [13C]urea in fed and fasted piglets

    NARCIS (Netherlands)

    Oosterveld, Michiel J. S.; Gemke, Reinoud J. B. J.; Dainty, Jack R.; Kulik, Willem; Jakobs, Cornelis; de Meer, Kees

    2005-01-01

    An oral [13C]urea protocol may provide a simple method for measurement of urea production. The validity of single pool calculations in relation to a reduced sampling protocol was assessed. In eight fed and five fasted piglets, plasma urea enrichments from a 10 h sampling protocol were measured

  13. Effects of GPS sampling intensity on home range analyses

    Science.gov (United States)

    Jeffrey J. Kolodzinski; Lawrence V. Tannenbaum; David A. Osborn; Mark C. Conner; W. Mark Ford; Karl V. Miller

    2010-01-01

    The two most common methods for determining home ranges, minimum convex polygon (MCP) and kernel analyses, can be affected by sampling intensity. Despite prior research, it remains unclear how high-intensity sampling regimes affect home range estimations. We used datasets from 14 GPS-collared, white-tailed deer (Odocoileus virginianus) to describe...

  14. Use of reference samples for more accurate RBS analyses

    International Nuclear Information System (INIS)

    Lanford, W.A.; Pelicon, P.; Zorko, B.; Budnar, M.

    2002-01-01

    While one of the primary assets of RBS analysis is that it is quantitative without use of reference samples, for certain types of analyses the precision of the method can be improved by measuring RBS spectra of unknowns relative to the RBS spectra of a similar known sample. The advantage of such an approach is that one can reduce (or eliminate) the uncertainties that arise from error in the detector solid angle, beam current integration efficiency, scattering cross-section, and stopping powers. We have used this approach extensively to determine the composition (x) of homogeneous thin films of TaN x using as reference samples films of pure Ta. Our approach is to measure R=(Ta count) unknown /(Ta count) standard and use RUMP to determine the function x(R). Once the function x(R) has been determined, this approach makes it easy to analyze many samples quickly. Other analyses for which this approach has proved useful are determination of the composition (x) of WN x , SiO x H y and SiN x H y , using W, SiO 2 and amorphous Si as reference samples, respectively

  15. Thermogravimetric and x-ray diffraction analyses of Luna-24 regolith samples

    International Nuclear Information System (INIS)

    Deshpande, V.V.; Dharwadkar, S.R.; Jakkal, V.S.

    1979-01-01

    Two samples of Luna-24 were analysed by X-ray diffraction and thermogravimetric (TG) techniques. The sample 24123.12 shows a weight loss of nearly 0.85 percent between 23O-440deg C and followed by 1.16 percent weight gain from 500 to 800deg C. The sample 23190.13 showed only a weight gain of about 1.5 percent from 5O0deg C to 900deg C. X-ray diffraction analyses show the presence of olivine, plagioclase, pigeonite, enstatite, and native iron in both the virgin samples. The heated samples, however, show that only the native iron got oxidized to iron oxide. The other constituents remain unaltered. (auth.)

  16. A two-hypothesis approach to establishing a life detection/biohazard protocol for planetary samples

    Science.gov (United States)

    Conley, Catharine; Steele, Andrew

    2016-07-01

    The COSPAR policy on performing a biohazard assessment on samples brought from Mars to Earth is framed in the context of a concern for false-positive results. However, as noted during the 2012 Workshop for Life Detection in Samples from Mars (ref. Kminek et al., 2014), a more significant concern for planetary samples brought to Earth is false-negative results, because an undetected biohazard could increase risk to the Earth. This is the reason that stringent contamination control must be a high priority for all Category V Restricted Earth Return missions. A useful conceptual framework for addressing these concerns involves two complementary 'null' hypotheses: testing both of them, together, would allow statistical and community confidence to be developed regarding one or the other conclusion. As noted above, false negatives are of primary concern for safety of the Earth, so the 'Earth Safety null hypothesis' -- that must be disproved to assure low risk to the Earth from samples introduced by Category V Restricted Earth Return missions -- is 'There is native life in these samples.' False positives are of primary concern for Astrobiology, so the 'Astrobiology null hypothesis' -- that must be disproved in order to demonstrate the existence of extraterrestrial life is 'There is no life in these samples.' The presence of Earth contamination would render both of these hypotheses more difficult to disprove. Both these hypotheses can be tested following a strict science protocol; analyse, interprete, test the hypotheses and repeat. The science measurements undertaken are then done in an iterative fashion that responds to discovery with both hypotheses testable from interpretation of the scientific data. This is a robust, community involved activity that ensures maximum science return with minimal sample use.

  17. Modification of ion chromatograph for analyses of radioactive samples

    International Nuclear Information System (INIS)

    Curfman, L.L.; Johnson, S.J.

    1979-01-01

    In ion chromatographic analysis, the sample is injected through a sample loop onto an analytical column where separation occurs. The sample then passes through a suppressor column to remove or neutralize background ions. A flow-through conductivity cell is used as a detector. Depending upon column and eluent selection, ion chromatography can be used for anion or cation analyses. Ion chromatography has proven to be a versatile analytical tool for the analysis of anions in Hanford waste samples. These radioactive samples range from caustic high salt solutions to hydrochloric acid dissolutions of insoluble sludges. Instrument modifications which provide safe and convenient handling of these samples without lengthening analysis time or altering instrument performance are described

  18. PROTOCOL FOR EXAMINATION OF THE INNER CAN CLOSURE WELD REGION FOR 3013 DE CONTAINERS

    Energy Technology Data Exchange (ETDEWEB)

    Mickalonis, J.

    2014-09-16

    The protocol for the examination of the inner can closure weld region (ICCWR) for 3013 DE containers is presented within this report. The protocol includes sectioning of the inner can lid section, documenting the surface condition, measuring corrosion parameters, and storing of samples. This protocol may change as the investigation develops since findings may necessitate additional steps be taken. Details of the previous analyses, which formed the basis for this protocol, are also presented.

  19. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Protocol for sampling and analysis of bone specimens

    International Nuclear Information System (INIS)

    Aras, N.K.

    2000-01-01

    The iliac crest of hip bone was chosen as the most suitable sampling site for several reasons: Local variation in the elemental concentration along the iliac crest is minimal; Iliac crest biopsies are commonly taken clinically on patients; The cortical part of the sample is small (∼2 mm) and can be separated easily from the trabecular bone; The use of the trabecular part of the iliac crest for trace element analysis has the advantage of reflecting rapidly changes in the composition of bone due to external parameters, including medication. Biopsy studies, although in some ways more difficult than autopsy studies, because of the need to obtain the informed consents of the subjects, are potentially more useful than autopsy studies. Thereby many problems of postmortem migration of elements can be avoided and reliable dietary and other data can be collected simultaneously. Select the subjects among the patients undergoing orthopedic surgery due to any reason other than osteoporosis. Follow an established protocol to obtain bone biopsies. Patients undergoing synergy should fill in the 'Osteoporosis Project Questionnaire Form' including information on lifestyle variables, dietary intakes, the reason for surgery etc. If possible, measure the bone mineral density (BMD) prior to removal of the biopsy sample. However it may not possible to have BMD results on all the subjects because of difficulty of DEXA measurement after an accident

  1. Heavy water standards. Qualitative analyses, sample treating, stocking and manipulation

    International Nuclear Information System (INIS)

    Pavelescu, M.; Steflea, D.; Mihancea, I.; Varlam, M.; Irimescu, R.

    1995-01-01

    This paper presents methods and procedures for measuring heavy water concentration, and also sampling, stocking and handling of samples to be analysed. The main concentration analysis methods are: mass spectrometry, for concentrations less then 1%, densitometry, for concentrations within the range 1% - 99% and infrared spectrometry for concentrations above 99%. Procedures of sampling, processing and purification appropriate to these measuring methods were established. 1 Tab

  2. Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness.

  3. Protocol converter for serial communication between digital rectifier controllers and a power plant SCADA system

    Directory of Open Access Journals (Sweden)

    Vukić Vladimir Đ.

    2016-01-01

    Full Text Available The paper describes the protocol converter INT-485-MBRTU, developed for serial communication between the thyristor rectifier (based on the proprietary protocol "INT-CPD-05", according to standard RS-485 and the SCADA system (based on protocol "Modbus RTU", of the same standard in the thermal power plant "Nikola Tesla B1". Elementary data on industrial communication protocols and communication gateways were provided. The basic technical characteristics of the "Omron" programmable logic controller CJ series were described, as well as the developed device INT-485-MBRTU. Protocol converters with two versions of communication software were tested, differing only in one control word, intended for a forced successive change of communication sequences, in opposite to automatic sequence relieve. The device iNT-485-MBRTU, with the program for forced successive change of communication sequences, demonstrated the reliability of data transfer of 100 %, in a sample of approximately 480 messages. For nearly the same sample, the same protocol converter, with a version of the program without any type of message identifiers, transferred less than 60 % of the foreseen data. During multiple sixty-hour tests, the reliability of data transfer of at least 99.9979% was recorded, in 100% of the analysed cases, and for a sample of nearly 96,000 pairs of the send and receive messages. We analysed the results and estimated the additional possibilities for application of the INT-485-MBRTU protocol converter.

  4. Comparison of Different Sample Preparation Protocols Reveals Lysis Buffer-Specific Extraction Biases in Gram-Negative Bacteria and Human Cells.

    Science.gov (United States)

    Glatter, Timo; Ahrné, Erik; Schmidt, Alexander

    2015-11-06

    We evaluated different in-solution and FASP-based sample preparation strategies for absolute protein quantification. Label-free quantification (LFQ) was employed to compare different sample preparation strategies in the bacterium Pseudomonas aeruginosa and human embryonic kidney cells (HEK), and organismal-specific differences in general performance and enrichment of specific protein classes were noted. The original FASP protocol globally enriched for most proteins in the bacterial sample, whereas the sodium deoxycholate in-solution strategy was more efficient with HEK cells. Although detergents were found to be highly suited for global proteome analysis, higher intensities were obtained for high-abundant nucleic acid-associated protein complexes, like the ribosome and histone proteins, using guanidine hydrochloride. Importantly, we show for the first time that the observable total proteome mass of a sample strongly depends on the sample preparation protocol, with some protocols resulting in a significant underestimation of protein mass due to incomplete protein extraction of biased protein groups. Furthermore, we demonstrate that some of the observed abundance biases can be overcome by incorporating a nuclease treatment step or, alternatively, a correction factor for complementary sample preparation approaches.

  5. The UK Biobank sample handling and storage protocol for the collection, processing and archiving of human blood and urine.

    Science.gov (United States)

    Elliott, Paul; Peakman, Tim C

    2008-04-01

    UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is

  6. Estimating the Effective Sample Size of Tree Topologies from Bayesian Phylogenetic Analyses

    Science.gov (United States)

    Lanfear, Robert; Hua, Xia; Warren, Dan L.

    2016-01-01

    Bayesian phylogenetic analyses estimate posterior distributions of phylogenetic tree topologies and other parameters using Markov chain Monte Carlo (MCMC) methods. Before making inferences from these distributions, it is important to assess their adequacy. To this end, the effective sample size (ESS) estimates how many truly independent samples of a given parameter the output of the MCMC represents. The ESS of a parameter is frequently much lower than the number of samples taken from the MCMC because sequential samples from the chain can be non-independent due to autocorrelation. Typically, phylogeneticists use a rule of thumb that the ESS of all parameters should be greater than 200. However, we have no method to calculate an ESS of tree topology samples, despite the fact that the tree topology is often the parameter of primary interest and is almost always central to the estimation of other parameters. That is, we lack a method to determine whether we have adequately sampled one of the most important parameters in our analyses. In this study, we address this problem by developing methods to estimate the ESS for tree topologies. We combine these methods with two new diagnostic plots for assessing posterior samples of tree topologies, and compare their performance on simulated and empirical data sets. Combined, the methods we present provide new ways to assess the mixing and convergence of phylogenetic tree topologies in Bayesian MCMC analyses. PMID:27435794

  7. Advanced Curation Protocols for Mars Returned Sample Handling

    Science.gov (United States)

    Bell, M.; Mickelson, E.; Lindstrom, D.; Allton, J.

    Introduction: Johnson Space Center has over 30 years experience handling precious samples which include Lunar rocks and Antarctic meteorites. However, we recognize that future curation of samples from such missions as Genesis, Stardust, and Mars S mple Return, will require a high degree of biosafety combined witha extremely low levels of inorganic, organic, and biological contamination. To satisfy these requirements, research in the JSC Advanced Curation Lab is currently focused toward two major areas: preliminary examination techniques and cleaning and verification techniques . Preliminary Examination Techniques : In order to minimize the number of paths for contamination we are exploring the synergy between human &robotic sample handling in a controlled environment to help determine the limits of clean curation. Within the Advanced Curation Laboratory is a prototype, next-generation glovebox, which contains a robotic micromanipulator. The remotely operated manipulator has six degrees-of- freedom and can be programmed to perform repetitive sample handling tasks. Protocols are being tested and developed to perform curation tasks such as rock splitting, weighing, imaging, and storing. Techniques for sample transfer enabling more detailed remote examination without compromising the integrity of sample science are also being developed . The glovebox is equipped with a rapid transfer port through which samples can be passed without exposure. The transfer is accomplished by using a unique seal and engagement system which allows passage between containers while maintaining a first seal to the outside environment and a second seal to prevent the outside of the container cover and port door from becoming contaminated by the material being transferred. Cleaning and Verification Techniques: As part of the contamination control effort, innovative cleaning techniques are being identified and evaluated in conjunction with sensitive cleanliness verification methods. Towards this

  8. An On-Target Desalting and Concentration Sample Preparation Protocol for MALDI-MS and MS/MS Analysis

    DEFF Research Database (Denmark)

    Zhang, Xumin; Wang, Quanhui; Lou, Xiaomin

    2012-01-01

    2DE coupled with MALDI-MS is one of the most widely used and powerful analytic technologies in proteomics study. The MALDI sample preparation method has been developed and optimized towards the combination of simplicity, sample-cleaning, and sample concentration since its introduction. Here we...... present a protocol of the so-called Sample loading, Matrix loading, and on-target Wash (SMW) method which fulfills the three criteria by taking advantage of the AnchorChip™ targets. Our method is extremely simple and no pre-desalting or concentration is needed when dealing with samples prepared from 2DE...

  9. Robotic sample preparation for radiochemical plutonium and americium analyses

    International Nuclear Information System (INIS)

    Stalnaker, N.; Beugelsdijk, T.; Thurston, A.; Quintana, J.

    1985-01-01

    A Zymate robotic system has been assembled and programmed to prepare samples for plutonium and americium analyses by radioactivity counting. The system performs two procedures: a simple dilution procedure and a TTA (xylene) extraction of plutonium. To perform the procedures, the robotic system executes 11 unit operations such as weighing, pipetting, mixing, etc. Approximately 150 programs, which require 64 kilobytes of memory, control the system. The system is now being tested with high-purity plutonium metal and plutonium oxide samples. Our studies indicate that the system can give results that agree within 5% at the 95% confidence level with determinations performed manually. 1 ref., 1 fig., 1 tab

  10. [Sampling and measurement methods of the protocol design of the China Nine-Province Survey for blindness, visual impairment and cataract surgery].

    Science.gov (United States)

    Zhao, Jia-liang; Wang, Yu; Gao, Xue-cheng; Ellwein, Leon B; Liu, Hu

    2011-09-01

    To design the protocol of the China nine-province survey for blindness, visual impairment and cataract surgery to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery. The protocol design was began after accepting the task for the national survey for blindness, visual impairment and cataract surgery from the Department of Medicine, Ministry of Health, China, in November, 2005. The protocol in Beijing Shunyi Eye Study in 1996 and Guangdong Doumen County Eye Study in 1997, both supported by World Health Organization, was taken as the basis for the protocol design. The relative experts were invited to discuss and prove the draft protocol. An international advisor committee was established to examine and approve the draft protocol. Finally, the survey protocol was checked and approved by the Department of Medicine, Ministry of Health, China and Prevention Program of Blindness and Deafness, WHO. The survey protocol was designed according to the characteristics and the scale of the survey. The contents of the protocol included determination of target population and survey sites, calculation of the sample size, design of the random sampling, composition and organization of the survey teams, determination of the examinee, the flowchart of the field work, survey items and methods, diagnostic criteria of blindness and moderate and sever visual impairment, the measures of the quality control, the methods of the data management. The designed protocol became the standard and practical protocol for the survey to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery.

  11. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  12. Lessons learned from radioactive/mixed waste analyses at EG ampersand G Idaho, Inc

    International Nuclear Information System (INIS)

    Murphy, R.J.; Sailer, S.J.; Bennett, J.T.; Arvizu, J.S.

    1990-01-01

    For the past 30 years extensive chemical characterizations of environmental and waste samples have been performed by numerous academic, commercial, and government analytical chemistry laboratories for the purposes of research, monitoring, and compliance with regulations. The vast majority of these analyses, however, has been conducted on samples containing natural concentrations of radioactive constituents. It is only within the last decade that a small number of laboratories have been conducting extensive chemical characterizations of highly radioactive samples and consequently have begun to identify many special requirements for the safe and accurate conduct of such analyses. Experience gained from chemical analyses of radioactively contaminated samples has indicated special requirements and actions needed in the following three general areas: Sample collection and preservation; chemical analysis protocols; disposal of waste from chemical analyses. In this paper we will summarize the experience and findings acquired from four years of radioactive sample analyses by the Environmental Chemistry Unit, an analytical chemistry laboratory of EG ampersand G Idaho, Inc. at the Idaho National Engineering Laboratory. 6 tabs

  13. Cancer risk of anti-TNF-α at recommended doses in adult rheumatoid arthritis: a meta-analysis with intention to treat and per protocol analyses.

    Directory of Open Access Journals (Sweden)

    Guillaume Moulis

    Full Text Available BACKGROUND: The risk of malignancies on TNF-α antagonists is controversial. The aim of this survey was to assess cancer risk on TNF-α antagonists in adult rheumatoid arthritis patients, including the five marketed drugs (infliximab, etanercept, adalimumab, golimumab and certolizumab used in line with the New Drug Application. Furthermore, the relative interest of modified intention to treat or per protocol analyses to assess such sparse events remains unknown. METHODOLOGY/PRINCIPAL FINDINGS: Data sources were MEDLINE, CENTRAL, ISI Web of Science, ACR and EULAR meeting abstracts, scientific evaluation of the drugs leading to their marketing approval, and clinicaltrials.gov, until 31 December 2012.We selected double-blind randomized controlled trials in adult rheumatoid arthritis patients, including at least one treatment arm in line with New Drug Application. We performed random effect meta-analysis, with modified intention to treat and per protocol analyses. Thirty-three trials were included. There was no excess risk of malignancies on anti-TNF-α administered in line with New Drug Application in the per protocol model (OR, 0.93 95%CI[0.59-1.44], as well as in the modified intention to treat model (OR, 1.27 95%CI[0.82-1.98]. There was a non-significant tendency for an excess non-melanoma skin cancer risk in both models (respectively, 1.37 [0.71-2.66] and 1.90 [0.98-3.67]. With fixed effect Peto model restricting to trials during at least 52 weeks, the overall cancer risk was respectively 1.60 [0.97-2.64] and 1.22 [0.72-2.08]. Whatever the model, modified intention to treat analysis led to higher estimations than per protocol analysis. The later may underestimate the treatment effect when assessing very sparse events and when many patients dropped out in placebo arms. In metaregression, there was no differential risk among the five drugs. CONCLUSIONS/SIGNIFICANCE: This study did not find any evidence for an excess cancer risk on TNF

  14. Establishment of a protocol for the gene expression analysis of laser microdissected rat kidney samples with affymetrix genechips

    International Nuclear Information System (INIS)

    Stemmer, Kerstin; Ellinger-Ziegelbauer, Heidrun; Lotz, Kerstin; Ahr, Hans-J.; Dietrich, Daniel R.

    2006-01-01

    Laser microdissection in conjunction with microarray technology allows selective isolation and analysis of specific cell populations, e.g., preneoplastic renal lesions. To date, only limited information is available on sample preparation and preservation techniques that result in both optimal histomorphological preservation of sections and high-quality RNA for microarray analysis. Furthermore, amplification of minute amounts of RNA from microdissected renal samples allowing analysis with genechips has only scantily been addressed to date. The objective of this study was therefore to establish a reliable and reproducible protocol for laser microdissection in conjunction with microarray technology using kidney tissue from Eker rats p.o. treated for 7 days and 6 months with 10 and 1 mg Aristolochic acid/kg bw, respectively. Kidney tissues were preserved in RNAlater or snap frozen. Cryosections were cut and stained with either H and E or cresyl violet for subsequent morphological and RNA quality assessment and laser microdissection. RNA quality was comparable in snap frozen and RNAlater-preserved samples, however, the histomorphological preservation of renal sections was much better following cryopreservation. Moreover, the different staining techniques in combination with sample processing time at room temperature can have an influence on RNA quality. Different RNA amplification protocols were shown to have an impact on gene expression profiles as demonstrated with Affymetrix Rat Genome 230 2 .0 arrays. Considering all the parameters analyzed in this study, a protocol for RNA isolation from laser microdissected samples with subsequent Affymetrix chip hybridization was established that was also successfully applied to preneoplastic lesions laser microdissected from Aristolochic acid-treated rats

  15. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  16. A protocol for better design, application, and communication of population viability analyses.

    Science.gov (United States)

    Pe'er, Guy; Matsinos, Yiannis G; Johst, Karin; Franz, Kamila W; Turlure, Camille; Radchuk, Viktoriia; Malinowska, Agnieszka H; Curtis, Janelle M R; Naujokaitis-Lewis, Ilona; Wintle, Brendan A; Henle, Klaus

    2013-08-01

    Population viability analyses (PVAs) contribute to conservation theory, policy, and management. Most PVAs focus on single species within a given landscape and address a specific problem. This specificity often is reflected in the organization of published PVA descriptions. Many lack structure, making them difficult to understand, assess, repeat, or use for drawing generalizations across PVA studies. In an assessment comparing published PVAs and existing guidelines, we found that model selection was rarely justified; important parameters remained neglected or their implementation was described vaguely; limited details were given on parameter ranges, sensitivity analysis, and scenarios; and results were often reported too inconsistently to enable repeatability and comparability. Although many guidelines exist on how to design and implement reliable PVAs and standards exist for documenting and communicating ecological models in general, there is a lack of organized guidelines for designing, applying, and communicating PVAs that account for their diversity of structures and contents. To fill this gap, we integrated published guidelines and recommendations for PVA design and application, protocols for documenting ecological models in general and individual-based models in particular, and our collective experience in developing, applying, and reviewing PVAs. We devised a comprehensive protocol for the design, application, and communication of PVAs (DAC-PVA), which has 3 primary elements. The first defines what a useful PVA is; the second element provides a workflow for the design and application of a useful PVA and highlights important aspects that need to be considered during these processes; and the third element focuses on communication of PVAs to ensure clarity, comprehensiveness, repeatability, and comparability. Thereby, DAC-PVA should strengthen the credibility and relevance of PVAs for policy and management, and improve the capacity to generalize PVA findings

  17. Comparative analysis of five DNA isolation protocols and three drying methods for leaves samples of Nectandra megapotamica (Spreng. Mez

    Directory of Open Access Journals (Sweden)

    Leonardo Severo da Costa

    2016-06-01

    Full Text Available The aim of the study was to establish a DNA isolation protocol Nectandra megapotamica (Spreng. Mez., able to obtain samples of high yield and quality for use in genomic analysis. A commercial kit and four classical methods of DNA extraction were tested, including three cetyltrimethylammonium bromide (CTAB-based and one sodium dodecyl sulfate (SDS-based methods. Three drying methods for leaves samples were also evaluated including drying at room temperature (RT, in an oven at 40ºC (S40, and in a microwave oven (FMO. The DNA solutions obtained from different types of leaves samples using the five protocols were assessed in terms of cost, execution time, and quality and yield of extracted DNA. The commercial kit did not extract DNA with sufficient quantity or quality for successful PCR reactions. Among the classic methods, only the protocols of Dellaporta and of Khanuja yielded DNA extractions for all three types of foliar samples that resulted in successful PCR reactions and subsequent enzyme restriction assays. Based on the evaluated variables, the most appropriate DNA extraction method for Nectandra megapotamica (Spreng. Mez. was that of Dellaporta, regardless of the method used to dry the samples. The selected method has a relatively low cost and total execution time. Moreover, the quality and quantity of DNA extracted using this method was sufficient for DNA sequence amplification using PCR reactions and to get restriction fragments.

  18. Sampling and analyses of SRP high-level waste sludges

    International Nuclear Information System (INIS)

    Stone, J.A.; Kelley, J.A.; McMillan, T.S.

    1976-08-01

    Twelve 3-liter samples of high-heat waste sludges were collected from four Savannah River Plant waste tanks with a hydraulically operated sample collector of unique design. Ten of these samples were processed in Savannah River Laboratory shielded cell facilities, yielding 5.3 kg of washed, dried sludge products for waste solidification studies. After initial drying, each batch was washed by settling and decantation to remove the bulk of soluble salts and then was redried. Additional washes were by filtration, followed by final drying. Conclusions from analyses of samples taken during the processing steps were: (a) the raw sludges contained approximately 80 wt percent soluble salts, most of which were removed by the washes; (b) 90 Sr and 238 , 239 Pu remained in the sludges, but most of the 137 Cs was removed by washing; (c) small amounts of sodium, sulfate, and 137 Cs remained in the sludges after thorough washing; (d) no significant differences were found in sludge samples taken from different risers of one waste tank. Chemical and radiometric compositions of the sludge product from each tank were determined. The sludges had diverse compositions, but iron, manganese, aluminum, and uranium were principal elements in each sludge. 90 Sr was the predominant radionuclide in each sludge product

  19. Development of a new protocol for rapid bacterial identification and susceptibility testing directly from urine samples.

    Science.gov (United States)

    Zboromyrska, Y; Rubio, E; Alejo, I; Vergara, A; Mons, A; Campo, I; Bosch, J; Marco, F; Vila, J

    2016-06-01

    The current gold standard method for the diagnosis of urinary tract infections (UTI) is urine culture that requires 18-48 h for the identification of the causative microorganisms and an additional 24 h until the results of antimicrobial susceptibility testing (AST) are available. The aim of this study was to shorten the time of urine sample processing by a combination of flow cytometry for screening and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for bacterial identification followed by AST directly from urine. The study was divided into two parts. During the first part, 675 urine samples were processed by a flow cytometry device and a cut-off value of bacterial count was determined to select samples for direct identification by MALDI-TOF-MS at ≥5 × 10(6) bacteria/mL. During the second part, 163 of 1029 processed samples reached the cut-off value. The sample preparation protocol for direct identification included two centrifugation and two washing steps. Direct AST was performed by the disc diffusion method if a reliable direct identification was obtained. Direct MALDI-TOF-MS identification was performed in 140 urine samples; 125 of the samples were positive by urine culture, 12 were contaminated and 3 were negative. Reliable direct identification was obtained in 108 (86.4%) of the 125 positive samples. AST was performed in 102 identified samples, and the results were fully concordant with the routine method among 83 monomicrobial infections. In conclusion, the turnaround time of the protocol described to diagnose UTI was about 1 h for microbial identification and 18-24 h for AST. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  20. Multidimensional chromatography coupled to mass spectrometry in analysing complex proteomics samples

    NARCIS (Netherlands)

    Horvatovich, Peter; Hoekman, Berend; Govorukhina, Natalia; Bischoff, Rainer

    Multidimensional chromatography coupled to mass spectrometry (LC(n)-MS) provides more separation power and an extended measured dynamic concentration range to analyse complex proteomics samples than one dimensional liquid chromatography coupled to mass spectrometry (1D-LC-MS). This review gives an

  1. On-farm comparisons of different cleaning protocols in broiler houses.

    Science.gov (United States)

    Luyckx, K Y; Van Weyenberg, S; Dewulf, J; Herman, L; Zoons, J; Vervaet, E; Heyndrickx, M; De Reu, K

    2015-08-01

    The present study evaluated the effectiveness of 4 cleaning protocols designed to reduce the bacteriological infection pressure on broiler farms and prevent food-borne zoonoses. Additionally, difficult to clean locations and possible sources of infection were identified. Cleaning and disinfection rounds were evaluated in 12 broiler houses on 5 farms through microbiological analyses and adenosine triphosphate hygiene monitoring. Samples were taken at 3 different times: before cleaning, after cleaning, and after disinfection. At each sampling time, swabs were taken from various locations for enumeration of the total aerobic flora and Enterococcus species pluralis ( SPP:). In addition, before cleaning and after disinfection, testing for Escherichia coli and Salmonella was carried out. Finally, adenosine triphosphate swabs and agar contact plates for total aerobic flora counts were taken after cleaning and disinfection, respectively. Total aerobic flora and Enterococcus spp. counts on the swab samples showed that cleaning protocols which were preceded by an overnight soaking with water caused a higher bacterial reduction compared to protocols without a preceding soaking step. Moreover, soaking of broiler houses leads to less water consumption and reduced working time during high pressure cleaning. No differences were found between protocols using cold or warm water during cleaning. Drinking cups, drain holes, and floor cracks were identified as critical locations for cleaning and disinfection in broiler houses. © 2015 Poultry Science Association Inc.

  2. Premature death of adult adoptees: analyses of a case-cohort sample.

    Science.gov (United States)

    Petersen, Liselotte; Andersen, Per Kragh; Sørensen, Thorkild I A

    2005-05-01

    Genetic and environmental influence on risk of premature death in adulthood was investigated by estimating the associations in total and cause-specific mortality of adult Danish adoptees and their biological and adoptive parents. Among all 14,425 non-familial adoptions formally granted in Denmark during the period 1924 through 1947, we selected the study population according to a case-cohort sampling design. As the case-control design, the case-cohort design has the advantage of economic data collection and little loss in statistical efficiency, but the case-cohort sample has the additional advantages that rate ratio estimates may be obtained, and re-use of the cohort sample in future studies of other outcomes is possible. Analyses were performed using Kalbfleisch and Lawless's estimator for hazard ratio, and robust estimation for variances. In the main analyses the sample was restricted to birth years of the adoptees 1924 and after, and age of transfer to the adoptive parents before 7 years, and age at death was restricted to 16 to 70 years. The results showed a higher mortality among adoptees, whose biological parents died in the age range of 16 to 70 years; this was significant for deaths from natural causes, vascular causes and all causes. No influence was seen from early death of adoptive parents, regardless of cause of death. (c) 2005 Wiley-Liss, Inc.

  3. Game-theoretic perspective of Ping-Pong protocol

    Science.gov (United States)

    Kaur, Hargeet; Kumar, Atul

    2018-01-01

    We analyse Ping-Pong protocol from the point of view of a game. The analysis helps us in understanding the different strategies of a sender and an eavesdropper to gain the maximum payoff in the game. The study presented here characterizes strategies that lead to different Nash equilibriums. We further demonstrate the condition for Pareto optimality depending on the parameters used in the game. Moreover, we also analysed LM05 protocol and compared it with PP protocol from the point of view of a generic two-way QKD game with or without entanglement. Our results provide a deeper understanding of general two-way QKD protocols in terms of the security and payoffs of different stakeholders in the protocol.

  4. Self-sampling with HPV mRNA analyses from vagina and urine compared with cervical samples.

    Science.gov (United States)

    Asciutto, Katrin Christine; Ernstson, Avalon; Forslund, Ola; Borgfeldt, Christer

    2018-04-01

    In order to increase coverage in the organized cervical screening program, self-sampling with HPV analyses has been suggested. The aim was to compare human papillomavirus (HPV) mRNA detection in vaginal and urine self-collected samples with clinician-taken cervical samples and the corresponding clinician-taken histological specimens. Self-collected vaginal, urine and clinician-taken cervical samples were analyzed from 209 women with the Aptima mRNA assay (Hologic Inc, MA, USA). Cervical cytology, colposcopy, biopsy and/or the loop electrosurgical excision procedure (LEEP) were performed in every examination. The sensitivity of the HPV mRNA test in detecting high-grade squamous intraepithelial lesions (HSIL)/adenocarcinoma in situ (AIS)/cancer cases was as follows: for the vaginal self-samples 85.5% (95% CI; 75.0-92.8), the urinary samples 44.8% (95% CI; 32.6-57.4), and for routine cytology 81.7% (95% CI; 70.7-89.9). For the clinician-taken cervical HPV samples the sensitivity of the HPV mRNA test in detecting HSIL/AIS/cancer was 100.0% (95% CI; 94.9-100.0). The specificity of the HPV mRNA was similar for the clinician-taken cervical HPV samples and the self-samples: 49.0% vs. 48.1%. The urinary HPV samples had a specificity of 61.9% and cytology had a specificity of 93.3%. The sensitivity of the Aptima HPV mRNA test in detecting HSIL/AIS/cancer from vaginal self-samples was similar to that of routine cytology. The Aptima HPV mRNA vaginal self-sampling analysis may serve as a complement in screening programs. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  6. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  7. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Directory of Open Access Journals (Sweden)

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  8. New Zealand guidelines for the collection of groundwater samples for chemical and isotopic analyses

    International Nuclear Information System (INIS)

    Rosen, M.R.; Cameron, S.G.; Reeves, R.R.; Taylor, C.B.

    1999-01-01

    Chemical and isotopic analyses of groundwater are important tools for differentiating between the natural composition and human-induced contaminants of groundwater. A comprehensive suite of inorganic water chemical analyses is necessary to characterise waters. The geology of New Zealand is diverse, so it is impractical to characterise a ''typical'' groundwater chemical composition. Each aquifer system should be evaluated individually because the major dissolved species contain useful information about the pathways of water through the soil zone into the aquifer. Analyses of major ions such as chloride, nitrate, potassium and sulphate often give indication of septic systems and agricultural contamination. The minor ions, while most are not considered contaminants, are often indicators of human activity. Iron and manganese are good indicators of Eh potential, which is an important control on the mobility of many heavy metals. The inexpensive inorganic chemical analytical suite should be used as a guide to advisability of more expensive contaminant testing. The purpose of this manual is to provide consistent groundwater sampling guidelines for use throughout New Zealand. Sinton's (1998) guide to groundwater sampling techniques provided a sound basis for the accurate collection of groundwater samples. However Sinton did not address sampling materials and techniques for the collection of samples for ultra trace component analysis or the collection of environmental isotope samples. These important aspects of groundwater sampling have been included in this updated manual. (author). 30 refs., 12 figs., 5 tabs., 1 appendix

  9. Inter-laboratory variation in DNA damage using a standard comet assay protocol

    DEFF Research Database (Denmark)

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen

    2012-01-01

    determined the baseline level of DNA strand breaks (SBs)/alkaline labile sites and formamidopyrimidine DNA glycosylase (FPG)-sensitive sites in coded samples of mononuclear blood cells (MNBCs) from healthy volunteers. There were technical problems in seven laboratories in adopting the standard protocol...... analysed by the standard protocol. The SBs and FPG-sensitive sites were measured in the same experiment, indicating that the large spread in the latter lesions was the main reason for the reduced inter-laboratory variation. However, it remains worrying that half of the participating laboratories obtained...

  10. The use of secondary ion mass spectrometry in forensic analyses of ultra-small samples

    Science.gov (United States)

    Cliff, John

    2010-05-01

    It is becoming increasingly important in forensic science to perform chemical and isotopic analyses on very small sample sizes. Moreover, in some instances the signature of interest may be incorporated in a vast background making analyses impossible by bulk methods. Recent advances in instrumentation make secondary ion mass spectrometry (SIMS) a powerful tool to apply to these problems. As an introduction, we present three types of forensic analyses in which SIMS may be useful. The causal organism of anthrax (Bacillus anthracis) chelates Ca and other metals during spore formation. Thus, the spores contain a trace element signature related to the growth medium that produced the organisms. Although other techniques have been shown to be useful in analyzing these signatures, the sample size requirements are generally relatively large. We have shown that time of flight SIMS (TOF-SIMS) combined with multivariate analysis, can clearly separate Bacillus sp. cultures prepared in different growth media using analytical spot sizes containing approximately one nanogram of spores. An important emerging field in forensic analysis is that of provenance of fecal pollution. The strategy of choice for these analyses-developing host-specific nucleic acid probes-has met with considerable difficulty due to lack of specificity of the probes. One potentially fruitful strategy is to combine in situ nucleic acid probing with high precision isotopic analyses. Bulk analyses of human and bovine fecal bacteria, for example, indicate a relative difference in d13C content of about 4 per mil. We have shown that sample sizes of several nanograms can be analyzed with the IMS 1280 with precisions capable of separating two per mil differences in d13C. The NanoSIMS 50 is capable of much better spatial resolution than the IMS 1280, albeit at a cost of analytical precision. Nevertheless we have documented precision capable of separating five per mil differences in d13C using analytical spots containing

  11. Optimized IMAC-IMAC protocol for phosphopeptide recovery from complex biological samples

    DEFF Research Database (Denmark)

    Ye, Juanying; Zhang, Xumin; Young, Clifford

    2010-01-01

    using Fe(III)-NTA IMAC resin and it proved to be highly selective in the phosphopeptide enrichment of a highly diluted standard sample (1:1000) prior to MALDI MS analysis. We also observed that a higher iron purity led to an increased IMAC enrichment efficiency. The optimized method was then adapted...... to phosphoproteome analyses of cell lysates of high protein complexity. From either 20 microg of mouse sample or 50 microg of Drosophila melanogaster sample, more than 1000 phosphorylation sites were identified in each study using IMAC-IMAC and LC-MS/MS. We demonstrate efficient separation of multiply phosphorylated...... characterization of phosphoproteins in functional phosphoproteomics research projects....

  12. From human monocytes to genome-wide binding sites--a protocol for small amounts of blood: monocyte isolation/ChIP-protocol/library amplification/genome wide computational data analysis.

    Directory of Open Access Journals (Sweden)

    Sebastian Weiterer

    Full Text Available Chromatin immunoprecipitation in combination with a genome-wide analysis via high-throughput sequencing is the state of the art method to gain genome-wide representation of histone modification or transcription factor binding profiles. However, chromatin immunoprecipitation analysis in the context of human experimental samples is limited, especially in the case of blood cells. The typically extremely low yields of precipitated DNA are usually not compatible with library amplification for next generation sequencing. We developed a highly reproducible protocol to present a guideline from the first step of isolating monocytes from a blood sample to analyse the distribution of histone modifications in a genome-wide manner.The protocol describes the whole work flow from isolating monocytes from human blood samples followed by a high-sensitivity and small-scale chromatin immunoprecipitation assay with guidance for generating libraries compatible with next generation sequencing from small amounts of immunoprecipitated DNA.

  13. Establishing a protocol for element determination in human nail clippings by neutron activation analysis

    International Nuclear Information System (INIS)

    Sanches, Thalita Pinheiro; Saiki, Mitiko

    2011-01-01

    Human nail samples have been analyzed to evaluate occupational exposure, nutritional status and to diagnose certain diseases. However, sampling and washing protocols for nail analyses vary from study to study not allowing comparisons between studies. One of the difficulties in analyzing nail samples is to eliminate only surface contamination without removing elements of interest in this tissue. In the present study, a protocol was defined in order to obtain reliable results of element concentrations in human nail clippings. Nail clippings collected from all 10 fingers or toes were previously pre cleaned using an ethyl alcohol solution to eliminate microbes. Then, the clippings were cut in small pieces and submitted to different reagents for washing by shaking. Neutron activation analysis (NAA) was applied for nail samples analysis which consisted of irradiating aliquots of samples together with synthetic elemental standards in the IEA-R1 nuclear research reactor followed by gamma ray spectrometry. Comparisons made between the results obtained for nails submitted to different reagents for cleaning indicated that the procedure using acetone and Triton X100 solution is more effective than that of nitric acid solution. Analyses in triplicates of a nail sample indicated results with relative standard deviations lower than 15% for most of elements, showing the homogeneity of the prepared sample. Qualitative analyses of different nail polishes showed that the presence of elements determined in the present study is negligible in these products. Quality control of the analytical results indicated that the applied NAA procedure is adequate for human nail analysis. (author)

  14. Comparative of three sampling protocols for water quality assessment using macro invertebrates; Comparacion de tres protocolos de muestreo de macroinvertebrados para determinar la calidad del agua

    Energy Technology Data Exchange (ETDEWEB)

    Puertolas Domenech, L.; Rieradevall Sant, M.; Prat Fornells, N.

    2007-07-01

    The implementation of the Water Framework directive (WFD, Directive 2000/60/CE) requires the establishment of standardized sampling protocols for the assessment of benthic fauna. In this paper, a comparative study of several sampling protocols that are used currently in Spain and Europe (AQEM, EPA and Guadalmed) has been carried out. Evaluating the three protocols with a list of 12 criteria, Guadalmed fits better to the most of them. therefore it appears as an efficient tool in the determination of Ecological Status. (Author)

  15. The Impact of Including Below Detection Limit Samples in Post Decommissioning Soil Sample Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Hwan; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    To meet the required standards the site owner has to show that the soil at the facility has been sufficiently cleaned up. To do this one must know the contamination of the soil at the site prior to clean up. This involves sampling that soil to identify the degree of contamination. However there is a technical difficulty in determining how much decontamination should be done. The problem arises when measured samples are below the detection limit. Regulatory guidelines for site reuse after decommissioning are commonly challenged because the majority of the activity in the soil at or below the limit of detection. Using additional statistical analyses of contaminated soil after decommissioning is expected to have the following advantages: a better and more reliable probabilistic exposure assessment, better economics (lower project costs) and improved communication with the public. This research will develop an approach that defines an acceptable method for demonstrating compliance of decommissioned NPP sites and validates that compliance. Soil samples from NPP often contain censored data. Conventional methods for dealing with censored data sets are statistically biased and limited in their usefulness. In this research, additional methods are performed using real data from a monazite manufacturing factory.

  16. Ion Chromatographic Analyses of Sea Waters, Brines and Related Samples

    OpenAIRE

    Nataša Gros

    2013-01-01

    This review focuses on the ion chromatographic methods for the analyses of natural waters with high ionic strength. At the beginning a natural diversity in ionic composition of waters is highlighted and terminology clarified. In continuation a brief overview of other review articles of potential interest is given. A review of ion chromatographic methods is organized in four sections. The first section comprises articles focused on the determination of ionic composition of water samples as com...

  17. Deep-sequencing protocols influence the results obtained in small-RNA sequencing.

    Directory of Open Access Journals (Sweden)

    Joern Toedling

    Full Text Available Second-generation sequencing is a powerful method for identifying and quantifying small-RNA components of cells. However, little attention has been paid to the effects of the choice of sequencing platform and library preparation protocol on the results obtained. We present a thorough comparison of small-RNA sequencing libraries generated from the same embryonic stem cell lines, using different sequencing platforms, which represent the three major second-generation sequencing technologies, and protocols. We have analysed and compared the expression of microRNAs, as well as populations of small RNAs derived from repetitive elements. Despite the fact that different libraries display a good correlation between sequencing platforms, qualitative and quantitative variations in the results were found, depending on the protocol used. Thus, when comparing libraries from different biological samples, it is strongly recommended to use the same sequencing platform and protocol in order to ensure the biological relevance of the comparisons.

  18. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show ...

  19. Use of a low-background and anti-Compton HpGe gamma-spectrometer in analyses of environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Qiong, Su; Yamin, Gao [Ministry of Public Health, Beijing, BJ (China). Lab. of Industrial Hygiene

    1989-12-01

    The results of application of a HpGe gamma-spectrometer in the analyses of enviromental samples are reported. The spectrometer has very low background and good property of Compton suppression. By comparison between the gamma-spectra with and without anti-coincidence shield for the same samples, the advantage in analysing environmental samples became apparent. In the analyses of carp samples, the ratio of specific activities of {sup 226}Ra in the flesh and bone of the carp was 1 to 35, which is basically in agreement with the ratio of the accumulation factors 1:37, as reported in the literature. Thus the spectrometer would play an important role in the research of the transfer of radionuclides of low activity in the environment. The method of gamma-ray data processing is also described.

  20. Use of a low-background and anti-Compton HpGe gamma-spectrometer in analyses of environmental samples

    International Nuclear Information System (INIS)

    Su Qiong; Gao Yamin

    1989-01-01

    The results of application of a HpGe gamma-spectrometer in the analyses of enviromental samples are reported. The spectrometer has very low background and good property of Compton suppression. By comparison between the gamma-spectra with and without anti-coincidence shield for the same samples, the advantage in analysing environmental samples became apparent. In the analyses of carp samples, the ratio of specific activities of 226 Ra in the flesh and bone of the carp was 1 to 35, which is basically in agreement with the ratio of the accumulation factors 1:37, as reported in the literature. Thus the spectrometer would play an important role in the research of the transfer of radionuclides of low activity in the environment. The method of gamma-ray data processing is also described

  1. Simultaneous PIXE and PIGE analyses of aerosol samples collected in urban areas

    International Nuclear Information System (INIS)

    Boni, C.; Caruso, E.; Cereda, E.; Marcazzan, G.M.; Redaelli, P.; Bacci, P.

    1988-01-01

    The paper concerns the simultaneous PIXE (Particle Induced X-ray Emission) and PIGE (Proton Induced Gamma-ray Emission) analyses of aerosol samples collected in urban areas. The results show that PIGE can detect Li, F, Na, Al, and Si in fly ashes and F, Na, Al and Si in atmospheric aerosol. The PIXE-PIGE technique has also been applied to 80 samples of atmospheric particular matter collected above Milan during the winter and summer months of 1986/7, and the average values of concentrations and enrichment factors are given for the detected elements. (U.K.)

  2. Data validation summary report for the 100-BC-5 Operable Unit Round 9 Groundwater Sampling. Revision 0

    International Nuclear Information System (INIS)

    Kearney, A.T.

    1996-03-01

    The information provided in this validation summary report includes chemical analyses of samples from 100-BC-5 Operable Unit Round 9 Groundwater sampling data. Data from this sampling event and their related quality assurance (QA) samples were reviewed and validated in accordance with Westinghouse Hanford Company (WHC) guidelines at the requested level. Sample analyses included metals, general chemistry, and radiochemistry. Sixty metals samples were analyzed by Quanterra Environmental Services (QES) and Lockheed Analytical Services (LAS). The metals samples were validated using WHC protocols specified in Data Validation Procedures for Chemical Analyses. All qualifiers assigned to the metals data were based on this guidance. The Table 1.1 lists the metals sample delivery group (SDG) that were validated for this sampling event

  3. Using the OSL single-aliquot regenerative-dose protocol with quartz extracted from building materials in retrospective dosimetry

    DEFF Research Database (Denmark)

    Bøtter-Jensen, L.; Solongo, S.; Murray, A.S.

    2000-01-01

    We report on the application of the single-aliquot regenerative-dose (SAR) protocol to the optically stimulated luminescence signal from quartz extracted from fired bricks acid unfired mortar in retrospective dosimetry. The samples came from a radioactive materials storage facility, with ambient...... dose rates of about 0.1 mGy/h. A detailed dose-depth profile was analysed from one brick, and compared with dose records from area TL dosemeters. Small-aliquot dose-distributions were analysed from the mortar samples; one associated with the exposed brick, and one from a remote site exposed only...

  4. Passive sampling and analyses of common dissolved fixed gases in groundwater

    International Nuclear Information System (INIS)

    Spalding, Brian Patrick; Watson, David B.

    2008-01-01

    An in situ passive sampler and gas chromatographic protocol for analysis of the major and several minor fixed gases in groundwater was developed. A gas-tight syringe, mated to a short length of silicone tubing, was equilibrated with dissolved gases in groundwater by immersing in monitoring wells and was used to transport and to inject a 0.5 mL gas sample into a gas chromatograph. Using Ar carrier gas, a HaySep DB porous polymer phase, and sequential thermal conductivity and reductive gas detectors allowed good sensitivity for He, Ne, H2, N2, O2, CO, CH4, CO2, and N2O. Within 4 days of immersion in groundwater, samplers initially filled with either He or air attained the same and constant gas composition at an Oak Ridge, Tennessee, site heavily impacted by uranium, acidity, and nitrate. Between June 2006 and July 2007, 12 permanent groundwater wells were used to test the passive samplers in groundwater contaminated by a group of four closed radioactive wastewater seepage ponds; over a thousand passive gas samples from these wells averaged 56% CO2, 32.4% N2, 2.5% O2, 2.5% N2O, 0.20% CH4, 0.096% H2, and 0.023% CO with an average recovery of 95 14% of the injected gas volume

  5. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  6. Protocol: A simple phenol-based method for 96-well extraction of high quality RNA from Arabidopsis

    Directory of Open Access Journals (Sweden)

    Coustham Vincent

    2011-03-01

    Full Text Available Abstract Background Many experiments in modern plant molecular biology require the processing of large numbers of samples for a variety of applications from mutant screens to the analysis of natural variants. A severe bottleneck to many such analyses is the acquisition of good yields of high quality RNA suitable for use in sensitive downstream applications such as real time quantitative reverse-transcription-polymerase chain reaction (real time qRT-PCR. Although several commercial kits are available for high-throughput RNA extraction in 96-well format, only one non-kit method has been described in the literature using the commercial reagent TRIZOL. Results We describe an unusual phenomenon when using TRIZOL reagent with young Arabidopsis seedlings. This prompted us to develop a high-throughput RNA extraction protocol (HTP96 adapted from a well established phenol:chloroform-LiCl method (P:C-L that is cheap, reliable and requires no specialist equipment. With this protocol 192 high quality RNA samples can be prepared in 96-well format in three hours (less than 1 minute per sample with less than 1% loss of samples. We demonstrate that the RNA derived from this protocol is of high quality and suitable for use in real time qRT-PCR assays. Conclusion The development of the HTP96 protocol has vastly increased our sample throughput, allowing us to fully exploit the large sample capacity of modern real time qRT-PCR thermocyclers, now commonplace in many labs, and develop an effective high-throughput gene expression platform. We propose that the HTP96 protocol will significantly benefit any plant scientist with the task of obtaining hundreds of high quality RNA extractions.

  7. Assessment of the influence of different sample processing and cold storage duration on plant free proline content analyses.

    Science.gov (United States)

    Teklić, Tihana; Spoljarević, Marija; Stanisavljević, Aleksandar; Lisjak, Miroslav; Vinković, Tomislav; Parađiković, Nada; Andrić, Luka; Hancock, John T

    2010-01-01

    A method which is widely accepted for the analysis of free proline content in plant tissues is based on the use of 3% sulfosalicylic acid as an extractant, followed by spectrophotometric quantification of a proline-ninhydrin complex in toluene. However, sample preparation and storage may influence the proline actually measured. This may give misleading or difficult to compare data. To evaluate free proline levels fresh and frozen strawberry (Fragaria × ananassa Duch.) leaves and soybean [Glycine max (L.) Merr.] hypocotyl tissues were used. These were ground with or without liquid nitrogen and proline extracted with sulfosalicylic acid. A particular focus was the influence of plant sample cold storage duration (1, 4 and 12 weeks at -20°C) on tissue proline levels measured. The free proline content analyses, carried out in leaves of Fragaria × ananassa Duch. as well as in hypocotyls of Glycine max (L.) Merr., showed a significant influence of the sample preparation method and cold storage period. Long-term storage of up to 12 weeks at -20°C led to a significant increase in the measured proline in all samples analysed. The observed changes in proline content in plant tissue samples stored at -20°C indicate the likelihood of the over-estimation of the proline content if the proline analyses are delayed. Plant sample processing and cold storage duration seem to have an important influence on results of proline analyses. Therefore it is recommended that samples should be ground fresh and analysed immediately. Copyright © 2010 John Wiley & Sons, Ltd.

  8. Evaluation of a lateral flow-based technology card for blood typing using a simplified protocol in a model of extreme blood sampling conditions.

    Science.gov (United States)

    Clavier, Benoît; Pouget, Thomas; Sailliol, Anne

    2018-02-01

    Life-threatening situations requiring blood transfusion under extreme conditions or in remote and austere locations, such as the battlefield or in traffic accidents, would benefit from reliable blood typing practices that are easily understood by a nonscientist or nonlaboratory technician and provide quick results. A simplified protocol was developed for the lateral flow-based device MDmulticard ABO-D-Rh subgroups-K. Its performance was compared to a reference method (PK7300, Beckman Coulter) in native blood samples from donors. The method was tested on blood samples stressed in vitro as a model of hemorrhage cases (through hemodilution using physiologic serum) and dehydration (through hemoconcentration by removing an aliquot of plasma after centrifugation), respectively. A total of 146 tests were performed on 52 samples; 126 in the hemodilution group (42 for each native, diluted 1/2, and diluted 1/4 samples) and 20 in the hemoconcentration group (10 for each native and 10% concentrated samples). Hematocrit in the tested samples ranged from 9.8% to 57.6% while hemoglobin levels ranged from 3.2 to 20.1 g/dL. The phenotype profile detected with the MDmulticard using the simplified protocol resulted in 22 A, seven B, 20 O, and three AB, of which nine were D- and five were Kell positive. No discrepancies were found with respect to the results obtained with the reference method. The simplified protocol for MDmulticard use could be considered a reliable method for blood typing in extreme environment or emergency situations, worsened by red blood cell dilution or concentration. © 2017 AABB.

  9. The Autism Simplex Collection : an international, expertly phenotyped autism sample for genetic and phenotypic analyses

    OpenAIRE

    Buxbaum, Joseph D.; Bolshakova, Nadia; Brownfeld, Jessica M.; Anney, Richard J. L.; Bender, Patrick; Bernier, Raphael; Cook, Edwin H.; Coon, Hilary; Cuccaro, Michael L.; Freitag, Christine M.; Hallmayer, Joachim; Geschwind, Daniel H.; Klauck, Sabine M.; Nurnberger, John I.; Oliveira, Guiomar

    2014-01-01

    Background: There is an urgent need for expanding and enhancing autism spectrum disorder (ASD) samples, in order to better understand causes of ASD. Methods: In a unique public-private partnership, 13 sites with extensive experience in both the assessment and diagnosis of ASD embarked on an ambitious, 2-year program to collect samples for genetic and phenotypic research and begin analyses on these samples. The program was called The Autism Simplex Collection (TASC). TASC sample collection ...

  10. Recent activity on the post-irradiation analyses of nuclear fuels and actinide samples at JAERI

    International Nuclear Information System (INIS)

    Shinohara, Nobuo; Nakahara, Yoshinori; Kohno, Nobuaki; Tsujimoto, Kazufumi

    2003-01-01

    Radiochemical analyses of spent fuels have been carried out at JAERI for contributing to the development of nuclear technologies, where several samples from research reactors and nuclear power plants were analyzed to obtain isotopic compositions and burnups. The history and procedures of the radiochemical analyses are depicted and some recent results are given in this paper. (author)

  11. Computational fragment-based screening using RosettaLigand: the SAMPL3 challenge

    Science.gov (United States)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2012-05-01

    SAMPL3 fragment based virtual screening challenge provides a valuable opportunity for researchers to test their programs, methods and screening protocols in a blind testing environment. We participated in SAMPL3 challenge and evaluated our virtual fragment screening protocol, which involves RosettaLigand as the core component by screening a 500 fragments Maybridge library against bovine pancreatic trypsin. Our study reaffirmed that the real test for any virtual screening approach would be in a blind testing environment. The analyses presented in this paper also showed that virtual screening performance can be improved, if a set of known active compounds is available and parameters and methods that yield better enrichment are selected. Our study also highlighted that to achieve accurate orientation and conformation of ligands within a binding site, selecting an appropriate method to calculate partial charges is important. Another finding is that using multiple receptor ensembles in docking does not always yield better enrichment than individual receptors. On the basis of our results and retrospective analyses from SAMPL3 fragment screening challenge we anticipate that chances of success in a fragment screening process could be increased significantly with careful selection of receptor structures, protein flexibility, sufficient conformational sampling within binding pocket and accurate assignment of ligand and protein partial charges.

  12. Surface analyses of electropolished niobium samples for superconducting radio frequency cavity

    International Nuclear Information System (INIS)

    Tyagi, P. V.; Nishiwaki, M.; Saeki, T.; Sawabe, M.; Hayano, H.; Noguchi, T.; Kato, S.

    2010-01-01

    The performance of superconducting radio frequency niobium cavities is sometimes limited by contaminations present on the cavity surface. In the recent years extensive research has been done to enhance the cavity performance by applying improved surface treatments such as mechanical grinding, electropolishing (EP), chemical polishing, tumbling, etc., followed by various rinsing methods such as ultrasonic pure water rinse, alcoholic rinse, high pressure water rinse, hydrogen per oxide rinse, etc. Although good cavity performance has been obtained lately by various post-EP cleaning methods, the detailed nature about the surface contaminants is still not fully characterized. Further efforts in this area are desired. Prior x-ray photoelectron spectroscopy (XPS) analyses of EPed niobium samples treated with fresh EP acid, demonstrated that the surfaces were covered mainly with the niobium oxide (Nb 2 O 5 ) along with carbon, in addition a small quantity of sulfur and fluorine were also found in secondary ion mass spectroscopy (SIMS) analysis. In this article, the authors present the analyses of surface contaminations for a series of EPed niobium samples located at various positions of a single cell niobium cavity followed by ultrapure water rinsing as well as our endeavor to understand the aging effect of EP acid solution in terms of contaminations presence at the inner surface of the cavity with the help of surface analytical tools such as XPS, SIMS, and scanning electron microscope at KEK.

  13. Generalized routing protocols for multihop relay networks

    KAUST Repository

    Khan, Fahd Ahmed

    2011-07-01

    Performance of multihop cooperative networks depends on the routing protocols employed. In this paper we propose the last-n-hop selection protocol, the dual path protocol, the forward-backward last-n-hop selection protocol and the forward-backward dual path protocol for the routing of data through multihop relay networks. The average symbol error probability performance of the schemes is analysed by simulations. It is shown that close to optimal performance can be achieved by using the last-n-hop selection protocol and its forward-backward variant. Furthermore we also compute the complexity of the protocols in terms of number of channel state information required and the number of comparisons required for routing the signal through the network. © 2011 IEEE.

  14. Neighborhood sampling: how many streets must an auditor walk?

    Science.gov (United States)

    McMillan, Tracy E; Cubbin, Catherine; Parmenter, Barbara; Medina, Ashley V; Lee, Rebecca E

    2010-03-12

    This study tested the representativeness of four street segment sampling protocols using the Pedestrian Environment Data Scan (PEDS) in eleven neighborhoods surrounding public housing developments in Houston, TX. The following four street segment sampling protocols were used (1) all segments, both residential and arterial, contained within the 400 meter radius buffer from the center point of the housing development (the core) were compared with all segments contained between the 400 meter radius buffer and the 800 meter radius buffer (the ring); all residential segments in the core were compared with (2) 75% (3) 50% and (4) 25% samples of randomly selected residential street segments in the core. Analyses were conducted on five key variables: sidewalk presence; ratings of attractiveness and safety for walking; connectivity; and number of traffic lanes. Some differences were found when comparing all street segments, both residential and arterial, in the core to the ring. Findings suggested that sampling 25% of residential street segments within the 400 m radius of a residence sufficiently represents the pedestrian built environment. Conclusions support more cost effective environmental data collection for physical activity research.

  15. Protocol-based care: the standardisation of decision-making?

    Science.gov (United States)

    Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra

    2009-05-01

    To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as

  16. Global post-Kyoto scenario analyses at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Kypreos, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs.

  17. Global post-Kyoto scenario analyses at PSI

    International Nuclear Information System (INIS)

    Kypreos, S.

    1999-01-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs

  18. Multisite tumor sampling enhances the detection of intratumor heterogeneity at all different temporal stages of tumor evolution.

    Science.gov (United States)

    Erramuzpe, Asier; Cortés, Jesús M; López, José I

    2018-02-01

    Intratumor heterogeneity (ITH) is an inherent process of tumor development that has received much attention in previous years, as it has become a major obstacle for the success of targeted therapies. ITH is also temporally unpredictable across tumor evolution, which makes its precise characterization even more problematic since detection success depends on the precise temporal snapshot at which ITH is analyzed. New and more efficient strategies for tumor sampling are needed to overcome these difficulties which currently rely entirely on the pathologist's interpretation. Recently, we showed that a new strategy, the multisite tumor sampling, works better than the routine sampling protocol for the ITH detection when the tumor time evolution was not taken into consideration. Here, we extend this work and compare the ITH detections of multisite tumor sampling and routine sampling protocols across tumor time evolution, and in particular, we provide in silico analyses of both strategies at early and late temporal stages for four different models of tumor evolution (linear, branched, neutral, and punctuated). Our results indicate that multisite tumor sampling outperforms routine protocols in detecting ITH at all different temporal stages of tumor evolution. We conclude that multisite tumor sampling is more advantageous than routine protocols in detecting intratumor heterogeneity.

  19. Sample preparation techniques for (p, X) spectrometry

    International Nuclear Information System (INIS)

    Whitehead, N.E.

    1985-01-01

    Samples are ashed at low temperature, using oxygen plasma; a rotary evaporator, and freeze drying speeded up the ashing. The new design of apparatus manufactured was only 10 watt but was as efficient as a 200 watt commercial machine; a circuit diagram is included. Samples of hair and biopsy samples of skin were analysed by the technique. A wool standard was prepared for interlaboratory comparison exercises. It was based on New Zealand merino sheep wool and was 2.9 kg in weight. A washing protocol was developed, which preserves most of the trace element content. The wool was ground in liquid nitrogen using a plastic pestle and beaker, driven by a rotary drill press. (author)

  20. Ion Chromatographic Analyses of Sea Waters, Brines and Related Samples

    Directory of Open Access Journals (Sweden)

    Nataša Gros

    2013-06-01

    Full Text Available This review focuses on the ion chromatographic methods for the analyses of natural waters with high ionic strength. At the beginning a natural diversity in ionic composition of waters is highlighted and terminology clarified. In continuation a brief overview of other review articles of potential interest is given. A review of ion chromatographic methods is organized in four sections. The first section comprises articles focused on the determination of ionic composition of water samples as completely as possible. The sections—Selected Anions, Selected Cations and Metals—follow. The most essential experimental conditions used in different methods are summarized in tables for a rapid comparison. Techniques encountered in the reviewed articles comprise: direct determinations of ions in untreated samples with ion- or ion-exclusion chromatography, or electrostatic ion chromatography; matrix elimination with column-switching; pre-concentration with a chelation ion chromatography and purge-and-trap pre-concentration. Different detection methods were used: non-suppressed conductometric or suppressed conductometric, direct spectrometric or spectrometric after a post-column derivetization, and inductively coupled plasma in combination with optical emission or mass spectrometry.

  1. Development of sampling systems and special analyses for pressurized gasification processes; Paineistettujen kaasutusprosessien naeytteenottomenetelmien ja erityisanalytiikan kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Staahlberg, P.; Oesch, P.; Leppaemaeki, E.; Moilanen, A.; Nieminen, M.; Korhonen, J. [VTT Energy, Espoo (Finland)

    1996-12-01

    The reliability of sampling methods used for measuring impurities contained in gasification gas were studied, and new methods were developed for sampling and sample analyses. The aim of the method development was to improve the representativeness of the samples and to speed up the analysis of gas composition. The study focused on tar, nitrogen and sulphur compounds contained in the gasification gas. In the study of the sampling reliability, the effects of probe and sampling line materials suitable for high temperatures and of the solids deposited in the sampling devices on gas samples drawn from the process were studied. Measurements were carried out in the temperature range of 250 - 850 deg C both in real conditions and in conditions simulating gasification gas. The durability of samples during storage was also studied. The other main aim of the study was to increase the amount of quick-measurable gas components by developing on-line analytical methods based on GC, FTIR and FI (flow injection) techniques for the measurements of nitrogen and sulphur compounds in gasification gas. As these methods are suitable only for the gases that do not contain condensing gas components disturbing the operation of analysers (heavy tar compounds, water), a sampling system operating in dilution principle was developed. The system operates at high pressures and temperatures and is suitable for gasification gases containing heavy tar compounds. The capabilities of analysing heavy tar compounds (mole weight >200 g mol) was improved by adding the amount of compounds identified and calibrated by model substances and by developing analytical methods based on the high-temperature-GC analysis and the thermogravimetric method. (author)

  2. Surface analyses of electropolished niobium samples for superconducting radio frequency cavity

    Energy Technology Data Exchange (ETDEWEB)

    Tyagi, P. V.; Nishiwaki, M.; Saeki, T.; Sawabe, M.; Hayano, H.; Noguchi, T.; Kato, S. [GUAS, Tsukuba, Ibaraki 305-0801 (Japan); KEK, Tsukuba, Ibaraki 305-0801 (Japan); KAKEN Inc., Hokota, Ibaraki 311-1416 (Japan); GUAS, Tsukuba, Ibaraki 305-0801 (Japan) and KEK, Tsukuba, Ibaraki 305-0801 (Japan)

    2010-07-15

    The performance of superconducting radio frequency niobium cavities is sometimes limited by contaminations present on the cavity surface. In the recent years extensive research has been done to enhance the cavity performance by applying improved surface treatments such as mechanical grinding, electropolishing (EP), chemical polishing, tumbling, etc., followed by various rinsing methods such as ultrasonic pure water rinse, alcoholic rinse, high pressure water rinse, hydrogen per oxide rinse, etc. Although good cavity performance has been obtained lately by various post-EP cleaning methods, the detailed nature about the surface contaminants is still not fully characterized. Further efforts in this area are desired. Prior x-ray photoelectron spectroscopy (XPS) analyses of EPed niobium samples treated with fresh EP acid, demonstrated that the surfaces were covered mainly with the niobium oxide (Nb{sub 2}O{sub 5}) along with carbon, in addition a small quantity of sulfur and fluorine were also found in secondary ion mass spectroscopy (SIMS) analysis. In this article, the authors present the analyses of surface contaminations for a series of EPed niobium samples located at various positions of a single cell niobium cavity followed by ultrapure water rinsing as well as our endeavor to understand the aging effect of EP acid solution in terms of contaminations presence at the inner surface of the cavity with the help of surface analytical tools such as XPS, SIMS, and scanning electron microscope at KEK.

  3. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  4. ICP/AES radioactive sample analyses at Pacific Northwest Laboratory

    International Nuclear Information System (INIS)

    Matsuzaki, C.L.; Hara, F.T.

    1986-03-01

    Inductively coupled argon plasma atomic emission spectroscopy (ICP/AES) analyses of radioactive materials at Pacific Northwest Laboratory (PNL) began about three years ago upon completion of the installation of a modified Applied Research Laboratory (ARL) 3560. Funding for the purchase and installation of the ICP/AES was provided by the Nuclear Waste Materials Characterization Center (MCC) established at PNL by the Department of Energy in 1979. MCC's objective is to ensure that qualified materials data are available on waste materials. This paper is divided into the following topics: (1) Instrument selection considerations; (2) initial installation of the simultaneous system with the source stand enclosed in a 1/2'' lead-shielded glove box; (3) retrofit installation of the sequential spectrometer; and (4) a brief discussion on several types of samples analyzed. 1 ref., 7 figs., 1 tab

  5. Investigations of the post-IR IRSL protocol applied to single K-feldspar grains from fluvial sediment samples

    International Nuclear Information System (INIS)

    Nian, Xiaomei; Bailey, Richard M.; Zhou, Liping

    2012-01-01

    The post-IR IRSL protocol with single K-feldspar grains was applied to three samples taken from a fluvial sedimentary sequence at the archaeological site of the Dali Man, Shaanxi Province, China. K-feldspar coarse grains were extracted for measurement. Approximately 30–40% of the grains were sufficiently bright to measure, and after application of rejection criteria based on signal strength, recuperation, recycling ratio and saturation dose, ∼10–15% of the grains were used for D e calculation. The relationship of signal decay rate and form of D e (t) with the recovery dose were investigated. The dose recovery ratios of the samples after initial bleaching with the four different light sources were within uncertainties of unity. No anomalous fading was observed. The over-dispersion of the recovered dose and D e values were similar, suggesting neither incomplete resetting of the post-IR IRSL signals nor spatially heterogeneous dose rates significantly affected the natural dose estimates. The values of D e obtained with the single K-feldspar grain post-IR IRSL protocol were in the range ∼400–490 Gy. Combining all of the measured single-grain signals for each of the individual samples (into a ‘synthetic single aliquot’) increased the D e estimates to the range ∼700–900 Gy, suggesting that the grains screened-out by the rejection criteria may have the potential to cause palaeodose over-estimation, although this finding requires a more extensive investigation. Thermally transferred signals were found in the single K-feldspar grains post-IR IRSL protocol, and the proportion of thermally transferred signal to test-dose OSL signal (stimulation at 290 °C) from the natural dose was higher than from regenerative doses, and the proportion was grain- and dose-dependent. As such, TT-post-IR IRSL signals at 290 °C have the potential to cause dose underestimation, although this may be reduced by using larger test-dose irradiations. Our study demonstrates

  6. Variability of protein level and phosphorylation status caused by biopsy protocol design in human skeletal muscle analyses

    Directory of Open Access Journals (Sweden)

    Caron Marc-André

    2011-11-01

    Full Text Available Abstract Background Bergström needle biopsy is widely used to sample skeletal muscle in order to study cell signaling directly in human tissue. Consequences of the biopsy protocol design on muscle protein quantity and quality remain unclear. The aim of the present study was to assess the impact of different events surrounding biopsy protocol on the stability of the Western blot signal of eukaryotic translation initiation factor 4E binding protein 1 (4E-BP1, Akt, glycogen synthase kinase-3β (GSK-3β, muscle RING finger protein 1 (MuRF1 and p70 S6 kinase (p70 S6K. Six healthy subjects underwent four biopsies of the vastus lateralis, distributed into two distinct visits spaced by 48 hrs. At visit 1, a basal biopsy in the right leg was performed in the morning (R1 followed by a second in the left leg in the afternoon (AF. At visit 2, a second basal biopsy (R2 was collected from the right leg. Low intensity mobilization (3 × 20 right leg extensions was performed and a final biopsy (Mob was collected using the same incision site as R2. Results Akt and p70 S6K phosphorylation levels were increased by 83% when AF biopsy was compared to R1. Mob condition induced important phosphorylation of p70 S6K when compared to R2. Comparison of R1 and R2 biopsies revealed a relative stability of the signal for both total and phosphorylated proteins. Conclusions This study highlights the importance to standardize muscle biopsy protocols in order to minimize the method-induced variation when analyzing Western blot signals.

  7. Securing statically-verified communications protocols against timing attacks

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Hillston, Jane

    2004-01-01

    We present a federated analysis of communication protocols which considers both security properties and timing. These are not entirely independent observations of a protocol; by using timing observations of an executing protocol it is possible to deduce derived information about the nature...... of the communication even in the presence of unbreakable encryption. Our analysis is based on expressing the protocol as a process algebra model and deriving from this process models analysable by the Imperial PEPA Compiler and the LySatool....

  8. Network protocols and sockets

    OpenAIRE

    BALEJ, Marek

    2010-01-01

    My work will deal with network protocols and sockets and their use in programming language C#. It will therefore deal programming network applications on the platform .NET from Microsoft and instruments, which C# provides to us. There will describe the tools and methods for programming network applications, and shows a description and sample applications that work with sockets and application protocols.

  9. Past and Future of the Kyoto Protocol. Final report

    International Nuclear Information System (INIS)

    Wijen, F.; Zoeteman, K.

    2004-01-01

    The present report reflects findings from a study on the realization of and prospects for the Kyoto Protocol. The purpose of the study was (1) to obtain insights into the factors that enabled the realization of the Kyoto Protocol, in particular the interactions among major parties involved; (2) to assess the future opportunities and threats of the Kyoto Protocol, in particular against the backdrop of an increasingly globalised world. The study was conducted from February up to December 2003 by (a) reviewing the literature, especially publications on the negotiation history of the Kyoto process, the social interactions enabling the realization of the Protocol, analyses of strengths and weaknesses, and future climate regimes; (b) conducting a series of interviews with representatives from government, academia, non-governmental organisations, and business, who have been - directly or indirectly - involved in the Kyoto process; (c) internal discussions,brainstorming and analysing the Protocol's strengths and weaknesses, possible future scenarios (including policy options), and the management of a possible failure of the Kyoto Protocol. The present report reflects and integrates the different sources. The first section deals with the past and the present. It discusses how the Kyoto Protocol could be realized despite the divergent interests, reflects on its architecture, and analyses major strengths and weaknesses. In the second section, we present possible future scenarios. We explore how different combinations of domestic and international commitment provide possible realities that national government may face when crafting climate policy. The third section provides an in-depth analysis of the possible event that the Kyoto Protocol fails. We discuss its definition and policy implications. The final section is reserved for overall conclusions and policy recommendations

  10. Using the OSL single-aliquot regenerative-dose protocol with quartz extracted from building materials in retrospective dosimetry

    International Nuclear Information System (INIS)

    Boetter-Jensen, L.; Solongo, S.; Murray, A.S.; Banerjee, D.; Jungner, H.

    2000-01-01

    We report on the application of the single-aliquot regenerative-dose (SAR) protocol to the optically stimulated luminescence signal from quartz extracted from fired bricks and unfired mortar in retrospective dosimetry. The samples came from a radioactive materials storage facility, with ambient dose rates of about 0.1 mGy/h. A detailed dose-depth profile was analysed from one brick, and compared with dose records from area TL dosemeters. Small-aliquot dose-distributions were analysed from the mortar samples; one associated with the exposed brick, and one from a remote site exposed only to background radiation. We conclude that unfired materials have considerable potential in retrospective dosimetry

  11. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects.

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  12. Correlated Amino Acid and Mineralogical Analyses of Milligram and Submilligram Samples of Carbonaceous Chondrite Lonewolf Nunataks 94101

    Science.gov (United States)

    Burton, S.; Berger, E. L.; Locke, D. R.; Lewis, E. K.

    2018-01-01

    Amino acids, the building blocks of proteins, have been found to be indigenous in the eight carbonaceous chondrite groups. The abundances, structural, enantiomeric and isotopic compositions of amino acids differ significantly among meteorites of different groups and petrologic types. These results suggest parent-body conditions (thermal or aqueous alteration), mineralogy, and the preservation of amino acids are linked. Previously, elucidating specific relationships between amino acids and mineralogy was not possible because the samples analyzed for amino acids were much larger than the scale at which petrologic heterogeneity is observed (sub mm-scale differences corresponding to sub-mg samples); for example, Pizzarello and coworkers measured amino acid abundances and performed X-ray diffraction (XRD) on several samples of the Murchison meteorite, but these analyses were performed on bulk samples that were 500 mg or larger. Advances in the sensitivity of amino acid measurements by liquid chromatography with fluorescence detection/time-of-flight mass spectrometry (LC-FD/TOF-MS), and application of techniques such as high resolution X-ray diffraction (HR-XRD) and scanning electron microscopy (SEM) with energy dispersive spectroscopy (EDS) for mineralogical characterizations have now enabled coordinated analyses on the scale at which mineral heterogeneity is observed. In this work, we have analyzed samples of the Lonewolf Nunataks (LON) 94101 CM2 carbonaceous chondrite. We are investigating the link(s) between parent body processes, mineralogical context, and amino acid compositions in meteorites on bulk samples (approx. 20mg) and mineral separates (< or = 3mg) from several of spatial locations within our allocated samples. Preliminary results of these analyses are presented here.

  13. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    Science.gov (United States)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  14. Intraosseous blood samples for point-of-care analysis: agreement between intraosseous and arterial analyses.

    Science.gov (United States)

    Jousi, Milla; Saikko, Simo; Nurmi, Jouni

    2017-09-11

    Point-of-care (POC) testing is highly useful when treating critically ill patients. In case of difficult vascular access, the intraosseous (IO) route is commonly used, and blood is aspirated to confirm the correct position of the IO-needle. Thus, IO blood samples could be easily accessed for POC analyses in emergency situations. The aim of this study was to determine whether IO values agree sufficiently with arterial values to be used for clinical decision making. Two samples of IO blood were drawn from 31 healthy volunteers and compared with arterial samples. The samples were analysed for sodium, potassium, ionized calcium, glucose, haemoglobin, haematocrit, pH, blood gases, base excess, bicarbonate, and lactate using the i-STAT® POC device. Agreement and reliability were estimated by using the Bland-Altman method and intraclass correlation coefficient calculations. Good agreement was evident between the IO and arterial samples for pH, glucose, and lactate. Potassium levels were clearly higher in the IO samples than those from arterial blood. Base excess and bicarbonate were slightly higher, and sodium and ionised calcium values were slightly lower, in the IO samples compared with the arterial values. The blood gases in the IO samples were between arterial and venous values. Haemoglobin and haematocrit showed remarkable variation in agreement. POC diagnostics of IO blood can be a useful tool to guide treatment in critical emergency care. Seeking out the reversible causes of cardiac arrest or assessing the severity of shock are examples of situations in which obtaining vascular access and blood samples can be difficult, though information about the electrolytes, acid-base balance, and lactate could guide clinical decision making. The analysis of IO samples should though be limited to situations in which no other option is available, and the results should be interpreted with caution, because there is not yet enough scientific evidence regarding the agreement of IO

  15. Evaluation protocol for amusia: Portuguese sample.

    Science.gov (United States)

    Peixoto, Maria Conceição; Martins, Jorge; Teixeira, Pedro; Alves, Marisa; Bastos, José; Ribeiro, Carlos

    2012-12-01

    Amusia is a disorder that affects the processing of music. Part of this processing happens in the primary auditory cortex. The study of this condition allows us to evaluate the central auditory pathways. To explore the diagnostic evaluation tests of amusia. The authors propose an evaluation protocol for patients with suspected amusia (after brain injury or complaints of poor musical perception), in parallel with the assessment of central auditory processing, already implemented in the department. The Montreal Evaluation of Battery of amusia was the basis for the selection of the tests. From this comprehensive battery of tests we selected some of the musical examples to evaluate different musical aspects, including memory and perception of music, ability concerning musical recognition and discrimination. In terms of memory there is a test for assessing delayed memory, adapted to the Portuguese culture. Prospective study. Although still experimental, with the possibility of adjustments in the assessment, we believe that this assessment, combined with the study of central auditory processing, will allow us to understand some central lesions, congenital or acquired hearing perception limitations.

  16. Comparative multielement analyses of airborne particulate samples collected in various areas

    International Nuclear Information System (INIS)

    Mamuro, Tetsuo; Matsuda, Yatsuka; Mizohata, Akira

    1973-01-01

    In order to grasp the characteristic features of the air pollution by particulates in various areas in Japan, multielement analyses by instrumental neutron activation analysis and radioisotope energy dispersive X-ray fluorescence analysis were applied to 31 airborne particulate samples collected in 15 different areas, and the analytical results obtained were compared with one another. All the samples were collected by so-called ''10 micron cut'' samplers, the collection efficiency of which is considered to be 50% at 8μ and nearly zero beyond 10μ. Among the areas in question there are clean seaside areas, heavily industrialized areas, small cities along the Inland Sea or the Pacific Ocean around which industrialization is progressing, a small city having only a big iron work, an area famous for its ceramic industry and so on. The atmospheres over them were found to be quite different not only in pollution extent but also in pollution pattern. (auth.)

  17. Terrorist fraud resistance of distance bounding protocols employing physical unclonable functions

    NARCIS (Netherlands)

    Kleber, Stephan; van der Heijden, Rens W.; Kopp, Henning; Kargl, Frank

    Distance bounding protocols (DBPs) are security protocols that aim to limit the maximum possible distance between two partners in a wireless communication. This enables to ensure locality of interaction between two devices. Despite numerous proposed protocols, recent analyses of DBPs have shown the

  18. Development of a protocol to measure iron-55 in solid matrices in the environment

    International Nuclear Information System (INIS)

    Augeray, Céline; Magalie, Mouton; Nathalie, Broustet; Marie-France, Perdereau; Chloé, Laconici; Jeanne, Loyen; Corinne, Fayolle; Jean-Louis, Picolo

    2015-01-01

    The development of metrology of iron-55 in low-level radioactivity in environmental solid matrices was realised for conducting radioecological studies. A protocol was developed based on the adaptation of existing methods for the purification of iron-55 with selective chromatographic resin, which was then measured with liquid scintillation. The loss attached treatment chemical steps were quantified with elemental iron by Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES). The tests were used to define the iron retention capacity of selective chromatographic resin, a key element in chemical treatment, and test sample size needed to reach the detection limit of 30 Bq kg −1  dry. The solid samples were analysed with the developed protocol. The activities obtained from iron-55 were below the detection limit of 30 Bq kg −1  dry. - Highlights: • To obtain the desired detection limit in environmental solid matrices, the choice of method was realised. • A protocol was thus developed with our resources to obtain a 30 Bq kg-1 dry detection limit. • The optimisation of the operating conditions is described and the activities obtained are presented

  19. Search and nonsearch protocols for radiographic consultation

    International Nuclear Information System (INIS)

    Swensson, R.G.; Theodore, G.H.

    1989-01-01

    Six radiologists, acting as film reviewers, used two different consultation protocols to differentiate among 292 ambiguous findings on chest films: 120 simulated nodules and 172 normal findings (previous readers' false-positive reports of nodules). The non-search protocol identified each finding (by location), and reviewers rated its likelihood as a nodule. The search protocol, which asked reviewers to report and rate all locations regarded as possible nodules on each film, assigned a default negative rating to any unreported finding (nodule or normal). Receiver operator characteristic analyses demonstrated a significantly higher accuracy for each reviewer's search- protocol discriminations between these nodules and confusing normal findings

  20. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  1. Investigating the Geological History of Asteroid 101955 Bennu Through Remote Sensing and Returned Sample Analyses

    Science.gov (United States)

    Messenger, S.; Connolly, H. C., Jr.; Lauretta, D. S.; Bottke, W. F.

    2014-01-01

    The NASA New Frontiers Mission OSRIS-REx will return surface regolith samples from near-Earth asteroid 101955 Bennu in September 2023. This target is classified as a B-type asteroid and is spectrally similar to CI and CM chondrite meteorites [1]. The returned samples are thus expected to contain primitive ancient Solar System materials that formed in planetary, nebular, interstellar, and circumstellar environments. Laboratory studies of primitive astromaterials have yielded detailed constraints on the origins, properties, and evolutionary histories of a wide range of Solar System bodies. Yet, the parent bodies of meteorites and cosmic dust are generally unknown, genetic and evolutionary relationships among asteroids and comets are unsettled, and links between laboratory and remote observations remain tenuous. The OSIRIS-REx mission will offer the opportunity to coordinate detailed laboratory analyses of asteroidal materials with known and well characterized geological context from which the samples originated. A primary goal of the OSIRIS-REx mission will be to provide detailed constraints on the origin and geological and dynamical history of Bennu through coordinated analytical studies of the returned samples. These microanalytical studies will be placed in geological context through an extensive orbital remote sensing campaign that will characterize the global geological features and chemical diversity of Bennu. The first views of the asteroid surface and of the returned samples will undoubtedly bring remarkable surprises. However, a wealth of laboratory studies of meteorites and spacecraft encounters with primitive bodies provides a useful framework to formulate priority scientific questions and effective analytical approaches well before the samples are returned. Here we summarize our approach to unraveling the geological history of Bennu through returned sample analyses.

  2. Reduction and technical simplification of testing protocol for walking based on repeatability analyses: An Interreg IVa pilot study

    Directory of Open Access Journals (Sweden)

    Nejc Sarabon

    2010-12-01

    Full Text Available The aim of this study was to define the most appropriate gait measurement protocols to be used in our future studies in the Mobility in Ageing project. A group of young healthy volunteers took part in the study. Each subject carried out a 10-metre walking test at five different speeds (preferred, very slow, very fast, slow, and fast. Each walking speed was repeated three times, making a total of 15 trials which were carried out in a random order. Each trial was simultaneously analysed by three observers using three different technical approaches: a stop watch, photo cells and electronic kinematic dress. In analysing the repeatability of the trials, the results showed that of the five self-selected walking speeds, three of them (preferred, very fast, and very slow had a significantly higher repeatability of the average walking velocity, step length and cadence than the other two speeds. Additionally, the data showed that one of the three technical methods for gait assessment has better metric characteristics than the other two. In conclusion, based on repeatability, technical and organizational simplification, this study helped us to successfully define a simple and reliable walking test to be used in the main study of the project.

  3. Results of initial analyses of the salt (macro) batch 11 Tank 21H qualification samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-23

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 11 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 11 composite sample indicates that the material does not display any unusual characteristics or observations, such as floating solids, the presence of large amounts of solids, or unusual colors. Further sample results will be reported in a future document. This memo satisfies part of Deliverable 3 of the Technical Task Request (TTR).

  4. Chemical and geotechnical analyses of soil samples from Olkiluoto for studies on sorption in soils

    International Nuclear Information System (INIS)

    Lusa, M.; Aemmaelae, K.; Hakanen, M.; Lehto, J.; Lahdenperae, A.-M.

    2009-05-01

    The safety assessment of disposal of spent nuclear fuel will include an estimate on the behavior of nuclear waste nuclides in the biosphere. As a part of this estimate also the transfer of nuclear waste nuclides in the soil and sediments is to be considered. In this study soil samples were collected from three excavator pits in Olkiluoto and the geotechnical and chemical characteristics of the samples were determined. In later stage these results will be used in sorption tests. Aim of these tests is to determine the Kd-values for Cs, Tc and I and later for Mo, Nb and Cl. Results of these sorption tests will be reported later. The geotechnical characteristics studied included dry weight and organic matter content as well as grain size distribution and mineralogy analyses. Selective extractions were carried out to study the sorption of cations into different mineral types. The extractions included five steps in which the cations bound to exchangeable, carbonate, oxides of Fe and Mn, organic matter and residual fractions were determined. For all fractions ICPMS analyses were carried out. In these analyses Li, Na, Mg, K, Ca, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Sr, Mo, Cd, Cs and Pb were determined. In addition six profiles were taken from the surroundings of two excavator pits for the 137 Cs determination. Besides the samples taken for the characterization of soil, supplement samples were taken from the same layers for the separation of soil water. From the soil water pH, DOC, anions (F, Cl, NO 3 , SO 4 ) and cations (Na, Mg, K, Ca, Al, Cr, Mn, Fe, Ni, Cu, Zn, As, S, Cd, Cs, Pb, U) were determined. (orig.)

  5. Aging of monolithic zirconia dental prostheses: Protocol for a 5-year prospective clinical study using ex vivo analyses.

    Science.gov (United States)

    Koenig, Vinciane; Wulfman, Claudine P; Derbanne, Mathieu A; Dupont, Nathalie M; Le Goff, Stéphane O; Tang, Mie-Leng; Seidel, Laurence; Dewael, Thibaut Y; Vanheusden, Alain J; Mainjot, Amélie K

    2016-12-15

    Recent introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) monolithic zirconia dental prostheses raises the issue of material low thermal degradation (LTD), a well-known problem with zirconia hip prostheses. This phenomenon could be accentuated by masticatory mechanical stress. Until now zirconia LTD process has only been studied in vitro . This work introduces an original protocol to evaluate LTD process of monolithic zirconia prostheses in the oral environment and to study their general clinical behavior, notably in terms of wear. 101 posterior monolithic zirconia tooth elements (molars and premolars) are included in a 5-year prospective clinical trial. On each element, several areas between 1 and 2 mm 2 (6 on molars, 4 on premolars) are determined on restoration surface: areas submitted or non-submitted to mastication mechanical stress, glazed or non-glazed. Before prosthesis placement, ex vivo analyses regarding LTD and wear are performed using Raman spectroscopy, SEM imagery and 3D laser profilometry. After placement, restorations are clinically evaluated following criteria of the World Dental Federation (FDI), complemented by the analysis of fracture clinical risk factors. Two independent examiners perform the evaluations. Clinical evaluation and ex vivo analyses are carried out after 6 months and then each year for up to 5 years. For clinicians and patients, the results of this trial will justify the use of monolithic zirconia restorations in dental practice. For researchers, the originality of a clinical study including ex vivo analyses of material aging will provide important data regarding zirconia properties.Trial registration: ClinicalTrials.gov Identifier: NCT02150226.

  6. Assessment of ethylene vinyl-acetate copolymer samples exposed to γ-rays via linearity analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Lucas N. de; Nascimento, Eriberto O. do; Schimidt, Fernando [Instituto Federal de Educação, Ciência e Tecnologia de Goiás (IFG), Goiânia, GO (Brazil); Antonio, Patrícia L.; Caldas, Linda V.E., E-mail: lcaldas@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Materials with the potential to become dosimeters are of interest in radiation physics. In this research, the materials were analyzed and compared in relation to their linearity ranges. Samples of ethylene vinyl-acetate copolymer (EVA) were irradiated with doses from 10 Gy to 10 kGy using a {sup 60}Co Gamma-Cell system 220 and evaluated with the FTIR technique. The linearity analyses were applied through two methodologies, searching for linear regions in their response. The results show that both applied analyses indicate linear regions in defined dose interval. The radiation detectors EVA can be useful for radiation dosimetry in intermediate and high doses. (author)

  7. Re-verification of a Lip Synchronization Protocol using Robust Reachability

    Directory of Open Access Journals (Sweden)

    Piotr Kordy

    2010-03-01

    Full Text Available The timed automata formalism is an important model for specifying and analysing real-time systems. Robustness is the correctness of the model in the presence of small drifts on clocks or imprecision in testing guards. A symbolic algorithm for the analysis of the robustness of timed automata has been implemented. In this paper, we re-analyse an industrial case lip synchronization protocol using the new robust reachability algorithm. This lip synchronization protocol is an interesting case because timing aspects are crucial for the correctness of the protocol. Several versions of the model are considered: with an ideal video stream, with anchored jitter, and with non-anchored jitter.

  8. Radon in large buildings: The development of a protocol

    International Nuclear Information System (INIS)

    Wilson, D.L.; Dudney, C.S.; Gammage, R.B.

    1993-01-01

    Over the past several years, considerable research has been devoted by the US Environmental Protection Agency (USEPA) and others to develop radon sampling protocols for single family residences and schools. However, very little research has been performed on measuring radon in the work place. To evaluate possible sampling protocols, 833 buildings throughout the United States were selected for extensive radon testing. The buildings tested (warehouses, production plants and office buildings) were representative of commercial buildings across the country both in design, size and use. Based on the results, preliminary radon sampling protocols for the work place have been developed

  9. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1998-01-01

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs

  10. Hanford analytical sample projections FY 1998--FY 2002

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  11. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    Science.gov (United States)

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  12. CO2 isotope analyses using large air samples collected on intercontinental flights by the CARIBIC Boeing 767

    NARCIS (Netherlands)

    Assonov, S.S.; Brenninkmeijer, C.A.M.; Koeppel, C.; Röckmann, T.

    2009-01-01

    Analytical details for 13C and 18O isotope analyses of atmospheric CO2 in large air samples are given. The large air samples of nominally 300 L were collected during the passenger aircraft-based atmospheric chemistry research project CARIBIC and analyzed for a large number of trace gases and

  13. Symbolic Analysis of Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten

    We present our work on using abstract models for formally analysing cryptographic protocols: First, we present an ecient method for verifying trace-based authenticity properties of protocols using nonces, symmetric encryption, and asymmetric encryption. The method is based on a type system...... of Gordon et al., which we modify to support fully-automated type inference. Tests conducted via an implementation of our algorithm found it to be very ecient. Second, we show how privacy may be captured in a symbolic model using an equivalencebased property and give a formal denition. We formalise...

  14. Inorganic analyses of Martian surface samples at the Viking landing sites

    Science.gov (United States)

    Clark, B. C.; Castro, A. J.; Rowe, C. D.; Baird, A. K.; Evans, P. H.; Rose, H. J., Jr.; Toulmin, P., III; Keil, K.; Kelliher, W. C.

    1976-01-01

    Elemental analyses of fines in the Martian regolith at two widely separated landing sites, Chryse Planitia and Utopia Planitia, produced remarkably similar results. At both sites, the uppermost regolith contains abundant Si and Fe, with significant concentrations of Mg, Al, S, Ca, and Ti. The S concentration is one to two orders of magnitude higher, and K (less than 0.25% by weight) is at least 5 times lower than the average for earth's crust. The trace elements Sr, Y, and possibly Zr have been detected at concentrations near or below 100 parts per million. Pebble-sized fragments sampled at Chryse contain more S than the bulk fines and are thought to be pieces of a sulfate-cemented duricrust.

  15. Development of a Competent and Trouble Free DNA Isolation Protocol for Downstream Genetic Analyses in Glycine Species

    Directory of Open Access Journals (Sweden)

    Muhammad Amjad Nawaz

    2016-08-01

    Full Text Available Extraction of deoxyribose nucleic acid (DNA from plants is preliminary step in molecular biology. Fast and cost effective genomic DNA isolation from Glycine species for downstream application is a major bottleneck. Here we report a high throughput and trouble free method for genomic DNA extraction from leaf and seeds of Glycine species with high quality and quantity. Protocol reports the optimization by employing different concentrations of CTAB and PVP in extraction buffer. Efficiency of optimized protocol was compared with frequently used DNA extraction methods. Wide adoptability and utility of this protocol was confirmed by DNA extraction from leaves as well as seeds of G. max, G. soja, G. tomentella and G. latifolia. Extracted DNA was successfully subjected to PCR amplification of five microsatellite markers and four putative glycosyltransferase genes. DNA extraction protocol is reproducible, trouble free, rapid and can be adopted for plant molecular biology applications.

  16. Diagnostic accuracy of serological diagnosis of hepatitis C and B using dried blood spot samples (DBS): two systematic reviews and meta-analyses.

    Science.gov (United States)

    Lange, Berit; Cohn, Jennifer; Roberts, Teri; Camp, Johannes; Chauffour, Jeanne; Gummadi, Nina; Ishizaki, Azumi; Nagarathnam, Anupriya; Tuaillon, Edouard; van de Perre, Philippe; Pichler, Christine; Easterbrook, Philippa; Denkinger, Claudia M

    2017-11-01

    Dried blood spots (DBS) are a convenient tool to enable diagnostic testing for viral diseases due to transport, handling and logistical advantages over conventional venous blood sampling. A better understanding of the performance of serological testing for hepatitis C (HCV) and hepatitis B virus (HBV) from DBS is important to enable more widespread use of this sampling approach in resource limited settings, and to inform the 2017 World Health Organization (WHO) guidance on testing for HBV/HCV. We conducted two systematic reviews and meta-analyses on the diagnostic accuracy of HCV antibody (HCV-Ab) and HBV surface antigen (HBsAg) from DBS samples compared to venous blood samples. MEDLINE, EMBASE, Global Health and Cochrane library were searched for studies that assessed diagnostic accuracy with DBS and agreement between DBS and venous sampling. Heterogeneity of results was assessed and where possible a pooled analysis of sensitivity and specificity was performed using a bivariate analysis with maximum likelihood estimate and 95% confidence intervals (95%CI). We conducted a narrative review on the impact of varying storage conditions or limits of detection in subsets of samples. The QUADAS-2 tool was used to assess risk of bias. For the diagnostic accuracy of HBsAg from DBS compared to venous blood, 19 studies were included in a quantitative meta-analysis, and 23 in a narrative review. Pooled sensitivity and specificity were 98% (95%CI:95%-99%) and 100% (95%CI:99-100%), respectively. For the diagnostic accuracy of HCV-Ab from DBS, 19 studies were included in a pooled quantitative meta-analysis, and 23 studies were included in a narrative review. Pooled estimates of sensitivity and specificity were 98% (CI95%:95-99) and 99% (CI95%:98-100), respectively. Overall quality of studies and heterogeneity were rated as moderate in both systematic reviews. HCV-Ab and HBsAg testing using DBS compared to venous blood sampling was associated with excellent diagnostic accuracy

  17. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  18. Augmented Quadruple-Phase Contrast Media Administration and Triphasic Scan Protocol Increases Image Quality at Reduced Radiation Dose During Computed Tomography Urography.

    Science.gov (United States)

    Saade, Charbel; Mohamad, May; Kerek, Racha; Hamieh, Nadine; Alsheikh Deeb, Ibrahim; El-Achkar, Bassam; Tamim, Hani; Abdul Razzak, Farah; Haddad, Maurice; Abi-Ghanem, Alain S; El-Merhi, Fadi

    The aim of this article was to investigate the opacification of the renal vasculature and the urogenital system during computed tomography urography by using a quadruple-phase contrast media in a triphasic scan protocol. A total of 200 patients with possible urinary tract abnormalities were equally divided between 2 protocols. Protocol A used the conventional single bolus and quadruple-phase scan protocol (pre, arterial, venous, and delayed), retrospectively. Protocol B included a quadruple-phase contrast media injection with a triphasic scan protocol (pre, arterial and combined venous, and delayed), prospectively. Each protocol used 100 mL contrast and saline at a flow rate of 4.5 mL. Attenuation profiles and contrast-to-noise ratio of the renal arteries, veins, and urogenital tract were measured. Effective radiation dose calculation, data analysis by independent sample t test, receiver operating characteristic, and visual grading characteristic analyses were performed. In arterial circulation, only the inferior interlobular arteries in both protocols showed a statistical significance (P contrast-to-noise ratio than protocol A (protocol B: 22.68 ± 13.72; protocol A: 14.75 ± 5.76; P contrast media and triphasic scan protocol usage increases the image quality at a reduced radiation dose.

  19. Military construction program economic analysis manual: Sample economic analyses: Hazardous Waste Remedial Actions Program

    International Nuclear Information System (INIS)

    1987-12-01

    This manual enables the US Air Force to comprehensively and systematically analyze alternative approaches to meeting its military construction requirements. The manual includes step-by-step procedures for completing economic analyses for military construction projects, beginning with determining if an analysis is necessary. Instructions and a checklist of the tasks involved for each step are provided; and examples of calculations and illustrations of completed forms are included. The manual explains the major tasks of an economic analysis, including identifying the problem, selecting realistic alternatives for solving it, formulating appropriate assumptions, determining the costs and benefits of the alternatives, comparing the alternatives, testing the sensitivity of major uncertainties, and ranking the alternatives. Appendixes are included that contain data, indexes, and worksheets to aid in performing the economic analyses. For reference, Volume 2 contains sample economic analyses that illustrate how each form is filled out and that include a complete example of the documentation required

  20. First formate, acetate and methanesulfonate analyses in firn samples from Grenzgletscher (Monte Rosa, 4200 m a.s.l.)

    Energy Technology Data Exchange (ETDEWEB)

    Grund, A.; Schwikowski, M.; Bruetsch, S.; Gaeggeler, H.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    In order to determine trace concentrations of acetate, formate and methanesulfonate in ice samples by ion chromatography precautions have to be taken to avoid contaminations. We investigated sources of contamination and analysed first samples from Grenzgletscher (Monte Rosa massif). (author) 1 fig., 3 refs.

  1. First formate, acetate and methanesulfonate analyses in firn samples from Grenzgletscher (Monte Rosa, 4200 m a.s.l.)

    International Nuclear Information System (INIS)

    Grund, A.; Schwikowski, M.; Bruetsch, S.; Gaeggeler, H.W.

    1997-01-01

    In order to determine trace concentrations of acetate, formate and methanesulfonate in ice samples by ion chromatography precautions have to be taken to avoid contaminations. We investigated sources of contamination and analysed first samples from Grenzgletscher (Monte Rosa massif). (author) 1 fig., 3 refs

  2. Protocol compliance and time management in blunt trauma resuscitation.

    Science.gov (United States)

    Spanjersberg, W R; Bergs, E A; Mushkudiani, N; Klimek, M; Schipper, I B

    2009-01-01

    To study advanced trauma life support (ATLS) protocol adherence prospectively in trauma resuscitation and to analyse time management of daily multidisciplinary trauma resuscitation at a level 1 trauma centre, for both moderately and severely injured patients. All victims of severe blunt trauma were consecutively included. Patients with a revised trauma score (RTS) of 12 were resuscitated by a "minor trauma" team and patients with an RTS of less than 12 were resuscitated by a "severe trauma" team. Digital video recordings were used to analyse protocol compliance and time management during initial assessment. From 1 May to 1 September 2003, 193 resuscitations were included. The "minor trauma" team assessed 119 patients, with a mean injury severity score (ISS) of 7 (range 1-45). Overall protocol compliance was 42%, ranging from 0% for thoracic percussion to 93% for thoracic auscultation. The median resuscitation time was 45.9 minutes (range 39.7-55.9). The "severe team" assessed 74 patients, with a mean ISS of 22 (range 1-59). Overall protocol compliance was 53%, ranging from 4% for thoracic percussion to 95% for thoracic auscultation. Resuscitation took 34.8 minutes median (range 21.6-44.1). Results showed the current trauma resuscitation to be ATLS-like, with sometimes very low protocol compliance rates. Timing of secondary survey and radiology and thus time efficiency remains a challenge in all trauma patients. To assess the effect of trauma resuscitation protocols on outcome, protocol adherence needs to be improved.

  3. Effectiveness of oxaliplatin desensitization protocols.

    Science.gov (United States)

    Cortijo-Cascajares, Susana; Nacle-López, Inmaculada; García-Escobar, Ignacio; Aguilella-Vizcaíno, María José; Herreros-de-Tejada, Alberto; Cortés-Funes Castro, Hernán; Calleja-Hernández, Miguel-Ángel

    2013-03-01

    Hypersensitivity reaction (HSR) to antineoplastic drugs can force doctors to stop treatment and seek other alternatives. These alternatives may be less effective, not as well tolerated and/or more expensive. Another option is to use desensitization protocols that induce a temporary state of tolerance by gradually administering small quantities of the antineoplastic drug until the therapeutic dosage is reached. The aim of this study is to assess the effectiveness of oxaliplatin desensitization protocols. A retrospective observational study was carried out between January 2006 and May 2011. The inclusion criteria were patients undergoing chemotherapy treatment with oxaliplatin who had developed an HSR to the drug and who were candidates for continuing the treatment using a desensitization protocol. The patients' clinical records were reviewed and variables were gathered relating to the patient, the treatment, the HSR, and the desensitization protocol administered. The data were analysed using version 18.0 of the statistics program SPSS. A total of 53 desensitization protocols were administered to 21 patients. In 89 % of these cases, no new reactions occurred while the drug was being administered. New reactions of mild severity only occurred in 11 % of cases, and none of these reactions were severe enough for treatment to be stopped. All patients were able to complete the desensitization protocol. This study confirms that oxaliplatin desensitization protocols are safe and effective and allow patients to continue with the treatment that initially caused an HSR.

  4. Comparison of methods for the quantification of the different carbon fractions in atmospheric aerosol samples

    Science.gov (United States)

    Nunes, Teresa; Mirante, Fátima; Almeida, Elza; Pio, Casimiro

    2010-05-01

    Atmospheric carbon consists of: organic carbon (OC, including various organic compounds), elemental carbon (EC, or black carbon [BC]/soot, a non-volatile/light-absorbing carbon), and a small quantity of carbonate carbon. Thermal/optical methods (TOM) have been widely used for quantifying total carbon (TC), OC, and EC in ambient and source particulate samples. Unfortunately, the different thermal evolution protocols in use can result in a wide elemental carbon-to-total carbon variation. Temperature evolution in thermal carbon analysis is critical to the allocation of carbon fractions. Another critical point in OC and EC quantification by TOM is the interference of carbonate carbon (CC) that could be present in the particulate samples, mainly in the coarse fraction of atmospheric aerosol. One of the methods used to minimize this interference consists on the use of a sample pre-treatment with acid to eliminate CC prior to thermal analysis (Chow et al., 2001; Pio et al., 1994). In Europe, there is currently no standard procedure for determining the carbonaceous aerosol fraction, which implies that data from different laboratories at various sites are of unknown accuracy and cannot be considered comparable. In the framework of the EU-project EUSAAR, a comprehensive study has been carried out to identify the causes of differences in the EC measured using different thermal evolution protocols. From this study an optimised protocol, the EUSAAR-2 protocol, was defined (Cavali et al., 2009). During the last two decades thousands of aerosol samples have been taken over quartz filters at urban, industrial, rural and background sites, and also from plume forest fires and biomass burning in a domestic closed stove. These samples were analysed for OC and EC, by a TOM, similar to that in use in the IMPROVE network (Pio et al., 2007). More recently we reduced the number of steps in thermal evolution protocols, without significant repercussions in the OC/EC quantifications. In order

  5. Analyses and Comparison of Bulk and Coil Surface Samples from the DWPF Slurry Mix Evaporator

    International Nuclear Information System (INIS)

    Hay, M.; Nash, C.; Stone, M.

    2012-01-01

    Sludge samples from the DWPF Slurry Mix Evaporator (SME) heating coil frame and coil surface were characterized to identify differences that might help identify heat transfer fouling materials. The SME steam coils have seen increased fouling leading to lower boil-up rates. Samples of the sludge were taken from the coil frame somewhat distant from the coil (bulk tank material) and from the coil surface (coil surface sample). The results of the analysis indicate the composition of the two SME samples are very similar with the exception that the coil surface sample shows ∼5-10X higher mercury concentration than the bulk tank sample. Elemental analyses and x-ray diffraction results did not indicate notable differences between the two samples. The ICP-MS and Cs-137 data indicate no significant differences in the radionuclide composition of the two SME samples. Semi-volatile organic analysis revealed numerous organic molecules, these likely result from antifoaming additives. The compositions of the two SME samples also match well with the analyzed composition of the SME batch with the exception of significantly higher silicon, lithium, and boron content in the batch sample indicating the coil samples are deficient in frit relative to the SME batch composition.

  6. An Improved 6LoWPAN Hierarchical Routing Protocol

    Directory of Open Access Journals (Sweden)

    Xue Li

    2015-10-01

    Full Text Available IETF 6LoWPAN working group is engaged in the IPv6 protocol stack research work based on IEEE802.15.4 standard. In this working group, the routing protocol is one of the important research contents. In the 6LoWPAN, HiLow is a well-known layered routing protocol. This paper puts forward an improved hierarchical routing protocol GHiLow by improving HiLow parent node selection and path restoration strategy. GHiLow improves the parent node selection by increasing the choice of parameters. Simutaneously, it also improves path recovery by analysing different situations to recovery path. Therefore, GHiLow contributes to the ehancement of network performance and the decrease of network energy consumption.

  7. Evaluation of Two Surface Sampling Methods for Microbiological and Chemical Analyses To Assess the Presence of Biofilms in Food Companies.

    Science.gov (United States)

    Maes, Sharon; Huu, Son Nguyen; Heyndrickx, Marc; Weyenberg, Stephanie van; Steenackers, Hans; Verplaetse, Alex; Vackier, Thijs; Sampers, Imca; Raes, Katleen; Reu, Koen De

    2017-12-01

    Biofilms are an important source of contamination in food companies, yet the composition of biofilms in practice is still mostly unknown. The chemical and microbiological characterization of surface samples taken after cleaning and disinfection is very important to distinguish free-living bacteria from the attached bacteria in biofilms. In this study, sampling methods that are potentially useful for both chemical and microbiological analyses of surface samples were evaluated. In the manufacturing facilities of eight Belgian food companies, surfaces were sampled after cleaning and disinfection using two sampling methods: the scraper-flocked swab method and the sponge stick method. Microbiological and chemical analyses were performed on these samples to evaluate the suitability of the sampling methods for the quantification of extracellular polymeric substance components and microorganisms originating from biofilms in these facilities. The scraper-flocked swab method was most suitable for chemical analyses of the samples because the material in these swabs did not interfere with determination of the chemical components. For microbiological enumerations, the sponge stick method was slightly but not significantly more effective than the scraper-flocked swab method. In all but one of the facilities, at least 20% of the sampled surfaces had more than 10 2 CFU/100 cm 2 . Proteins were found in 20% of the chemically analyzed surface samples, and carbohydrates and uronic acids were found in 15 and 8% of the samples, respectively. When chemical and microbiological results were combined, 17% of the sampled surfaces were contaminated with both microorganisms and at least one of the analyzed chemical components; thus, these surfaces were characterized as carrying biofilm. Overall, microbiological contamination in the food industry is highly variable by food sector and even within a facility at various sampling points and sampling times.

  8. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    Science.gov (United States)

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  9. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Science.gov (United States)

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how

  10. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Directory of Open Access Journals (Sweden)

    Abhishek Mitra

    Full Text Available Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding. Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants. Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively

  11. Effect of variable rates of daily sampling of fly larvae on decomposition and carrion insect community assembly: implications for forensic entomology field study protocols.

    Science.gov (United States)

    Michaud, Jean-Philippe; Moreau, Gaétan

    2013-07-01

    Experimental protocols in forensic entomology successional field studies generally involve daily sampling of insects to document temporal changes in species composition on animal carcasses. One challenge with that method has been to adjust the sampling intensity to obtain the best representation of the community present without affecting the said community. To this date, little is known about how such investigator perturbations affect decomposition-related processes. Here, we investigated how different levels of daily sampling of fly eggs and fly larvae affected, over time, carcass decomposition rate and the carrion insect community. Results indicated that a daily sampling of forensic entomology successional field studies.

  12. Paranoid Personality Has a Dimensional Latent Structure: Taxometric Analyses of Community and Clinical Samples

    OpenAIRE

    Edens, John F.; Marcus, David K.; Morey, Leslie C.

    2009-01-01

    Although paranoid personality is one of the most commonly diagnosed personality disorders and is associated with numerous negative life consequences, relatively little is known about the structural properties of this condition. This study examines whether paranoid personality traits represent a latent dimension or a discrete class (i.e., taxon). In study 1, we conducted taxometric analyses of paranoid personality disorder criteria in a sample of 731 patients participating in the Collaborative...

  13. Evaluating the effect of sample type on American alligator (Alligator mississippiensis) analyte values in a point-of-care blood analyser.

    Science.gov (United States)

    Hamilton, Matthew T; Finger, John W; Winzeler, Megan E; Tuberville, Tracey D

    2016-01-01

    The assessment of wildlife health has been enhanced by the ability of point-of-care (POC) blood analysers to provide biochemical analyses of non-domesticated animals in the field. However, environmental limitations (e.g. temperature, atmospheric humidity and rain) and lack of reference values may inhibit researchers from using such a device with certain wildlife species. Evaluating the use of alternative sample types, such as plasma, in a POC device may afford researchers the opportunity to delay sample analysis and the ability to use banked samples. In this study, we examined fresh whole blood, fresh plasma and frozen plasma (sample type) pH, partial pressure of carbon dioxide (PCO2), bicarbonate (HCO3 (-)), total carbon dioxide (TCO2), base excess (BE), partial pressure of oxygen (PO2), oxygen saturation (sO2) and lactate concentrations in 23 juvenile American alligators (Alligator mississippiensis) using an i-STAT CG4+ cartridge. Our results indicate that sample type had no effect on lactate concentration values (F 2,65 = 0.37, P = 0.963), suggesting that the i-STAT analyser can be used reliably to quantify lactate concentrations in fresh and frozen plasma samples. In contrast, the other seven blood parameters measured by the CG4+ cartridge were significantly affected by sample type. Lastly, we were able to collect blood samples from all alligators within 2 min of capture to establish preliminary reference ranges for juvenile alligators based on values obtained using fresh whole blood.

  14. Evaluation of a transposase protocol for rapid generation of shotgun high-throughput sequencing libraries from nanogram quantities of DNA.

    Science.gov (United States)

    Marine, Rachel; Polson, Shawn W; Ravel, Jacques; Hatfull, Graham; Russell, Daniel; Sullivan, Matthew; Syed, Fraz; Dumas, Michael; Wommack, K Eric

    2011-11-01

    Construction of DNA fragment libraries for next-generation sequencing can prove challenging, especially for samples with low DNA yield. Protocols devised to circumvent the problems associated with low starting quantities of DNA can result in amplification biases that skew the distribution of genomes in metagenomic data. Moreover, sample throughput can be slow, as current library construction techniques are time-consuming. This study evaluated Nextera, a new transposon-based method that is designed for quick production of DNA fragment libraries from a small quantity of DNA. The sequence read distribution across nine phage genomes in a mock viral assemblage met predictions for six of the least-abundant phages; however, the rank order of the most abundant phages differed slightly from predictions. De novo genome assemblies from Nextera libraries provided long contigs spanning over half of the phage genome; in four cases where full-length genome sequences were available for comparison, consensus sequences were found to match over 99% of the genome with near-perfect identity. Analysis of areas of low and high sequence coverage within phage genomes indicated that GC content may influence coverage of sequences from Nextera libraries. Comparisons of phage genomes prepared using both Nextera and a standard 454 FLX Titanium library preparation protocol suggested that the coverage biases according to GC content observed within the Nextera libraries were largely attributable to bias in the Nextera protocol rather than to the 454 sequencing technology. Nevertheless, given suitable sequence coverage, the Nextera protocol produced high-quality data for genomic studies. For metagenomics analyses, effects of GC amplification bias would need to be considered; however, the library preparation standardization that Nextera provides should benefit comparative metagenomic analyses.

  15. Sampling and analyses report for June 1992 semiannual postburn sampling at the RM1 UCG site, Hanna, Wyoming

    International Nuclear Information System (INIS)

    Lindblom, S.R.

    1992-08-01

    The Rocky Mountain 1 (RMl) underground coal gasification (UCG) test was conducted from November 16, 1987 through February 26, 1988 (United Engineers and Constructors 1989) at a site approximately one mile south of Hanna, Wyoming. The test consisted of dual module operation to evaluate the controlled retracting injection point (CRIP) technology, the elongated linked well (ELW) technology, and the interaction of closely spaced modules operating simultaneously. The test caused two cavities to be formed in the Hanna No. 1 coal seam and associated overburden. The Hanna No. 1 coal seam is approximately 30 ft thick and lays at depths between 350 ft and 365 ft below the surface in the test area. The coal seam is overlain by sandstones, siltstones and claystones deposited by various fluvial environments. The groundwater monitoring was designed to satisfy the requirements of the Wyoming Department of Environmental Quality (WDEQ) in addition to providing research data toward the development of UCG technology that minimizes environmental impacts. The June 1992 semiannual groundwater.sampling took place from June 10 through June 13, 1992. This event occurred nearly 34 months after the second groundwater restoration at the RM1 site and was the fifteenth sampling event since UCG operations ceased. Samples were collected for analyses of a limited suite set of parameters as listed in Table 1. With a few exceptions, the groundwater is near baseline conditions. Data from the field measurements and analysis of samples are presented. Benzene concentrations in the groundwater were below analytical detection limits

  16. Dysphonia risk screening protocol

    Science.gov (United States)

    Nemr, Katia; Simões-Zenari, Marcia; da Trindade Duarte, João Marcos; Lobrigate, Karen Elena; Bagatini, Flavia Alves

    2016-01-01

    OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors) divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children), 29.25 (adult women), 22.75 (adult men), and 27.10 (seniors). CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics. PMID:27074171

  17. Dysphonia risk screening protocol

    Directory of Open Access Journals (Sweden)

    Katia Nemr

    2016-03-01

    Full Text Available OBJECTIVE: To propose and test the applicability of a dysphonia risk screening protocol with score calculation in individuals with and without dysphonia. METHOD: This descriptive cross-sectional study included 365 individuals (41 children, 142 adult women, 91 adult men and 91 seniors divided into a dysphonic group and a non-dysphonic group. The protocol consisted of 18 questions and a score was calculated using a 10-cm visual analog scale. The measured value on the visual analog scale was added to the overall score, along with other partial scores. Speech samples allowed for analysis/assessment of the overall degree of vocal deviation and initial definition of the respective groups and after six months, the separation of the groups was confirmed using an acoustic analysis. RESULTS: The mean total scores were different between the groups in all samples. Values ranged between 37.0 and 57.85 in the dysphonic group and between 12.95 and 19.28 in the non-dysphonic group, with overall means of 46.09 and 15.55, respectively. High sensitivity and specificity were demonstrated when discriminating between the groups with the following cut-off points: 22.50 (children, 29.25 (adult women, 22.75 (adult men, and 27.10 (seniors. CONCLUSION: The protocol demonstrated high sensitivity and specificity in differentiating groups of individuals with and without dysphonia in different sample groups and is thus an effective instrument for use in voice clinics.

  18. 21 CFR 660.6 - Samples; protocols; official release.

    Science.gov (United States)

    2010-04-01

    ... product iodinated with 125I means a sample from each lot of diagnostic test kits in a finished package... manufacturer has satisfactorily completed all tests on the samples: (i) One sample until written notification... of this section, a sample of product not iodinated with 125I means a sample from each filling of each...

  19. Lead isotope analyses of standard rock samples

    International Nuclear Information System (INIS)

    Koide, Yoshiyuki; Nakamura, Eizo

    1990-01-01

    New results on lead isotope compositions of standard rock samples and their analytical procedures are reported. Bromide form anion exchange chromatography technique was adopted for the chemical separation lead from rock samples. The lead contamination during whole analytical procedure was low enough to determine lead isotope composition of common natural rocks. Silica-gel activator method was applied for emission of lead ions in the mass spectrometer. Using the data reduction of 'unfractionated ratios', we obtained good reproducibility, precision and accuracy on lead isotope compositions of NBS SRM. Here we present new reliable lead isotope compositions of GSJ standard rock samples and USGS standard rock, BCR-1. (author)

  20. Multiple surveys employing a new sample-processing protocol reveal the genetic diversity of placozoans in Japan.

    Science.gov (United States)

    Miyazawa, Hideyuki; Nakano, Hiroaki

    2018-03-01

    Placozoans, flat free-living marine invertebrates, possess an extremely simple bauplan lacking neurons and muscle cells and represent one of the earliest-branching metazoan phyla. They are widely distributed from temperate to tropical oceans. Based on mitochondrial 16S rRNA sequences, 19 haplotypes forming seven distinct clades have been reported in placozoans to date. In Japan, placozoans have been found at nine locations, but 16S genotyping has been performed at only two of these locations. Here, we propose a new processing protocol, "ethanol-treated substrate sampling," for collecting placozoans from natural environments. We also report the collection of placozoans from three new locations, the islands of Shikine-jima, Chichi-jima, and Haha-jima, and we present the distribution of the 16S haplotypes of placozoans in Japan. Multiple surveys conducted at multiple locations yielded five haplotypes that were not reported previously, revealing high genetic diversity in Japan, especially at Shimoda and Shikine-jima Island. The observed geographic distribution patterns were different among haplotypes; some were widely distributed, while others were sampled only from a single location. However, samplings conducted on different dates at the same sites yielded different haplotypes, suggesting that placozoans of a given haplotype do not inhabit the same site constantly throughout the year. Continued sampling efforts conducted during all seasons at multiple locations worldwide and the development of molecular markers within the haplotypes are needed to reveal the geographic distribution pattern and dispersal history of placozoans in greater detail.

  1. Bayesian adaptive survey protocols for resource management

    Science.gov (United States)

    Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.

    2011-01-01

    Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of

  2. Analysis of Security Protocols by Annotations

    DEFF Research Database (Denmark)

    Gao, Han

    . The development of formal techniques, e.g. control flow analyses, that can check various security properties, is an important tool to meet this challenge. This dissertation contributes to the development of such techniques. In this dissertation, security protocols are modelled in the process calculus LYSA......The trend in Information Technology is that distributed systems and networks are becoming increasingly important, as most of the services and opportunities that characterise the modern society are based on these technologies. Communication among agents over networks has therefore acquired a great...... deal of research interest. In order to provide effective and reliable means of communication, more and more communication protocols are invented, and for most of them, security is a significant goal. It has long been a challenge to determine conclusively whether a given protocol is secure or not...

  3. Comparison of sampling procedures and microbiological and non-microbiological parameters to evaluate cleaning and disinfection in broiler houses.

    Science.gov (United States)

    Luyckx, K; Dewulf, J; Van Weyenberg, S; Herman, L; Zoons, J; Vervaet, E; Heyndrickx, M; De Reu, K

    2015-04-01

    Cleaning and disinfection of the broiler stable environment is an essential part of farm hygiene management. Adequate cleaning and disinfection is essential for prevention and control of animal diseases and zoonoses. The goal of this study was to shed light on the dynamics of microbiological and non-microbiological parameters during the successive steps of cleaning and disinfection and to select the most suitable sampling methods and parameters to evaluate cleaning and disinfection in broiler houses. The effectiveness of cleaning and disinfection protocols was measured in six broiler houses on two farms through visual inspection, adenosine triphosphate hygiene monitoring and microbiological analyses. Samples were taken at three time points: 1) before cleaning, 2) after cleaning, and 3) after disinfection. Before cleaning and after disinfection, air samples were taken in addition to agar contact plates and swab samples taken from various sampling points for enumeration of total aerobic flora, Enterococcus spp., and Escherichia coli and the detection of E. coli and Salmonella. After cleaning, air samples, swab samples, and adenosine triphosphate swabs were taken and a visual score was also assigned for each sampling point. The mean total aerobic flora determined by swab samples decreased from 7.7±1.4 to 5.7±1.2 log CFU/625 cm2 after cleaning and to 4.2±1.6 log CFU/625 cm2 after disinfection. Agar contact plates were used as the standard for evaluating cleaning and disinfection, but in this study they were found to be less suitable than swabs for enumeration. In addition to measuring total aerobic flora, Enterococcus spp. seemed to be a better hygiene indicator to evaluate cleaning and disinfection protocols than E. coli. All stables were Salmonella negative, but the detection of its indicator organism E. coli provided additional information for evaluating cleaning and disinfection protocols. Adenosine triphosphate analyses gave additional information about the

  4. Rescue and Preservation of Sample Data from the Apollo Missions to the Moon

    Science.gov (United States)

    Todd, Nancy S.; Zeigler, Ryan A.; Evans, Cindy A.; Lehnert, Kerstin

    2016-01-01

    Six Apollo missions landed on the Moon from 1969-72, returning to Earth 382 kg of lunar rock, soil, and core samples. These samples are among the best documented and preserved samples on Earth that have supported a robust research program for 45 years. From mission planning through sample collection, preliminary examination, and subsequent research, strict protocols and procedures are followed for handling and allocating Apollo subsamples, resulting in the production of vast amounts of documentation. Even today, hundreds of samples are allocated for research each year, building on the science foundation laid down by the early Apollo sample studies and combining new data from today's instrumentation, lunar remote sensing missions and lunar meteorites. Much sample information is available to researchers at curator.jsc.nasa.gov. Decades of analyses on lunar samples are published in LPSC proceedings volumes and other peer-reviewed journals, and tabulated in lunar sample compendia entries. However, for much of the 1969-1995 period, the processing documentation, individual and consortia analyses, and unpublished results exist only in analog forms or primitive digital formats that are either inaccessible or at risk of being lost forever because critical data from early investigators remain unpublished.

  5. Experimental Protocol to Determine the Chloride Threshold Value for Corrosion in Samples Taken from Reinforced Concrete Structures.

    Science.gov (United States)

    Angst, Ueli M; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard

    2017-08-31

    The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established

  6. Using semantics for representing experimental protocols.

    Science.gov (United States)

    Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar

    2017-11-13

    An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .

  7. FAO/IAEA model protocol for the determination of bound residues in soil

    International Nuclear Information System (INIS)

    1986-01-01

    A protocol for determining bound pesticide residue content in soils was developed and collaboratively tested by 11 members of the FAO/IAEA Research Co-ordination Committee. The method assumes prior incubation of soil with a radioactive pesticide or related organic compound. The major process steps of the protocol include: (a) Soxhlet extraction of air-dry soil with methanol for 24 h; (b) determination of radioactivity in unextracted soil, in methanol-extracted soil (yielding bound residue content), and in the methanol extract (yielding extractable residue content); and (c) use of triplicate samples per analysis. The participants received lysimeter soils treated six to seven years earlier with 14 C-allyl alcohol (Soil A) or 14 C-hexachloro-benzene (Soil H). The inter-laboratory results first indicated non-homogeneity of Soil A sub-samples, since the initial and bound radioactivity for four laboratories was about half of that found by the remaining seven laboratories. Intra-laboratory (in one laboratory) analyses of sub-subsamples from six 'high-group' laboratories, two 'low-group' laboratories and two additional laboratories confirmed the homogeneity of Soil A and implicated error in the combustion methods at 'low-group' laboratories. The intra- and inter-laboratory coefficients of variation for initial 14 C-content were 4.7% and 7.0%, respectively. Of the residual 14 C in Soil A, 95% was bound; in contrast, only 15% of 14 C in Soil H was bound. The coefficients of variation among ten laboratories, for Soil H, were 8.4% and 18.1% for percentage extractable residue and percentage bound residue, respectively. Some limited testing of alternative protocols, using other solvents or batch extraction, confirmed that the IAEA protocol was most efficient in the extraction of non-bound radioactivity; pre-wetting Soil A may, however, improve extraction. (author)

  8. [Sampling, storage and transport of biological materials collected from living and deceased subjects for determination of concentration levels of ethyl alcohol and similarly acting substances. A proposal of updating the blood and urine sampling protocol].

    Science.gov (United States)

    Wiergowski, Marek; Reguła, Krystyna; Pieśniak, Dorota; Galer-Tatarowicz, Katarzyna; Szpiech, Beata; Jankowski, Zbigniew

    2007-01-01

    The present paper emphasizes the most common mistakes committed at the beginning of an analytical procedure. To shorten the time and decrease the cost of determinations of substances with similar to alcohol activity, it is postulated to introduce mass-scale screening analysis of saliva collected from a living subject at the site of the event, with all positive results confirmed in blood or urine samples. If no saliva sample is collected for toxicology, a urine sample, allowing for a stat fast screening analysis, and a blood sample, to confirm the result, should be ensured. Inappropriate storage of a blood sample in the tube without a preservative can cause sample spilling and its irretrievable loss. The authors propose updating the "Blood/urine sampling protocol", with the updated version to be introduced into practice following consultations and revisions.

  9. Time clustered sampling can inflate the inferred substitution rate in foot-and-mouth disease virus analyses

    DEFF Research Database (Denmark)

    Pedersen, Casper-Emil Tingskov; Frandsen, Peter; Wekesa, Sabenzia N.

    2015-01-01

    abundance of sequence data sampled under widely different schemes, an effort to keep results consistent and comparable is needed. This study emphasizes commonly disregarded problems in the inference of evolutionary rates in viral sequence data when sampling is unevenly distributed on a temporal scale...... through a study of the foot-and-mouth (FMD) disease virus serotypes SAT 1 and SAT 2. Our study shows that clustered temporal sampling in phylogenetic analyses of FMD viruses will strongly bias the inferences of substitution rates and tMRCA because the inferred rates in such data sets reflect a rate closer...... to the mutation rate rather than the substitution rate. Estimating evolutionary parameters from viral sequences should be performed with due consideration of the differences in short-term and longer-term evolutionary processes occurring within sets of temporally sampled viruses, and studies should carefully...

  10. Utilisation de l'assistant grammatical Antidote dans le cadre d'activités de révision - Analyse exploratoire de protocoles d'observation Using Antidote in Revision Tasks: An Exploratory Study of Grammar Checker Usage Through Verbal Protocol Analysis

    Directory of Open Access Journals (Sweden)

    Patrick Durel

    2006-06-01

    Full Text Available Cette étude examine les opérations mentales d'apprenants utilisant l'assistant grammatical Antidote en phase de révision de leurs productions écrites. Analysant des données obtenues à partir de la méthode des protocoles verbaux, ce travail montre comment ce type d'activité de révision assistée par ordinateur conduit l'apprenant à manipuler, construire, renforcer ou élargir le champ d'application de ses connaissances, qu'elles soient déclaratives ou procédurales. Les éléments dégagés indiquent que l'utilisation d'Antidote peut être vecteur d'apprentissage et offrent quelques pistes permettant de concevoir des activités de révision assistée par ordinateur qui s'inscrivent dans le cadre d'une didactique de la production scripturale.This study explores learners' cognitive processes when using the grammar assistant software Antidote during the revision phase of their composition. Analysing data obtained using a hybrid form of verbal protocols, it shows how computer-assisted revision activity can lead learners to manipulate, build or reinforce declarative and procedural knowledge. We argue that using grammar assistant software can be conducive to learning. The results of the analysis provide an insight into how such a tool can be integrated into classroom activities, contributing to writing quality and the acquisition of revision strategies.

  11. Use of 1012 ohm current amplifiers in Sr and Nd isotope analyses by TIMS for application to sub-nanogram samples

    NARCIS (Netherlands)

    Koornneef, J.M.; Bouman, C.; Schwieters, J.B.; Davies, G.R.

    2013-01-01

    We have investigated the use of current amplifiers equipped with 10 12 ohm feedback resistors in thermal ionisation mass spectrometry (TIMS) analyses of sub-nanogram sample aliquots for Nd and Sr isotope ratios. The results of analyses using the 1012 ohm resistors were compared to those obtained

  12. Verbal protocols as methodological resources: research evidence

    Directory of Open Access Journals (Sweden)

    Alessandra Baldo

    2012-01-01

    Full Text Available This article aims at reflecting on the use of verbal protocols as a methodological resource in qualitative research, more specifically on the aspect regarded as the main limitation of a study about lexical inferencing in L2 (BALDO; VELASQUES, 2010: its subjective trait. The article begins with a brief literature review on protocols, followed by a description of the study in which they were employed as methodological resources. Based on that, protocol subjectivity is illustrated through samples of unparalleled data classification, carried out independently by two researchers. In the final section, the path followed to minimize the problem is presented, intending to contribute to improve efficiency in the use of verbal protocols in future research.

  13. Quantum cryptography: individual eavesdropping with the knowledge of the error-correcting protocol

    International Nuclear Information System (INIS)

    Horoshko, D B

    2007-01-01

    The quantum key distribution protocol BB84 combined with the repetition protocol for error correction is analysed from the point of view of its security against individual eavesdropping relying on quantum memory. It is shown that the mere knowledge of the error-correcting protocol changes the optimal attack and provides the eavesdropper with additional information on the distributed key. (fifth seminar in memory of d.n. klyshko)

  14. Technical Note: Comparison of storage strategies of sea surface microlayer samples

    Directory of Open Access Journals (Sweden)

    K. Schneider-Zapp

    2013-07-01

    Full Text Available The sea surface microlayer (SML is an important biogeochemical system whose physico-chemical analysis often necessitates some degree of sample storage. However, many SML components degrade with time so the development of optimal storage protocols is paramount. We here briefly review some commonly used treatment and storage protocols. Using freshwater and saline SML samples from a river estuary, we investigated temporal changes in surfactant activity (SA and the absorbance and fluorescence of chromophoric dissolved organic matter (CDOM over four weeks, following selected sample treatment and storage protocols. Some variability in the effectiveness of individual protocols most likely reflects sample provenance. None of the various protocols examined performed any better than dark storage at 4 °C without pre-treatment. We therefore recommend storing samples refrigerated in the dark.

  15. Sample preparation guidelines for two-dimensional electrophoresis.

    Science.gov (United States)

    Posch, Anton

    2014-12-01

    Sample preparation is one of the key technologies for successful two-dimensional electrophoresis (2DE). Due to the great diversity of protein sample types and sources, no single sample preparation method works with all proteins; for any sample the optimum procedure must be determined empirically. This review is meant to provide a broad overview of the most important principles in sample preparation in order to avoid a multitude of possible pitfalls. Sample preparation protocols from the expert in the field were screened and evaluated. On the basis of these protocols and my own comprehensive practical experience important guidelines are given in this review. The presented guidelines will facilitate straightforward protocol development for researchers new to gel-based proteomics. In addition the available choices are rationalized in order to successfully prepare a protein sample for 2DE separations. The strategies described here are not limited to 2DE and can also be applied to other protein separation techniques.

  16. Analysis of agreement between cardiac risk stratification protocols applied to participants of a center for cardiac rehabilitation

    Directory of Open Access Journals (Sweden)

    Ana A. S. Santos

    2016-01-01

    Full Text Available ABSTRACT Background Cardiac risk stratification is related to the risk of the occurrence of events induced by exercise. Despite the existence of several protocols to calculate risk stratification, studies indicating that there is similarity between these protocols are still unknown. Objective To evaluate the agreement between the existing protocols on cardiac risk rating in cardiac patients. Method The records of 50 patients from a cardiac rehabilitation program were analyzed, from which the following information was extracted: age, sex, weight, height, clinical diagnosis, medical history, risk factors, associated diseases, and the results from the most recent laboratory and complementary tests performed. This information was used for risk stratification of the patients in the protocols of the American College of Sports Medicine, the Brazilian Society of Cardiology, the American Heart Association, the protocol designed by Frederic J. Pashkow, the American Association of Cardiovascular and Pulmonary Rehabilitation, the Société Française de Cardiologie, and the Sociedad Española de Cardiología. Descriptive statistics were used to characterize the sample and the analysis of agreement between the protocols was calculated using the Kappa coefficient. Differences were considered with a significance level of 5%. Results Of the 21 analyses of agreement, 12 were considered significant between the protocols used for risk classification, with nine classified as moderate and three as low. No agreements were classified as excellent. Different proportions were observed in each risk category, with significant differences between the protocols for all risk categories. Conclusion The agreements between the protocols were considered low and moderate and the risk proportions differed between protocols.

  17. A Calculus for Control Flow Analysis of Security Protocols

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Nielson, Hanne Riis; Nielson, Flemming

    2004-01-01

    The design of a process calculus for anaysing security protocols is governed by three factors: how to express the security protocol in a precise and faithful manner, how to accommodate the variety of attack scenarios, and how to utilise the strengths (and limit the weaknesses) of the underlying...... analysis methodology. We pursue an analysis methodology based on control flow analysis in flow logic style and we have previously shown its ability to analyse a variety of security protocols. This paper develops a calculus, LysaNS that allows for much greater control and clarity in the description...

  18. Diplomacy and Diplomatic Protocol

    Directory of Open Access Journals (Sweden)

    Lect. Ph.D Oana Iucu

    2008-12-01

    Full Text Available The present study aims to observe relationships and determining factors between diplomacyand diplomatic protocol as outlined by historical and contextual analyses. The approach is very dynamic,provided that concepts are able to show their richness, antiquity and polyvalence at the level of connotations,semantics, grammatical and social syntax. The fact that this information is up to date determines anattitude of appreciation and a state of positive contamination.

  19. Protocol voor meting van lachgasemissie uit huisvestingssystemen in de veehouderij 2010 = Measurement protocol for nitrous oxide emission from housing systems in livestock production 2010

    NARCIS (Netherlands)

    Mosquera Losada, J.; Groenestein, C.M.; Ogink, N.W.M.

    2011-01-01

    This report describes a measurement protocol for nitrous oxide emissions from animal housing systems. The protocol is based on sampling periods of 24 hours spread over one year and can be applied in specified animal categories.

  20. Data validation report for the 100-HR-3 Operable Unit first quarter 1994 groundwater sampling data

    Energy Technology Data Exchange (ETDEWEB)

    Biggerstaff, R.L.

    1994-06-24

    Westinghouse-Hanford has requested that a minimum of 20% of the total number of Sample Delivery Groups be validated for the 100-HR-3 Operable Unit First Quarter 1994 Groundwater Sampling Investigation. Therefore, the data from the chemical analysis of twenty-four samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site. The samples were analyzed by Thermo-Analytic Laboratories (TMA) and Roy F. Weston Laboratories (WESTON) using US Environmental Protection Agency (EPA) CLP protocols. Sample analyses included: inorganics; and general chemical parameters. Forty-two samples were validated for radiochemical parameters by TMA and Teledyne.

  1. Data validation report for the 100-HR-3 Operable Unit first quarter 1994 groundwater sampling data

    International Nuclear Information System (INIS)

    Biggerstaff, R.L.

    1994-01-01

    Westinghouse-Hanford has requested that a minimum of 20% of the total number of Sample Delivery Groups be validated for the 100-HR-3 Operable Unit First Quarter 1994 Groundwater Sampling Investigation. Therefore, the data from the chemical analysis of twenty-four samples from this sampling event and their related quality assurance samples were reviewed and validated to verify that reported sample results were of sufficient quality to support decisions regarding remedial actions performed at this site. The samples were analyzed by Thermo-Analytic Laboratories (TMA) and Roy F. Weston Laboratories (WESTON) using US Environmental Protection Agency (EPA) CLP protocols. Sample analyses included: inorganics; and general chemical parameters. Forty-two samples were validated for radiochemical parameters by TMA and Teledyne

  2. Decellularization of placentas: establishing a protocol

    Directory of Open Access Journals (Sweden)

    L.C.P.C. Leonel

    2017-11-01

    Full Text Available Biological biomaterials for tissue engineering purposes can be produced through tissue and/or organ decellularization. The remaining extracellular matrix (ECM must be acellular and preserve its proteins and physical features. Placentas are organs of great interest because they are discarded after birth and present large amounts of ECM. Protocols for decellularization are tissue-specific and have not been established for canine placentas yet. This study aimed at analyzing a favorable method for decellularization of maternal and fetal portions of canine placentas. Canine placentas were subjected to ten preliminary tests to analyze the efficacy of parameters such as the type of detergents, freezing temperatures and perfusion. Two protocols were chosen for further analyses using histology, scanning electron microscopy, immunofluorescence and DNA quantification. Sodium dodecyl sulfate (SDS was the most effective detergent for cell removal. Freezing placentas before decellularization required longer periods of incubation in different detergents. Both perfusion and immersion methods were capable of removing cells. Placentas decellularized using Protocol I (1% SDS, 5 mM EDTA, 50 mM TRIS, and 0.5% antibiotic preserved the ECM structure better, but Protocol I was less efficient to remove cells and DNA content from the ECM than Protocol II (1% SDS, 5 mM EDTA, 0.05% trypsin, and 0.5% antibiotic.

  3. Privacy-Preserving Verifiability: A Case for an Electronic Exam Protocol

    DEFF Research Database (Denmark)

    Giustolisi, Rosario; Iovino, Vincenzo; Lenzini, Gabriele

    2017-01-01

    We introduce the notion of privacy-preserving verifiability for security protocols. It holds when a protocol admits a verifiability test that does not reveal, to the verifier that runs it, more pieces of information about the protocol’s execution than those required to run the test. Our definition...... of privacy-preserving verifiability is general and applies to cryptographic protocols as well as to human security protocols. In this paper we exemplify it in the domain of e-exams. We prove that the notion is meaningful by studying an existing exam protocol that is verifiable but whose verifiability tests...... are not privacy-preserving. We prove that the notion is applicable: we review the protocol using functional encryption so that it admits a verifiability test that preserves privacy according to our definition. We analyse, in ProVerif, that the verifiability holds despite malicious parties and that the new...

  4. Continuous sampling from distributed streams

    DEFF Research Database (Denmark)

    Graham, Cormode; Muthukrishnan, S.; Yi, Ke

    2012-01-01

    A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple distribu......A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple...... distributed sites. The main challenge is to ensure that a sample is drawn uniformly across the union of the data while minimizing the communication needed to run the protocol on the evolving data. At the same time, it is also necessary to make the protocol lightweight, by keeping the space and time costs low...... for each participant. In this article, we present communication-efficient protocols for continuously maintaining a sample (both with and without replacement) from k distributed streams. These apply to the case when we want a sample from the full streams, and to the sliding window cases of only the W most...

  5. A simplified and cost-effective enrichment protocol for the isolation of Campylobacter spp. from retail broiler meat without microaerobic incubation

    LENUS (Irish Health Repository)

    Zhou, Ping

    2011-08-03

    Abstract Background To simplify the methodology for the isolation of Campylobacter spp. from retail broiler meat, we evaluated 108 samples (breasts and thighs) using an unpaired sample design. The enrichment broths were incubated under aerobic conditions (subsamples A) and for comparison under microaerobic conditions (subsamples M) as recommended by current reference protocols. Sensors were used to measure the dissolved oxygen (DO) in the broth and the percentage of oxygen (O2) in the head space of the bags used for enrichment. Campylobacter isolates were identified with multiplex PCR assays and typed using pulsed-field gel electrophoresis (PFGE). Ribosomal intergenic spacer analyses (RISA) and denaturing gradient gel electrophoresis (DGGE) were used to study the bacterial communities of subsamples M and A after 48 h enrichment. Results The number of Campylobacter positive subsamples were similar for A and M when all samples were combined (P = 0.81) and when samples were analyzed by product (breast: P = 0.75; thigh: P = 1.00). Oxygen sensors showed that DO values in the broth were around 6 ppm and O2 values in the head space were 14-16% throughout incubation. PFGE demonstrated high genomic similarity of isolates in the majority of the samples in which isolates were obtained from subsamples A and M. RISA and DGGE results showed a large variability in the bacterial populations that could be attributed to sample-to-sample variations and not enrichment conditions (aerobic or microaerobic). These data also suggested that current sampling protocols are not optimized to determine the true number of Campylobacter positive samples in retail boiler meat. Conclusions Decreased DO in enrichment broths is naturally achieved. This simplified, cost-effective enrichment protocol with aerobic incubation could be incorporated into reference methods for the isolation of Campylobacter spp. from retail broiler meat.

  6. Comparison of dose and image quality in protocols abdominal CT using high an low KVP

    International Nuclear Information System (INIS)

    Mas Munoz, I.; Alejo Luque, L.; Corredoira Silva, E.; Sanchez Munoz, F. J.; Serrada Hierro, A.

    2013-01-01

    This paper compares quantitatively low kV Protocol with the conventional Protocol of abdomen, analysing the image quality with objective physical parameters and calculating the corresponding dummy dose reduction. (Author)

  7. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  8. A distance limited method for sampling downed coarse woody debris

    Science.gov (United States)

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  9. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  10. ISS protocol for EPR tooth dosimetry

    International Nuclear Information System (INIS)

    Onori, S.; Aragno, D.; Fattibene, P.; Petetti, E.; Pressello, M.C.

    2000-01-01

    The accuracy in Electron Paramagnetic Resonance (EPR) dose reconstruction with tooth enamel is affected by sample preparation, dosimetric signal amplitude evaluation and unknown dose estimate. Worldwide efforts in the field of EPR dose reconstruction with tooth enamel are focused on the optimization of the three mentioned steps in dose assessment. In the present work, the protocol implemented at ISS in the framework of the European Community Nuclear Fission Safety project 'Dose Reconstruction' is presented. A combined mechanical-chemical procedure for ground enamel sample preparation is used. The signal intensity evaluation is carried out with powder spectra simulation program. Finally, the unknown dose is evaluated individually for each sample with the additive dose method. The unknown dose is obtained by subtracting a mean native dose from the back-extrapolated dose. As an example of the capability of the ISS protocol in unknown dose evaluation, the results obtained in the framework of the 2nd International Intercomparison on EPR tooth enamel dosimetry are reported

  11. Protocol for Measuring the Thermal Properties of a Supercooled Synthetic Sand-water-gas-methane Hydrate Sample.

    Science.gov (United States)

    Muraoka, Michihiro; Susuki, Naoko; Yamaguchi, Hiroko; Tsuji, Tomoya; Yamamoto, Yoshitaka

    2016-03-21

    Methane hydrates (MHs) are present in large amounts in the ocean floor and permafrost regions. Methane and hydrogen hydrates are being studied as future energy resources and energy storage media. To develop a method for gas production from natural MH-bearing sediments and hydrate-based technologies, it is imperative to understand the thermal properties of gas hydrates. The thermal properties' measurements of samples comprising sand, water, methane, and MH are difficult because the melting heat of MH may affect the measurements. To solve this problem, we performed thermal properties' measurements at supercooled conditions during MH formation. The measurement protocol, calculation method of the saturation change, and tips for thermal constants' analysis of the sample using transient plane source techniques are described here. The effect of the formation heat of MH on measurement is very small because the gas hydrate formation rate is very slow. This measurement method can be applied to the thermal properties of the gas hydrate-water-guest gas system, which contains hydrogen, CO2, and ozone hydrates, because the characteristic low formation rate of gas hydrate is not unique to MH. The key point of this method is the low rate of phase transition of the target material. Hence, this method may be applied to other materials having low phase-transition rates.

  12. Comparison of PIXE and XRF analysis of airborne particulate matter samples collected on Teflon and quartz fibre filters

    Science.gov (United States)

    Chiari, M.; Yubero, E.; Calzolai, G.; Lucarelli, F.; Crespo, J.; Galindo, N.; Nicolás, J. F.; Giannoni, M.; Nava, S.

    2018-02-01

    Within the framework of research projects focusing on the sampling and analysis of airborne particulate matter, Particle Induced X-ray Emission (PIXE) and Energy Dispersive X-ray Fluorescence (ED-XRF) techniques are routinely used in many laboratories throughout the world to determine the elemental concentration of the particulate matter samples. In this work an inter-laboratory comparison of the results obtained from analysing several samples (collected on both Teflon and quartz fibre filters) using both techniques is presented. The samples were analysed by PIXE (in Florence, at the 3 MV Tandetron accelerator of INFN-LABEC laboratory) and by XRF (in Elche, using the ARL Quant'X EDXRF spectrometer with specific conditions optimized for specific groups of elements). The results from the two sets of measurements are in good agreement for all the analysed samples, thus validating the use of the ARL Quant'X EDXRF spectrometer and the selected measurement protocol for the analysis of aerosol samples. Moreover, thanks to the comparison of PIXE and XRF results on Teflon and quartz fibre filters, possible self-absorption effects due to the penetration of the aerosol particles inside the quartz fibre-filters were quantified.

  13. Sample Preparation Strategies for the Effective Quantitation of Hydrophilic Metabolites in Serum by Multi-Targeted HILIC-MS/MS

    Directory of Open Access Journals (Sweden)

    Elisavet Tsakelidou

    2017-03-01

    Full Text Available The effect of endogenous interferences of serum in multi-targeted metabolite profiling HILIC-MS/MS analysis was investigated by studying different sample preparation procedures. A modified QuEChERS dispersive SPE protocol, a HybridSPE protocol, and a combination of liquid extraction with protein precipitation were compared to a simple protein precipitation. Evaluation of extraction efficiency and sample clean-up was performed for all methods. SPE sorbent materials tested were found to retain hydrophilic analytes together with endogenous interferences, thus additional elution steps were needed. Liquid extraction was not shown to minimise matrix effects. In general, it was observed that a balance should be reached in terms of recovery, efficient clean-up, and sample treatment time when a wide range of metabolites are analysed. A quick step for removing phospholipids prior to the determination of hydrophilic endogenous metabolites is required, however, based on the results from the applied methods, further studies are needed to achieve high recoveries for all metabolites.

  14. EMS Adherence to a Pre-hospital Cervical Spine Clearance Protocol

    Directory of Open Access Journals (Sweden)

    Johnson, David

    2001-10-01

    Full Text Available Purpose: To determine the degree of adherence to a cervical spine (c-spine clearance protocol by pre-hospital Emergency Medical Services (EMS personnel by both self-assessment and receiving hospital assessment, to describe deviations from the protocol, and to determine if the rate of compliance by paramedic self-assessment differed from receiving hospital assessment. Methods: A retrospective sample of pre-hospital (consecutive series and receiving hospital (convenience sample assessments of the compliance with and appropriateness of c-spine immobilization. The c-spine clearance protocol was implemented for Orange County EMS just prior to the April-November 1999 data collection period. Results: We collected 396 pre-hospital and 162 receiving hospital data forms. From the pre-hospital data sheet. the percentage deviation from the protocol was 4.096 (16/396. Only one out of 16 cases that did not comply with the protocol was due to over immobilization (0.2%. The remaining 15 cases were under immobilized, according to protocol. Nine of the under immobilized cases (66% that should have been placed in c-spine precautions met physical assessment criteria in the protocol, while the other five cases met mechanism of injury criteria. The rate of deviations from protocol did not differ over time. The receiving hospital identified 8.0% (13/162; 6/16 over immobilized, 7/16 under immobilized of patients with deviations from the protocol; none was determined to have actual c-spine injury. Conclusion: The implementation of a pre-hospital c-spine clearance protocol in Orange County was associated with a moderate overall adherence rate (96% from the pre-hospital perspective, and 92% from the hospital perspective, p=.08 for the two evaluation methods. Most patients who deviated from protocol were under immobilized, but no c-spine injuries were missed. The rate of over immobilization was better than previously reported, implying a saving of resources.

  15. An optimised protocol for molecular identification of Eimeria from chickens☆

    Science.gov (United States)

    Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L.; Macdonald, Sarah E.; Chaudhry, Abdul S.; Sparagano, Olivier; Banerjee, Partha S.; Kundu, Krishnendu; Tomley, Fiona M.; Blake, Damer P.

    2014-01-01

    Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. PMID:24138724

  16. Time Clustered Sampling Can Inflate the Inferred Substitution Rate in Foot-And-Mouth Disease Virus Analyses.

    Science.gov (United States)

    Pedersen, Casper-Emil T; Frandsen, Peter; Wekesa, Sabenzia N; Heller, Rasmus; Sangula, Abraham K; Wadsworth, Jemma; Knowles, Nick J; Muwanika, Vincent B; Siegismund, Hans R

    2015-01-01

    With the emergence of analytical software for the inference of viral evolution, a number of studies have focused on estimating important parameters such as the substitution rate and the time to the most recent common ancestor (tMRCA) for rapidly evolving viruses. Coupled with an increasing abundance of sequence data sampled under widely different schemes, an effort to keep results consistent and comparable is needed. This study emphasizes commonly disregarded problems in the inference of evolutionary rates in viral sequence data when sampling is unevenly distributed on a temporal scale through a study of the foot-and-mouth (FMD) disease virus serotypes SAT 1 and SAT 2. Our study shows that clustered temporal sampling in phylogenetic analyses of FMD viruses will strongly bias the inferences of substitution rates and tMRCA because the inferred rates in such data sets reflect a rate closer to the mutation rate rather than the substitution rate. Estimating evolutionary parameters from viral sequences should be performed with due consideration of the differences in short-term and longer-term evolutionary processes occurring within sets of temporally sampled viruses, and studies should carefully consider how samples are combined.

  17. Methodological effects in Fourier transform infrared (FTIR) spectroscopy: Implications for structural analyses of biomacromolecular samples

    Science.gov (United States)

    Kamnev, Alexander A.; Tugarova, Anna V.; Dyatlova, Yulia A.; Tarantilis, Petros A.; Grigoryeva, Olga P.; Fainleib, Alexander M.; De Luca, Stefania

    2018-03-01

    A set of experimental data obtained by Fourier transform infrared (FTIR) spectroscopy (involving the use of samples ground and pressed with KBr, i.e. in a polar halide matrix) and by matrix-free transmission FTIR or diffuse reflectance infrared Fourier transform (DRIFT) spectroscopic methodologies (involving measurements of thin films or pure powdered samples, respectively) were compared for several different biomacromolecular substances. The samples under study included poly-3-hydroxybutyrate (PHB) isolated from cell biomass of the rhizobacterium Azospirillum brasilense; dry PHB-containing A. brasilense biomass; pectin (natural carboxylated heteropolysaccharide of plant origin; obtained from apple peel) as well as its chemically modified derivatives obtained by partial esterification of its galacturonide-chain hydroxyl moieties with palmitic, oleic and linoleic acids. Significant shifts of some FTIR vibrational bands related to polar functional groups of all the biomacromolecules under study, induced by the halide matrix used for preparing the samples for spectroscopic measurements, were shown and discussed. A polar halide matrix used for preparing samples for FTIR measurements was shown to be likely to affect band positions not only per se, by affecting band energies or via ion exchange (e.g., with carboxylate moieties), but also by inducing crystallisation of metastable amorphous biopolymers (e.g., PHB of microbial origin). The results obtained have important implications for correct structural analyses of polar, H-bonded and/or amphiphilic biomacromolecular systems using different methodologies of FTIR spectroscopy.

  18. RNA extraction from decaying wood for (meta)transcriptomic analyses.

    Science.gov (United States)

    Adamo, Martino; Voyron, Samuele; Girlanda, Mariangela; Marmeisse, Roland

    2017-10-01

    Wood decomposition is a key step of the terrestrial carbon cycle and is of economic importance. It is essentially a microbiological process performed by fungi and to an unknown extent by bacteria. To gain access to the genes expressed by the diverse microbial communities participating in wood decay, we developed an RNA extraction protocol from this recalcitrant material rich in polysaccharides and phenolic compounds. This protocol was implemented on 22 wood samples representing as many tree species from 11 plant families in the Angiosperms and Gymnosperms. RNA was successfully extracted from all samples and converted into cDNAs from which were amplified both fungal and bacterial protein coding genes, including genes encoding hydrolytic enzymes participating in lignocellulose hydrolysis. This protocol applicable to a wide range of decomposing wood types represents a first step towards a metatranscriptomic analysis of wood degradation under natural conditions.

  19. Ad-Hoc vs. Standardized and Optimized Arthropod Diversity Sampling

    Directory of Open Access Journals (Sweden)

    Pedro Cardoso

    2009-09-01

    Full Text Available The use of standardized and optimized protocols has been recently advocated for different arthropod taxa instead of ad-hoc sampling or sampling with protocols defined on a case-by-case basis. We present a comparison of both sampling approaches applied for spiders in a natural area of Portugal. Tests were made to their efficiency, over-collection of common species, singletons proportions, species abundance distributions, average specimen size, average taxonomic distinctness and behavior of richness estimators. The standardized protocol revealed three main advantages: (1 higher efficiency; (2 more reliable estimations of true richness; and (3 meaningful comparisons between undersampled areas.

  20. Preparation of peat samples for inorganic geochemistry used as palaeoenvironmental proxies

    Directory of Open Access Journals (Sweden)

    G. Le Roux

    2010-07-01

    Full Text Available This article provides a brief review of protocols used in peat inorganic geochemistry. We emphasise the key issues that could lead to inter-comparison problems. For each section (drying, grinding, non-destructive analyses, acid digestions and destructive analyses, recommendations are provided to guide the reader through an idealised protocol, which is the only workable approach for studies incorporating long-term comparisons.

  1. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    Science.gov (United States)

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  2. Sample heterogeneity in unipolar depression as assessed by functional connectivity analyses is dominated by general disease effects.

    Science.gov (United States)

    Feder, Stephan; Sundermann, Benedikt; Wersching, Heike; Teuber, Anja; Kugel, Harald; Teismann, Henning; Heindel, Walter; Berger, Klaus; Pfleiderer, Bettina

    2017-11-01

    Combinations of resting-state fMRI and machine-learning techniques are increasingly employed to develop diagnostic models for mental disorders. However, little is known about the neurobiological heterogeneity of depression and diagnostic machine learning has mainly been tested in homogeneous samples. Our main objective was to explore the inherent structure of a diverse unipolar depression sample. The secondary objective was to assess, if such information can improve diagnostic classification. We analyzed data from 360 patients with unipolar depression and 360 non-depressed population controls, who were subdivided into two independent subsets. Cluster analyses (unsupervised learning) of functional connectivity were used to generate hypotheses about potential patient subgroups from the first subset. The relationship of clusters with demographical and clinical measures was assessed. Subsequently, diagnostic classifiers (supervised learning), which incorporated information about these putative depression subgroups, were trained. Exploratory cluster analyses revealed two weakly separable subgroups of depressed patients. These subgroups differed in the average duration of depression and in the proportion of patients with concurrently severe depression and anxiety symptoms. The diagnostic classification models performed at chance level. It remains unresolved, if subgroups represent distinct biological subtypes, variability of continuous clinical variables or in part an overfitting of sparsely structured data. Functional connectivity in unipolar depression is associated with general disease effects. Cluster analyses provide hypotheses about potential depression subtypes. Diagnostic models did not benefit from this additional information regarding heterogeneity. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Formal Security Analysis of the MaCAN Protocol

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Sojka, Michal; Nielson, Flemming

    2014-01-01

    analysis identifies two flaws in the original protocol: one creates unavailability concerns during key establishment, and the other allows re-using authenticated signals for different purposes. We propose and analyse a modification that improves its behaviour while fitting the constraints of CAN bus...

  4. Protocols for pressure ulcer prevention: are they evidence-based?

    Science.gov (United States)

    Chaves, Lidice M; Grypdonck, Mieke H F; Defloor, Tom

    2010-03-01

    This study is a report of a study to determine the quality of protocols for pressure ulcer prevention in home care in the Netherlands. If pressure ulcer prevention protocols are evidence-based and practitioners use them correctly in practice, this will result a reduction in pressure ulcers. Very little is known about the evidence-based content and quality of the pressure ulcer prevention protocols. In 2008, current pressure ulcer prevention protocols from 24 home-care agencies in the Netherlands were evaluated. A checklist developed and validated by two pressure ulcer prevention experts was used to assess the quality of the protocols, and weighted and unweighted quality scores were computed and analysed using descriptive statistics. The 24 pressure ulcer prevention protocols had a mean weighted quality score of 63.38 points out of a maximum of 100 (sd 5). The importance of observing the skin at the pressure points at least once a day was emphasized in 75% of the protocols. Only 42% correctly warned against the use of materials that were 'less effective or that could potentially cause harm'. Pressure ulcer prevention commands a reasonable amount of attention in home care, but the incidence of pressure ulcers and lack of a consistent, standardized document for use in actual practice indicate a need for systematic implementation of national pressure ulcer prevention standards in the Netherlands to ensure adherence to the established protocols.

  5. Tailored two-photon correlation and fair-sampling: a cautionary tale

    Science.gov (United States)

    Romero, J.; Giovannini, D.; Tasca, D. S.; Barnett, S. M.; Padgett, M. J.

    2013-08-01

    We demonstrate an experimental test of the Clauser-Horne- Shimony-Holt (CHSH) Bell inequality which seemingly exhibits correlations beyond the limits imposed by quantum mechanics. Inspired by the idea of Fourier synthesis, we design analysers that measure specific superpositions of orbital angular momentum (OAM) states, such that when one analyser is rotated with respect to the other, the resulting coincidence curves are similar to a square-wave. Calculating the CHSH Bell parameter, S, from these curves result to values beyond the Tsirelson bound of S_{ {QM}}=2\\sqrt {2} . We obtain S = 3.99 ± 0.02, implying almost perfect nonlocal Popescu-Rohrlich correlations. The ‘super-quantum’ values of S is only possible in our experiment because our experiment, subtly, does not comply with fair-sampling. The way our Bell test fails fair-sampling is not immediately obvious and requires knowledge of the states being measured. Our experiment highlights the caution needed in Bell-type experiments based on measurements within high-dimensional state spaces such as that of OAM, especially in the advent of device-independent quantum protocols.

  6. Zamak samples analyses using EDXRF

    Energy Technology Data Exchange (ETDEWEB)

    Assis, J.T. de; Lima, I.; Monin, V., E-mail: joaquim@iprj.uerj.b, E-mail: inaya@iprj.uerj.b, E-mail: monin@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Engenharia Mecanica e Energia; Anjos, M. dos; Lopes, R.T., E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Alves, H., E-mail: marcelin@uerj.b, E-mail: haimon.dlafis@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada e Termodinamica

    2009-07-01

    Zamak is a family of alloys with a base metal of zinc and alloying elements of aluminium, magnesium and copper. Among all non-ferrous metal alloys, Zamak is one that has more applications, for their physical, mechanical properties and easy ability to electrodeposition. It has good resistance to corrosion, traction, shock and wear. Its low melting point (approximately 400 deg C) allows greater durability of the mold, allowing greater production of melted series parts. Zamak can be used in several kinds of areas, such as, to produce residential and industrial locks, construction and carpentry components, refrigerators hinges and so on. It is observed that in some cases the quality of these products is not very good. The problem should be the quality of Zamak alloy purchased by the industries. One possible technique that can be used to investigate the quality of these alloys is Energy Dispersive X-ray fluorescence. In this paper we present results of eight samples of Zamak alloy by this technique and it was possible to classify Zamak alloy and verify some irregularity on these alloys. (author)

  7. Zamak samples analyses using EDXRF

    International Nuclear Information System (INIS)

    Assis, J.T. de; Lima, I.; Monin, V.; Anjos, M. dos; Lopes, R.T.; Alves, H.

    2009-01-01

    Zamak is a family of alloys with a base metal of zinc and alloying elements of aluminium, magnesium and copper. Among all non-ferrous metal alloys, Zamak is one that has more applications, for their physical, mechanical properties and easy ability to electrodeposition. It has good resistance to corrosion, traction, shock and wear. Its low melting point (approximately 400 deg C) allows greater durability of the mold, allowing greater production of melted series parts. Zamak can be used in several kinds of areas, such as, to produce residential and industrial locks, construction and carpentry components, refrigerators hinges and so on. It is observed that in some cases the quality of these products is not very good. The problem should be the quality of Zamak alloy purchased by the industries. One possible technique that can be used to investigate the quality of these alloys is Energy Dispersive X-ray fluorescence. In this paper we present results of eight samples of Zamak alloy by this technique and it was possible to classify Zamak alloy and verify some irregularity on these alloys. (author)

  8. Isolation of cancer cells by "in situ" microfluidic biofunctionalization protocols

    KAUST Repository

    De Vitis, Stefania; Matarise, Giuseppina; Pardeo, Francesca; Catalano, Rossella; Malara, Natalia Maria; Trunzo, Valentina; Tallerico, Rossana; Gentile, Francesco T.; Candeloro, Patrizio; Coluccio, Maria Laura; Massaro, Alessandro S.; Viglietto, Giuseppe; Carbone, Ennio; Kutter, Jö rg Peter; Perozziello, Gerardo; Di Fabrizio, Enzo M.

    2014-01-01

    The aim of this work is the development of a microfluidic immunosensor for the immobilization of cancer cells and their separation from healthy cells by using "in situ" microfluidic biofunctionalization protocols. These protocols allow to link antibodies on microfluidic device surfaces and can be used to study the interaction between cell membrane and biomolecules. Moreover they allow to perform analysis with high processing speed, small quantity of reagents and samples, short reaction times and low production costs. In this work the developed protocols were used in microfluidic devices for the isolation of cancer cells in heterogeneous blood samples by exploiting the binding of specific antibody to an adhesion protein (EpCAM), overexpressed on the tumor cell membranes. The presented biofunctionalization protocols can be performed right before running the experiment: this allows to have a flexible platform where biomolecules of interest can be linked on the device surface according to the user's needs. © 2014 Elsevier B.V. All rights reserved.

  9. Isolation of cancer cells by "in situ" microfluidic biofunctionalization protocols

    KAUST Repository

    De Vitis, Stefania

    2014-07-01

    The aim of this work is the development of a microfluidic immunosensor for the immobilization of cancer cells and their separation from healthy cells by using "in situ" microfluidic biofunctionalization protocols. These protocols allow to link antibodies on microfluidic device surfaces and can be used to study the interaction between cell membrane and biomolecules. Moreover they allow to perform analysis with high processing speed, small quantity of reagents and samples, short reaction times and low production costs. In this work the developed protocols were used in microfluidic devices for the isolation of cancer cells in heterogeneous blood samples by exploiting the binding of specific antibody to an adhesion protein (EpCAM), overexpressed on the tumor cell membranes. The presented biofunctionalization protocols can be performed right before running the experiment: this allows to have a flexible platform where biomolecules of interest can be linked on the device surface according to the user\\'s needs. © 2014 Elsevier B.V. All rights reserved.

  10. The challenge of multi-parameter hydrochemical, gas-physical, and isotopic analyses of in-situ clay pore water and samples from in-situ clay experiments

    International Nuclear Information System (INIS)

    Eichinger, L.; Lorenz, G.D.; Eichinger, F.; Wechner, S.; Voropaev, A.

    2012-01-01

    Document available in extended abstract form only. Within the research framework of natural clay rocks used as barriers for radioactive waste confinement comprehensive analyses are mandatory to determine the chemical and isotopic composition of natural pore water and therein dissolved gases as well as samples from distinct in-situ and lab experiments. Based on the natural conditions pore waters from low permeable argillaceous rocks can be sampled only in small amounts over long time periods. Often those samples are primarily influenced by processes of the exploration and exploitation such as the contamination by drilling fluid and disinfection fluid or cement-water interactions. Sophisticated equipment for circulation experiments allows the sampling of gas and water in the original state in steel and peek cells. The challenge though is to optimise the lab equipment and measurement techniques in a way that the physical-chemical conditions of the water can be analysed in the original state. The development of special micro measuring cells enables the analyses of physical parameters like redox potential under very slow through-flow conditions. Additional analyses can follow subsequently without wasting any drop of the precious pore water. The gas composition is measured in equilibrated gas phases above water phases after emptying a defined volume by inert gas or through manual pressure. The analytical challenge is to obtain an extensive set of parameters which is considered representative for the in-situ conditions using only a few millilitres of water. The parameter analysis includes the determination of the composition of the water, the isotopic compositions of the water and the dissolved constituents as well as their gas concentrations and isotopic signatures. So far the smallest sample volume needed for an analysis of a full set of parameters including the gas composition was 9 ml of water. Obviously, the analysis requires a highly sophisticated infrastructure and

  11. Publication trends of study protocols in rehabilitation.

    Science.gov (United States)

    Jesus, Tiago S; Colquhoun, Heather L

    2017-09-04

    Growing evidence points for the need to publish study protocols in the health field. To observe whether the growing interest in publishing study protocols in the broader health field has been translated into increased publications of rehabilitation study protocols. Observational study using publication data and its indexation in PubMed. Not applicable. Not applicable. PubMed was searched with appropriate combinations of Medical Subject Headings up to December 2014. The effective presence of study protocols was manually screened. Regression models analyzed the yearly growth of publications. Two-sample Z-tests analyzed whether the proportion of Systematic Reviews (SRs) and Randomized Controlled Trials (RCTs) among study protocols differed from that of the same designs for the broader rehabilitation research. Up to December 2014, 746 publications of rehabilitation study protocols were identified, with an exponential growth since 2005 (r2=0.981; p<0.001). RCT protocols were the most common among rehabilitation study protocols (83%), while RCTs were significantly more prevalent among study protocols than among the broader rehabilitation research (83% vs. 35.8%; p<0.001). For SRs, the picture was reversed: significantly less common among study protocols (2.8% vs. 9.3%; p<0.001). Funding was more often reported by rehabilitation study protocols than the broader rehabilitation research (90% vs. 53.1%; p<0.001). Rehabilitation journals published a significantly lower share of rehabilitation study protocols than they did for the broader rehabilitation research (1.8% vs.16.7%; p<0.001). Identifying the reasons for these discrepancies and reverting unwarranted disparities (e.g. low rate of publication for rehabilitation SR protocols) are likely new avenues for rehabilitation research and its publication. SRs, particularly those aggregating RCT results, are considered the best standard of evidence to guide rehabilitation clinical practice; however, that standard can be improved

  12. Total reflection x-ray fluorescence spectroscopy as a tool for evaluation of iron concentration in ferrofluids and yeast samples

    Energy Technology Data Exchange (ETDEWEB)

    Kulesh, N.A., E-mail: nikita.kulesh@urfu.ru [Ural Federal University, Mira 19, 620002 Ekaterinburg (Russian Federation); Novoselova, I.P. [Ural Federal University, Mira 19, 620002 Ekaterinburg (Russian Federation); Immanuel Kant Baltic Federal University, 236041 Kaliningrad (Russian Federation); Safronov, A.P. [Ural Federal University, Mira 19, 620002 Ekaterinburg (Russian Federation); Institute of Electrophysics UD RAS, Amundsen 106, 620016 Ekaterinburg (Russian Federation); Beketov, I.V.; Samatov, O.M. [Institute of Electrophysics UD RAS, Amundsen 106, 620016 Ekaterinburg (Russian Federation); Kurlyandskaya, G.V. [Ural Federal University, Mira 19, 620002 Ekaterinburg (Russian Federation); University of the Basque Country UPV-EHU, 48940 Leioa (Spain); Morozova, M. [Ural Federal University, Mira 19, 620002 Ekaterinburg (Russian Federation); Denisova, T.P. [Irkutsk State University, Karl Marks 1, 664003 Irkutsk (Russian Federation)

    2016-10-01

    In this study, total reflection x-ray fluorescent (TXRF) spectrometry was applied for the evaluation of iron concentration in ferrofluids and biological samples containing iron oxide magnetic nanoparticles obtained by the laser target evaporation technique. Suspensions of maghemite nanoparticles of different concentrations were used to estimate the limitation of the method for the evaluation of nanoparticle concentration in the range of 1–5000 ppm in absence of organic matrix. Samples of single-cell yeasts grown in the nutrient media containing maghemite nanoparticles were used to study the nanoparticle absorption mechanism. The obtained results were analyzed in terms of applicability of TXRF for quantitative analysis in a wide range of iron oxide nanoparticle concentrations for biological samples and ferrofluids with a simple established protocol of specimen preparation. - Highlights: • Ferrofluids and yeasts samples were analysed by TXRF spectroscopy. • Simple protocol for iron quantification by means of TXRF was proposed. • Results were combined with magnetic, structural, and morphological characterization. • Preliminary conclusion on nanoparticles uptake mechanism was made.

  13. Total reflection x-ray fluorescence spectroscopy as a tool for evaluation of iron concentration in ferrofluids and yeast samples

    International Nuclear Information System (INIS)

    Kulesh, N.A.; Novoselova, I.P.; Safronov, A.P.; Beketov, I.V.; Samatov, O.M.; Kurlyandskaya, G.V.; Morozova, M.; Denisova, T.P.

    2016-01-01

    In this study, total reflection x-ray fluorescent (TXRF) spectrometry was applied for the evaluation of iron concentration in ferrofluids and biological samples containing iron oxide magnetic nanoparticles obtained by the laser target evaporation technique. Suspensions of maghemite nanoparticles of different concentrations were used to estimate the limitation of the method for the evaluation of nanoparticle concentration in the range of 1–5000 ppm in absence of organic matrix. Samples of single-cell yeasts grown in the nutrient media containing maghemite nanoparticles were used to study the nanoparticle absorption mechanism. The obtained results were analyzed in terms of applicability of TXRF for quantitative analysis in a wide range of iron oxide nanoparticle concentrations for biological samples and ferrofluids with a simple established protocol of specimen preparation. - Highlights: • Ferrofluids and yeasts samples were analysed by TXRF spectroscopy. • Simple protocol for iron quantification by means of TXRF was proposed. • Results were combined with magnetic, structural, and morphological characterization. • Preliminary conclusion on nanoparticles uptake mechanism was made.

  14. Protocol for Cohesionless Sample Preparation for Physical Experimentation

    Science.gov (United States)

    2016-05-01

    Standard test method for consolidated drained triaxial compression test for soils . In Annual book of ASTM standards. West Conshohocken, PA: ASTM...derived wherein uncertainties and laboratory scatter associated with soil fabric-behavior variance during sample preparation are mitigated. Samples of...wherein comparable analysis between different laboratory tests’ results can be made by ensuring a comparable soil fabric prior to laboratory testing

  15. Provable Fair Document Exchange Protocol with Transaction Privacy for E-Commerce

    Directory of Open Access Journals (Sweden)

    Ren-Junn Hwang

    2015-04-01

    Full Text Available Transaction privacy has attracted a lot of attention in the e-commerce. This study proposes an efficient and provable fair document exchange protocol with transaction privacy. Using the proposed protocol, any untrusted parties can fairly exchange documents without the assistance of online, trusted third parties. Moreover, a notary only notarizes each document once. The authorized document owner can exchange a notarized document with different parties repeatedly without disclosing the origin of the document or the identities of transaction participants. Security and performance analyses indicate that the proposed protocol not only provides strong fairness, non-repudiation of origin, non-repudiation of receipt, and message confidentiality, but also enhances forward secrecy, transaction privacy, and authorized exchange. The proposed protocol is more efficient than other works.

  16. Radioisotope dilution analyses of geological samples using 236U and 229Th

    International Nuclear Information System (INIS)

    Rosholt, J.N.

    1984-01-01

    The use of 236 U and 229 Th in alpha spectrometric measurements has some advantages over the use of other tracers and measurement techniques in isotope dilution analyses of most geological samples. The advantages are: 1) these isotopes do not occur in terrestrial rocks, 2) they have negligible decay losses because of their long half lives, 3) they cause minimal recoil contamination to surface-barrier detectors, 4) they allow for simultaneous determination of the concentration and isotopic composition of uranium and thorium in a variety of sample types, and 5) they allow for simple and constant corrections for spectral interferences, 0.5% of the 238 U activity is subtracted for the contribution of 235 U in the 236 U peak and 1% of the 229 Th activity is subtracted from the 230 Th activity. Disadvantages in using 236 U and 229 Th are: 1) individual separates of uranium and thorium must be prepared as very thin sources for alpha spectrometry, 2) good resolution in the spectrometer system is required for thorium isotopic measurements where measurement times may extend to 300 h, and 3) separate calibrations of the 236 U and 229 Th spike solution with both uranium and thorium standards are required. The use of these tracers in applications of uranium-series disequilibrium studies has simplified the measurements required for the determination of the isotopic composition of uranium and thorium because of the minimal corrections needed for alpha spectral interferences. (orig.)

  17. Radioisotope dilution analyses of geological samples using 236U and 229Th

    Science.gov (United States)

    Rosholt, J.N.

    1984-01-01

    The use of 236U and 229Th in alpha spectrometric measurements has some advantages over the use of other tracers and measurement techniques in isotope dilution analyses of most geological samples. The advantages are: (1) these isotopes do not occur in terrestrial rocks, (2) they have negligible decay losses because of their long half lives, (3) they cause minimal recoil contamination to surface-barrier detectors, (4) they allow for simultaneous determination of the concentration and isotopic composition of uranium and thorium in a variety of sample types, and (5) they allow for simple and constant corrections for spectral inferences, 0.5% of the 238U activity is subtracted for the contribution of 235U in the 236U peak and 1% of the 229Th activity is subtracted from the 230Th activity. Disadvantages in using 236U and 229Th are: (1) individual separates of uranium and thorium must be prepared as very thin sources for alpha spectrometry, (2) good resolution in the spectrometer system is required for thorium isotopic measurements where measurement times may extend to 300 h, and (3) separate calibrations of the 236U and 229Th spike solution with both uranium and thorium standards are required. The use of these tracers in applications of uranium-series disequilibrium studies has simplified the measurements required for the determination of the isotopic composition of uranium and thorium because of the minimal corrections needed for alpha spectral interferences. ?? 1984.

  18. Pseudogenes and DNA-based diet analyses: A cautionary tale from a relatively well sampled predator-prey system

    DEFF Research Database (Denmark)

    Dunshea, G.; Barros, N. B.; Wells, R. S.

    2008-01-01

    Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...

  19. Identification of a research protocol to study orthodontic tooth movement

    Directory of Open Access Journals (Sweden)

    Annalisa Dichicco

    2014-06-01

    Full Text Available Aim: The orthodontic movement is associated with a process of tissue remodeling together with the release of several chemical mediators in periodontal tissues. Each mediator is a potential marker of tooth movement and expresses biological processes as: tissue inflammation and bone remodeling. Different amounts of every mediator are present in several tissues and fluids of the oral cavity. Therefore, there are different methods that allow sampling with several degrees of invasiveness. Chemical mediators are also substances of different molecular nature, and multiple kind of analysis methods allow detection. The purpose of this study was to draft the best research protocol for an optimal study on orthodontic movement efficiency. Methods: An analysis of the international literature have been made, to identify the gold standard of each aspect of the protocol: type of mediator, source and method of sampling and analysis method. Results: From the analysis of the international literature was created an original research protocol for the study and the assessment of the orthodontic movement, by using the biomarkers of the tooth movement. Conclusions: The protocol created is based on the choice of the gold standard of every aspect already analyzed in the literature and in existing protocols for the monitoring of orthodontic tooth movement through the markers of tooth movement. Clinical trials re required for the evaluation and validation of the protocol created.

  20. Performance of Differential-Phase-Shift Keying Protocol Applying 1310 nm Up-Conversion Single-Photon Detector

    International Nuclear Information System (INIS)

    Chen-Xu, Feng; Rong-Zhen, Jiao; Wen-Han, Zhang

    2008-01-01

    The performance of the differential-phase-shift keying (DPSK) protocol applying a 1310 nm up-conversion single-photon detector is analysed. The error rate and the communication rate as a function of distance for three quantum key distribution protocols, the Bennett–Brassard 1984, the Bennett–Brassard–Mermin 1992, and the DPSK, are presented. Then we compare the performance of these three protocols using the 1310nm up-conversion detector. We draw the conclusion that the DPSK protocol applying the detector has significant advantage over the other two protocols. Longer transmission distance and lower error rate can be achieved. (general)

  1. Protocol compliance and time management in blunt trauma resuscitation.

    NARCIS (Netherlands)

    Spanjersberg, W.R.; Bergs, E.A.; Mushkudiani, N.; Klimek, M.; Schipper, I.B.

    2009-01-01

    OBJECTIVES: To study advanced trauma life support (ATLS) protocol adherence prospectively in trauma resuscitation and to analyse time management of daily multidisciplinary trauma resuscitation at a level 1 trauma centre, for both moderately and severely injured patients. PATIENTS AND METHODS: All

  2. Radiological analyses of Marshall Islands environmental samples, 1974--1976

    International Nuclear Information System (INIS)

    Greenhouse, N.A.; Miltenberger, R.P.; Cua, F.T.

    1977-01-01

    Results are reported from the radiological analysis of environmental samples collected in the Marshall Islands during 1974 through 1976. Most of the samples were collected on or near the Bikini Atoll and included plants, soil, fish, catchment water, and sediments, with emphasis on local marine and terrestrial food items. Data are presented from γ spectral analysis and the content of 90 Sr and transuranic elements in the samples

  3. Evaluating the effect of sample type on American alligator (Alligator mississippiensis) analyte values in a point-of-care blood analyser

    OpenAIRE

    Hamilton, Matthew T.; Finger, John W.; Winzeler, Megan E.; Tuberville, Tracey D.

    2016-01-01

    The assessment of wildlife health has been enhanced by the ability of point-of-care (POC) blood analysers to provide biochemical analyses of non-domesticated animals in the field. However, environmental limitations (e.g. temperature, atmospheric humidity and rain) and lack of reference values may inhibit researchers from using such a device with certain wildlife species. Evaluating the use of alternative sample types, such as plasma, in a POC device may afford researchers the opportunity to d...

  4. Detection and identification of Leishmania spp.: application of two hsp70-based PCR-RFLP protocols to clinical samples from the New World.

    Science.gov (United States)

    Montalvo, Ana M; Fraga, Jorge; Tirado, Dídier; Blandón, Gustavo; Alba, Annia; Van der Auwera, Gert; Vélez, Iván Darío; Muskus, Carlos

    2017-07-01

    Leishmaniasis is highly prevalent in New World countries, where several methods are available for detection and identification of Leishmania spp. Two hsp70-based PCR protocols (PCR-N and PCR-F) and their corresponding restriction fragment length polymorphisms (RFLP) were applied for detection and identification of Leishmania spp. in clinical samples recruited in Colombia, Guatemala, and Honduras. A total of 93 cases were studied. The samples were classified into positive or suspected of leishmaniasis according to parasitological criteria. Molecular amplification of two different hsp70 gene fragments and further RFLP analysis for identification of Leishmania species was done. The detection in parasitologically positive samples was higher using PCR-N than PCR-F. In the total of samples studied, the main species identified were Leishmania panamensis, Leishmania braziliensis, and Leishmania infantum (chagasi). Although RFLP-N was more efficient for the identification, RFLP-F is necessary for discrimination between L. panamensis and Leishmania guyanesis, of great importance in Colombia. Unexpectedly, one sample from this country revealed an RFLP pattern corresponding to Leishmania naiffi. Both molecular variants are applicable for the study of clinical samples originated in Colombia, Honduras, and Guatemala. Choosing the better tool for each setting depends on the species circulating. More studies are needed to confirm the presence of L. naiffi in Colombian territory.

  5. Total CMB analysis of streaker aerosol samples by PIXE, PIGE, beta- and optical-absorption analyses

    International Nuclear Information System (INIS)

    Annegarn, H.J.; Przybylowicz, W.J.

    1993-01-01

    Multielemental analyses of aerosol samples are widely used in air pollution receptor modelling. Specifically, the chemical mass balance (CMB) model has become a powerful tool in urban air quality studies. Input data required for the CMB includes not only the traditional X-ray fluorescence (and hence PIXE) detected elements, but also total mass, organic and inorganic carbon, and other light elements including Mg, Na and F. The circular streaker sampler, in combination with PIXE analysis, has developed into a powerful tool for obtaining time-resolved, multielemental aerosol data. However, application in CMB modelling has been limited by the absence of total mass and complementary light element data. This study reports on progress in using techniques complementary to PIXE to obtain additional data from circular streaker samples, maintaining the nondestructive, instrumental approach inherent in PIXE: Beta-gauging using a 147 Pm source for total mass; optical absorption for inorganic carbon; and PIGE to measure the lighter elements. (orig.)

  6. EDXRF applied to the chemical element determination of small invertebrate samples

    International Nuclear Information System (INIS)

    Magalhaes, Marcelo L.R.; Santos, Mariana L.O.; Cantinha, Rebeca S.; Souza, Thomas Marques de; Franca, Elvis J. de

    2015-01-01

    Energy Dispersion X-Ray Fluorescence - EDXRF is a fast analytical technique of easy operation, however demanding reliable analytical curves due to the intrinsic matrix dependence and interference during the analysis. By using biological materials of diverse matrices, multielemental analytical protocols can be implemented and a group of chemical elements could be determined in diverse biological matrices depending on the chemical element concentration. Particularly for invertebrates, EDXRF presents some advantages associated to the possibility of the analysis of small size samples, in which a collimator can be used that directing the incidence of X-rays to a small surface of the analyzed samples. In this work, EDXRF was applied to determine Cl, Fe, P, S and Zn in invertebrate samples using the collimator of 3 mm and 10 mm. For the assessment of the analytical protocol, the SRM 2976 Trace Elements in Mollusk produced and SRM 8415 Whole Egg Powder by the National Institute of Standards and Technology - NIST were also analyzed. After sampling by using pitfall traps, invertebrate were lyophilized, milled and transferred to polyethylene vials covered by XRF polyethylene. Analyses were performed at atmosphere lower than 30 Pa, varying voltage and electric current according to the chemical element to be analyzed. For comparison, Zn in the invertebrate material was also quantified by graphite furnace atomic absorption spectrometry after acid treatment (mixture of nitric acid and hydrogen peroxide) of samples have. Compared to the collimator of 10 mm, the SRM 2976 and SRM 8415 results obtained by the 3 mm collimator agreed well at the 95% confidence level since the E n Number were in the range of -1 and 1. Results from GFAAS were in accordance to the EDXRF values for composite samples. Therefore, determination of some chemical elements by EDXRF can be recommended for very small invertebrate samples (lower than 100 mg) with advantage of preserving the samples. (author)

  7. A Bayesian approach to assess data from radionuclide activity analyses in environmental samples

    International Nuclear Information System (INIS)

    Barrera, Manuel; Lourdes Romero, M.; Nunez-Lagos, Rafael; Bernardo, Jose M.

    2007-01-01

    A Bayesian statistical approach is introduced to assess experimental data from the analyses of radionuclide activity concentration in environmental samples (low activities). A theoretical model has been developed that allows the use of known prior information about the value of the measurand (activity), together with the experimental value determined through the measurement. The model has been applied to data of the Inter-laboratory Proficiency Test organised periodically among Spanish environmental radioactivity laboratories that are producing the radiochemical results for the Spanish radioactive monitoring network. A global improvement of laboratories performance is produced when this prior information is taken into account. The prior information used in this methodology is an interval within which the activity is known to be contained, but it could be extended to any other experimental quantity with a different type of prior information available

  8. Evaluation of Four Automated Protocols for Extraction of DNA from FTA Cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura

    2013-01-01

    protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA...... from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore......, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already...

  9. Purifying, Separating, and Concentrating Cells From a Sample Low in Biomass

    Science.gov (United States)

    Benardini, James N.; LaDuc, Myron T.; Diamond, Rochelle

    2012-01-01

    validated using varying ratios and mixtures of cells to ensure homogenous staining compared to that of individual cells, and were utilized for flow analyzer and FACS labeling. This technology focuses on the purification and concentration of cells from low-biomass spacecraft assembly facility samples. Currently, purification and concentration of low-biomass samples plague planetary protection downstream analyses. Having a capability to use flow cytometry to concentrate cells out of low-biomass, high-volume spacecraft/ facility sample extracts will be of extreme benefit to the fields of planetary protection and astrobiology. Successful research and development of this novel methodology will significantly increase the knowledge base for designing more effective cleaning protocols, and ultimately lead to a more empirical and true account of the microbial diversity present on spacecraft surfaces. Refined cleaning and an enhanced ability to resolve microbial diversity may decrease the overall cost of spacecraft assembly and/or provide a means to begin to assess challenging planetary protection missions.

  10. Improved protocol and data analysis for accelerated shelf-life estimation of solid dosage forms.

    Science.gov (United States)

    Waterman, Kenneth C; Carella, Anthony J; Gumkowski, Michael J; Lukulay, Patrick; MacDonald, Bruce C; Roy, Michael C; Shamblin, Sheri L

    2007-04-01

    To propose and test a new accelerated aging protocol for solid-state, small molecule pharmaceuticals which provides faster predictions for drug substance and drug product shelf-life. The concept of an isoconversion paradigm, where times in different temperature and humidity-controlled stability chambers are set to provide a critical degradant level, is introduced for solid-state pharmaceuticals. Reliable estimates for temperature and relative humidity effects are handled using a humidity-corrected Arrhenius equation, where temperature and relative humidity are assumed to be orthogonal. Imprecision is incorporated into a Monte-Carlo simulation to propagate the variations inherent in the experiment. In early development phases, greater imprecision in predictions is tolerated to allow faster screening with reduced sampling. Early development data are then used to design appropriate test conditions for more reliable later stability estimations. Examples are reported showing that predicted shelf-life values for lower temperatures and different relative humidities are consistent with the measured shelf-life values at those conditions. The new protocols and analyses provide accurate and precise shelf-life estimations in a reduced time from current state of the art.

  11. Molecular phylogeny and divergence times of Malagasy tenrecs: Influence of data partitioning and taxon sampling on dating analyses

    Directory of Open Access Journals (Sweden)

    Glos Julian

    2008-03-01

    Full Text Available Abstract Background Malagasy tenrecs belong to the Afrotherian clade of placental mammals and comprise three subfamilies divided in eight genera (Tenrecinae: Tenrec, Echinops, Setifer and Hemicentetes; Oryzorictinae: Oryzorictes, Limnogale and Microgale; Geogalinae:Geogale. The diversity of their morphology and incomplete taxon sampling made it difficult until now to resolve phylogenies based on either morphology or molecular data for this group. Therefore, in order to delineate the evolutionary history of this family, phylogenetic and dating analyses were performed on a four nuclear genes dataset (ADRA2B, AR, GHR and vWF including all Malagasy tenrec genera. Moreover, the influence of both taxon sampling and data partitioning on the accuracy of the estimated ages were assessed. Results Within Afrotheria the vast majority of the nodes received a high support, including the grouping of hyrax with sea cow and the monophyly of both Afroinsectivora (Macroscelidea + Afrosoricida and Afroinsectiphillia (Tubulidentata + Afroinsectivora. Strongly supported relationships were also recovered among all tenrec genera, allowing us to firmly establish the grouping of Geogale with Oryzorictinae, and to confirm the previously hypothesized nesting of Limnogale within the genus Microgale. The timeline of Malagasy tenrec diversification does not reflect a fast adaptive radiation after the arrival on Madagascar, indicating that morphological specializations have appeared over the whole evolutionary history of the family, and not just in a short period after colonization. In our analysis, age estimates at the root of a clade became older with increased taxon sampling of that clade. Moreover an augmentation of data partitions resulted in older age estimates as well, whereas standard deviations increased when more extreme partition schemes were used. Conclusion Our results provide as yet the best resolved gene tree comprising all Malagasy tenrec genera, and may lead

  12. Quality control and conduct of genome-wide association meta-analyses

    DEFF Research Database (Denmark)

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C

    2014-01-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC...

  13. The Protocol of Choice for Treatment of Snake Bite

    Directory of Open Access Journals (Sweden)

    Afshin Mohammad Alizadeh

    2016-01-01

    Full Text Available The aim of the current study is to compare three different methods of treatment of snake bite to determine the most efficient one. To unify the protocol of snake bite treatment in our center, we retrospectively reviewed files of the snake-bitten patients who had been referred to us between 2010 and 2014. They were contacted for follow-up using phone calls. Demographic and on-arrival characteristics, protocol used for treatment (WHO/Haddad/GF, and outcome/complications were evaluated. Patients were entered into one of the protocol groups and compared. Of a total of 63 patients, 56 (89% were males. Five, 19, and 28 patients were managed by Haddad, WHO, or GF protocols, respectively. Eleven patients had fallen into both GF and WHO protocols and were excluded. Serum sickness was significantly more common when WHO protocol was used while 100% of the compartment syndromes and 71% of deformities had been reported after GF protocol. The most important complications were considered to be deformity, compartment syndrome, and amputation and were more frequent after the use of WHO and GF protocols (23.1% versus 76.9%; none in Haddad; P = NS. Haddad protocol seems to be the best for treatment of snake-bitten patients in our region. However, this cannot be strictly concluded because of the limited sample size and nonsignificant P values.

  14. ANALYSING ACCEPTANCE SAMPLING PLANS BY MARKOV CHAINS

    Directory of Open Access Journals (Sweden)

    Mohammad Mirabi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: In this research, a Markov analysis of acceptance sampling plans in a single stage and in two stages is proposed, based on the quality of the items inspected. In a stage of this policy, if the number of defective items in a sample of inspected items is more than the upper threshold, the batch is rejected. However, the batch is accepted if the number of defective items is less than the lower threshold. Nonetheless, when the number of defective items falls between the upper and lower thresholds, the decision-making process continues to inspect the items and collect further samples. The primary objective is to determine the optimal values of the upper and lower thresholds using a Markov process to minimise the total cost associated with a batch acceptance policy. A solution method is presented, along with a numerical demonstration of the application of the proposed methodology.

    AFRIKAANSE OPSOMMING: In hierdie navorsing word ’n Markov-ontleding gedoen van aannamemonsternemingsplanne wat plaasvind in ’n enkele stap of in twee stappe na gelang van die kwaliteit van die items wat geïnspekteer word. Indien die eerste monster toon dat die aantal defektiewe items ’n boonste grens oorskry, word die lot afgekeur. Indien die eerste monster toon dat die aantal defektiewe items minder is as ’n onderste grens, word die lot aanvaar. Indien die eerste monster toon dat die aantal defektiewe items in die gebied tussen die boonste en onderste grense lê, word die besluitnemingsproses voortgesit en verdere monsters word geneem. Die primêre doel is om die optimale waardes van die booonste en onderste grense te bepaal deur gebruik te maak van ’n Markov-proses sodat die totale koste verbonde aan die proses geminimiseer kan word. ’n Oplossing word daarna voorgehou tesame met ’n numeriese voorbeeld van die toepassing van die voorgestelde oplossing.

  15. Log sampling methods and software for stand and landscape analyses.

    Science.gov (United States)

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...

  16. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    Science.gov (United States)

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  17. Protein blotting protocol for beginners.

    Science.gov (United States)

    Petrasovits, Lars A

    2014-01-01

    The transfer and immobilization of biological macromolecules onto solid nitrocellulose or nylon (polyvinylidene difluoride (PVDF)) membranes subsequently followed by specific detection is referred to as blotting. DNA blots are called Southerns after the inventor of the technique, Edwin Southern. By analogy, RNA blots are referred to as northerns and protein blots as westerns (Burnette, Anal Biochem 112:195-203, 1981). With few exceptions, western blotting involves five steps, namely, sample collection, preparation, separation, immobilization, and detection. In this chapter, protocols for the entire process from sample collection to detection are described.

  18. Simulating quantum correlations as a distributed sampling problem

    International Nuclear Information System (INIS)

    Degorre, Julien; Laplante, Sophie; Roland, Jeremie

    2005-01-01

    It is known that quantum correlations exhibited by a maximally entangled qubit pair can be simulated with the help of shared randomness, supplemented with additional resources, such as communication, postselection or nonlocal boxes. For instance, in the case of projective measurements, it is possible to solve this problem with protocols using one bit of communication or making one use of a nonlocal box. We show that this problem reduces to a distributed sampling problem. We give a new method to obtain samples from a biased distribution, starting with shared random variables following a uniform distribution, and use it to build distributed sampling protocols. This approach allows us to derive, in a simpler and unified way, many existing protocols for projective measurements, and extend them to positive operator value measurements. Moreover, this approach naturally leads to a local hidden variable model for Werner states

  19. High-Precision In Situ 87Sr/86Sr Analyses through Microsampling on Solid Samples: Applications to Earth and Life Sciences

    Directory of Open Access Journals (Sweden)

    Sara Di Salvo

    2018-01-01

    Full Text Available An analytical protocol for high-precision, in situ microscale isotopic investigations is presented here, which combines the use of a high-performing mechanical microsampling device and high-precision TIMS measurements on micro-Sr samples, allowing for excellent results both in accuracy and precision. The present paper is a detailed methodological description of the whole analytical procedure from sampling to elemental purification and Sr-isotope measurements. The method offers the potential to attain isotope data at the microscale on a wide range of solid materials with the use of minimally invasive sampling. In addition, we present three significant case studies for geological and life sciences, as examples of the various applications of microscale 87Sr/86Sr isotope ratios, concerning (i the pre-eruptive mechanisms triggering recent eruptions at Nisyros volcano (Greece, (ii the dynamics involved with the initial magma ascent during Eyjafjallajökull volcano’s (Iceland 2010 eruption, which are usually related to the precursory signals of the eruption, and (iii the environmental context of a MIS 3 cave bear, Ursus spelaeus. The studied cases show the robustness of the methods, which can be also be applied in other areas, such as cultural heritage, archaeology, petrology, and forensic sciences.

  20. An electronic specimen collection protocol schema (eSCPS). Document architecture for specimen management and the exchange of specimen collection protocols between biobanking information systems.

    Science.gov (United States)

    Eminaga, O; Semjonow, A; Oezguer, E; Herden, J; Akbarov, I; Tok, A; Engelmann, U; Wille, S

    2014-01-01

    The integrity of collection protocols in biobanking is essential for a high-quality sample preparation process. However, there is not currently a well-defined universal method for integrating collection protocols in the biobanking information system (BIMS). Therefore, an electronic schema of the collection protocol that is based on Extensible Markup Language (XML) is required to maintain the integrity and enable the exchange of collection protocols. The development and implementation of an electronic specimen collection protocol schema (eSCPS) was performed at two institutions (Muenster and Cologne) in three stages. First, we analyzed the infrastructure that was already established at both the biorepository and the hospital information systems of these institutions and determined the requirements for the sufficient preparation of specimens and documentation. Second, we designed an eSCPS according to these requirements. Finally, a prospective study was conducted to implement and evaluate the novel schema in the current BIMS. We designed an eSCPS that provides all of the relevant information about collection protocols. Ten electronic collection protocols were generated using the supplementary Protocol Editor tool, and these protocols were successfully implemented in the existing BIMS. Moreover, an electronic list of collection protocols for the current studies being performed at each institution was included, new collection protocols were added, and the existing protocols were redesigned to be modifiable. The documentation time was significantly reduced after implementing the eSCPS (5 ± 2 min vs. 7 ± 3 min; p = 0.0002). The eSCPS improves the integrity and facilitates the exchange of specimen collection protocols in the existing open-source BIMS.

  1. Meta-analyses of the 5-HTTLPR polymorphisms and post-traumatic stress disorder.

    Directory of Open Access Journals (Sweden)

    Fernando Navarro-Mateu

    Full Text Available OBJECTIVE: To conduct a meta-analysis of all published genetic association studies of 5-HTTLPR polymorphisms performed in PTSD cases. METHODS DATA SOURCES: Potential studies were identified through PubMed/MEDLINE, EMBASE, Web of Science databases (Web of Knowledge, WoK, PsychINFO, PsychArticles and HuGeNet (Human Genome Epidemiology Network up until December 2011. STUDY SELECTION: Published observational studies reporting genotype or allele frequencies of this genetic factor in PTSD cases and in non-PTSD controls were all considered eligible for inclusion in this systematic review. DATA EXTRACTION: Two reviewers selected studies for possible inclusion and extracted data independently following a standardized protocol. STATISTICAL ANALYSIS: A biallelic and a triallelic meta-analysis, including the total S and S' frequencies, the dominant (S+/LL and S'+/L'L' and the recessive model (SS/L+ and S'S'/L'+, was performed with a random-effect model to calculate the pooled OR and its corresponding 95% CI. Forest plots and Cochran's Q-Statistic and I(2 index were calculated to check for heterogeneity. Subgroup analyses and meta-regression were carried out to analyze potential moderators. Publication bias and quality of reporting were also analyzed. RESULTS: 13 studies met our inclusion criteria, providing a total sample of 1874 patients with PTSD and 7785 controls in the biallelic meta-analyses and 627 and 3524, respectively, in the triallelic. None of the meta-analyses showed evidence of an association between 5-HTTLPR and PTSD but several characteristics (exposure to the same principal stressor for PTSD cases and controls, adjustment for potential confounding variables, blind assessment, study design, type of PTSD, ethnic distribution and Total Quality Score influenced the results in subgroup analyses and meta-regression. There was no evidence of potential publication bias. CONCLUSIONS: Current evidence does not support a direct effect of 5-HTTLPR

  2. Meta-analyses of the 5-HTTLPR polymorphisms and post-traumatic stress disorder.

    Science.gov (United States)

    Navarro-Mateu, Fernando; Escámez, Teresa; Koenen, Karestan C; Alonso, Jordi; Sánchez-Meca, Julio

    2013-01-01

    To conduct a meta-analysis of all published genetic association studies of 5-HTTLPR polymorphisms performed in PTSD cases. Potential studies were identified through PubMed/MEDLINE, EMBASE, Web of Science databases (Web of Knowledge, WoK), PsychINFO, PsychArticles and HuGeNet (Human Genome Epidemiology Network) up until December 2011. Published observational studies reporting genotype or allele frequencies of this genetic factor in PTSD cases and in non-PTSD controls were all considered eligible for inclusion in this systematic review. Two reviewers selected studies for possible inclusion and extracted data independently following a standardized protocol. A biallelic and a triallelic meta-analysis, including the total S and S' frequencies, the dominant (S+/LL and S'+/L'L') and the recessive model (SS/L+ and S'S'/L'+), was performed with a random-effect model to calculate the pooled OR and its corresponding 95% CI. Forest plots and Cochran's Q-Statistic and I(2) index were calculated to check for heterogeneity. Subgroup analyses and meta-regression were carried out to analyze potential moderators. Publication bias and quality of reporting were also analyzed. 13 studies met our inclusion criteria, providing a total sample of 1874 patients with PTSD and 7785 controls in the biallelic meta-analyses and 627 and 3524, respectively, in the triallelic. None of the meta-analyses showed evidence of an association between 5-HTTLPR and PTSD but several characteristics (exposure to the same principal stressor for PTSD cases and controls, adjustment for potential confounding variables, blind assessment, study design, type of PTSD, ethnic distribution and Total Quality Score) influenced the results in subgroup analyses and meta-regression. There was no evidence of potential publication bias. Current evidence does not support a direct effect of 5-HTTLPR polymorphisms on PTSD. Further analyses of gene-environment interactions, epigenetic modulation and new studies with large samples

  3. Mac protocols for wireless sensor network (wsn): a comparative study

    International Nuclear Information System (INIS)

    Arshad, J.; Akram, Q.; Saleem, Y.

    2014-01-01

    Data communication between nodes is carried out under Medium Access Control (MAC) protocol which is defined at data link layer. The MAC protocols are responsible to communicate and coordinate between nodes according to the defined standards in WSN (Wireless Sensor Networks). The design of a MAC protocol should also address the issues of energy efficiency and transmission efficiency. There are number of MAC protocols that exist in the literature proposed for WSN. In this paper, nine MAC protocols which includes S-MAC, T-MAC, Wise-MAC, Mu-MAC, Z-MAC, A-MAC, D-MAC, B-MAC and B-MAC+ for WSN have been explored, studied and analyzed. These nine protocols are classified in contention based and hybrid (combination of contention and schedule based) MAC protocols. The goal of this comparative study is to provide a basis for MAC protocols and to highlight different mechanisms used with respect to parameters for the evaluation of energy and transmission efficiency in WSN. This study also aims to give reader a better understanding of the concepts, processes and flow of information used in these MAC protocols for WSN. A comparison with respect to energy reservation scheme, idle listening avoidance, latency, fairness, data synchronization, and throughput maximization has been presented. It was analyzed that contention based MAC protocols are less energy efficient as compared to hybrid MAC protocols. From the analysis of contention based MAC protocols in term of energy consumption, it was being observed that protocols based on preamble sampling consume lesser energy than protocols based on static or dynamic sleep schedule. (author)

  4. Comparison of Different Dosing Protocols of Anti-Snake Venom (ASV) in Snake Bite Cases.

    Science.gov (United States)

    Daswani, B R; Chandanwale, A S; Kadam, D B; Ghongane, B B; Ghorpade, V S; Manu, H C

    2017-09-01

    Considering the cost of Anti-Snake Venom (ASV) and irregularity in its supply, there is often a need to curtail doses of ASV, despite guidelines for management of snake bite. During June 2013 to September 2013, when ASV was in short supply, our institutional committee reviewed the overall hospital statistics of snake bite cases as well as scientific literature and formulated a working modified protocol that used low dose of ASV in snake bite cases. To retrospectively analyse and compare the modified ASV protocol versus conventional ASV protocol with respect to outcome, number of ASV vials required, duration of stay in the hospital/ ICU, and additional supportive interventions needed. This was a retrospective study conducted at a tertiary care teaching hospital, Maharashtra, India. Hospital records of inpatients admitted for snake bite during June 2013 to September 2013 (since introduction of the modified protocol) as well as during June 2012 to September 2012, (when patients received conventional protocol-historical controls) were retrospectively analysed to assess the number of ASV vials received by the patients during the stay, need for supportive therapy, duration of stay and outcome of the patients. There was a significant reduction in average number of ASV vials per patient, required vide the modified protocol compared to their historical controls (10.74±0.95 vs 28.17±2.75 pcost of management of each patient reduced by approximately 11974.41 INR per treated patient, based on the requirement of ASV. The modified ASV protocol used in this study is more cost effective as compared to the conventional protocol, deserves prospective evaluation and may be followed at least during prime time of scarcity of ASV.

  5. Two mini-preparation protocols to DNA extraction from plants with ...

    African Journals Online (AJOL)

    AJB SERVER

    2006-10-16

    Oct 16, 2006 ... samples to process and it is also a non expensive protocol. This method also ... because many of those chemicals inhibit PCR reactions. (Pandey et al., 1996) ... Spin at 15,000 rpm for 15 min and wash the DNA pellet with 1.2 ml ... Protocol: To 200 mg frozen and ground tissue plant material, add 900 µl of.

  6. Early Versus Delayed Motion After Rotator Cuff Repair: A Systematic Review of Overlapping Meta-analyses.

    Science.gov (United States)

    Houck, Darby A; Kraeutler, Matthew J; Schuette, Hayden B; McCarty, Eric C; Bravman, Jonathan T

    2017-10-01

    Previous meta-analyses have been conducted to compare outcomes of early versus delayed motion after rotator cuff repair. To conduct a systematic review of overlapping meta-analyses comparing early versus delayed motion rehabilitation protocols after rotator cuff repair to determine which meta-analyses provide the best available evidence. Systematic review. A systematic review was performed by searching PubMed and Cochrane Library databases. Search terms included "rotator cuff repair," "early passive motion," "immobilization," "rehabilitation protocol," and "meta-analysis." Results were reviewed to determine study eligibility. Patient outcomes and structural healing were extracted from these meta-analyses. Meta-analysis quality was assessed using the Oxman-Guyatt and Quality of Reporting of Meta-analyses (QUOROM) systems. The Jadad decision algorithm was then used to determine which meta-analyses provided the best level of evidence. Seven meta-analyses containing a total of 5896 patients met the eligibility criteria (1 Level I evidence, 4 Level II evidence, 2 Level III evidence). None of these meta-analyses found immobilization to be superior to early motion; however, most studies suggested that early motion would increase range of motion (ROM), thereby reducing time of recovery. Three of these studies suggested that tear size contributed to the choice of rehabilitation to ensure proper healing of the shoulder. A study by Chan et al in 2014 received the highest QUOROM and Oxman-Guyatt scores, and therefore this meta-analysis appeared to have the highest level of evidence. Additionally, a study by Riboh and Garrigues in 2014 was selected as the highest quality study in this systematic review according to the Jadad decision algorithm. The current, best available evidence suggests that early motion improves ROM after rotator cuff repair but increases the risk of rotator cuff retear. Lower quality meta-analyses indicate that tear size may provide a better strategy in

  7. Chiral analyses of dextromethorphan/levomethorphan and their metabolites in rat and human samples using LC-MS/MS.

    Science.gov (United States)

    Kikura-Hanajiri, Ruri; Kawamura, Maiko; Miyajima, Atsuko; Sunouchi, Momoko; Goda, Yukihiro

    2011-04-01

    In order to develop an analytical method for the discrimination of dextromethorphan (an antitussive medicine) from its enantiomer, levomethorphan (a narcotic) in biological samples, chiral analyses of these drugs and their O-demethyl and/or N-demethyl metabolites in rat plasma, urine, and hair were carried out using LC-MS/MS. After the i.p. administration of dextromethorphan or levomethorphan to pigmented hairy male DA rats (5 mg/kg/day, 10 days), the parent compounds and their three metabolites in plasma, urine and hair were determined using LC-MS/MS. Complete chiral separation was achieved in 12 min on a Chiral CD-Ph column in 0.1% formic acid-acetonitrile by a linear gradient program. Most of the metabolites were detected as being the corresponding O-demethyl and N, O-didemethyl metabolites in the rat plasma and urine after the hydrolysis of O-glucuronides, although obvious differences in the amounts of these metabolites were found between the dextro and levo forms. No racemation was observed through O- and/or N-demethylation. In the rat hair samples collected 4 weeks after the first administration, those differences were more clearly detected and the concentrations of the parent compounds, their O-demethyl, N-demethyl, and N, O-didemethyl metabolites were 63.4, 2.7, 25.1, and 0.7 ng/mg for the dextro forms and 24.5, 24.6, 2.6, and 0.5 ng/mg for the levo forms, respectively. In order to fully investigate the differences of their metabolic properties between dextromethorphan and levomethorphan, DA rat and human liver microsomes were studied. The results suggested that there might be an enantioselective metabolism of levomethorphan, especially with regard to the O-demethylation, not only in DA rat but human liver microsomes as well. The proposed chiral analyses might be applied to human samples and could be useful for discriminating dextromethorphan use from levomethorphan use in the field of forensic toxicology, although further studies should be carried out

  8. Simple DNA extraction of urine samples: Effects of storage temperature and storage time.

    Science.gov (United States)

    Ng, Huey Hian; Ang, Hwee Chen; Hoe, See Ying; Lim, Mae-Lynn; Tai, Hua Eng; Soh, Richard Choon Hock; Syn, Christopher Kiu-Choong

    2018-06-01

    Urine samples are commonly analysed in cases with suspected illicit drug consumption. In events of alleged sample mishandling, urine sample source identification may be necessary. A simple DNA extraction procedure suitable for STR typing of urine samples was established on the Promega Maxwell ® 16 paramagnetic silica bead platform. A small sample volume of 1.7mL was used. Samples were stored at room temperature, 4°C and -20°C for 100days to investigate the influence of storage temperature and time on extracted DNA quantity and success rate of STR typing. Samples stored at room temperature exhibited a faster decline in DNA yield with time and lower typing success rates as compared to those at 4°C and -20°C. This trend can likely be attributed to DNA degradation. In conclusion, this study presents a quick and effective DNA extraction protocol from a small urine volume stored for up to 100days at 4°C and -20°C. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. The use of chemical analyses of borehole samples in sub-surface mapping: an example of the Delmas-Bapsfontein area is given

    International Nuclear Information System (INIS)

    Rudman, R.A.

    1989-01-01

    The East Rand Dolomites around Delmas-Bapsfontein underlie an area of about 1000 sq. km. but do not outcrop. An intensive ground water exploration programme had been carried out and the percussion samples obtained were analysed in an attempt to differentiate the sub-surface geology based on the chemistry of the samples. The gross chemistry of the various rock types has been well defined and various computer-aided graphical methods were used to highlight changes in the chemistry. Samples were analysed by means of x-ray fluorescence. The chert-rich dolomite formations near surface have been leached to the extent that all of the carbonate minerals have been removed, leaving a chert residium of commonly 80 m thick. The carbonates in this area can be regarded as 'pure' dolomites. There are however two discretely different CaO/MgO ratios present in the study area. Intrusives with up to 16% Na 2 O are noted. The effect of de-dolomitization at the contacts of the intrusives is clearly illustrated. 4 refs., 7 figs., 1 tab

  10. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    International Nuclear Information System (INIS)

    McGowan, S E; Albertini, F; Lomax, A J; Thomas, S J

    2015-01-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties. (paper)

  11. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    Science.gov (United States)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  12. Statistical analyses to support guidelines for marine avian sampling. Final report

    Science.gov (United States)

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.

  13. Differences in reporting of analyses in internal company documents versus published trial reports: comparisons in industry-sponsored trials in off-label uses of gabapentin.

    Directory of Open Access Journals (Sweden)

    S Swaroop Vedula

    Full Text Available BACKGROUND: Details about the type of analysis (e.g., intent to treat [ITT] and definitions (i.e., criteria for including participants in the analysis are necessary for interpreting a clinical trial's findings. Our objective was to compare the description of types of analyses and criteria for including participants in the publication (i.e., what was reported with descriptions in the corresponding internal company documents (i.e., what was planned and what was done. Trials were for off-label uses of gabapentin sponsored by Pfizer and Parke-Davis, and documents were obtained through litigation. METHODS AND FINDINGS: For each trial, we compared internal company documents (protocols, statistical analysis plans, and research reports, all unpublished, with publications. One author extracted data and another verified, with a third person verifying discordant items and a sample of the rest. Extracted data included the number of participants randomized and analyzed for efficacy, and types of analyses for efficacy and safety and their definitions (i.e., criteria for including participants in each type of analysis. We identified 21 trials, 11 of which were published randomized controlled trials, and that provided the documents needed for planned comparisons. For three trials, there was disagreement on the number of randomized participants between the research report and publication. Seven types of efficacy analyses were described in the protocols, statistical analysis plans, and publications, including ITT and six others. The protocol or publication described ITT using six different definitions, resulting in frequent disagreements between the two documents (i.e., different numbers of participants were included in the analyses. CONCLUSIONS: Descriptions of analyses conducted did not agree between internal company documents and what was publicly reported. Internal company documents provide extensive documentation of methods planned and used, and trial

  14. National protocol framework for the inventory and monitoring of bees

    Science.gov (United States)

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative

  15. The determination of radionuclides in grass ecosystem samples

    International Nuclear Information System (INIS)

    LaBrecque, J.J.; Schelenz, R.; Perkins, R.W.

    1987-07-01

    The radioactive debris cloud from the Chernobyl reactor accident resulted in some deposition over essentially all of the Northern Hemisphere. Shortly after the accident invitations were sent out by the IAEA to Member States to collect grass samples according to specific instructions so that the ratio of the various radionuclides in the fallout debris could be established over a wide area of Europe. In response to this request, 20 grass samples were provided by Member States. To establish a protocol for analysis of these valuable samples and to recommend a protocol for future sample collection, a Consultants Meeting was called by the IAEA for 23-25 September 1986. This document contains the considerations and recommendations of the consultants

  16. Determination of hydrazine in drinking water: Development and multivariate optimization of a rapid and simple solid phase microextraction-gas chromatography-triple quadrupole mass spectrometry protocol.

    Science.gov (United States)

    Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio

    2014-07-04

    In this work, the capabilities of solid phase microextraction were exploited in a fully optimized SPME-GC-QqQ-MS analytical approach for hydrazine assay. A rapid and easy method was obtained by a simple derivatization reaction with propyl chloroformate and pyridine carried out directly in water samples, followed by automated SPME analysis in the same vial without further sample handling. The affinity of the different derivatized compounds obtained towards five commercially available SPME coatings was evaluated, in order to achieve the best extraction efficiency. GC analyses were carried out using a GC-QqQ-MS instrument in selected reaction monitoring (SRM) acquisition mode which has allowed the achievement of high specificity by selecting appropriate precursor-product ion couples improving the capability in analyte identification. The multivariate approach of experimental design was crucial in order to optimize derivatization reaction, SPME process and tandem mass spectrometry parameters. Accuracy of the proposed protocol, tested at 60, 200 and 800 ng L(-1), provided satisfactory values (114.2%, 83.6% and 98.6%, respectively), whereas precision (RSD%) at the same concentration levels were of 10.9%, 7.9% and 7.7% respectively. Limit of detection and quantification of 4.4 and 8.3 ng L(-1) were obtained. The reliable application of the proposed protocol to real drinking water samples confirmed its capability to be used as analytical tool for routine analyses. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Medium-term cryopreservation of rabies virus samples

    Directory of Open Access Journals (Sweden)

    Tereza D'avila de Freitas Aguiar

    2013-12-01

    Full Text Available Introduction The cryopreservation of rabies virus has been described in detail in the literature. To date, little information is available on the use of cryoprotective agents for cold preservation of this virus, and the available data focus only on short-term virus preservation. In this study, we investigated the medium-term cryopreservation of samples of rabies virus using different cryopreservation protocols. Methods The cryopreservation protocols for the rabies virus samples were performed at -20°C and were divided according to the variables of time and cryoprotectant type used. The laboratory tests (intracerebral inoculation of mice, viral titration and direct immunofluorescence were performed at regular intervals (360 and 720 days to assess the viability of the viral samples according to the different preservation techniques used. Results After 1 year of cryopreservation, the fluorescence intensity of intracellular corpuscles of the rabies virus and the median survival time of the mice differed between the positive controls and the treatments with the cryoprotectants. After 2 years, most of the samples subjected to the cryopreservation protocols (including the controls did not produce fluorescence. However, the virus samples exposed to the cryoprotectant sucrose (68% solution responded positively in the direct immunofluorescence assay and in the intracerebral inoculation of the mice. Conclusions Medium-term cryopreservation of the rabies virus inactivates the viral sample. However, the cryoprotectant agent sucrose (68% produces a preservative effect in cryopreserved rabies virus samples.

  18. From protocol to published report

    DEFF Research Database (Denmark)

    Berendt, Louise; Callréus, Torbjörn; Petersen, Lene Grejs

    2016-01-01

    and published reports of academic clinical drug trials. METHODS: A comparison was made between study protocols and their corresponding published reports. We assessed the overall consistency, which was defined as the absence of discrepancy regarding study type (categorized as either exploratory or confirmatory...... in 1999, 2001, and 2003, 95 of which fulfilled the eligibility criteria and had at least one corresponding published report reporting data on trial subjects. Overall consistency was observed in 39% of the trials (95% CI: 29 to 49%). Randomized controlled trials (RCTs) constituted 72% (95% CI: 63 to 81......%) of the sample, and 87% (95% CI: 80 to 94%) of the trials were hospital based. CONCLUSIONS: Overall consistency between protocols and their corresponding published reports was low. Motivators for the inconsistencies are unknown but do not seem restricted to economic incentives....

  19. Relationships of Functional Tests Following ACL Reconstruction: Exploratory Factor Analyses of the Lower Extremity Assessment Protocol.

    Science.gov (United States)

    DiFabio, Melissa; Slater, Lindsay V; Norte, Grant; Goetschius, John; Hart, Joseph M; Hertel, Jay

    2018-03-01

    After ACL reconstruction (ACLR), deficits are often assessed using a variety of functional tests, which can be time consuming. It is unknown whether these tests provide redundant or unique information. To explore relationships between components of a battery of functional tests, the Lower Extremity Assessment Protocol (LEAP) was created to aid in developing the most informative, concise battery of tests for evaluating ACLR patients. Descriptive, cross-sectional. Laboratory. 76 ACLR patients (6.86±3.07 months postoperative) and 54 healthy participants. Isokinetic knee flexion and extension at 90 and 180 degrees/second, maximal voluntary isometric contraction for knee extension and flexion, single leg balance, 4 hopping tasks (single, triple, crossover, and 6-meter timed hop), and a bilateral drop vertical jump that was scored with the Landing Error Scoring System (LESS). Peak torque, average torque, average power, total work, fatigue indices, center of pressure area and velocity, hop distance and time, and LESS score. A series of factor analyses were conducted to assess grouping of functional tests on the LEAP for each limb in the ACLR and healthy groups and limb symmetry indices (LSI) for both groups. Correlations were run between measures that loaded on retained factors. Isokinetic and isometric strength tests for knee flexion and extension, hopping, balance, and fatigue index were identified as unique factors for all limbs. The LESS score loaded with various factors across the different limbs. The healthy group LSI analysis produced more factors than the ACLR LSI analysis. Individual measures within each factor had moderate to strong correlations. Isokinetic and isometric strength, hopping, balance, and fatigue index provided unique information. Within each category of measures, not all tests may need to be included for a comprehensive functional assessment of ACLR patients due to the high amount of shared variance between them.

  20. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  1. Perspectives on land snails - sampling strategies for isotopic analyses

    Science.gov (United States)

    Kwiecien, Ola; Kalinowski, Annika; Kamp, Jessica; Pellmann, Anna

    2017-04-01

    Since the seminal works of Goodfriend (1992), several substantial studies confirmed a relation between the isotopic composition of land snail shells (d18O, d13C) and environmental parameters like precipitation amount, moisture source, temperature and vegetation type. This relation, however, is not straightforward and site dependent. The choice of sampling strategy (discrete or bulk sampling) and cleaning procedure (several methods can be used, but comparison of their effects in an individual shell has yet not been achieved) further complicate the shell analysis. The advantage of using snail shells as environmental archive lies in the snails' limited mobility, and therefore an intrinsic aptitude of recording local and site-specific conditions. Also, snail shells are often found at dated archaeological sites. An obvious drawback is that shell assemblages rarely make up a continuous record, and a single shell is only a snapshot of the environmental setting at a given time. Shells from archaeological sites might represent a dietary component and cooking would presumably alter the isotopic signature of aragonite material. Consequently, a proper sampling strategy is of great importance and should be adjusted to the scientific question. Here, we compare and contrast different sampling approaches using modern shells collected in Morocco, Spain and Germany. The bulk shell approach (fine-ground material) yields information on mean environmental parameters within the life span of analyzed individuals. However, despite homogenization, replicate measurements of bulk shell material returned results with a variability greater than analytical precision (up to 2‰ for d18O, and up to 1‰ for d13C), calling for caution analyzing only single individuals. Horizontal high-resolution sampling (single drill holes along growth lines) provides insights into the amplitude of seasonal variability, while vertical high-resolution sampling (multiple drill holes along the same growth line

  2. Optimizing Urine Processing Protocols for Protein and Metabolite Detection.

    Science.gov (United States)

    Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K

    In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.

  3. Multiple-specimen absolute paleointensity determination with the MSP-DSC protocol: Advantages and drawbacks.

    Science.gov (United States)

    Camps, P.; Fanjat, G.; Poidras, T.; Carvallo, C.; Nicol, P.

    2012-04-01

    The MSP-DSC protocol (Dekkers & Bohnel, 2006, EPSL; Fabian & Leonhardt, 2010, EPSL) is a recent development in the methodology for documenting the intensity of the ancient Earth magnetic field. Applicable both on rocks or archaeological artifacts it allows us to use samples that until now were not measured because their magnetic properties do not meet selection criteria required by conventional methods. However, this new experimental protocol requires that samples be heated and cooled under a field parallel to its natural remanent magnetization (NRM). Currently, standard paleointensity furnaces do not match precisely this constraint. Yet, such new measurement protocol seems very promising since it would possibly double the number of available data. We are developing in Montpellier (France), a very fast-heating oven with infrared dedicated to this protocol. Two key points determine its characteristics. The first is to heat uniformly a rock sample of a 10-cc-standard volume as fast as possible. The second is to apply to the sample during the heating (and the cooling) a precise magnetic induction field, perfectly controlled in 3D. We tested and calibrated a preliminary version of this oven along with the MSP-DSC protocol with 3 historical lava flows, 2 from Reunion Island (erupted in 2002 and 2007) and one from Etna (erupted in 1983). These lava flows were selected because they have different magnetic behaviors. Reunion 2002 is rather SD-PSD-like, while Reunion 2007 is PSD-MD-like, and Etna 1983 is MD-like. The paleointensity determinations obtained with the original protocol of Dekkers and Bohnel (2006, EPSL) are within +- 1 μT of the known field for the three lava flows. The same precision is obtained when we applied the fraction correction (MSP-FC protocol). However, we systematically observed a loss in the linearity of the MSP-FC plots. In addition, like Muxworthy and Taylor (2011, GJI), we found that the Domain State Correction is difficult to apply since alpha

  4. Preston and Park-Sanders protocols adapted for semi-quantitative isolation of thermotolerant Campylobacter from chicken rinse

    DEFF Research Database (Denmark)

    Josefsen, Mathilde Hartmann; Lübeck, Peter Stephensen; Aalbaek, B.

    2003-01-01

    Human campylobacteriosis has become the major cause of foodborne gastrointestinal diseases in several European countries. In order to implement effective control measures in the primary production, and as a tool in risk assessment studies, it is necessary to have sensitive and quantitative...... detection methods. Thus, semi-quantitative detection of thermophilic Campylobacter spp. in 20 naturally contaminated chicken rinse samples was carried out using the two most common standard protocols: Preston and Park-Sanders, as proposed by Nordic Committee on Food Analysis (NMKL) and International...... Standard Organization (ISO), respectively. For both protocols, the chicken rinse samples were prepared in 500 ml buffered peptone water, as recommended in the ISO protocol no. 6887-2. The results indicated that the Preston protocol was superior to the Park-Sanders protocol in supporting growth...

  5. Dynamic Anthropometry – Deffning Protocols for Automatic Body Measurement

    Directory of Open Access Journals (Sweden)

    Slavenka Petrak

    2017-12-01

    Full Text Available The paper presents the research on possibilities of protocol development for automatic computer-based determination of measurements on a 3D body model in defined dynamic positions. Initially, two dynamic body positions were defined for the research on dimensional changes of targeted body lengths and surface segments during body movement from basic static position into a selected dynamic body position. The assumption was that during body movement, specifi c length and surface dimensions would change significantly from the aspect of clothing construction and functionality of a garment model. 3D body scanning of a female test sample was performed in basic static and two defined dynamic positions. 3D body models were processed and measurement points were defined as a starting point for the determination of characteristic body measurements. The protocol for automatic computer measurement was defined for every dynamic body position by the systematic set of activities based on determined measurement points. The verification of developed protocols was performed by automatic determination of defined measurements on the test sample and by comparing the results with the conventional manual measurement.

  6. Analysis of Security Protocols in Embedded Systems

    DEFF Research Database (Denmark)

    Bruni, Alessandro

    Embedded real-time systems have been adopted in a wide range of safety-critical applications—including automotive, avionics, and train control systems—where the focus has long been on safety (i.e., protecting the external world from the potential damage caused by the system) rather than security (i.......e., protecting the system from the external world). With increased connectivity of these systems to external networks the attack surface has grown, and consequently there is a need for securing the system from external attacks. Introducing security protocols in safety critical systems requires careful...... in this direction is to extend saturation-based techniques so that enough state information can be modelled and analysed. Finally, we present a methodology for proving the same security properties in the computational model, by means of typing protocol implementations....

  7. Three-Stage Quantum Cryptography Protocol under Collective-Rotation Noise

    Directory of Open Access Journals (Sweden)

    Linsen Wu

    2015-05-01

    Full Text Available Information security is increasingly important as society migrates to the information age. Classical cryptography widely used nowadays is based on computational complexity, which means that it assumes that solving some particular mathematical problems is hard on a classical computer. With the development of supercomputers and, potentially, quantum computers, classical cryptography has more and more potential risks. Quantum cryptography provides a solution which is based on the Heisenberg uncertainty principle and no-cloning theorem. While BB84-based quantum protocols are only secure when a single photon is used in communication, the three-stage quantum protocol is multi-photon tolerant. However, existing analyses assume perfect noiseless channels. In this paper, a multi-photon analysis is performed for the three-stage quantum protocol under the collective-rotation noise model. The analysis provides insights into the impact of the noise level on a three-stage quantum cryptography system.

  8. Evaluation of six sample preparation procedures for qualitative and quantitative proteomics analysis of milk fat globule membrane.

    Science.gov (United States)

    Yang, Yongxin; Anderson, Elizabeth; Zhang, Sheng

    2018-04-12

    Proteomic analysis of membrane proteins is challenged by the proteins solubility and detergent incompatibility with MS analysis. No single perfect protocol can be used to comprehensively characterize the proteome of membrane fraction. Here, we used cow milk fat globule membrane (MFGM) proteome analysis to assess six sample preparation procedures including one in-gel and five in-solution digestion approaches prior to LC-MS/MS analysis. The largest number of MFGM proteins were identified by suspension trapping (S-Trap) and filter-aided sample preparation (FASP) methods, followed by acetone precipitation without clean-up of tryptic peptides method. Protein identifications with highest average coverage was achieved by Chloroform/MeOH, in-gel and S-Trap methods. Most distinct proteins were identified by FASP method, followed by S-Trap. Analyses by Venn diagram, principal-component analysis, hierarchical clustering and the abundance ranking of quantitative proteins highlight differences in the MFGM fraction by the all sample preparation procedures. These results reveal the biased proteins/peptides loss occurred in each protocol. In this study, we found several novel proteins that were not observed previously by in-depth proteomics characterization of MFGM fraction in milk. Thus, a combination of multiple procedures with orthologous properties of sample preparation was demonstrated to improve the protein sequence coverage and expression level accuracy of membrane samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Vertical Protocol Composition

    DEFF Research Database (Denmark)

    Groß, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    The security of key exchange and secure channel protocols, such as TLS, has been studied intensively. However, only few works have considered what happens when the established keys are actually used—to run some protocol securely over the established “channel”. We call this a vertical protocol.......e., that the combination cannot introduce attacks that the individual protocols in isolation do not have. In this work, we prove a composability result in the symbolic model that allows for arbitrary vertical composition (including self-composition). It holds for protocols from any suite of channel and application...

  10. Using the MMPI-2-RF to discriminate psychometrically identified schizotypic college students from a matched comparison sample.

    Science.gov (United States)

    Hunter, Helen K; Bolinskey, P Kevin; Novi, Jonathan H; Hudak, Daniel V; James, Alison V; Myers, Kevin R; Schuder, Kelly M

    2014-01-01

    This study investigates the extent to which the Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF) profiles of 52 individuals making up a psychometrically identified schizotypes (SZT) sample could be successfully discriminated from the protocols of 52 individuals in a matched comparison (MC) sample. Replication analyses were performed with an additional 53 pairs of SZT and MC participants. Results showed significant differences in mean T-score values between these 2 groups across a variety of MMPI-2-RF scales. Results from discriminant function analyses indicate that schizotypy can be predicted effectively using 4 MMPI-2-RF scales and that this method of classification held up on replication. Additional results demonstrated that these MMPI-2-RF scales nominally outperformed MMPI-2 scales suggested by previous research as being indicative of schizophrenia liability. Directions for future research with the MMPI-2-RF are suggested.

  11. OUTPACE long duration stations: physical variability, context of biogeochemical sampling, and evaluation of sampling strategy

    Directory of Open Access Journals (Sweden)

    A. de Verneil

    2018-04-01

    Full Text Available Research cruises to quantify biogeochemical fluxes in the ocean require taking measurements at stations lasting at least several days. A popular experimental design is the quasi-Lagrangian drifter, often mounted with in situ incubations or sediment traps that follow the flow of water over time. After initial drifter deployment, the ship tracks the drifter for continuing measurements that are supposed to represent the same water environment. An outstanding question is how to best determine whether this is true. During the Oligotrophy to UlTra-oligotrophy PACific Experiment (OUTPACE cruise, from 18 February to 3 April 2015 in the western tropical South Pacific, three separate stations of long duration (five days over the upper 500 m were conducted in this quasi-Lagrangian sampling scheme. Here we present physical data to provide context for these three stations and to assess whether the sampling strategy worked, i.e., that a single body of water was sampled. After analyzing tracer variability and local water circulation at each station, we identify water layers and times where the drifter risks encountering another body of water. While almost no realization of this sampling scheme will be truly Lagrangian, due to the presence of vertical shear, the depth-resolved observations during the three stations show most layers sampled sufficiently homogeneous physical environments during OUTPACE. By directly addressing the concerns raised by these quasi-Lagrangian sampling platforms, a protocol of best practices can begin to be formulated so that future research campaigns include the complementary datasets and analyses presented here to verify the appropriate use of the drifter platform.

  12. Sampling designs for contaminant temporal trend analyses using sedentary species exemplified by the snails Bellamya aeruginosa and Viviparus viviparus.

    Science.gov (United States)

    Yin, Ge; Danielsson, Sara; Dahlberg, Anna-Karin; Zhou, Yihui; Qiu, Yanling; Nyberg, Elisabeth; Bignert, Anders

    2017-10-01

    Environmental monitoring typically assumes samples and sampling activities to be representative of the population being studied. Given a limited budget, an appropriate sampling strategy is essential to support detecting temporal trends of contaminants. In the present study, based on real chemical analysis data on polybrominated diphenyl ethers in snails collected from five subsites in Tianmu Lake, computer simulation is performed to evaluate three sampling strategies by the estimation of required sample size, to reach a detection of an annual change of 5% with a statistical power of 80% and 90% with a significant level of 5%. The results showed that sampling from an arbitrarily selected sampling spot is the worst strategy, requiring much more individual analyses to achieve the above mentioned criteria compared with the other two approaches. A fixed sampling site requires the lowest sample size but may not be representative for the intended study object e.g. a lake and is also sensitive to changes of that particular sampling site. In contrast, sampling at multiple sites along the shore each year, and using pooled samples when the cost to collect and prepare individual specimens are much lower than the cost for chemical analysis, would be the most robust and cost efficient strategy in the long run. Using statistical power as criterion, the results demonstrated quantitatively the consequences of various sampling strategies, and could guide users with respect of required sample sizes depending on sampling design for long term monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A single lysis solution for the analysis of tissue samples by different proteomic technologies

    DEFF Research Database (Denmark)

    Gromov, P.; Celis, J.E.; Gromova, I.

    2008-01-01

    -based proteomics (reverse-phase lysate arrays or direct antibody arrays), allowing the direct comparison of qualitative and quantitative data yielded by these technologies when applied to the same samples. The usefulness of the CLB1 solution for gel-based proteomics was further established by 2D PAGE analysis...... dissease, is driving scientists to increasingly use clinically relevant samples for biomarker and target discovery. Tissues are heterogeneous and as a result optimization of sample preparation is critical for generating accurate, representative, and highly reproducible quantitative data. Although a large...... number of protocols for preparation of tissue lysates has been published, so far no single recipe is able to provide a "one-size fits all" solubilization procedure that can be used to analyse the same lysate using different proteomics technologies. Here we present evidence showing that cell lysis buffer...

  14. Redactions in protocols for drug trials: what industry sponsors concealed.

    Science.gov (United States)

    Marquardsen, Mikkel; Ogden, Michelle; Gøtzsche, Peter C

    2018-04-01

    Objective To describe the redactions in contemporary protocols for industry-sponsored randomised drug trials with patient relevant outcomes and to evaluate whether there was a legitimate rationale for the redactions. Design Cohort study. Under the Freedom of Information Act, we requested access to trial protocols approved by a research ethics committee in Denmark from October 2012 to March 2013. We received 17 consecutive protocols, which had been redacted before we got them, and nine protocols without redactions. In five additional cases, the companies refused to let the committees give us access, and in three other cases, documents were missing. Participants Not applicable. Setting Not applicable. Main outcome measure Amount and nature of redactions in 22 predefined key protocol variables. Results The redactions were most widespread in those sections of the protocol where there is empirical evidence of substantial problems with the trustworthiness of published drug trials: data analysis, handling of missing data, detection and analysis of adverse events, definition of the outcomes, interim analyses and premature termination of the study, sponsor's access to incoming data while the study is running, ownership to the data and investigators' publication rights. The parts of the text that were redacted differed widely, both between companies and within the same company. Conclusions We could not identify any legitimate rationale for the redactions. The current mistrust in industry-sponsored drug trials can only change if the industry offers unconditional access to its trial protocols and other relevant documents and data.

  15. Multi-element analyses of Vietnamese environmental samples for radiation protection

    International Nuclear Information System (INIS)

    Mai, T.H.; Nguyen, T.B.; Nguyen, T.N.; Yoshida, S.

    2005-01-01

    The Inductively Coupled Plasma-Atomic Emission Spectrometry (ICP-AES) and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS) were used for measuring 8 major and 27 trace elements in food and soil samples collected in Vietnam. The concentration of elements in food samples was different from different locations and also from different food groups. Depth profiles of the elements were discussed for soil samples. (author)

  16. Advanced CUBIC protocols for whole-brain and whole-body clearing and imaging.

    Science.gov (United States)

    Susaki, Etsuo A; Tainaka, Kazuki; Perrin, Dimitri; Yukinaga, Hiroko; Kuno, Akihiro; Ueda, Hiroki R

    2015-11-01

    Here we describe a protocol for advanced CUBIC (Clear, Unobstructed Brain/Body Imaging Cocktails and Computational analysis). The CUBIC protocol enables simple and efficient organ clearing, rapid imaging by light-sheet microscopy and quantitative imaging analysis of multiple samples. The organ or body is cleared by immersion for 1-14 d, with the exact time required dependent on the sample type and the experimental purposes. A single imaging set can be completed in 30-60 min. Image processing and analysis can take whole-brain neural activities at single-cell resolution using Arc-dVenus transgenic (Tg) mice. CUBIC informatics calculated the Venus signal subtraction, comparing different brains at a whole-organ scale. These protocols provide a platform for organism-level systems biology by comprehensively detecting cells in a whole organ or body.

  17. Flow cytometry protocol to evaluate ionizing radiation effects on P-glycoprotein activity

    International Nuclear Information System (INIS)

    Santos, Neyliane Goncalves dos; Amaral, Ademir; Cavalcanti, Mariana Brayner . E-mail; Neves, Maria Amelia Batista; Machado, Cintia Gonsalves de Faria

    2008-01-01

    The aim of this work was to establish a protocol to evaluate ionizing radiation effects on P-glycoprotein (P-gp) activity. For this, human peripheral blood samples were irradiated in vitro with different doses and P-gp activity was analyzed for CD4 and CD8 T lymphocytes through rhodamine123-efflux assay by flow cytometry. By simultaneous employment of percentage and mean fluorescence index parameters, subject-by-subject analysis pointed out changes in P-gp activity for some individuals and irradiated samples. Based on this work, the proposed protocol was considered adequate for evaluating P-gp activity on cells after radioactive stress. Besides, this research suggests that P-gp activity could be an important factor to define patient-specific protocols in combined chemo- and radiotherapy, particularly when radiation exposure precedes chemical treatment. (author)

  18. Flow cytometry protocol to evaluate ionizing radiation effects on P-glycoprotein activity

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Neyliane Goncalves dos; Amaral, Ademir; Cavalcanti, Mariana Brayner [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear]. E-mail; neylisantos@yahoo.com.br; Neves, Maria Amelia Batista; Machado, Cintia Gonsalves de Faria [Fundacao de Hematologia e Hemoterapia de Pernambuco, Recife, PE (Brazil). Unidade de Laboratorios Especializados. Lab. de Imunofenotipagem

    2008-12-15

    The aim of this work was to establish a protocol to evaluate ionizing radiation effects on P-glycoprotein (P-gp) activity. For this, human peripheral blood samples were irradiated in vitro with different doses and P-gp activity was analyzed for CD4 and CD8 T lymphocytes through rhodamine123-efflux assay by flow cytometry. By simultaneous employment of percentage and mean fluorescence index parameters, subject-by-subject analysis pointed out changes in P-gp activity for some individuals and irradiated samples. Based on this work, the proposed protocol was considered adequate for evaluating P-gp activity on cells after radioactive stress. Besides, this research suggests that P-gp activity could be an important factor to define patient-specific protocols in combined chemo- and radiotherapy, particularly when radiation exposure precedes chemical treatment. (author)

  19. Prenatal diagnosis and prevention of toxoplasmosis in pregnant women in Northern Vietnam: study protocol.

    Science.gov (United States)

    Smit, G Suzanne A; Vu, Thi Lam Binh; Do, Trung Dung; Speybroeck, Niko; Devleesschauwer, Brecht; Padalko, Elizaveta; Roets, Ellen; Dorny, Pierre

    2017-05-25

    In Vietnam, no systematic prenatal toxoplasmosis screening is in place, and only few studies have assessed the prevalence and importance of this zoonotic parasite infection. In addition, no studies have been conducted to assess the risk factors associated with toxoplasmosis. This study protocol was developed to determine the seroprevalence of toxoplasmosis in pregnant women in Hanoi and Thai Binh, Northern Vietnam, and to evaluate the association with risk factors and congenital toxoplasmosis. The protocol was developed in a way that it could potentially evolve into a countrywide prenatal diagnosis and prevention program, with the main focus on primary prevention. The collaborating gynaecologists will invite eligible pregnant women attending antenatal care for the first time to participate in the study. At first consult, information about toxoplasmosis and its prevention will be provided. All participants will be asked to fill in a questionnaire, which is designed to analyse socio-demographic and biologically plausible risk factors associated with toxoplasmosis, and blood samples will be collected to determine the seroprevalence of toxoplasmosis in pregnant women. In case there is suspicion of a primary infection during pregnancy, the concerned women will be followed-up by the gynaecologists according to a predefined protocol. Every participant will be informed on her serological status, risk factors and prevention measures and is offered appropriate medical information and medical follow-up if required. The hypothesis is that congenital toxoplasmosis is an important but currently under-diagnosed public health problem in Vietnam. This study can strengthen sustainable control of toxoplasmosis in Vietnam, provide a protocol for prenatal diagnosis, boost overall awareness, improve the knowledge about toxoplasmosis prevention and can be essential for evidence-based health policy.

  20. Space Network Time Distribution and Synchronization Protocol Development for Mars Proximity Link

    Science.gov (United States)

    Woo, Simon S.; Gao, Jay L.; Mills, David

    2010-01-01

    Time distribution and synchronization in deep space network are challenging due to long propagation delays, spacecraft movements, and relativistic effects. Further, the Network Time Protocol (NTP) designed for terrestrial networks may not work properly in space. In this work, we consider the time distribution protocol based on time message exchanges similar to Network Time Protocol (NTP). We present the Proximity-1 Space Link Interleaved Time Synchronization (PITS) algorithm that can work with the CCSDS Proximity-1 Space Data Link Protocol. The PITS algorithm provides faster time synchronization via two-way time transfer over proximity links, improves scalability as the number of spacecraft increase, lowers storage space requirement for collecting time samples, and is robust against packet loss and duplication which underlying protocol mechanisms provide.

  1. Defining standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to mastitis pathogens.

    Science.gov (United States)

    Schukken, Y H; Rauch, B J; Morelli, J

    2013-04-01

    The objective of this paper was to define standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to both Staphylococcus aureus and Streptococcus agalactiae. The standardized protocols describe the selection of cows and herds and define the critical points in performing experimental exposure, performing bacterial culture, evaluating the culture results, and finally performing statistical analyses and reporting of the results. The protocols define both negative control and positive control trials. For negative control trials, the protocol states that an efficacy of reducing new intramammary infections (IMI) of at least 40% is required for a teat disinfectant to be considered effective. For positive control trials, noninferiority to a control disinfectant with a published efficacy of reducing new IMI of at least 70% is required. Sample sizes for both negative and positive control trials are calculated. Positive control trials are expected to require a large trial size. Statistical analysis methods are defined and, in the proposed methods, the rate of IMI may be analyzed using generalized linear mixed models. The efficacy of the test product can be evaluated while controlling for important covariates and confounders in the trial. Finally, standards for reporting are defined and reporting considerations are discussed. The use of the defined protocol is shown through presentation of the results of a recent trial of a test product against a negative control. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  3. Efficient sample preparation from complex biological samples using a sliding lid for immobilized droplet extractions.

    Science.gov (United States)

    Casavant, Benjamin P; Guckenberger, David J; Beebe, David J; Berry, Scott M

    2014-07-01

    Sample preparation is a major bottleneck in many biological processes. Paramagnetic particles (PMPs) are a ubiquitous method for isolating analytes of interest from biological samples and are used for their ability to thoroughly sample a solution and be easily collected with a magnet. There are three main methods by which PMPs are used for sample preparation: (1) removal of fluid from the analyte-bound PMPs, (2) removal of analyte-bound PMPs from the solution, and (3) removal of the substrate (with immobilized analyte-bound PMPs). In this paper, we explore the third and least studied method for PMP-based sample preparation using a platform termed Sliding Lid for Immobilized Droplet Extractions (SLIDE). SLIDE leverages principles of surface tension and patterned hydrophobicity to create a simple-to-operate platform for sample isolation (cells, DNA, RNA, protein) and preparation (cell staining) without the need for time-intensive wash steps, use of immiscible fluids, or precise pinning geometries. Compared to other standard isolation protocols using PMPs, SLIDE is able to perform rapid sample preparation with low (0.6%) carryover of contaminants from the original sample. The natural recirculation occurring within the pinned droplets of SLIDE make possible the performance of multistep cell staining protocols within the SLIDE by simply resting the lid over the various sample droplets. SLIDE demonstrates a simple easy to use platform for sample preparation on a range of complex biological samples.

  4. Inter-comparison of NIOSH and IMPROVE protocols for OC and EC determination: implications for inter-protocol data conversion

    Science.gov (United States)

    Wu, Cheng; Huang, X. H. Hilda; Ng, Wai Man; Griffith, Stephen M.; Zhen Yu, Jian

    2016-09-01

    Organic carbon (OC) and elemental carbon (EC) are operationally defined by analytical methods. As a result, OC and EC measurements are protocol dependent, leading to uncertainties in their quantification. In this study, more than 1300 Hong Kong samples were analyzed using both National Institute for Occupational Safety and Health (NIOSH) thermal optical transmittance (TOT) and Interagency Monitoring of Protected Visual Environment (IMPROVE) thermal optical reflectance (TOR) protocols to explore the cause of EC disagreement between the two protocols. EC discrepancy mainly (83 %) arises from a difference in peak inert mode temperature, which determines the allocation of OC4NSH, while the rest (17 %) is attributed to a difference in the optical method (transmittance vs. reflectance) applied for the charring correction. Evidence shows that the magnitude of the EC discrepancy is positively correlated with the intensity of the biomass burning signal, whereby biomass burning increases the fraction of OC4NSH and widens the disagreement in the inter-protocol EC determination. It is also found that the EC discrepancy is positively correlated with the abundance of metal oxide in the samples. Two approaches (M1 and M2) that translate NIOSH TOT OC and EC data into IMPROVE TOR OC and EC data are proposed. M1 uses direct relationship between ECNSH_TOT and ECIMP_TOR for reconstruction: M1 : ECIMP_TOR = a × ECNSH_TOT + b; while M2 deconstructs ECIMP_TOR into several terms based on analysis principles and applies regression only on the unknown terms: M2 : ECIMP_TOR = AECNSH + OC4NSH - (a × PCNSH_TOR + b), where AECNSH, apparent EC by the NIOSH protocol, is the carbon that evolves in the He-O2 analysis stage, OC4NSH is the carbon that evolves at the fourth temperature step of the pure helium analysis stage of NIOSH, and PCNSH_TOR is the pyrolyzed carbon as determined by the NIOSH protocol. The implementation of M1 to all urban site data (without considering seasonal specificity

  5. Aldefluor protocol to sort keratinocytes stem cells from skin

    OpenAIRE

    Noronha, Samuel Marcos Ribeiro; Gragnani, Alfredo; Pereira, Thiago Antônio Calado; Correa, Silvana Aparecida Alves; Bonucci, Jessica; Ferreira, Lydia Masako

    2017-01-01

    Abstract Purpose: To investigate the use Aldefluor® and N, N - Dimethylaminobenzaldehyde (DEAB) to design a protocol to sort keratinocyte stem cells from cultured keratinocytes from burned patients. Methods: Activated Aldefluor® aliquots were prepared and maintained at temperature between 2 to 8°C, or stored at -20°C. Next, the cells were collected following the standard protocol of sample preparation. Results: Best results were obtained with Aldefluor® 1.5µl and DEAB 15 µl for 1 x 106 c...

  6. Composite sampling of a Bacillus anthracis surrogate with cellulose sponge surface samplers from a nonporous surface.

    Directory of Open Access Journals (Sweden)

    Jenia A M Tufts

    Full Text Available A series of experiments was conducted to explore the utility of composite-based collection of surface samples for the detection of a Bacillus anthracis surrogate using cellulose sponge samplers on a nonporous stainless steel surface. Two composite-based collection approaches were evaluated over a surface area of 3716 cm2 (four separate 929 cm2 areas, larger than the 645 cm2 prescribed by the standard Centers for Disease Control (CDC and Prevention cellulose sponge sampling protocol for use on nonporous surfaces. The CDC method was also compared to a modified protocol where only one surface of the sponge sampler was used for each of the four areas composited. Differences in collection efficiency compared to positive controls and the potential for contaminant transfer for each protocol were assessed. The impact of the loss of wetting buffer from the sponge sampler onto additional surface areas sampled was evaluated. Statistical tests of the results using ANOVA indicate that the collection of composite samples using the modified sampling protocol is comparable to the collection of composite samples using the standard CDC protocol (p  =  0.261. Most of the surface-bound spores are collected on the first sampling pass, suggesting that multiple passes with the sponge sampler over the same surface may be unnecessary. The effect of moisture loss from the sponge sampler on collection efficiency was not significant (p  =  0.720 for both methods. Contaminant transfer occurs with both sampling protocols, but the magnitude of transfer is significantly greater when using the standard protocol than when the modified protocol is used (p<0.001. The results of this study suggest that composite surface sampling, by either method presented here, could successfully be used to increase the surface area sampled per sponge sampler, resulting in reduced sampling times in the field and decreased laboratory processing cost and turn-around times.

  7. EDXRF applied to the chemical element determination of small invertebrate samples

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, Marcelo L.R.; Santos, Mariana L.O.; Cantinha, Rebeca S.; Souza, Thomas Marques de; Franca, Elvis J. de, E-mail: marcelo_rlm@hotmail.com, E-mail: marianasantos_ufpe@hotmail.com, E-mail: rebecanuclear@gmail.com, E-mail: thomasmarques@live.com.pt, E-mail: ejfranca@cnen.gov.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil)

    2015-07-01

    Energy Dispersion X-Ray Fluorescence - EDXRF is a fast analytical technique of easy operation, however demanding reliable analytical curves due to the intrinsic matrix dependence and interference during the analysis. By using biological materials of diverse matrices, multielemental analytical protocols can be implemented and a group of chemical elements could be determined in diverse biological matrices depending on the chemical element concentration. Particularly for invertebrates, EDXRF presents some advantages associated to the possibility of the analysis of small size samples, in which a collimator can be used that directing the incidence of X-rays to a small surface of the analyzed samples. In this work, EDXRF was applied to determine Cl, Fe, P, S and Zn in invertebrate samples using the collimator of 3 mm and 10 mm. For the assessment of the analytical protocol, the SRM 2976 Trace Elements in Mollusk produced and SRM 8415 Whole Egg Powder by the National Institute of Standards and Technology - NIST were also analyzed. After sampling by using pitfall traps, invertebrate were lyophilized, milled and transferred to polyethylene vials covered by XRF polyethylene. Analyses were performed at atmosphere lower than 30 Pa, varying voltage and electric current according to the chemical element to be analyzed. For comparison, Zn in the invertebrate material was also quantified by graphite furnace atomic absorption spectrometry after acid treatment (mixture of nitric acid and hydrogen peroxide) of samples have. Compared to the collimator of 10 mm, the SRM 2976 and SRM 8415 results obtained by the 3 mm collimator agreed well at the 95% confidence level since the E{sub n} Number were in the range of -1 and 1. Results from GFAAS were in accordance to the EDXRF values for composite samples. Therefore, determination of some chemical elements by EDXRF can be recommended for very small invertebrate samples (lower than 100 mg) with advantage of preserving the samples. (author)

  8. A critical analysis of a locally agreed protocol for clinical practice

    International Nuclear Information System (INIS)

    Owen, A.; Hogg, P.; Nightingale, J.

    2004-01-01

    Within the traditional scope of radiographic practice (including advanced practice) there is a need to demonstrate effective patient care and management. Such practice should be set within a context of appropriate evidence and should also reflect peer practice. In order to achieve such practice the use of protocols is encouraged. Effective protocols can maximise care and management by minimising inter- and intra-professional variation; they can also allow for detailed procedural records to be kept in case of legal claims. However, whilst literature exists to encourage the use of protocols there is little published material available to indicate how to create, manage and archive them. This article uses an analytical approach to propose a suitable method for protocol creation and archival, it also offers suggestions on the scope and content of a protocol. To achieve this an existing clinical protocol for radiographer reporting barium enemas is analysed to draw out the general issues. Proposals for protocol creation, management, and archival were identified. The clinical practice described or inferred in the protocol should be drawn from evidence, such evidence could include peer-reviewed material, national standards and peer practice. The protocol should include an explanation of how to proceed when the radiographers reach the limit of their ability. It should refer to the initial training required to undertake the clinical duties as well as the on-going continual professional updating required to maintain competence. Audit of practice should be indicated, including the preferred audit methodology, and associated with this should be a clear statement about standards and what to do if standards are not adequately met. Protocols should be archived, in a paper-based form, for lengthy periods in case of legal claims. On the archived protocol the date it was in clinical use should be included

  9. Study protocol for the translating research in elder care (TREC: building context – an organizational monitoring program in long-term care project (project one

    Directory of Open Access Journals (Sweden)

    Cummings Greta G

    2009-08-01

    Full Text Available Abstract Background While there is a growing awareness of the importance of organizational context (or the work environment/setting to successful knowledge translation, and successful knowledge translation to better patient, provider (staff, and system outcomes, little empirical evidence supports these assumptions. Further, little is known about the factors that enhance knowledge translation and better outcomes in residential long-term care facilities, where care has been shown to be suboptimal. The project described in this protocol is one of the two main projects of the larger five-year Translating Research in Elder Care (TREC program. Aims The purpose of this project is to establish the magnitude of the effect of organizational context on knowledge translation, and subsequently on resident, staff (unregulated, regulated, and managerial and system outcomes in long-term care facilities in the three Canadian Prairie Provinces (Alberta, Saskatchewan, Manitoba. Methods/Design This study protocol describes the details of a multi-level – including provinces, regions, facilities, units within facilities, and individuals who receive care (residents or work (staff in facilities – and longitudinal (five-year research project. A stratified random sample of 36 residential long-term care facilities (30 urban and 6 rural from the Canadian Prairie Provinces will comprise the sample. Caregivers and care managers within these facilities will be asked to complete the TREC survey – a suite of survey instruments designed to assess organizational context and related factors hypothesized to be important to successful knowledge translation and to achieving better resident, staff, and system outcomes. Facility and unit level data will be collected using standardized data collection forms, and resident outcomes using the Resident Assessment Instrument-Minimum Data Set version 2.0 instrument. A variety of analytic techniques will be employed including descriptive

  10. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P 2015 statement

    Directory of Open Access Journals (Sweden)

    Mireia Estarli

    2016-02-01

    Full Text Available Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015. PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium. Translation with permission of the authors. The original authors have not revised and verified the Spanish translation, and they do not necessarily endorse it.

  11. Modified Scoring, Traditional Item Analysis, and Sato's Caution Index Used To Investigate the Reading Recall Protocol.

    Science.gov (United States)

    Deville, Craig W.; Chalhoub-Deville, Micheline

    A study demonstrated the utility of item analyses to investigate which items function well or poorly in a second language reading recall protocol instrument. Data were drawn from a larger study of 56 learners of German as a second language at various proficiency levels. Pausal units of scored recall protocols were analyzed using both classical…

  12. Protocol for quantitative tracing of surface water with synthetic DNA

    Science.gov (United States)

    Foppen, J. W.; Bogaard, T. A.

    2012-04-01

    Based on experiments we carried out in 2010 with various synthetic single stranded DNA markers with a size of 80 nucleotides (ssDNA; Foppen et al., 2011), we concluded that ssDNA can be used to carry out spatially distributed multi-tracer experiments in the environment. Main advantages are in principle unlimited amount of tracers, environmental friendly and tracer recovery at very high dilution rates (detection limit is very low). However, when ssDNA was injected in headwater streams, we found that at selected downstream locations, the total mass recovery was less than 100%. The exact reason for low mass recovery was unknown. In order to start identifying the cause of the loss of mass in these surface waters, and to increase our knowledge of the behaviour of synthetic ssDNA in the environment, we examined the effect of laboratory and field protocols working with artificial DNA by performing numerous batch experiments. Then, we carried out several field tests in different headwater streams in the Netherlands and in Luxembourg. The laboratory experiments consisted of a batch of water in a vessel with in the order of 10^10 ssDNA molecules injected into the batch. The total duration of each experiment was 10 hour, and, at regular time intervals, 100 µl samples were collected in a 1.5 ml Eppendorf vial for qPCR analyses. The waters we used ranged from milliQ water to river water with an Electrical Conductivity of around 400 μS/cm. The batch experiments were performed in different vessel types: polyethylene bottles, polypropylene copolymer bottles , and glass bottles. In addition, two filter types were tested: 1 µm pore size glass fibre filters and 0.2 µm pore size cellulose acetate filters. Lastly, stream bed sediment was added to the batch experiments to quantify interaction of the DNA with sediment. For each field experiment around 10^15 ssDNA molecules were injected, and water samples were collected 100 - 600 m downstream of the point of injection. Additionally

  13. Analysing Microbial Community Composition through Amplicon Sequencing: From Sampling to Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Luisa W. Hugerth

    2017-09-01

    Full Text Available Microbial ecology as a scientific field is fundamentally driven by technological advance. The past decade's revolution in DNA sequencing cost and throughput has made it possible for most research groups to map microbial community composition in environments of interest. However, the computational and statistical methodology required to analyse this kind of data is often not part of the biologist training. In this review, we give a historical perspective on the use of sequencing data in microbial ecology and restate the current need for this method; but also highlight the major caveats with standard practices for handling these data, from sample collection and library preparation to statistical analysis. Further, we outline the main new analytical tools that have been developed in the past few years to bypass these caveats, as well as highlight the major requirements of common statistical practices and the extent to which they are applicable to microbial data. Besides delving into the meaning of select alpha- and beta-diversity measures, we give special consideration to techniques for finding the main drivers of community dissimilarity and for interaction network construction. While every project design has specific needs, this review should serve as a starting point for considering what options are available.

  14. A protocol for measuring spatial variables in soft-sediment tide pools

    Directory of Open Access Journals (Sweden)

    Marina R. Brenha-Nunes

    2016-01-01

    Full Text Available ABSTRACT We present a protocol for measuring spatial variables in large (>50 m2 soft-sediment tide pool. Secondarily, we present the fish capture efficiency of a sampling protocol that based on such spatial variables to calculate relative abundances. The area of the pool is estimated by summing areas of basic geometric forms; the depth, by taken representative measurements of the depth variability of each pool's sector, previously determined according to its perimeter; and the volume, by considering the pool as a prism. These procedures were a trade-off between the acquisition of reliable estimates and the minimization of both the cost of operating and the time spent in field. The fish sampling protocol is based on two con secutive stages: 1 two people search for fishes under structures (e.g., rocks and litters on the pool and capture them with hand seines; 2 these structures are removed and then a beach-seine is hauled over the whole pool. Our method is cheaper than others and fast to operate considering the time in low tides. The method to sample fish is quite efficient resulting in a capture efficiency of 89%.

  15. Measurement assurance program for LSC analyses of tritium samples

    International Nuclear Information System (INIS)

    Levi, G.D. Jr.; Clark, J.P.

    1997-01-01

    Liquid Scintillation Counting (LSC) for Tritium is done on 600 to 800 samples daily as part of a contamination control program at the Savannah River Site's Tritium Facilities. The tritium results from the LSCs are used: to release items as radiologically clean; to establish radiological control measures for workers; and to characterize waste. The following is a list of the sample matrices that are analyzed for tritium: filter paper smears, aqueous, oil, oily rags, ethylene glycol, ethyl alcohol, freon and mercury. Routine and special causes of variation in standards, counting equipment, environment, operators, counting times, samples, activity levels, etc. produce uncertainty in the LSC measurements. A comprehensive analytical process measurement assurance program such as JTIPMAP trademark has been implemented. The process measurement assurance program is being used to quantify and control many of the sources of variation and provide accurate estimates of the overall measurement uncertainty associated with the LSC measurements. The paper will describe LSC operations, process improvements, quality control and quality assurance programs along with future improvements associated with the implementation of the process measurement assurance program

  16. Glycan characterization of the NIST RM monoclonal antibody using a total analytical solution: From sample preparation to data analysis.

    Science.gov (United States)

    Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M

    Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.

  17. An improved ATAC-seq protocol reduces background and enables interrogation of frozen tissues.

    Science.gov (United States)

    Corces, M Ryan; Trevino, Alexandro E; Hamilton, Emily G; Greenside, Peyton G; Sinnott-Armstrong, Nicholas A; Vesuna, Sam; Satpathy, Ansuman T; Rubin, Adam J; Montine, Kathleen S; Wu, Beijing; Kathiria, Arwa; Cho, Seung Woo; Mumbach, Maxwell R; Carter, Ava C; Kasowski, Maya; Orloff, Lisa A; Risca, Viviana I; Kundaje, Anshul; Khavari, Paul A; Montine, Thomas J; Greenleaf, William J; Chang, Howard Y

    2017-10-01

    We present Omni-ATAC, an improved ATAC-seq protocol for chromatin accessibility profiling that works across multiple applications with substantial improvement of signal-to-background ratio and information content. The Omni-ATAC protocol generates chromatin accessibility profiles from archival frozen tissue samples and 50-μm sections, revealing the activities of disease-associated DNA elements in distinct human brain structures. The Omni-ATAC protocol enables the interrogation of personal regulomes in tissue context and translational studies.

  18. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    Science.gov (United States)

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  19. Direct data access protocols benchmarking on DPM

    Science.gov (United States)

    Furano, Fabrizio; Devresse, Adrien; Keeble, Oliver; Mancinelli, Valentina

    2015-12-01

    The Disk Pool Manager is an example of a multi-protocol, multi-VO system for data access on the Grid that went though a considerable technical evolution in the last years. Among other features, its architecture offers the opportunity of testing its different data access frontends under exactly the same conditions, including hardware and backend software. This characteristic inspired the idea of collecting monitoring information from various testbeds in order to benchmark the behaviour of the HTTP and Xrootd protocols for the use case of data analysis, batch or interactive. A source of information is the set of continuous tests that are run towards the worldwide endpoints belonging to the DPM Collaboration, which accumulated relevant statistics in its first year of activity. On top of that, the DPM releases are based on multiple levels of automated testing that include performance benchmarks of various kinds, executed regularly every day. At the same time, the recent releases of DPM can report monitoring information about any data access protocol to the same monitoring infrastructure that is used to monitor the Xrootd deployments. Our goal is to evaluate under which circumstances the HTTP-based protocols can be good enough for batch or interactive data access. In this contribution we show and discuss the results that our test systems have collected under the circumstances that include ROOT analyses using TTreeCache and stress tests on the metadata performance.

  20. Gene expression differences between PAXgene and Tempus blood RNA tubes are highly reproducible between independent samples and biobanks.

    Science.gov (United States)

    Skogholt, Anne Heidi; Ryeng, Einar; Erlandsen, Sten Even; Skorpen, Frank; Schønberg, Svanhild A; Sætrom, Pål

    2017-03-23

    Gene expression profiling from blood is sensitive to technology choices. For example, the main blood RNA collection systems-the PAXgene and Tempus tubes-differently influence RNA expression signatures. The aim of this study was to establish a common RNA isolation protocol for these two systems and investigate if it could reduce the differences in gene expression between them. We collected identical blood samples on the PAXgene and Tempus systems and retrieved blood samples from two independent biobanks-NOWAC and HUNT3-which are based on PAXgene and Tempus, respectively. High-quality RNA was extracted from both sampling systems by using their original protocols and our common modified protocol, and were profiled on Illumina microarrays. Regardless of the protocol used, we found most of the measured transcripts to be differently affected by the two sampling systems. However, our modified protocol reduced the number of transcripts that were significantly differentially expressed between PAXgene and Tempus by approximately 50%. Expression differences between PAXgene and Tempus were highly reproducible both between protocols and between different independent sample sets (Pearson correlation 0.563-0.854 across 47323 probes). Moreover, the modified protocol increased the microRNA output of the system with lowest microRNA yield, the PAXgene system. Most transcripts are affected by the choice of sampling system, but these effects are highly reproducible between independent samples. We propose that by running a control experiment with samples on both systems in parallel with biologically relevant samples, researchers may adjust for technical differences between the sampling systems.

  1. Tritium sample analyses in the Savannah River and associated waterways following the K-reactor release of December 1991

    International Nuclear Information System (INIS)

    Beals, D.M.; Dunn, D.L.; Hall, G.; Kantelo, M.V.

    1992-01-01

    An unplanned release of tritiated water occurred at K reactor on SRS between 22-December and 25-December 1991. This water moved down through the effluent canal, Pen Branch, Steel Creek and finally to the Savannah River. Samples were collected in the Savannah River and associated waterways over a period of a month. The Environmental Technology Section (ETS) of the Savannah River Laboratory performed liquid scintillation analyses to monitor the passage of the tritiated water from SRS to the Atlantic Ocean

  2. Energy and fossil fuels as a topic in WTO accession protocols

    NARCIS (Netherlands)

    Marhold, Anna; Weiss, Friedl; Bungenberg, M; Krajewski, M; Tams, C; Terhechte, JP; Ziegler, AR

    2018-01-01

    This article seeks to analyse and compare WTO Accession Protocols, particularly the interpretations given relevant commitments made in them regarding energy and fossil fuels. Much has changed in global trade relations since the launch of the Doha Round of multilateral trade negotiations in November

  3. Efficient DNP NMR of Membrane Proteins: Sample Preparation Protocols, Sensitivity, and Radical Location

    Science.gov (United States)

    Liao, Shu Y.; Lee, Myungwoon; Wang, Tuo; Sergeyev, Ivan V.; Hong, Mei

    2016-01-01

    Although dynamic nuclear polarization (DNP) has dramatically enhanced solid-state NMR spectral sensitivities of many synthetic materials and some biological macromolecules, recent studies of membrane-protein DNP using exogenously doped paramagnetic radicals as polarizing agents have reported varied and sometimes surprisingly limited enhancement factors. This motivated us to carry out a systematic evaluation of sample preparation protocols for optimizing the sensitivity of DNP NMR spectra of membrane-bound peptides and proteins at cryogenic temperatures of ~110 K. We show that mixing the radical with the membrane by direct titration instead of centrifugation gives a significant boost to DNP enhancement. We quantify the relative sensitivity enhancement between AMUPol and TOTAPOL, two commonly used radicals, and between deuterated and protonated lipid membranes. AMUPol shows ~4 fold higher sensitivity enhancement than TOTAPOL, while deuterated lipid membrane does not give net higher sensitivity for the membrane peptides than protonated membrane. Overall, a ~100 fold enhancement between the microwave-on and microwave-off spectra can be achieved on lipid-rich membranes containing conformationally disordered peptides, and absolute sensitivity gains of 105–160 can be obtained between low-temperature DNP spectra and high-temperature non-DNP spectra. We also measured the paramagnetic relaxation enhancement of lipid signals by TOTAPOL and AMUPol, to determine the depths of these two radicals in the lipid bilayer. Our data indicate a bimodal distribution of both radicals, a surface-bound fraction and a membrane-bound fraction where the nitroxides lie at ~10 Å from the membrane surface. TOTAPOL appears to have a higher membrane-embedded fraction than AMUPol. These results should be useful for membrane-protein solid-state NMR studies under DNP conditions and provide insights into how biradicals interact with phospholipid membranes. PMID:26873390

  4. Efficient DNP NMR of membrane proteins: sample preparation protocols, sensitivity, and radical location

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Shu Y.; Lee, Myungwoon; Wang, Tuo [Massachusetts Institute of Technology, Department of Chemistry (United States); Sergeyev, Ivan V. [Bruker Biospin (United States); Hong, Mei, E-mail: meihong@mit.edu [Massachusetts Institute of Technology, Department of Chemistry (United States)

    2016-03-15

    Although dynamic nuclear polarization (DNP) has dramatically enhanced solid-state NMR spectral sensitivities of many synthetic materials and some biological macromolecules, recent studies of membrane-protein DNP using exogenously doped paramagnetic radicals as polarizing agents have reported varied and sometimes surprisingly limited enhancement factors. This motivated us to carry out a systematic evaluation of sample preparation protocols for optimizing the sensitivity of DNP NMR spectra of membrane-bound peptides and proteins at cryogenic temperatures of ~110 K. We show that mixing the radical with the membrane by direct titration instead of centrifugation gives a significant boost to DNP enhancement. We quantify the relative sensitivity enhancement between AMUPol and TOTAPOL, two commonly used radicals, and between deuterated and protonated lipid membranes. AMUPol shows ~fourfold higher sensitivity enhancement than TOTAPOL, while deuterated lipid membrane does not give net higher sensitivity for the membrane peptides than protonated membrane. Overall, a ~100 fold enhancement between the microwave-on and microwave-off spectra can be achieved on lipid-rich membranes containing conformationally disordered peptides, and absolute sensitivity gains of 105–160 can be obtained between low-temperature DNP spectra and high-temperature non-DNP spectra. We also measured the paramagnetic relaxation enhancement of lipid signals by TOTAPOL and AMUPol, to determine the depths of these two radicals in the lipid bilayer. Our data indicate a bimodal distribution of both radicals, a surface-bound fraction and a membrane-bound fraction where the nitroxides lie at ~10 Å from the membrane surface. TOTAPOL appears to have a higher membrane-embedded fraction than AMUPol. These results should be useful for membrane-protein solid-state NMR studies under DNP conditions and provide insights into how biradicals interact with phospholipid membranes.

  5. Analyses of Digman's child-personality data: derivation of Big-Five factor scores from each of six samples.

    Science.gov (United States)

    Goldberg, L R

    2001-10-01

    One of the world's richest collections of teacher descriptions of elementary-school children was obtained by John M. Digman from 1959 to 1967 in schools on two Hawaiian islands. In six phases of data collection, 88 teachers described 2,572 of their students, using one of five different sets of personality variables. The present report provides findings from new analyses of these important data, which have never before been analyzed in a comprehensive manner. When factors developed from carefully selected markers of the Big-Five factor structure were compared to those based on the total set of variables in each sample, the congruence between both types of factors was quite high. Attempts to extend the structure to 6 and 7 factors revealed no other broad factors beyond the Big Five in any of the 6 samples. These robust findings provide significant new evidence for the structure of teacher-based assessments of child personality attributes.

  6. Mapping of Schistosomiasis and Soil-Transmitted Helminths in Namibia: The First Large-Scale Protocol to Formally Include Rapid Diagnostic Tests.

    Directory of Open Access Journals (Sweden)

    José Carlos Sousa-Figueiredo

    Full Text Available Namibia is now ready to begin mass drug administration of praziquantel and albendazole against schistosomiasis and soil-transmitted helminths, respectively. Although historical data identifies areas of transmission of these neglected tropical diseases (NTDs, there is a need to update epidemiological data. For this reason, Namibia adopted a new protocol for mapping of schistosomiasis and geohelminths, formally integrating rapid diagnostic tests (RDTs for infections and morbidity. In this article, we explain the protocol in detail, and introduce the concept of 'mapping resolution', as well as present results and treatment recommendations for northern Namibia.This new protocol allowed a large sample to be surveyed (N = 17,896 children from 299 schools at relatively low cost (7 USD per person mapped and very quickly (28 working days. All children were analysed by RDTs, but only a sub-sample was also diagnosed by light microscopy. Overall prevalence of schistosomiasis in the surveyed areas was 9.0%, highly associated with poorer access to potable water (OR = 1.5, P<0.001 and defective (OR = 1.2, P<0.001 or absent sanitation infrastructure (OR = 2.0, P<0.001. Overall prevalence of geohelminths, more particularly hookworm infection, was 12.2%, highly associated with presence of faecal occult blood (OR = 1.9, P<0.001. Prevalence maps were produced and hot spots identified to better guide the national programme in drug administration, as well as targeted improvements in water, sanitation and hygiene. The RDTs employed (circulating cathodic antigen and microhaematuria for Schistosoma mansoni and S. haematobium, respectively performed well, with sensitivities above 80% and specificities above 95%.This protocol is cost-effective and sensitive to budget limitations and the potential economic and logistical strains placed on the national Ministries of Health. Here we present a high resolution map of disease prevalence levels, and treatment regimens are

  7. Experimental development of a new protocol for extraction and characterization of microplastics in fish tissues: First observations in commercial species from Adriatic Sea.

    Science.gov (United States)

    Avio, Carlo Giacomo; Gorbi, Stefania; Regoli, Francesco

    2015-10-01

    The presence of microplastics in the marine environment has raised scientific interest during the last decade. Several organisms can ingest microplastics with potentially adverse effects on the digestive tract, respiratory system and locomotory appendages. However, a clear evidence of tissue accumulation and transfer of such microparticles in wild organisms is still lacking, partially hampered by technical difficulties in isolation and characterization protocols from biological samples. In this work, we compared the efficacy of some existing approaches and we optimized a new protocol allowing an extraction yield of microplastics from fish tissues ranging between 78% and 98%, depending on the polymer size. FT-IR analyses confirmed that the extraction procedure did not affect the particles characteristics. The method was further validated on the fish mullet, Mugil cephalus, exposed under laboratory conditions to polystyrene and polyethylene; the particles were isolated and quantified in stomach and liver, and their presence in the hepatic tissue was confirmed also by histological analyses. A preliminary characterization revealed the presence and distribution of microplastics in various fish species collected along the Adriatic Sea. FT-IR analyses indicated polyethylene as the predominant polymer (65%) in the stomach of fish. The overall results confirmed the newly developed method as a reliable approach to detect and quantify microplastics in the marine biota. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Testing the efficiency of rover science protocols for robotic sample selection: A GeoHeuristic Operational Strategies Test

    Science.gov (United States)

    Yingst, R. A.; Bartley, J. K.; Chidsey, T. C.; Cohen, B. A.; Gilleaudeau, G. J.; Hynek, B. M.; Kah, L. C.; Minitti, M. E.; Williams, R. M. E.; Black, S.; Gemperline, J.; Schaufler, R.; Thomas, R. J.

    2018-05-01

    The GHOST field tests are designed to isolate and test science-driven rover operations protocols, to determine best practices. During a recent field test at a potential Mars 2020 landing site analog, we tested two Mars Science Laboratory data-acquisition and decision-making methods to assess resulting science return and sample quality: a linear method, where sites of interest are studied in the order encountered, and a "walkabout-first" method, where sites of interest are examined remotely before down-selecting to a subset of sites that are interrogated with more resource-intensive instruments. The walkabout method cost less time and fewer resources, while increasing confidence in interpretations. Contextual data critical to evaluating site geology was acquired earlier than for the linear method, and given a higher priority, which resulted in development of more mature hypotheses earlier in the analysis process. Combined, this saved time and energy in the collection of data with more limited spatial coverage. Based on these results, we suggest that the walkabout method be used where doing so would provide early context and time for the science team to develop hypotheses-critical tests; and that in gathering context, coverage may be more important than higher resolution.

  9. Efficacy of protocol-based pharmacotherapy management on anticoagulation with warfarin for patients with cardiovascular surgery.

    Science.gov (United States)

    Katada, Y; Nakagawa, S; Minakata, K; Odaka, M; Taue, H; Sato, Y; Yonezawa, A; Kayano, Y; Yano, I; Nakatsu, T; Sakamoto, K; Uehara, K; Sakaguchi, H; Yamazaki, K; Minatoya, K; Sakata, R; Matsubara, K

    2017-10-01

    Anticoagulation therapy with warfarin requires periodic monitoring of prothrombin time-international normalized ratio (PT-INR) and adequate dose adjustments based on the data to minimize the risk of bleeding and thromboembolic events. In our hospital, we have developed protocol-based pharmaceutical care, which we called protocol-based pharmacotherapy management (PBPM), for warfarin therapy. The protocol requires pharmacists to manage timing of blood sampling for measuring PT-INR and warfarin dosage determination based on an algorithm. This study evaluated the efficacy of PBPM in warfarin therapy by comparing to conventional pharmaceutical care. From October 2013 to June 2015, a total of 134 hospitalized patients who underwent cardiovascular surgeries received post-operative warfarin therapy. The early series of patients received warfarin therapy as the conventional care (control group, n=77), whereas the latter received warfarin therapy based on the PBPM (PBPM group, n=68). These patients formed the cohort of the present study and were retrospectively analysed. The indications for warfarin included aortic valve replacement (n=56), mitral valve replacement (n=4), mitral valve plasty (n=22) and atrial fibrillation (n=29). There were no differences in patients' characteristics between both groups. The percentage time in therapeutic range in the first 10 days was significantly higher in the PBPM group (47.1%) than that in the control group (34.4%, PWarfarin therapy based on our novel PBPM was clinically safe and resulted in significantly better anticoagulation control compared to conventional care. © 2017 John Wiley & Sons Ltd.

  10. Preanalytical Blood Sampling Errors in Clinical Settings

    International Nuclear Information System (INIS)

    Zehra, N.; Malik, A. H.; Arshad, Q.; Sarwar, S.; Aslam, S.

    2016-01-01

    Background: Blood sampling is one of the common procedures done in every ward for disease diagnosis and prognosis. Daily hundreds of samples are collected from different wards but lack of appropriate knowledge of blood sampling by paramedical staff and accidental errors make the samples inappropriate for testing. Thus the need to avoid these errors for better results still remains. We carried out this research with an aim to determine the common errors during blood sampling; find factors responsible and propose ways to reduce these errors. Methods: A cross sectional descriptive study was carried out at the Military and Combined Military Hospital Rawalpindi during February and March 2014. A Venous Blood Sampling questionnaire (VBSQ) was filled by the staff on voluntary basis in front of the researchers. The staff was briefed on the purpose of the survey before filling the questionnaire. Sample size was 228. Results were analysed using SPSS-21. Results: When asked in the questionnaire, around 61.6 percent of the paramedical staff stated that they cleaned the vein by moving the alcohol swab from inward to outwards while 20.8 percent of the staff reported that they felt the vein after disinfection. On contrary to WHO guidelines, 89.6 percent identified that they had a habit of placing blood in the test tube by holding it in the other hand, which should actually be done after inserting it into the stand. Although 86 percent thought that they had ample knowledge regarding the blood sampling process but they did not practice it properly. Conclusion: Pre analytical blood sampling errors are common in our setup. Eighty six percent participants though thought that they had adequate knowledge regarding blood sampling, but most of them were not adhering to standard protocols. There is a need of continued education and refresher courses. (author)

  11. Non-destructive analyses of cometary nucleus samples using synchrotron radiation

    International Nuclear Information System (INIS)

    Flynn, G.J.; Sutton, S.R.; Rivers, M.L.

    1989-01-01

    Trace element abundances and abundance patterns in meteorites have proven to be diagnostic indicators of nebular and parent body fractionations, formation temperature, thermal metamorphism and, co-genesis. If comets are more primitive samples of the solar nebula than the meteorites, then trace element abundances in the returned comet nucleus samples should be better indicators of primitive solar nebula conditions than those of meteorites. Comet nucleus samples are likely to consist of a mixture of ices and mineral grains. To provide a complete picture of the elemental distributions, trace element abundance data on the bulk material, as well as separated mineral grains and ices, will be required. This paper discusses the present and future analytical capabilities. 22 refs., 2 figs

  12. Protocol for collecting eDNA samples from streams [Version 2.3

    Science.gov (United States)

    K. J. Carim; T. Wilcox; M. K. Young; K. S. McKelvey; M. K. Schwartz

    2015-01-01

    Throughout the 2014 field season, we had over two dozen biologist throughout the western US collect over 300 samples for eDNA analysis with paired controls. Control samples were collected by filtering 0.5 L of distilled water. No samples had any evidence of field contamination. This method of sampling verifies the cleanliness of the field equipment, as well as the...

  13. Water-quality assessment of south-central Texas : comparison of water quality in surface-water samples collected manually and by automated samplers

    Science.gov (United States)

    Ging, Patricia B.

    1999-01-01

    Surface-water sampling protocols of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program specify samples for most properties and constituents to be collected manually in equal-width increments across a stream channel and composited for analysis. Single-point sampling with an automated sampler (autosampler) during storms was proposed in the upper part of the South-Central Texas NAWQA study unit, raising the question of whether property and constituent concentrations from automatically collected samples differ significantly from those in samples collected manually. Statistical (Wilcoxon signed-rank test) analyses of 3 to 16 paired concentrations for each of 26 properties and constituents from water samples collected using both methods at eight sites in the upper part of the study unit indicated that there were no significant differences in concentrations for dissolved constituents, other than calcium and organic carbon.

  14. A novel porcine cell culture based protocol for the propagation of hepatitis E virus

    Directory of Open Access Journals (Sweden)

    Walter Chingwaru

    2016-08-01

    Full Text Available Objective: To present a comprehensive protocol for the processing of hepatitis E virus (HEV infected samples and propagation of the virus in primary cell cultures. Methods: Hepatitis E was extracted from porcine liver and faecal samples following standard protocols. The virus was then allowed to attach in the presence of trypsin to primary cells that included porcine and bovine intestinal epithelial cells and macrophages over a period of up to 3 h. The virus was propagated by rotational passaging through the cell cultures. Propagation was confirmed by immunoblotting. Results: We developed a comprehensive protocol to propagate HEV in porcine cell model that includes (i rotational culturing of the virus between porcine cell types, (ii pre-incubation of infected cells for 210 min, (iii use of a semi-complete cell culture medium supplemented with trypsin (0.33 µg/mL and (iv the use of simple immunoblot technique to detect the amplified virus based on the open reading frame 2/3. Conclusions: This protocol opens doors towards systematic analysis of the mechanisms that underlie the pathogenesis of HEV in vitro. Using our protocol, one can complete the propagation process within 6 to 9 d.

  15. Exact Throughput Analyses of Energy-Harvesting Cooperation Scheme with Best Relay Selections Under I/Q Imbalance

    Directory of Open Access Journals (Sweden)

    Tan Phuoc Huynh

    2017-01-01

    Full Text Available In this paper, we propose an energy-harvesting cooperation scheme in which relays suffer in-phase and quadrature-phase imbalances (IQI and harvest energy from a wireless transmit source. A best relay is selected based on end-to-end signal-to-interference-plus-noise ratios (SINRs in both Amplify-and-Forward (called an EHAF protocol and Decode-and-Forward (called an EHDF protocol cooperation methods. We analyze and evaluate the system performance in terms of exact closed-form throughputs over Rayleigh fading channels. Simulation and analysis results discover contributions as follows. Firstly, the throughput performance of the proposed protocols EHAF and EHDF is improved when comparing with that of a non-selection cooperation scheme. Secondly, the EHDF protocol is more efficient than the EHAF protocol. Finally, the theoretical analyses are validated by performing Monte Carlo simulations.

  16. Improved reproducibility in genome-wide DNA methylation analysis for PAXgene® fixed samples compared to restored FFPE DNA

    DEFF Research Database (Denmark)

    Andersen, Gitte Brinch; Hager, Henrik; Hansen, Lise Lotte

    2014-01-01

    Chip. Quantitative DNA methylation analysis demonstrated that the methylation profile in PAXgene-fixed tissues showed, in comparison with restored FFPE samples, a higher concordance with the profile detected in frozen samples. We demonstrate, for the first time, that DNA from PAXgene conserved tissue performs better......Formalin fixation has been the standard method for conservation of clinical specimens for decades. However, a major drawback is the high degradation of nucleic acids, which complicates its use in genome-wide analyses. Unbiased identification of biomarkers, however, requires genome-wide studies......, precluding the use of the valuable archives of specimens with long-term follow-up data. Therefore, restoration protocols for DNA from formalin-fixed and paraffin-embedded (FFPE) samples have been developed, although they are cost-intensive and time-consuming. An alternative to FFPE and snap...

  17. Monitoring well utility in a heterogeneous DNAPL source zone area: Insights from proximal multilevel sampler wells and sampling capture-zone modelling.

    Science.gov (United States)

    McMillan, Lindsay A; Rivett, Michael O; Wealthall, Gary P; Zeeb, Peter; Dumble, Peter

    2018-03-01

    Groundwater-quality assessment at contaminated sites often involves the use of short-screen (1.5 to 3 m) monitoring wells. However, even over these intervals considerable variation may occur in contaminant concentrations in groundwater adjacent to the well screen. This is especially true in heterogeneous dense non-aqueous phase liquid (DNAPL) source zones, where cm-scale contamination variability may call into question the effectiveness of monitoring wells to deliver representative data. The utility of monitoring wells in such settings is evaluated by reference to high-resolution multilevel sampler (MLS) wells located proximally to short-screen wells, together with sampling capture-zone modelling to explore controls upon well sample provenance and sensitivity to monitoring protocols. Field data are analysed from the highly instrumented SABRE research site that contained an old trichloroethene source zone within a shallow alluvial aquifer at a UK industrial facility. With increased purging, monitoring-well samples tend to a flow-weighted average concentration but may exhibit sensitivity to the implemented protocol and degree of purging. Formation heterogeneity adjacent to the well-screen particularly, alongside pump-intake position and water level, influence this sensitivity. Purging of low volumes is vulnerable to poor reproducibility arising from concentration variability predicted over the initial 1 to 2 screen volumes purged. Marked heterogeneity may also result in limited long-term sample concentration stabilization. Development of bespoke monitoring protocols, that consider screen volumes purged, alongside water-quality indicator parameter stabilization, is recommended to validate and reduce uncertainty when interpreting monitoring-well data within source zone areas. Generalised recommendations on monitoring well based protocols are also developed. A key monitoring well utility is their proportionately greater sample draw from permeable horizons constituting

  18. Monitoring well utility in a heterogeneous DNAPL source zone area: Insights from proximal multilevel sampler wells and sampling capture-zone modelling

    Science.gov (United States)

    McMillan, Lindsay A.; Rivett, Michael O.; Wealthall, Gary P.; Zeeb, Peter; Dumble, Peter

    2018-03-01

    Groundwater-quality assessment at contaminated sites often involves the use of short-screen (1.5 to 3 m) monitoring wells. However, even over these intervals considerable variation may occur in contaminant concentrations in groundwater adjacent to the well screen. This is especially true in heterogeneous dense non-aqueous phase liquid (DNAPL) source zones, where cm-scale contamination variability may call into question the effectiveness of monitoring wells to deliver representative data. The utility of monitoring wells in such settings is evaluated by reference to high-resolution multilevel sampler (MLS) wells located proximally to short-screen wells, together with sampling capture-zone modelling to explore controls upon well sample provenance and sensitivity to monitoring protocols. Field data are analysed from the highly instrumented SABRE research site that contained an old trichloroethene source zone within a shallow alluvial aquifer at a UK industrial facility. With increased purging, monitoring-well samples tend to a flow-weighted average concentration but may exhibit sensitivity to the implemented protocol and degree of purging. Formation heterogeneity adjacent to the well-screen particularly, alongside pump-intake position and water level, influence this sensitivity. Purging of low volumes is vulnerable to poor reproducibility arising from concentration variability predicted over the initial 1 to 2 screen volumes purged. Marked heterogeneity may also result in limited long-term sample concentration stabilization. Development of bespoke monitoring protocols, that consider screen volumes purged, alongside water-quality indicator parameter stabilization, is recommended to validate and reduce uncertainty when interpreting monitoring-well data within source zone areas. Generalised recommendations on monitoring well based protocols are also developed. A key monitoring well utility is their proportionately greater sample draw from permeable horizons constituting a

  19. Stability of purgeable VOCs in water samples during pre-analytical holding. Part 2: Analyses by an EPA regional laboratory

    Energy Technology Data Exchange (ETDEWEB)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.L. [Oak Ridge National Lab., TN (United States); Bottrell, D.W. [Dept. of Energy, Germantown, MD (United States)

    1997-03-01

    This study was undertaken to examine the hypothesis that prevalent and priority purgeable VOCs in properly preserved water samples are stable for at least 28 days. For the purposes of this study, VOCs were considered functionally stable if concentrations measured after 28 days did not change by more than 10% from the initial values. An extensive stability experiment was performed on freshly-collected surface water spiked with a suite of 44 purgeable VOCs. The spiked water was then distributed into multiple 40-mL VOC vials with 0.010-in Teflon-lined silicone septum caps prefilled with 250 mg of NaHSO{sub 4} (resulting pH of the water {approximately}2). The samples were sent to a commercial [Analytical Resources, Inc. (ARI)] and EPA (Region IV) laboratory where they were stored at 4 C. On 1, 8, 15, 22, 29, 36, and 71 days after sample preparation, analysts from ARI took 4 replicate samples out of storage and analyzed these samples for purgeable VOCs following EPA/SW846 8260A. A similar analysis schedule was followed by analysts at the EPA laboratory. This document contains the results from the EPA analyses; the ARI results are described in a separate report.

  20. Analyzing the effect of routing protocols on media access control protocols in radio networks

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C. L. (Christopher L.); Drozda, M. (Martin); Marathe, A. (Achla); Marathe, M. V. (Madhav V.)

    2002-01-01

    We study the effect of routing protocols on the performance of media access control (MAC) protocols in wireless radio networks. Three well known MAC protocols: 802.11, CSMA, and MACA are considered. Similarly three recently proposed routing protocols: AODV, DSR and LAR scheme 1 are considered. The experimental analysis was carried out using GloMoSim: a tool for simulating wireless networks. The main focus of our experiments was to study how the routing protocols affect the performance of the MAC protocols when the underlying network and traffic parameters are varied. The performance of the protocols was measured w.r.t. five important parameters: (i) number of received packets, (ii) average latency of each packet, (iii) throughput (iv) long term fairness and (v) number of control packets at the MAC layer level. Our results show that combinations of routing and MAC protocols yield varying performance under varying network topology and traffic situations. The result has an important implication; no combination of routing protocol and MAC protocol is the best over all situations. Also, the performance analysis of protocols at a given level in the protocol stack needs to be studied not locally in isolation but as a part of the complete protocol stack. A novel aspect of our work is the use of statistical technique, ANOVA (Analysis of Variance) to characterize the effect of routing protocols on MAC protocols. This technique is of independent interest and can be utilized in several other simulation and empirical studies.

  1. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    Science.gov (United States)

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  2. Cow-specific diet digestibility predictions based on near-infrared reflectance spectroscopy scans of faecal samples.

    Science.gov (United States)

    Mehtiö, T; Rinne, M; Nyholm, L; Mäntysaari, P; Sairanen, A; Mäntysaari, E A; Pitkänen, T; Lidauer, M H

    2016-04-01

    This study was designed to obtain information on prediction of diet digestibility from near-infrared reflectance spectroscopy (NIRS) scans of faecal spot samples from dairy cows at different stages of lactation and to develop a faecal sampling protocol. NIRS was used to predict diet organic matter digestibility (OMD) and indigestible neutral detergent fibre content (iNDF) from faecal samples, and dry matter digestibility (DMD) using iNDF in feed and faecal samples as an internal marker. Acid-insoluble ash (AIA) as an internal digestibility marker was used as a reference method to evaluate the reliability of NIRS predictions. Feed and composite faecal samples were collected from 44 cows at approximately 50, 150 and 250 days in milk (DIM). The estimated standard deviation for cow-specific organic matter digestibility analysed by AIA was 12.3 g/kg, which is small considering that the average was 724 g/kg. The phenotypic correlation between direct faecal OMD prediction by NIRS and OMD by AIA over the lactation was 0.51. The low repeatability and small variability estimates for direct OMD predictions by NIRS were not accurate enough to quantify small differences in OMD between cows. In contrast to OMD, the repeatability estimates for DMD by iNDF and especially for direct faecal iNDF predictions were 0.32 and 0.46, respectively, indicating that developing of NIRS predictions for cow-specific digestibility is possible. A data subset of 20 cows with daily individual faecal samples was used to develop an on-farm sampling protocol. Based on the assessment of correlations between individual sample combinations and composite samples as well as repeatability estimates for individual sample combinations, we found that collecting up to three individual samples yields a representative composite sample. Collection of samples from all the cows of a herd every third month might be a good choice, because it would yield a better accuracy. © 2015 Blackwell Verlag GmbH.

  3. A Novel Inspection Protocol to Detect Volatile Compounds in Breast Surgery Electrocautery Smoke

    Directory of Open Access Journals (Sweden)

    Yu-Wen Lin

    2010-07-01

    Conclusion: The sampling protocol enabled acquisition of smoke samples near the source without interrupting surgery. The findings suggest that type of surgery, patient body mass index and duration of electrocautery are factors that can alter production of chemicals.

  4. An attempt to validate serum and plasma as sample matrices for analyses of polychlorobiphenylols

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, J.; Bergman, Aa. [Stockholm Univ. (Sweden). Dept. of Environmental Chemistry; Bignert, A. [Museum of Natural History (Sweden)

    2004-09-15

    Polychlorinated biphenyls (PCBs) form hydroxylated metabolites (OH-PCBs), as reported both from wildlife and from experimental animal studies already in the early 1970s'. However, the interest increased in OH-PCBs from the mid 1990s' depending on the discovery that some OHPCB congeners are strongly retained in the blood of birds, fish and mammals, including humans. The interest is linked to the fact that OH-PCBs is strongly, but reversibly, bound to the blood protein transthyretin (TTR). It is reasonable to believe that the strong TTR binding may have toxicological impact, probably related to endocrine type effects. Importantly, OH-PCBs are present in blood at far higher concentrations than in any other compartment in the body, which is dependent on the physico-chemical characteristics of the phenols. Analyses of OH-PCBs have thus been concentrated to whole blood, plasma or serum. Still there is no comparison between the three sample types even though it is clear that whole blood is not optimal due to the large proportion of haemoglobin in the sample that make the clean up more difficult than if plasma or serum is selected for analysis. In the present study we have addressed two questions: First we have looked at any potential differences in the analytical results of OH-PCBs when using serum and plasma for extraction and clean up; Second, the serum and plasma applied in the validation has been unfrozen, frozen (at -20 C) for two months and frozen for twenty months, respectively.

  5. Settleability assessment protocol for anaerobic granular sludge and ...

    African Journals Online (AJOL)

    The re revealed that the protocol was sufficiently sensitive to define the settleability of the sludge samples and to accurately determ their allowable upflow velocities, resultant organic loading rates, and recycling ratios according to the settleability of the gran bed. Also, a series of graphical procedures with settling tests which ...

  6. Influence of Sampling Season and Sampling Protocol on Detection of Legionella Pneumophila Contamination in Hot Water / Paraugu Ņemšanas Sezonalitātes Un Paraugu Ņemšanas Metodes Ietekme Uz Legionella Pneumophila Kontaminācijas Noteikšanu Karstajš Ūdenī

    Directory of Open Access Journals (Sweden)

    Pūle Daina

    2016-08-01

    Full Text Available Legionella pneumophila is an environmental pathogen of engineered water systems that can cause different forms of legionellosis - from mild fever to potentially lethal pneumonia. Low concentrations of legionellae in natural habitats can increase markedly in engineered hot water systems where water temperatures are below 55 °C. In the current study, we aimed to investigate the influence of sampling season, hot water temperature and sampling protocol on occurrence of L. pneumophila. A total of 120 hot water samples from 20 apartment buildings were collected in two sampling periods - winter 2014 (n = 60 and summer 2015 (n = 60. Significantly higher occurrence of L. pneumophila was observed in summer 2015. Significant differences in temperature for negative and positive samples were not observed, which can be explained by low water temperatures at the point of water consumption. Temperature above 55 °C was observed only once, for all other sampling events it ranged from 14 °C to 53 °C.

  7. Composition analyses of size-resolved aerosol samples taken from aircraft downwind of Kuwait, Spring 1991

    Energy Technology Data Exchange (ETDEWEB)

    Cahill, T.A.; Wilkinson, K. [Univ. of California, Davis, CA (United States); Schnell, R. [National Center for Atmospheric Research, Boulder, CO (United States)

    1992-09-20

    Analyses are reported for eight aerosol samples taken from the National Center for Atmospheric Research Electra typically 200 to 250 km downwind of Kuwait between May 19 and June 1, 1991. Aerosols were separated into fine (D{sub p} < 2.5 {mu}m) and coarse (2.5 < D{sub p} 10 {mu}m) particles for optical, gravimetric, X ray and nuclear analyses, yielding information on the morphology, mass, and composition of aerosols downwind of Kuwait. The mass of coarse aerosols ranged between 60 and 1971 {mu}g/m{sup 3} and, while dominated by soil derived aerosols, contained considerable content of sulfates and salt (NaCl) and soot in the form of fluffy agglomerates. The mass of fine aerosols varied between 70 and 785 {mu}g/m{sup 3}, of which about 70% was accounted for via compositional analyses performed in vacuum. While most components varied greatly from flight to flight, organic matter and fine soils each accounted for about 1/4 of the fine mass, while salt and sulfates contributed about 10% and 7%, respectively. The Cl/S ratios were remarkably constant, 2.4 {+-} 1.2 for coarse particles and 2.0 {+-} 0.2 for fine particles, with one flight deleted in each case. Vanadium, when observed, ranged from 9 to 27 ng/m{sup 3}, while nickel ranged from 5 to 25 ng/m{sup 3}. In fact, fine sulfates, vanadium, and nickel occurred in levels typical of Los Angeles, California, during summer 1986. The V/Ni ratio, 1.7 {+-} 0.4, was very similar to the ratios measured in fine particles from combusted Kuwaiti oil, 1.4 {+-} 0.9. Bromine, copper, zinc, and arsenic/lead were also observed at levels between 2 and 190 ng/m{sup 3}. The presence of massive amounts of fine, typically alkaline soils in the Kuwaiti smoke plumes significantly modified their behavior and probably mitigated their impacts, locally and globally. 16 refs., 1 fig., 3 tabs.

  8. Current developments in forensic interpretation of mixed DNA samples (Review)

    Science.gov (United States)

    HU, NA; CONG, BIN; LI, SHUJIN; MA, CHUNLING; FU, LIHONG; ZHANG, XIAOJING

    2014-01-01

    A number of recent improvements have provided contemporary forensic investigations with a variety of tools to improve the analysis of mixed DNA samples in criminal investigations, producing notable improvements in the analysis of complex trace samples in cases of sexual assult and homicide. Mixed DNA contains DNA from two or more contributors, compounding DNA analysis by combining DNA from one or more major contributors with small amounts of DNA from potentially numerous minor contributors. These samples are characterized by a high probability of drop-out or drop-in combined with elevated stutter, significantly increasing analysis complexity. At some loci, minor contributor alleles may be completely obscured due to amplification bias or over-amplification, creating the illusion of additional contributors. Thus, estimating the number of contributors and separating contributor genotypes at a given locus is significantly more difficult in mixed DNA samples, requiring the application of specialized protocols that have only recently been widely commercialized and standardized. Over the last decade, the accuracy and repeatability of mixed DNA analyses available to conventional forensic laboratories has greatly advanced in terms of laboratory technology, mathematical models and biostatistical software, generating more accurate, rapid and readily available data for legal proceedings and criminal cases. PMID:24748965

  9. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Revised

    Science.gov (United States)

    Fargion, Giulietta S.; Mueller, James L.

    2000-01-01

    The document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. This document supersedes the earlier version (Mueller and Austin 1995) published as Volume 25 in the SeaWiFS Technical Report Series. This document marks a significant departure from, and improvement on, theformat and content of Mueller and Austin (1995). The authorship of the protocols has been greatly broadened to include experts specializing in some key areas. New chapters have been added to provide detailed and comprehensive protocols for stability monitoring of radiometers using portable sources, abovewater measurements of remote-sensing reflectance, spectral absorption measurements for discrete water samples, HPLC pigment analysis and fluorometric pigment analysis. Protocols were included in Mueller and Austin (1995) for each of these areas, but the new treatment makes significant advances in each topic area. There are also new chapters prescribing protocols for calibration of sun photometers and sky radiance sensors, sun photometer and sky radiance measurements and analysis, and data archival. These topic areas were barely mentioned in Mueller and Austin (1995).

  10. Assessing respiratory pathogen communities in bighorn sheep populations: Sampling realities, challenges, and improvements.

    Directory of Open Access Journals (Sweden)

    Carson J Butler

    Full Text Available Respiratory disease has been a persistent problem for the recovery of bighorn sheep (Ovis canadensis, but has uncertain etiology. The disease has been attributed to several bacterial pathogens including Mycoplasma ovipneumoniae and Pasteurellaceae pathogens belonging to the Mannheimia, Bibersteinia, and Pasteurella genera. We estimated detection probability for these pathogens using protocols with diagnostic tests offered by a fee-for-service laboratory and not offered by a fee-for-service laboratory. We conducted 2861 diagnostic tests on swab samples collected from 476 bighorn sheep captured across Montana and Wyoming to gain inferences regarding detection probability, pathogen prevalence, and the power of different sampling methodologies to detect pathogens in bighorn sheep populations. Estimated detection probability using fee-for-service protocols was less than 0.50 for all Pasteurellaceae and 0.73 for Mycoplasma ovipneumoniae. Non-fee-for-service Pasteurellaceae protocols had higher detection probabilities, but no single protocol increased detection probability of all Pasteurellaceae pathogens to greater than 0.50. At least one protocol resulted in an estimated detection probability of 0.80 for each pathogen except Mannheimia haemolytica, for which the highest detection probability was 0.45. In general, the power to detect Pasteurellaceae pathogens at low prevalence in populations was low unless many animals were sampled or replicate samples were collected per animal. Imperfect detection also resulted in low precision when estimating prevalence for any pathogen. Low and variable detection probabilities for respiratory pathogens using live-sampling protocols may lead to inaccurate conclusions regarding pathogen community dynamics and causes of bighorn sheep respiratory disease epizootics. We recommend that agencies collect multiples samples per animal for Pasteurellaceae detection, and one sample for Mycoplasma ovipneumoniae detection from

  11. Marine sediment sample pre-processing for macroinvertebrates metabarcoding: mechanical enrichment and homogenization

    Directory of Open Access Journals (Sweden)

    Eva Aylagas

    2016-10-01

    Full Text Available Metabarcoding is an accurate and cost-effective technique that allows for simultaneous taxonomic identification of multiple environmental samples. Application of this technique to marine benthic macroinvertebrate biodiversity assessment for biomonitoring purposes requires standardization of laboratory and data analysis procedures. In this context, protocols for creation and sequencing of amplicon libraries and their related bioinformatics analysis have been recently published. However, a standardized protocol describing all previous steps (i.e. processing and manipulation of environmental samples for macroinvertebrate community characterization is lacking. Here, we provide detailed procedures for benthic environmental sample collection, processing, enrichment for macroinvertebrates, homogenization, and subsequent DNA extraction for metabarcoding analysis. Since this is the first protocol of this kind, it should be of use to any researcher in this field, having the potential for improvement.

  12. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  13. Novel methodology to isolate microplastics from vegetal-rich samples.

    Science.gov (United States)

    Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T

    2018-04-01

    Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Comparison of two cooling protocols for llama semen: with and without collagenase and seminal plasma in the medium.

    Science.gov (United States)

    Carretero, M I; Giuliano, S M; Arraztoa, C C; Santa Cruz, R C; Fumuso, F G; Neild, D M

    2017-08-01

    Seminal plasma (SP) of South American Camelids could interfere with the interaction of spermatozoa with the extenders; therefore it becomes necessary to improve semen management using enzymatic treatment. Our objective was to compare two cooling protocols for llama semen. Twelve ejaculates were incubated in 0.1% collagenase and then were divided into two aliquots. One was extended in lactose and egg yolk (LEY) (Protocol A: collagenase and SP present). The other aliquot was centrifuged, and the pellet was resuspended in LEY (Protocol B: collagenase and SP absent). Both samples were maintained at 5°C during 24 hr. Routine and DNA evaluations were carried out in raw and cooled semen. Both cooling protocols maintained sperm viability, membrane function and DNA fragmentation, with Protocol A showing a significantly lowered total and progressive motility (p semen samples prior to either cooling or freeze-thawing. © 2016 Blackwell Verlag GmbH.

  15. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  16. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  17. Contamination Levels and Identification of Bacteria in Milk Sampled from Three Regions of Tanzania: Evidence from Literature and Laboratory Analyses

    Directory of Open Access Journals (Sweden)

    G. Msalya

    2017-01-01

    Full Text Available Milk in Tanzania has been reported to be contaminated with large number of bacteria. This is because (1 milk is obtained from animals with unknown health status, (2 good milking and handling practices are to a large extent not observed, and (3 marketing and distribution are done in informal channels. These factors are potential causes of milk-borne diseases and milk quality loss. The aim of this study was to assess nutritional risks in milk as reported in literature over a period of 20 years and through analyses of samples collected during the present study. The issues highlighted in literature were high bacteria and coliform counts exceeding standard levels in East Africa, prevalence of bacteria and drug residues in milk, and adulteration. Based on performed analyses, total bacterial count 1.0×107 colony forming units per millilitre (cfu/ml and total coliform count 1.1×107 cfu/ml, also greater than recommended levels, were found. Ten bacteria types were isolated from milk samples (five, Pseudomonas aeruginosa, Listeria monocytogenes, Listeria innocua, Listeria ivanovii, and Klebsiella spp. are reported in Tanzanian for the first time. Two drugs tetracycline and sulphur were detected. Therefore, it is worth noting that integrated research is needed to evaluate the situation and address these challenges.

  18. A high-throughput sample preparation method for cellular proteomics using 96-well filter plates.

    Science.gov (United States)

    Switzar, Linda; van Angeren, Jordy; Pinkse, Martijn; Kool, Jeroen; Niessen, Wilfried M A

    2013-10-01

    A high-throughput sample preparation protocol based on the use of 96-well molecular weight cutoff (MWCO) filter plates was developed for shotgun proteomics of cell lysates. All sample preparation steps, including cell lysis, buffer exchange, protein denaturation, reduction, alkylation and proteolytic digestion are performed in a 96-well plate format, making the platform extremely well suited for processing large numbers of samples and directly compatible with functional assays for cellular proteomics. In addition, the usage of a single plate for all sample preparation steps following cell lysis reduces potential samples losses and allows for automation. The MWCO filter also enables sample concentration, thereby increasing the overall sensitivity, and implementation of washing steps involving organic solvents, for example, to remove cell membranes constituents. The optimized protocol allowed for higher throughput with improved sensitivity in terms of the number of identified cellular proteins when compared to an established protocol employing gel-filtration columns. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Microplastics in seafood: Benchmark protocol for their extraction and characterization.

    Science.gov (United States)

    Dehaut, Alexandre; Cassone, Anne-Laure; Frère, Laura; Hermabessiere, Ludovic; Himber, Charlotte; Rinnert, Emmanuel; Rivière, Gilles; Lambert, Christophe; Soudant, Philippe; Huvet, Arnaud; Duflos, Guillaume; Paul-Pont, Ika

    2016-08-01

    Pollution of the oceans by microplastics (studies have investigated the level of contamination of marine organisms collected in situ. For extraction and characterization of microplastics in biological samples, the crucial step is the identification of solvent(s) or chemical(s) that efficiently dissolve organic matter without degrading plastic polymers for their identification in a time and cost effective way. Most published papers, as well as OSPAR recommendations for the development of a common monitoring protocol for plastic particles in fish and shellfish at the European level, use protocols containing nitric acid to digest the biological tissues, despite reports of polyamide degradation with this chemical. In the present study, six existing approaches were tested and their effects were compared on up to 15 different plastic polymers, as well as their efficiency in digesting biological matrices. Plastic integrity was evaluated through microscopic inspection, weighing, pyrolysis coupled with gas chromatography and mass spectrometry, and Raman spectrometry before and after digestion. Tissues from mussels, crabs and fish were digested before being filtered on glass fibre filters. Digestion efficiency was evaluated through microscopical inspection of the filters and determination of the relative removal of organic matter content after digestion. Five out of the six tested protocols led to significant degradation of plastic particles and/or insufficient tissue digestion. The protocol using a KOH 10% solution and incubation at 60 °C during a 24 h period led to an efficient digestion of biological tissues with no significant degradation on all tested polymers, except for cellulose acetate. This protocol appeared to be the best compromise for extraction and later identification of microplastics in biological samples and should be implemented in further monitoring studies to ensure relevance and comparison of environmental and seafood product quality studies

  20. Houdbaarheid en conservering van grondwatermonsters voor anorganische analyses

    NARCIS (Netherlands)

    Cleven RFMJ; Gast LFL; Boshuis-Hilverdink ME; LAC

    1995-01-01

    The storage life and the possibilities for preservation of inorganic analyses of groundwater samples have been investigated. Groundwater samples, with and without preservation with acid, from four locations in the Netherlands have been analysed ten times over a period of three months on six

  1. A Comparison Between Inter-Asterisk eXchange Protocol and Jingle Protocol: Session Time

    Directory of Open Access Journals (Sweden)

    H. S. Haj Aliwi

    2016-08-01

    Full Text Available Over the last few years, many multimedia conferencing and Voice over Internet Protocol (VoIP applications have been developed due to the use of signaling protocols in providing video, audio and text chatting services between at least two participants. This paper compares between two widely common signaling protocols: InterAsterisk eXchange Protocol (IAX and the extension of the eXtensible Messaging and Presence Protocol (Jingle in terms of delay time during call setup, call teardown, and media sessions.

  2. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    Science.gov (United States)

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  3. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    Directory of Open Access Journals (Sweden)

    Alavalapati Goutham Reddy

    Full Text Available Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  4. Transgenic mouse - Methods and protocols, 2nd edition

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi

    2011-09-01

    Full Text Available Marten H. Hofner (from the Dept. of Pathology of the Groningen University and Jan M. van Deursen (from the Mayo College of Medicine at Rochester, MN, USA provided us with the valuable second edition of Transgenic mouse: in fact, eventhough we are in the –omics era and already equipped with the state-of-the-art techniques in whatsoever field, still we need to have gene(s functional analysis data to understand common and complex deseases. Transgenesis is still an irreplaceable method and protocols to well perform it are more than welcome. Here, how to get genetic modified mice (the quintessential model of so many human deseases considering how much of the human genes are conserved in the mouse and the great block of genic synteny existing between the two genomes is analysed in deep and presented in clearly detailed step by step protocols....

  5. No effects of functional exercise therapy on walking biomechanics in patients with knee osteoarthritis: exploratory outcome analyses from a randomised trial.

    Science.gov (United States)

    Henriksen, Marius; Klokker, Louise; Bartholdy, Cecilie; Schjoedt-Jorgensen, Tanja; Bandak, Elisabeth; Bliddal, Henning

    2016-01-01

    To assess the effects of a functional and individualised exercise programme on gait biomechanics during walking in people with knee OA. Sixty participants were randomised to 12 weeks of facility-based functional and individualised neuromuscular exercise therapy (ET), 3 sessions per week supervised by trained physical therapists, or a no attention control group (CG). Three-dimensional gait analyses were used, from which a comprehensive list of conventional gait variables were extracted (totally 52 kinematic, kinetic and spatiotemporal variables). According to the protocol, the analyses were based on the 'Per-Protocol' population (defined as participants following the protocol with complete and valid gait analyses). Analysis of covariance adjusting for the level at baseline was used to determine differences between groups (95% CIs) in the changes from baseline at follow-up. The per-protocol population included 46 participants (24 ET/22 CG). There were no group differences in the analysed gait variables, except for a significant group difference in the second peak knee flexor moment and second peak vertical ground reaction force. While plausible we have limited confidence in the findings due to multiple statistical tests and lack of biomechanical logics. Therefore we conclude that a 12-week supervised individualised neuromuscular exercise programme has no effects on gait biomechanics. Future studies should focus on exercise programmes specifically designed to alter gait patterns, or include other measures of mobility, such as walking on stairs or inclined surfaces. ClinicalTrials.gov: NCT01545258.

  6. Adaptive control of theophylline therapy: importance of blood sampling times.

    Science.gov (United States)

    D'Argenio, D Z; Khakmahd, K

    1983-10-01

    A two-observation protocol for estimating theophylline clearance during a constant-rate intravenous infusion is used to examine the importance of blood sampling schedules with regard to the information content of resulting concentration data. Guided by a theory for calculating maximally informative sample times, population simulations are used to assess the effect of specific sampling times on the precision of resulting clearance estimates and subsequent predictions of theophylline plasma concentrations. The simulations incorporated noise terms for intersubject variability, dosing errors, sample collection errors, and assay error. Clearance was estimated using Chiou's method, least squares, and a Bayesian estimation procedure. The results of these simulations suggest that clinically significant estimation and prediction errors may result when using the above two-point protocol for estimating theophylline clearance if the time separating the two blood samples is less than one population mean elimination half-life.

  7. MANET Performance for Source and Destination Moving Scenarios Considering OLSR and AODV protocols

    Directory of Open Access Journals (Sweden)

    Elis Kulla

    2010-01-01

    Full Text Available Recently, a great interest is shown in MANETs potential usage and applications in several fields such as military activities, rescue operations and time-critical applications. In this work, we implement and analyse a MANET testbed considering AODV and OLSR protocols for wireless multi-hop networking. We investigate the effect of mobility and topology changing in MANET and evaluate the performance of the network through experiments in a real environment. The performance assessment of our testbed is done considering throughput, number of dropped packets and delay. We designed four scenarios: Static, Source Moving, Destination Moving and Source-Destination Moving. From our experimental results, we concluded that when the communicating nodes are moving and the routes change quickly, OLSR (as a proactive protocol performs better than AODV, which is a reactive protocol.

  8. Designing protocols for the human teeth biobank of the Universidad Nacional de Colombia

    Directory of Open Access Journals (Sweden)

    Lina Constanza Gonzáles-Pita

    2014-07-01

    Full Text Available Protocols in a Tooth Bank are essential in order to assure smooth operation, reproducibility and standardization that minimize cross contamination, maintain original characteristics and physicochemical properties of teeth, fulll ethical and legal regulations and a proper disposal of residues. Objective: to propose the disinfection, storing and transportation protocols for the UNTB. Methods: A literature search was conducted using the words “teeth, human, tooth bank, disinfection, sterilization, storage, organization, biosecurity, biobank, protocol, prevention” in the Pubmed, Science Direct and Scielo databases. 37 papers ranging from 1988 up to 2014 were selected. International and Colombian ethical and legal regulations for organ donation, handling and investigation were taken into account as well as laboratory observations and chemical basic principles gained through several undergraduate and graduate thesis. All this input was carefully studied, analysed and critically modied for setting the recommended processes for the conversion of donated teeth into organs suitable for research. Results: Collection, transportation, cleaning/disinfection and storing protocols were planned and elaborated. Conclusions: Based on scientic literature, national and international regulations and experimental experience, several protocols for the UNTB were presented.

  9. Association between organisational and workplace cultures, and patient outcomes: systematic review protocol.

    Science.gov (United States)

    Braithwaite, J; Herkes, J; Ludlow, K; Lamprell, G; Testa, L

    2016-12-01

    Despite widespread interest in the topic, no current synthesis of research is available analysing the linkages between organisational or workplace cultures on the one hand, and patient outcomes on the other. This protocol proposes a systematic review to analyse and synthesise the literature to date on this topic. The resulting review will discuss characteristics of included studies in terms of the type of healthcare settings researched, the measurements of organisational and workplace culture, patient outcomes measured and the influence of these cultures on patient outcomes. A systematic review will be conducted aiming to examine the associations between organisational and workplace cultures, and patient outcomes, guided by the Preferred Reporting Items for Systematic review and Meta-Analysis Protocols (PRISMA-P) statement. An English language search of abstracts will be executed using the following academic databases: CINAHL, EMBASE, Ovid MEDLINE, Web of Science and PsycINFO. The review will include relevant peer-reviewed articles from randomised controlled trials (RCTs), non-RCTs, controlled before and after studies, interrupted time series studies, cross-sectional analyses, qualitative studies and mixed-method studies. Multiple researchers will be involved in assessing the quality of articles for inclusion in the review. This protocol documents a detailed search strategy, including terms and inclusion criteria, which will form the basis of the subsequent systematic review. Ethics approval is not required as no primary data will be collected. Results will be disseminated through a peer-reviewed publication and conference presentations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. Systematic review of economic analyses in patient safety: a protocol designed to measure development in the scope and quality of evidence.

    Science.gov (United States)

    Carter, Alexander W; Mandavia, Rishi; Mayer, Erik; Marti, Joachim; Mossialos, Elias; Darzi, Ara

    2017-08-18

    Recent avoidable failures in patient care highlight the ongoing need for evidence to support improvements in patient safety. According to the most recent reviews, there is a dearth of economic evidence related to patient safety. These reviews characterise an evidence gap in terms of the scope and quality of evidence available to support resource allocation decisions. This protocol is designed to update and improve on the reviews previously conducted to determine the extent of methodological progress in economic analyses in patient safety. A broad search strategy with two core themes for original research (excluding opinion pieces and systematic reviews) in 'patient safety' and 'economic analyses' has been developed. Medline, Econlit and National Health Service Economic Evaluation Database bibliographic databases will be searched from January 2007 using a combination of medical subject headings terms and research-derived search terms (see table 1). The method is informed by previous reviews on this topic, published in 2012. Screening, risk of bias assessment (using the Cochrane collaboration tool) and economic evaluation quality assessment (using the Drummond checklist) will be conducted by two independent reviewers, with arbitration by a third reviewer as needed. Studies with a low risk of bias will be assessed using the Drummond checklist. High-quality economic evaluations are those that score >20/35. A qualitative synthesis of evidence will be performed using a data collection tool to capture the study design(s) employed, population(s), setting(s), disease area(s), intervention(s) and outcome(s) studied. Methodological quality scores will be compared with previous reviews where possible. Effect size(s) and estimate uncertainty will be captured and used in a quantitative synthesis of high-quality evidence, where possible. Formal ethical approval is not required as primary data will not be collected. The results will be disseminated through a peer

  11. A novel method for sample preparation of fresh lung cancer tissue for proteomics analysis by tumor cell enrichment and removal of blood contaminants

    Directory of Open Access Journals (Sweden)

    Orre Lotta

    2010-02-01

    Full Text Available Abstract Background In-depth proteomics analyses of tumors are frequently biased by the presence of blood components and stromal contamination, which leads to large experimental variation and decreases the proteome coverage. We have established a reproducible method to prepare freshly collected lung tumors for proteomics analysis, aiming at tumor cell enrichment and reduction of plasma protein contamination. We obtained enriched tumor-cell suspensions (ETS from six lung cancer cases (two adenocarcinomas, two squamous-cell carcinomas, two large-cell carcinomas and from two normal lung samples. The cell content of resulting ETS was evaluated with immunocytological stainings and compared with the histologic pattern of the original specimens. By means of a quantitative mass spectrometry-based method we evaluated the reproducibility of the sample preparation protocol and we assessed the proteome coverage by comparing lysates from ETS samples with the direct lysate of corresponding fresh-frozen samples. Results Cytological analyses on cytospin specimens showed that the percentage of tumoral cells in the ETS samples ranged from 20% to 70%. In the normal lung samples the percentage of epithelial cells was less then 10%. The reproducibility of the sample preparation protocol was very good, with coefficient of variation at the peptide level and at the protein level of 13% and 7%, respectively. Proteomics analysis led to the identification of a significantly higher number of proteins in the ETS samples than in the FF samples (244 vs 109, respectively. Albumin and hemoglobin were among the top 5 most abundant proteins identified in the FF samples, showing a high contamination with blood and plasma proteins, whereas ubiquitin and the mitochondrial ATP synthase 5A1 where among the top 5 most abundant proteins in the ETS samples. Conclusion The method is feasible and reproducible. We could obtain a fair enrichment of cells but the major benefit of the method

  12. Comparisons of PGA and INAA in the analyses of meteorite samples

    International Nuclear Information System (INIS)

    Wee Boon Siong; Ebihara, M.; Abdul Khalik Wood

    2010-01-01

    Prompt gamma-ray analysis (PGA) and instrumental neutron activation analysis (INAA) are suitable methods for multi-elemental determinations in various samples. These two methods are complementary because PGA is capable of analyzing most major and minor elements in rock samples whereas INAA is more superior in determining minor and trace elements. Both PGA and INAA are essential for the study of rare samples such as meteorites because of non-destructivity and relatively being free from contaminations. Samples for PGA can be reused for INAA, which help to reduce the sample usage. This project aims to utilize PGA and INAA techniques for comparative study and apply them to meteorites. In this study, 11 meteorite samples received from the Meteorite Working Group of NASA were analyzed. The Allende meteorite powder was included as quality control material. Results from PGA and INAA for Allende showed good agreement with literature values, signifying the reliabilities of these two methods. Elements Al, Ca, Mg, Mn, Na and Ti were determined by both methods and their results are compared. Comparison of PGA and INAA data using linear regression analysis showed correlations coefficients r 2 > 0.90 for Al, Ca, Mn and Ti, 0.85 for Mg, and 0.38 for Na. The PGA results for Na using 472 keV were less accurate due to the interference from the broad B peak. Therefore, Na results from INAA method are preferred. For other elements (Al, Ca, Mg, Mn and Ti), PGA and INAA results can be used as cross-reference for consistency. The PGA and INAA techniques have been applied to meteorite samples and results are comparable to literature values compiled from previously analyzed meteorites. In summary, both PGA and INAA methods give reasonably good agreement and are indispensable in the study of meteorites. (author)

  13. The influence of uncertainties of measurements in laboratory performance evaluation by intercomparison program in radionuclide analyses of environmental samples

    International Nuclear Information System (INIS)

    Tauhata, L.; Vianna, M.E.; Oliveira, A.E. de; Clain, A.F.; Ferreira, A.C.M.; Bernardes, E.M.

    2000-01-01

    The accuracy and precision of results of the radionuclide analyses in environmental samples are widely claimed internationally due to its consequences in the decision process coupled to evaluation of environmental pollution, impact, internal and external population exposure. These characteristics of measurement of the laboratories can be shown clearly using intercomparison data, due to the existence of a reference value and the need of three determinations for each analysis. In intercomparison studies accuracy in radionuclide assays in low-level environmental samples has usually been the main focus in performance evaluation and it can be estimated by taking into account the deviation between the experimental laboratory mean value and the reference value. The laboratory repeatability of measurements or their standard deviation is seldom included in performance evaluation. In order to show the influence of the uncertainties in performance evaluation of the laboratories, data of 22 intercomparison runs which distributed 790 spiked environmental samples to 20 Brazilian participant laboratories were compared, using the 'Normalised Standard Deviation' as statistical criteria for performance evaluation of U.S.EPA. It mainly takes into account the laboratory accuracy and the performance evaluation using the same data classified by normalised standard deviation modified by a weight reactor that includes the individual laboratory uncertainty. The results show a relative decrease in laboratory performance in each radionuclide assay: 1.8% for 65 Zn, 2.8% for 40 K, 3.4 for 60 Co, 3.7% for 134 Cs, 4.0% for 137 Cs, 4.4% for Th and U nat , 4.5% for 3 H, 6.3% for 133 Ba, 8.6% for 90 Sr, 10.6% for Gross Alpha, 10.9% for 106 Ru, 11.1% for 226 Ra, 11.5% for Gross Beta and 13.6% for 228 Ra. The changes in the parameters of the statistical distribution function were negligible and the distribution remained as Gaussian type for all radionuclides analysed. Data analyses in terms of

  14. CaPiTo: protocol stacks for services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    CaPiTo allows the modelling of service-oriented applications using process algebras at three levels of abstraction. The abstract level focuses on the key functionality of the services; the plug-in level shows how to obtain security using standardised protocol stacks; finally, the concrete level...... allows to consider how security is obtained using asymmetric and symmetric cryptographic primitives. The CaPiTo approach therefore caters for a variety of developers that need to cooperate on designing and implementing service-oriented applications. We show how to formally analyse CaPiTo specifications...

  15. Independent assessment of matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) sample preparation quality: Effect of sample preparation on MALDI-MS of synthetic polymers.

    Science.gov (United States)

    Kooijman, Pieter C; Kok, Sander; Honing, Maarten

    2017-02-28

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) provides detailed and in-depth information about the molecular characteristics of synthetic polymers. To obtain the most accurate results the sample preparation parameters should be chosen to suit the sample and the aim of the experiment. Because the underlying principles of MALDI are still not fully known, a priori determination of optimal sample preparation protocols is often not possible. Employing an automated sample preparation quality assessment method recently presented by us we quantified the sample preparation quality obtained using various sample preparation protocols. Six conventional matrices with and without added potassium as a cationization agent and six ionic liquid matrices (ILMs) were assessed using poly(ethylene glycol) (PEG), polytetrahydrofuran (PTHF) and poly(methyl methacrylate) (PMMA) as samples. All sample preparation protocols were scored and ranked based on predefined quality parameters and spot-to-spot repeatability. Clearly distinctive preferences were observed in matrix identity and cationization agent for PEG, PTHF and PMMA, as the addition of an excess of potassium cationization agent results in an increased score for PMMA and a contrasting matrix-dependent effect for PTHF and PEG. The addition of excess cationization agent to sample mixtures dissipates any overrepresentation of high molecular weight polymer species. Our results show reduced ionization efficiency and similar sample deposit homogeneity for all tested ILMs, compared with well-performing conventional MALDI matrices. The results published here represent a start in the unsupervised quantification of sample preparation quality for MALDI samples. This method can select the best sample preparation parameters for any synthetic polymer sample and the results can be used to formulate hypotheses on MALDI principles. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Estimating the dim light melatonin onset of adolescents within a 6-h sampling window: the impact of sampling rate and threshold method.

    Science.gov (United States)

    Crowley, Stephanie J; Suh, Christina; Molina, Thomas A; Fogg, Louis F; Sharkey, Katherine M; Carskadon, Mary A

    2016-04-01

    Circadian rhythm sleep-wake disorders (CRSWDs) often manifest during the adolescent years. Measurement of circadian phase such as the dim light melatonin onset (DLMO) improves diagnosis and treatment of these disorders, but financial and time costs limit the use of DLMO phase assessments in clinic. The current analysis aims to inform a cost-effective and efficient protocol to measure the DLMO in older adolescents by reducing the number of samples and total sampling duration. A total of 66 healthy adolescents (26 males) aged 14.8-17.8 years participated in a study; they were required to sleep on a fixed baseline schedule for a week before which they visited the laboratory for saliva collection in dim light (<20 lux). Two partial 6-h salivary melatonin profiles were derived for each participant. Both profiles began 5 h before bedtime and ended 1 h after bedtime, but one profile was derived from samples taken every 30 min (13 samples) and the other from samples taken every 60 min (seven samples). Three standard thresholds (first three melatonin values mean + 2 SDs, 3 pg/mL, and 4 pg/mL) were used to compute the DLMO. An agreement between DLMOs derived from 30-min and 60-min sampling rates was determined using Bland-Altman analysis; agreement between the sampling rate DLMOs was defined as ± 1 h. Within a 6-h sampling window, 60-min sampling provided DLMO estimates within ± 1 h of DLMO from 30-min sampling, but only when an absolute threshold (3 or 4 pg/mL) was used to compute the DLMO. Future analyses should be extended to include adolescents with CRSWDs. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. An improved two-way continuous-variable quantum key distribution protocol with added noise in homodyne detection

    International Nuclear Information System (INIS)

    Sun Maozhu; Peng Xiang; Guo Hong

    2013-01-01

    We propose an improved two-way continuous-variable quantum key distribution (CV QKD) protocol by adding proper random noise on the receiver’s homodyne detection, the security of which is analysed against general collective attacks. The simulation result under the collective entangling cloner attack indicates that despite the correlation between two-way channels decreasing the secret key rate relative to the uncorrelated channels slightly, the performance of the two-way protocol is still far beyond that of the one-way protocols. Importantly, the added noise in detection is beneficial for the secret key rate and the tolerable excess noise of this two-way protocol. With the reasonable reconciliation efficiency of 90%, the two-way CV QKD with added noise allows the distribution of secret keys over 60 km fibre distance. (paper)

  18. The French dosimetry protocol

    International Nuclear Information System (INIS)

    Dutreix, A.

    1985-01-01

    After a general introduction the protocol is divided in five sections dealing with: determination of the quality of X-ray, γ-ray and electron beams; the measuring instrument; calibration of the reference instrument; determination of the reference absorbed dose in the user's beams; determination of the absorbed dose in water at other points, in other conditions. The French protocol is not essentially different from the Nordic protocol and it is based on the experience gained in using both the American and the Nordic protocols. Therefore, only the main difference with the published protocols are discussed. (Auth.)

  19. Performances of different protocols for exocellular polysaccharides extraction from milk acid gels: Application to yogurt.

    Science.gov (United States)

    Nguyen, An Thi-Binh; Nigen, Michaël; Jimenez, Luciana; Ait-Abderrahim, Hassina; Marchesseau, Sylvie; Picart-Palmade, Laetitia

    2018-01-15

    Dextran or xanthan were used as model exocellular polysaccharides (EPS) to compare the extraction efficiency of EPS from skim milk acid gels using three different protocols. Extraction yields, residual protein concentrations and the macromolecular properties of extracted EPS were determined. For both model EPS, the highest extraction yield (∼80%) was obtained when samples were heated in acidic conditions at the first step of extraction (Protocol 1). Protocols that contained steps of acid/ethanol precipitation without heating (Protocols 2 and 3) show lower extraction yields (∼55%) but allow a better preservation of the EPS macromolecular properties. Changing the pH of acid gels up to 7 before extraction (Protocol 3) improved the extraction yield of anionic EPS without effect on the macromolecular properties of EPS. Protocol 1 was then applied for the quantification of EPS produced during the yogurt fermentation, while Protocol 3 was dedicated to their macromolecular characterization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Delineating sampling procedures: Pedagogical significance of analysing sampling descriptions and their justifications in TESL experimental research reports

    Directory of Open Access Journals (Sweden)

    Jason Miin-Hwa Lim

    2011-04-01

    Full Text Available Teaching second language learners how to write research reports constitutes a crucial component in programmes on English for Specific Purposes (ESP in institutions of higher learning. One of the rhetorical segments in research reports that merit attention has to do with the descriptions and justifications of sampling procedures. This genre-based study looks into sampling delineations in the Method-related sections of research articles on the teaching of English as a second language (TESL written by expert writers and published in eight reputed international refereed journals. Using Swales’s (1990 & 2004 framework, I conducted a quantitative analysis of the rhetorical steps and a qualitative investigation into the language resources employed in delineating sampling procedures. This investigation has considerable relevance to ESP students and instructors as it has yielded pertinent findings on how samples can be appropriately described to meet the expectations of dissertation examiners, reviewers, and supervisors. The findings of this study have furnished insights into how supervisors and instructors can possibly teach novice writers ways of using specific linguistic mechanisms to lucidly describe and convincingly justify the sampling procedures in the Method sections of experimental research reports.

  1. PROTOCOL FOR GAS SAMPLING AND ANALYSIS IN STRANDED MARINE MAMMALS

    OpenAIRE

    sprotocols

    2015-01-01

    Authors: Yara Bernaldo de Quirós, Óscar González-Díaz, Manuel Arbelo, Marisa Andrada & Antonio Fernández ### Abstract Gas sampling in stranded marine mammals can now be performed in situ using the appropriate vacuum tubes, insulin syringes and an aspirometer. Glass vacuum tubes are used for extraction of gas from cavities such as the intestine, pterigoyd air sacs, pneumothorax or subcapsular emphysema as well as for storage of the gas sample at room temperature and pressure. Insulin s...

  2. Biological Sampling and Analysis in Sinclair and Dyes Inlets, Washington: Chemical Analyses for 2007 Puget Sound Biota Study

    Energy Technology Data Exchange (ETDEWEB)

    Brandenberger, Jill M.; Suslick, Carolynn R.; Johnston, Robert K.

    2008-10-09

    Evaluating spatial and temporal trends in contaminant residues in Puget Sound fish and macroinvertebrates are the objectives of the Puget Sound Ambient Monitoring Program (PSAMP). In a cooperative effort between the ENVironmental inVESTment group (ENVVEST) and Washington State Department of Fish and Wildlife, additional biota samples were collected during the 2007 PSAMP biota survey and analyzed for chemical residues and stable isotopes of carbon (δ13C) and nitrogen (δ15N). Approximately three specimens of each species collected from Sinclair Inlet, Georgia Basin, and reference locations in Puget Sound were selected for whole body chemical analysis. The muscle tissue of specimens selected for chemical analyses were also analyzed for δ13C and δ15N to provide information on relative trophic level and food sources. This data report summarizes the chemical residues for the 2007 PSAMP fish and macro-invertebrate samples. In addition, six Spiny Dogfish (Squalus acanthias) samples were necropsied to evaluate chemical residue of various parts of the fish (digestive tract, liver, embryo, muscle tissue), as well as, a weight proportional whole body composite (WBWC). Whole organisms were homogenized and analyzed for silver, arsenic, cadmium, chromium, copper, nickel, lead, zinc, mercury, 19 polychlorinated biphenyl (PCB) congeners, PCB homologues, percent moisture, percent lipids, δ13C, and δ15N.

  3. Field protocols for the genomic era

    Directory of Open Access Journals (Sweden)

    N Bulatova

    2009-08-01

    Full Text Available For many decades karyotype was the only source of overall genomic information obtained from species of mammal. However, approaches have been developed in recent years to obtain molecular and ultimately genomic information directly from the extracted DNA of an organism. Molecular data have accumulated hugely for mammalian taxa. The growing volume of studies should motivate field researchers to collect suitable samples for molecular analysis from various species across all their ranges. This is the reason why we here include a molecular sampling procedure within a field work protocol, which also includes more traditional (including cytogenetic techniques. In this way we hope to foster the development of molecular and genomic studies in non-standard wild mammals.

  4. Applying Data Mining Techniques to Chemical Analyses of Pre-drill Groundwater Samples within the Marcellus Formation Shale Play in Bradford County, Pennsylvania

    Science.gov (United States)

    Wen, T.; Niu, X.; Gonzales, M. S.; Li, Z.; Brantley, S.

    2017-12-01

    Groundwater samples are collected for chemical analyses by shale gas industry consultants in the vicinity of proposed gas wells in Pennsylvania. These data sets are archived so that the chemistry of water from homeowner wells can be compared to chemistry after gas-well drilling. Improved public awareness of groundwater quality issues will contribute to designing strategies for both water resource management and hydrocarbon exploration. We have received water analyses for 11,000 groundwater samples from PA Department of Environmental Protection (PA DEP) in the Marcellus Shale footprint in Bradford County, PA for the years ranging from 2010 to 2016. The PA DEP has investigated these analyses to determine whether gas well drilling or other activities affected water quality. We are currently investigating these analyses to look for patterns in chemistry throughout the study area (related or unrelated to gas drilling activities) and to look for evidence of analytes that may be present at concentrations higher than the advised standards for drinking water. Our preliminary results reveal that dissolved methane concentrations tend to be higher along fault lines in Bradford County [1]. Lead (Pb), arsenic (As), and barium (Ba) are sometimes present at levels above the EPA maximum contaminant level (MCL). Iron (Fe) and manganese (Mn) more frequently violate the EPA standard. We find that concentrations of some chemical analytes (e.g., Ba and Mn) are dependent on bedrock formations (i.e., Catskill vs. Lock Haven) while concentrations of other analytes (e.g., Pb) are not statistically significantly distinct between different bedrock formations. Our investigations are also focused on looking for correlations that might explain water quality patterns with respect to human activities such as gas drilling. However, percentages of water samples failing EPA MCL with respect to Pb, As, and Ba have decreased from previous USGS and PSU studies in the 1990s and 2000s. Public access to

  5. Technical protocol for laboratory tests of transformation of veterinary medicinal products and biocides in liquid manures. Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kreuzig, Robert [Technische Univ. Braunschweig (Germany). Inst. fuer Oekologische Chemie und Abfallanalytik

    2010-07-15

    The technical protocol under consideration describes a laboratory test method to evaluate the transformation of chemicals in liquid bovine and pig manures under anaerobic conditions and primarily is designed for veterinary medicinal products and biocides. The environmentally relevant entry routes into liquid manures occur via urine and feces of cattle and pigs in stable housings after excretion of veterinary medicinal products as parent compounds or metabolites and after the application of biocides in animal housings. Further entry routes such as solid dung application and direct dung pat deposition by production animals on pasture are not considered by this technical protocol. Thus, this technical protocol focused on the sampling of excrements from cattles and pigs kept in stables and fed under standard nutrition conditions. This approach additionally ensures that excrement samples are operationally free of any contamination by veterinary medicinal products and biocides. After the matrix characterization, reference-manure samples are prepared from the excrement samples by adding tap water to adjust defined dry substance contents typical for bovine or pig manures. This technical protocol comprehends a tiered experimental design in two parts: (a) Sampling of excrements and preparation of reference bovine and pig manures; (b) Testing of anaerobic transformation of chemicals in reference manures.

  6. Guidance for air sampling at nuclear facilities

    International Nuclear Information System (INIS)

    Breslin, A.J.

    1976-11-01

    The principal uses of air sampling at nuclear facilities are to monitor general levels of radioactive air contamination, identify sources of air contamination, and evaluate the effectiveness of contaminant control equipment, determine exposures of individual workers, and provide automatic warning of hazardous concentrations of radioactivity. These applications of air sampling are discussed with respect to standards of occupational exposure, instrumentation, sample analysis, sampling protocol, and statistical treatment of concentration data. Emphasis is given to the influence of spacial and temporal variations of radionuclide concentration on the location, duration, and frequency of air sampling

  7. Digitally programmable microfluidic automaton for multiscale combinatorial mixing and sample processing†

    Science.gov (United States)

    Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.

    2013-01-01

    A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232

  8. Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks.

    Science.gov (United States)

    Pérez-Solano, Juan J; Claver, Jose M; Ezpeleta, Santiago

    2017-07-06

    Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments.

  9. Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks

    Directory of Open Access Journals (Sweden)

    Juan J. Pérez-Solano

    2017-07-01

    Full Text Available Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments.

  10. Human blood RNA stabilization in samples collected and transported for a large biobank

    Science.gov (United States)

    2012-01-01

    Background The Norwegian Mother and Child Cohort Study (MoBa) is a nation-wide population-based pregnancy cohort initiated in 1999, comprising more than 108.000 pregnancies recruited between 1999 and 2008. In this study we evaluated the feasibility of integrating RNA analyses into existing MoBa protocols. We compared two different blood RNA collection tube systems – the PAXgene™ Blood RNA system and the Tempus™ Blood RNA system - and assessed the effects of suboptimal blood volumes in collection tubes and of transportation of blood samples by standard mail. Endpoints to characterize the samples were RNA quality and yield, and the RNA transcript stability of selected genes. Findings High-quality RNA could be extracted from blood samples stabilized with both PAXgene and Tempus tubes. The RNA yields obtained from the blood samples collected in Tempus tubes were consistently higher than from PAXgene tubes. Higher RNA yields were obtained from cord blood (3 – 4 times) compared to adult blood with both types of tubes. Transportation of samples by standard mail had moderate effects on RNA quality and RNA transcript stability; the overall RNA quality of the transported samples was high. Some unexplained changes in gene expression were noted, which seemed to correlate with suboptimal blood volumes collected in the tubes. Temperature variations during transportation may also be of some importance. Conclusions Our results strongly suggest that special collection tubes are necessary for RNA stabilization and they should be used for establishing new biobanks. We also show that the 50,000 samples collected in the MoBa biobank provide RNA of high quality and in sufficient amounts to allow gene expression analyses for studying the association of disease with altered patterns of gene expression. PMID:22988904

  11. Agricultural Soil Spectral Response and Properties Assessment: Effects of Measurement Protocol and Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Asa Gholizadeh

    2017-10-01

    Full Text Available Soil spectroscopy has shown to be a fast, cost-effective, environmentally friendly, non-destructive, reproducible and repeatable analytical technique. Soil components, as well as types of instruments, protocols, sampling methods, sample preparation, spectral acquisition techniques and analytical algorithms have a combined influence on the final performance. Therefore, it is important to characterize these differences and to introduce an effective approach in order to minimize the technical factors that alter reflectance spectra and consequent prediction. To quantify this alteration, a joint project between Czech University of Life Sciences Prague (CULS and Tel-Aviv University (TAU was conducted to estimate Cox, pH-H2O, pH-KCl and selected forms of Fe and Mn. Two different soil spectral measurement protocols and two data mining techniques were used to examine seventy-eight soil samples from five agricultural areas in different parts of the Czech Republic. Spectral measurements at both laboratories were made using different ASD spectroradiometers. The CULS protocol was based on employing a contact probe (CP spectral measurement scheme, while the TAU protocol was carried out using a CP measurement method, accompanied with the internal soil standard (ISS procedure. Two spectral datasets, acquired from different protocols, were both analyzed using partial least square regression (PLSR technique as well as the PARACUDA II®, a new data mining engine for optimizing PLSR models. The results showed that spectra based on the CULS setup (non-ISS demonstrated significantly higher albedo intensity and reflectance values relative to the TAU setup with ISS. However, the majority of statistics using the TAU protocol was not noticeably better than the CULS spectra. The paper also highlighted that under both measurement protocols, the PARACUDA II® engine proved to be a powerful tool for providing better results than PLSR. Such initiative is not only a way to

  12. SPIDIA-RNA: second external quality assessment for the pre-analytical phase of blood samples used for RNA based analyses.

    Directory of Open Access Journals (Sweden)

    Francesca Malentacchi

    Full Text Available One purpose of the EC funded project, SPIDIA, is to develop evidence-based quality guidelines for the pre-analytical handling of blood samples for RNA molecular testing. To this end, two pan-European External Quality Assessments (EQAs were implemented. Here we report the results of the second SPIDIA-RNA EQA. This second study included modifications in the protocol related to the blood collection process, the shipping conditions and pre-analytical specimen handling for participants. Participating laboratories received two identical proficiency blood specimens collected in tubes with or without an RNA stabilizer. For pre-defined specimen storage times and temperatures, laboratories were asked to perform RNA extraction from whole blood according to their usual procedure and to return extracted RNA to the SPIDIA facility for further analysis. These RNA samples were evaluated for purity, yield, integrity, stability, presence of interfering substances, and gene expression levels for the validated markers of RNA stability: FOS, IL1B, IL8, GAPDH, FOSB and TNFRSF10c. Analysis of the gene expression results of FOS, IL8, FOSB, and TNFRSF10c, however, indicated that the levels of these transcripts were significantly affected by blood collection tube type and storage temperature. These results demonstrated that only blood collection tubes containing a cellular RNA stabilizer allowed reliable gene expression analysis within 48 h from blood collection for all the genes investigated. The results of these two EQAs have been proposed for use in the development of a Technical Specification by the European Committee for Standardization.

  13. The HPA photon protocol and proposed electron protocol

    International Nuclear Information System (INIS)

    Pitchford, W.G.

    1985-01-01

    The Hospital Physicists Association (HPA) photon dosimetry protocol has been produced and was published in 1983. Revised values of some components of Csub(lambda) and refinements introduced into the theory in the last few years have enabled new Csub(lambda) values to be produced. The proposed HPA electron protocol is at present in draft form and will be published shortly. Both protocels are discussed. (Auth.)

  14. Toward Synthesis, Analysis, and Certification of Security Protocols

    Science.gov (United States)

    Schumann, Johann

    2004-01-01

    : multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.

  15. Differences in handgrip strength protocols to identify sarcopenia and frailty - a systematic review.

    Science.gov (United States)

    Sousa-Santos, A R; Amaral, T F

    2017-10-16

    Hand grip strength (HGS) is used for the diagnosis of sarcopenia and frailty. Several factors have been shown to influence HGS values during measurement. Therefore, variations in the protocols used to assess HGS, as part of the diagnosis of sarcopenia and frailty, may lead to the identification of different individuals with low HGS, introducing bias. The aim of this systematic review is to gather all the relevant studies that measured HGS to diagnose sarcopenia and frailty and to identify the differences between the protocols used. A systematic review was carried out following the recommendations of The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement. PubMed and Web of Science were systematically searched, until August 16, 2016. The evidence regarding HGS measurement protocols used to diagnose sarcopenia and frailty was summarised and the most recent protocols regarding the procedure were compared. From the described search 4393 articles were identified. Seventy-two studies were included in this systematic review, in which 37 referred to sarcopenia articles, 33 to frailty and two evaluated both conditions. Most studies presented limited information regarding the protocols used. The majority of the studies included did not describe a complete procedure of HGS measurement. The high heterogeneity between the protocols used, in sarcopenia and frailty studies, create an enormous difficulty in drawing comparative conclusions among them.

  16. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  17. Security Protocols in a Nutshell

    OpenAIRE

    Toorani, Mohsen

    2016-01-01

    Security protocols are building blocks in secure communications. They deploy some security mechanisms to provide certain security services. Security protocols are considered abstract when analyzed, but they can have extra vulnerabilities when implemented. This manuscript provides a holistic study on security protocols. It reviews foundations of security protocols, taxonomy of attacks on security protocols and their implementations, and different methods and models for security analysis of pro...

  18. [Analysis of palliative sedation in hospitalised elderly patients: Effectiveness of a protocol].

    Science.gov (United States)

    Mateos-Nozal, Jesús; García-Cabrera, Lorena; Montero Errasquín, Beatriz; Cruz-Jentoft, Alfonso José; Rexach Cano, Lourdes

    2016-01-01

    To measure changes in the practice of palliative sedation during agony in hospitalised elderly patients before and after the implementation of a palliative sedation protocol. A retrospective before-after study was performed in hospitalised patients over 65 years old who received midazolam during hospital admission and died in the hospital in two 3-month periods, before and after the implementation of the protocol. Non-sedative uses of midazolam and patients in intensive care were excluded. Patient and admission characteristics, the consent process, withdrawal of life-sustaining treatments, and the sedation process (refractory symptom treated, drug doses, assessment and use of other drugs) were recorded. Association was analysed using the Chi(2) and Student t tests. A total of 143 patients were included, with no significant differences between groups in demographic characteristics or symptoms. Do not resuscitate (DNR) orders were recorded in approximately 70% of the subjects of each group, and informed consent for sedation was recorded in 91% before vs. 84% after the protocol. Induction and maintenance doses of midazolam followed protocol recommendations in 1.3% before vs 10.4% after the protocol was implemented (P=.02) and adequate rescue doses were used in 1.3% vs 11.9% respectively (P=.01). Midazolam doses were significantly lower (9.86mg vs 18.67mg, Psedation score was used in 8% vs. 12% and the Palliative Care Team was involved in 35.5% and 16.4% of the cases (P=.008) before and after the protocol, respectively. Use of midazolam slightly improved after the implementation of a hospital protocol on palliative sedation. The percentage of adequate sedations and the general process of sedation were mostly unchanged by the protocol. More education and further assessment is needed to gauge the effect of these measures in the future. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.

  19. Aging of monolithic zirconia dental prostheses: Protocol for a 5-year prospective clinical study using ex vivo analyses

    OpenAIRE

    Koenig, Vinciane; Wulfman, Claudine P.; Derbanne, Mathieu A.; Dupont, Nathalie M.; Le Goff, Stéphane O.; Tang, Mie-Leng; Seidel, Laurence; Dewael, Thibaut Y.; Vanheusden, Alain J.; Mainjot, Amélie K.

    2016-01-01

    Background: Recent introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) monolithic zirconia dental prostheses raises the issue of material low thermal degradation (LTD), a well-known problem with zirconia hip prostheses. This phenomenon could be accentuated by masticatory mechanical stress. Until now zirconia LTD process has only been studied in vitro. This work introduces an original protocol to evaluate LTD process of monolithic zirconia prostheses in the oral enviro...

  20. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  1. Soft chelating irrigation protocol optimizes bonding quality of Resilon/Epiphany root fillings.

    Science.gov (United States)

    De-Deus, Gustavo; Namen, Fátima; Galan, João; Zehnder, Matthias

    2008-06-01

    This study was designed to test the impact of either a strong (MTAD) or a soft (1-hydroxyethylidene-1, 1-bisphosphonate [HEPB]) chelating solution on the bond strength of Resilon/Epiphany root fillings. Both 17% EDTA and the omission of a chelator in the irrigation protocol were used as reference treatments. Forty extracted human upper lateral incisors were prepared using different irrigation protocols (n = 10): G1: NaOCl, G2: NaOCl + 17% EDTA, G3: NaOCl + BioPure MTAD (Dentsply/Tulsa, Tulsa, OK), and G4: NaOCl + 18% HEPB. The teeth were obturated and then prepared for micropush-out assessment using root slices of 1 mm thickness. Loading was performed on a universal testing machine at a speed of 0.5 mm/min. One-way analysis of variance and Tukey multiple comparisons were used to compare the results among the experimental groups. EDTA- and MTAD-treated samples revealed intermediate bond strength (0.3-3.6 MPa). The lowest bond strengths were achieved in NaOCl-treated samples (0.3-1.2 MPa, p < 0.05). The highest bond strength was reached in the HEBP-treated samples (3.1-6.1 MPa, p < 0.05). Under the present in vitro conditions, the soft chelating irrigation protocol (18% HEBP) optimized the bonding quality of Resilon/Epiphany (Resilon Research LLC, Madison, CT) root fillings.

  2. From Protocols to Publications: A Study in Selective Reporting of Outcomes in Randomized Trials in Oncology

    Science.gov (United States)

    Raghav, Kanwal Pratap Singh; Mahajan, Sminil; Yao, James C.; Hobbs, Brian P.; Berry, Donald A.; Pentz, Rebecca D.; Tam, Alda; Hong, Waun K.; Ellis, Lee M.; Abbruzzese, James; Overman, Michael J.

    2015-01-01

    Purpose The decision by journals to append protocols to published reports of randomized trials was a landmark event in clinical trial reporting. However, limited information is available on how this initiative effected transparency and selective reporting of clinical trial data. Methods We analyzed 74 oncology-based randomized trials published in Journal of Clinical Oncology, the New England Journal of Medicine, and The Lancet in 2012. To ascertain integrity of reporting, we compared published reports with their respective appended protocols with regard to primary end points, nonprimary end points, unplanned end points, and unplanned analyses. Results A total of 86 primary end points were reported in 74 randomized trials; nine trials had greater than one primary end point. Nine trials (12.2%) had some discrepancy between their planned and published primary end points. A total of 579 nonprimary end points (median, seven per trial) were planned, of which 373 (64.4%; median, five per trial) were reported. A significant positive correlation was found between the number of planned and nonreported nonprimary end points (Spearman r = 0.66; P < .001). Twenty-eight studies (37.8%) reported a total of 65 unplanned end points; 52 (80.0%) of which were not identified as unplanned. Thirty-one (41.9%) and 19 (25.7%) of 74 trials reported a total of 52 unplanned analyses involving primary end points and 33 unplanned analyses involving nonprimary end points, respectively. Studies reported positive unplanned end points and unplanned analyses more frequently than negative outcomes in abstracts (unplanned end points odds ratio, 6.8; P = .002; unplanned analyses odd ratio, 8.4; P = .007). Conclusion Despite public and reviewer access to protocols, selective outcome reporting persists and is a major concern in the reporting of randomized clinical trials. To foster credible evidence-based medicine, additional initiatives are needed to minimize selective reporting. PMID:26304898

  3. Cost-effectiveness analysis of a non-contrast screening MRI protocol for vestibular schwannoma in patients with asymmetric sensorineural hearing loss

    International Nuclear Information System (INIS)

    Crowson, Matthew G.; Rocke, Daniel J.; Kaylie, David M.; Hoang, Jenny K.; Weissman, Jane L.

    2017-01-01

    We aimed to determine if a non-contrast screening MRI is cost-effective compared to a full MRI protocol with contrast for the evaluation of vestibular schwannomas. A decision tree was constructed to evaluate full MRI and screening MRI strategies for patients with asymmetric sensorineural hearing loss. If a patient were to have a positive screening MRI, s/he received a full MRI. Vestibular schwannoma prevalence, MRI specificity and sensitivity, and gadolinium anaphylaxis incidence were obtained through literature review. Institutional charge data were obtained using representative patient cohorts. One-way and probabilistic sensitivity analyses were completed to determine CE model threshold points for MRI performance characteristics and charges. The mean charge for a full MRI with contrast was significantly higher than a screening MRI ($4089 ± 1086 versus $2872 ± 741; p < 0.05). The screening MRI protocol was more cost-effective than a full MRI protocol with a willingness-to-pay from $0 to 20,000 USD. Sensitivity analyses determined that the screening protocol dominated when the screening MRI charge was less than $4678, and the imaging specificity exceeded 78.2%. The screening MRI protocol also dominated when vestibular schwannoma prevalence was varied between 0 and 1000 in 10,000 people. A screening MRI protocol is more cost-effective than a full MRI with contrast in the diagnostic evaluation of a vestibular schwannoma. A screening MRI likely also confers benefits of shorter exam time and no contrast use. Further investigation is needed to confirm the relative performance of screening protocols for vestibular schwannomas. (orig.)

  4. Cost-effectiveness analysis of a non-contrast screening MRI protocol for vestibular schwannoma in patients with asymmetric sensorineural hearing loss

    Energy Technology Data Exchange (ETDEWEB)

    Crowson, Matthew G.; Rocke, Daniel J.; Kaylie, David M. [Duke University Medical Center, Division of Otolaryngology-Head and Neck Surgery, Durham, NC (United States); Hoang, Jenny K. [Duke University Medical Center, Department of Radiology, Durham, NC (United States); Weissman, Jane L. [Oregon Health Sciences University, Professor Emerita of Diagnostic Radiology, Portland, OR (United States)

    2017-08-15

    We aimed to determine if a non-contrast screening MRI is cost-effective compared to a full MRI protocol with contrast for the evaluation of vestibular schwannomas. A decision tree was constructed to evaluate full MRI and screening MRI strategies for patients with asymmetric sensorineural hearing loss. If a patient were to have a positive screening MRI, s/he received a full MRI. Vestibular schwannoma prevalence, MRI specificity and sensitivity, and gadolinium anaphylaxis incidence were obtained through literature review. Institutional charge data were obtained using representative patient cohorts. One-way and probabilistic sensitivity analyses were completed to determine CE model threshold points for MRI performance characteristics and charges. The mean charge for a full MRI with contrast was significantly higher than a screening MRI ($4089 ± 1086 versus $2872 ± 741; p < 0.05). The screening MRI protocol was more cost-effective than a full MRI protocol with a willingness-to-pay from $0 to 20,000 USD. Sensitivity analyses determined that the screening protocol dominated when the screening MRI charge was less than $4678, and the imaging specificity exceeded 78.2%. The screening MRI protocol also dominated when vestibular schwannoma prevalence was varied between 0 and 1000 in 10,000 people. A screening MRI protocol is more cost-effective than a full MRI with contrast in the diagnostic evaluation of a vestibular schwannoma. A screening MRI likely also confers benefits of shorter exam time and no contrast use. Further investigation is needed to confirm the relative performance of screening protocols for vestibular schwannomas. (orig.)

  5. A Novel Morphometry-Based Protocol of Automated Video-Image Analysis for Species Recognition and Activity Rhythms Monitoring in Deep-Sea Fauna

    Directory of Open Access Journals (Sweden)

    Paolo Menesatti

    2009-10-01

    Full Text Available The understanding of ecosystem dynamics in deep-sea areas is to date limited by technical constraints on sampling repetition. We have elaborated a morphometry-based protocol for automated video-image analysis where animal movement tracking (by frame subtraction is accompanied by species identification from animals’ outlines by Fourier Descriptors and Standard K-Nearest Neighbours methods. One-week footage from a permanent video-station located at 1,100 m depth in Sagami Bay (Central Japan was analysed. Out of 150,000 frames (1 per 4 s, a subset of 10.000 was analyzed by a trained operator to increase the efficiency of the automated procedure. Error estimation of the automated and trained operator procedure was computed as a measure of protocol performance. Three displacing species were identified as the most recurrent: Zoarcid fishes (eelpouts, red crabs (Paralomis multispina, and snails (Buccinum soyomaruae. Species identification with KNN thresholding produced better results in automated motion detection. Results were discussed assuming that the technological bottleneck is to date deeply conditioning the exploration of the deep-sea.

  6. Business protocol in integrated Europe

    OpenAIRE

    Pavelová, Nina

    2009-01-01

    The first chapter devotes to definitions of basic terms such as protocol or business protocol, to differences between protocol and etiquette, and between social etiquette and business etiquette. The second chapter focuses on the factors influencing the European business protocol. The third chapter is devoted to the etiquette of business protocol in the European countries. It touches the topics such as punctuality and planning of business appointment, greeting, business cards, dress and appear...

  7. Evaluation of four automated protocols for extraction of DNA from FTA cards.

    Science.gov (United States)

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels

    2013-10-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.

  8. Evolved Gas Analyses of the Murray Formation in Gale Crater, Mars: Results of the Curiosity Rover's Sample Analysis at Mars (SAM) Instrument

    Science.gov (United States)

    Sutter, B.; McAdam, A. C.; Rampe, E. B.; Thompson, L. M.; Ming, D. W.; Mahaffy, P. R.; Navarro-Gonzalez, R.; Stern, J. C.; Eigenbrode, J. L.; Archer, P. D.

    2017-01-01

    The Sample Analysis at Mars (SAM) instrument aboard the Mars Science Laboratory rover has analyzed 13 samples from Gale Crater. All SAM-evolved gas analyses have yielded a multitude of volatiles (e.g., H2O, SO2, H2S, CO2, CO, NO, O2, HCl) [1- 6]. The objectives of this work are to 1) Characterize recent evolved SO2, CO2, O2, and NO gas traces of the Murray formation mudstone, 2) Constrain sediment mineralogy/composition based on SAM evolved gas analysis (SAM-EGA), and 3) Discuss the implications of these results relative to understanding the geological history of Gale Crater.

  9. Rapid Fractionation and Isolation of Whole Blood Components in Samples Obtained from a Community-based Setting.

    Science.gov (United States)

    Weckle, Amy; Aiello, Allison E; Uddin, Monica; Galea, Sandro; Coulborn, Rebecca M; Soliven, Richelo; Meier, Helen; Wildman, Derek E

    2015-11-30

    Collection and processing of whole blood samples in a non-clinical setting offers a unique opportunity to evaluate community-dwelling individuals both with and without preexisting conditions. Rapid processing of these samples is essential to avoid degradation of key cellular components. Included here are methods for simultaneous peripheral blood mononuclear cell (PBMC), DNA, RNA and serum isolation from a single blood draw performed in the homes of consenting participants across a metropolitan area, with processing initiated within 2 hr of collection. We have used these techniques to process over 1,600 blood specimens yielding consistent, high quality material, which has subsequently been used in successful DNA methylation, genotyping, gene expression and flow cytometry analyses. Some of the methods employed are standard; however, when combined in the described manner, they enable efficient processing of samples from participants of population- and/or community-based studies who would not normally be evaluated in a clinical setting. Therefore, this protocol has the potential to obtain samples (and subsequently data) that are more representative of the general population.

  10. A robust ECC based mutual authentication protocol with anonymity for session initiation protocol.

    Science.gov (United States)

    Mehmood, Zahid; Chen, Gongliang; Li, Jianhua; Li, Linsen; Alzahrani, Bander

    2017-01-01

    Over the past few years, Session Initiation Protocol (SIP) is found as a substantial application-layer protocol for the multimedia services. It is extensively used for managing, altering, terminating and distributing the multimedia sessions. Authentication plays a pivotal role in SIP environment. Currently, Lu et al. presented an authentication protocol for SIP and profess that newly proposed protocol is protected against all the familiar attacks. However, the detailed analysis describes that the Lu et al.'s protocol is exposed against server masquerading attack and user's masquerading attack. Moreover, it also fails to protect the user's identity as well as it possesses incorrect login and authentication phase. In order to establish a suitable and efficient protocol, having ability to overcome all these discrepancies, a robust ECC-based novel mutual authentication mechanism with anonymity for SIP is presented in this manuscript. The improved protocol contains an explicit parameter for user to cope the issues of security and correctness and is found to be more secure and relatively effective to protect the user's privacy, user's masquerading and server masquerading as it is verified through the comprehensive formal and informal security analysis.

  11. A robust ECC based mutual authentication protocol with anonymity for session initiation protocol.

    Directory of Open Access Journals (Sweden)

    Zahid Mehmood

    Full Text Available Over the past few years, Session Initiation Protocol (SIP is found as a substantial application-layer protocol for the multimedia services. It is extensively used for managing, altering, terminating and distributing the multimedia sessions. Authentication plays a pivotal role in SIP environment. Currently, Lu et al. presented an authentication protocol for SIP and profess that newly proposed protocol is protected against all the familiar attacks. However, the detailed analysis describes that the Lu et al.'s protocol is exposed against server masquerading attack and user's masquerading attack. Moreover, it also fails to protect the user's identity as well as it possesses incorrect login and authentication phase. In order to establish a suitable and efficient protocol, having ability to overcome all these discrepancies, a robust ECC-based novel mutual authentication mechanism with anonymity for SIP is presented in this manuscript. The improved protocol contains an explicit parameter for user to cope the issues of security and correctness and is found to be more secure and relatively effective to protect the user's privacy, user's masquerading and server masquerading as it is verified through the comprehensive formal and informal security analysis.

  12. Usefulness of an injectable anaesthetic protocol for semen collection through urethral catheterisation in domestic cats.

    Science.gov (United States)

    Pisu, Maria Carmela; Ponzio, Patrizia; Rovella, Chiara; Baravalle, Michela; Veronesi, Maria Cristina

    2017-10-01

    Objectives Although less often requested in comparison with dogs, the collection of semen in cats can be necessary for artificial insemination, for semen evaluation in tom cats used for breeding and for semen storage. Urethral catheterisation after pharmacological induction with medetomidine has proved to be useful for the collection of semen in domestic cats. However, most of the previously used protocols require the administration of high doses of medetomidine that can increase the risk of side effects, especially on the cardiovascular system. In routine clinical practice, one safe and useful injectable anaesthetic protocol for short-term clinical investigations or surgery in cats involves premedication with low intramuscular doses of dexmedetomidine with methadone, followed by intravenous propofol bolus injection. We aimed to assess the usefulness of this injectable anaesthetic protocol for semen collection, via urethral catheterisation, in domestic cats. Methods The study was performed on 38 purebred, adult cats, during the breeding season, and semen was collected via urethral catheterisation using an injectable anaesthesia protocol with methadone (0.2 mg/kg) and dexmedetomidine (5 µg/kg) premedication, followed by induction with propofol. Results The anaesthetic protocol used in the present study allowed the collection of large-volume semen samples, characterised by good parameters and without side effects. Conclusions and relevance The results from the present study suggest that the injectable anaesthetic protocol using methadone and dexmedetomidine premedication, followed by induction with propofol, could be suitable and safe for the collection of a good-quality semen sample, via urethral catheterisation, in domestic cats. It can therefore be used as an alternative to previous medetomidine-based sedation protocols.

  13. Validation of a standard forensic anthropology examination protocol by measurement of applicability and reliability on exhumed and archive samples of known biological attribution.

    Science.gov (United States)

    Francisco, Raffaela Arrabaça; Evison, Martin Paul; Costa Junior, Moacyr Lobo da; Silveira, Teresa Cristina Pantozzi; Secchieri, José Marcelo; Guimarães, Marco Aurelio

    2017-10-01

    Forensic anthropology makes an important contribution to human identification and assessment of the causes and mechanisms of death and body disposal in criminal and civil investigations, including those related to atrocity, disaster and trafficking victim identification. The methods used are comparative, relying on assignment of questioned material to categories observed in standard reference material of known attribution. Reference collections typically originate in Europe and North America, and are not necessarily representative of contemporary global populations. Methods based on them must be validated when applied to novel populations. This study describes the validation of a standardized forensic anthropology examination protocol by application to two contemporary Brazilian skeletal samples of known attribution. One sample (n=90) was collected from exhumations following 7-35 years of burial and the second (n=30) was collected following successful investigations following routine case work. The study presents measurement of (1) the applicability of each of the methods: used and (2) the reliability with which the biographic parameters were assigned in each case. The results are discussed with reference to published assessments of methodological reliability regarding sex, age and-in particular-ancestry estimation. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Hair cortisol detection in dairy cattle by using EIA: protocol validation and correlation with faecal cortisol metabolites.

    Science.gov (United States)

    Tallo-Parra, O; Manteca, X; Sabes-Alsina, M; Carbajal, A; Lopez-Bejar, M

    2015-06-01

    Hair may be a useful matrix to detect cumulative cortisol concentrations in studies of animal welfare and chronic stress. The aim of this study was to validate a protocol for cortisol detection in hair from dairy cattle by enzyme immunoassay (EIA). Seventeen adult Holstein-Friesian dairy cows were used during the milking period. Hair cortisol concentration was assessed in 25-day-old hair samples taken from the frontal region of the head, analysing black and white coloured hair separately. Concentrations of cortisol metabolites were determined in faeces collected twice a week during the same period of time. There was a high correlation between cortisol values in faeces and cortisol in white colour hair samples but such correlation was not significant with the black colour hair samples. The intra- and inter-assay coefficients of variation were 4.9% and 10.6%, respectively. The linearity showed R 2=0.98 and mean percentage error of -10.8 ± 1.55%. The extraction efficiency was 89.0 ± 23.52% and the parallelism test showed similar slopes. Cortisol detection in hair by using EIA seems to be a valid method to represent long-term circulating cortisol levels in dairy cattle.

  15. Assessing Exhaustiveness of Stochastic Sampling for Integrative Modeling of Macromolecular Structures.

    Science.gov (United States)

    Viswanath, Shruthi; Chemmama, Ilan E; Cimermancic, Peter; Sali, Andrej

    2017-12-05

    Modeling of macromolecular structures involves structural sampling guided by a scoring function, resulting in an ensemble of good-scoring models. By necessity, the sampling is often stochastic, and must be exhaustive at a precision sufficient for accurate modeling and assessment of model uncertainty. Therefore, the very first step in analyzing the ensemble is an estimation of the highest precision at which the sampling is exhaustive. Here, we present an objective and automated method for this task. As a proxy for sampling exhaustiveness, we evaluate whether two independently and stochastically generated sets of models are sufficiently similar. The protocol includes testing 1) convergence of the model score, 2) whether model scores for the two samples were drawn from the same parent distribution, 3) whether each structural cluster includes models from each sample proportionally to its size, and 4) whether there is sufficient structural similarity between the two model samples in each cluster. The evaluation also provides the sampling precision, defined as the smallest clustering threshold that satisfies the third, most stringent test. We validate the protocol with the aid of enumerated good-scoring models for five illustrative cases of binary protein complexes. Passing the proposed four tests is necessary, but not sufficient for thorough sampling. The protocol is general in nature and can be applied to the stochastic sampling of any set of models, not just structural models. In addition, the tests can be used to stop stochastic sampling as soon as exhaustiveness at desired precision is reached, thereby improving sampling efficiency; they may also help in selecting a model representation that is sufficiently detailed to be informative, yet also sufficiently coarse for sampling to be exhaustive. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  17. Power Saving MAC Protocols for WSNs and Optimization of S-MAC Protocol

    Directory of Open Access Journals (Sweden)

    Simarpreet Kaur

    2012-11-01

    Full Text Available Low power MAC protocols have received a lot of consideration in the last few years because of their influence on the lifetime of wireless sensor networks. Since, sensors typically operate on batteries, replacement of which is often difficult. A lot of work has been done to minimize the energy expenditure and prolong the sensor lifetime through energy efficient designs, across layers. Meanwhile, the sensor network should be able to maintain a certain throughput in order to fulfill the QoS requirements of the end user, and to ensure the constancy of the network. This paper introduces different types of MAC protocols used for WSNs and proposes S‐MAC, a Medium‐Access Control protocol designed for Wireless Sensor Networks. S‐MAC uses a few innovative techniques to reduce energy consumption and support selfconfiguration. A new protocol is suggested to improve the energy efficiency, latency and throughput of existing MAC protocol for WSNs. A modification of the protocol is then proposed to eliminate the need for some nodes to stay awake longer than the other nodes which improves the energy efficiency, latency and throughput and hence increases the life span of a wireless sensor network.

  18. Massage in children with cancer: effectiveness of a protocol

    Directory of Open Access Journals (Sweden)

    Luís Manuel da Cunha Batalha

    2013-11-01

    Conclusions: despite the small sample size, massage therapy appears to be a useful intervention in reducing pain in children with cancer. However, there are still questions regarding the effectiveness of this massage protocol. The authors recommend its use due to its contribution to the promotion of the child's well-being and quality of life.

  19. The effect of DNA degradation bias in passive sampling devices on metabarcoding studies of arthropod communities and their associated microbiota.

    Science.gov (United States)

    Krehenwinkel, Henrik; Fong, Marisa; Kennedy, Susan; Huang, Edward Greg; Noriyuki, Suzuki; Cayetano, Luis; Gillespie, Rosemary

    2018-01-01

    PCR amplification bias is a well-known problem in metagenomic analysis of arthropod communities. In contrast, variation of DNA degradation rates is a largely neglected source of bias. Differential degradation of DNA molecules could cause underrepresentation of taxa in a community sequencing sample. Arthropods are often collected by passive sampling devices, like malaise traps. Specimens in such a trap are exposed to varying periods of suboptimal storage and possibly different rates of DNA degradation. Degradation bias could thus be a significant issue, skewing diversity estimates. Here, we estimate the effect of differential DNA degradation on the recovery of community diversity of Hawaiian arthropods and their associated microbiota. We use a simple DNA size selection protocol to test for degradation bias in mock communities, as well as passively collected samples from actual Malaise traps. We compare the effect of DNA degradation to that of varying PCR conditions, including primer choice, annealing temperature and cycle number. Our results show that DNA degradation does indeed bias community analyses. However, the effect of this bias is of minor importance compared to that induced by changes in PCR conditions. Analyses of the macro and microbiome from passively collected arthropod samples are thus well worth pursuing.

  20. An optimised protocol for isolation of RNA from small sections of laser-capture microdissected FFPE tissue amenable for next-generation sequencing.

    Science.gov (United States)

    Amini, Parisa; Ettlin, Julia; Opitz, Lennart; Clementi, Elena; Malbon, Alexandra; Markkanen, Enni

    2017-08-23

    Formalin-fixed paraffin embedded (FFPE) tissue constitutes a vast treasury of samples for biomedical research. Thus far however, extraction of RNA from FFPE tissue has proved challenging due to chemical RNA-protein crosslinking and RNA fragmentation, both of which heavily impact on RNA quantity and quality for downstream analysis. With very small sample sizes, e.g. when performing Laser-capture microdissection (LCM) to isolate specific subpopulations of cells, recovery of sufficient RNA for analysis with reverse-transcription quantitative PCR (RT-qPCR) or next-generation sequencing (NGS) becomes very cumbersome and difficult. We excised matched cancer-associated stroma (CAS) and normal stroma from clinical specimen of FFPE canine mammary tumours using LCM, and compared the commonly used protease-based RNA isolation procedure with an adapted novel technique that additionally incorporates a focused ultrasonication step. We successfully adapted a protocol that uses focused ultrasonication to isolate RNA from small amounts of deparaffinised, stained, clinical LCM samples. Using this approach, we found that total RNA yields could be increased by 8- to 12-fold compared to a commonly used protease-based extraction technique. Surprisingly, RNA extracted using this new approach was qualitatively at least equal if not superior compared to the old approach, as Cq values in RT-qPCR were on average 2.3-fold lower using the new method. Finally, we demonstrate that RNA extracted using the new method performs comparably in NGS as well. We present a successful isolation protocol for extraction of RNA from difficult and limiting FFPE tissue samples that enables successful analysis of small sections of clinically relevant specimen. The possibility to study gene expression signatures in specific small sections of archival FFPE tissue, which often entail large amounts of highly relevant clinical follow-up data, unlocks a new dimension of hitherto difficult-to-analyse samples which now

  1. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  2. Improvement in quality of life and sexual functioning in a comorbid sample after the unified protocol transdiagnostic group treatment.

    Science.gov (United States)

    de Ornelas Maia, Ana Claudia Corrêa; Sanford, Jenny; Boettcher, Hannah; Nardi, Antonio E; Barlow, David

    2017-10-01

    Patients with multiple mental disorders often experience sexual dysfunction and reduced quality of life. The unified protocol (UP) is a transdiagnostic treatment for emotional disorders that has the potential to improve quality of life and sexual functioning via improved emotion management. The present study evaluates changes in quality of life and sexual functioning in a highly comorbid sample treated with the UP in a group format. Forty-eight patients were randomly assigned to either a UP active-treatment group or a medication-only control group. Treatment was delivered in 14 sessions over the course of 4 months. Symptoms of anxiety and depression were assessed using the Beck Anxiety Inventory and Beck Depression Inventory. Sexual functioning was assessed by the Arizona Sexual Experience Scale (ASEX), and quality of life was assessed by the World Health Organization Quality of Life-BREF scale (WHOQOL-BREF). Quality of life, anxiety and depression all significantly improved among participants treated with the UP. Some improvement in sexual functioning was also noted. The results support the efficacy of the UP in improving quality of life and sexual functioning in comorbid patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Self-collected versus clinician-collected sampling for sexually transmitted infections: a systematic review and meta-analysis protocol.

    Science.gov (United States)

    Taylor, Darlene; Lunny, Carole; Wong, Tom; Gilbert, Mark; Li, Neville; Lester, Richard; Krajden, Mel; Hoang, Linda; Ogilvie, Gina

    2013-10-10

    Three meta-analyses and one systematic review have been conducted on the question of whether self-collected specimens are as accurate as clinician-collected specimens for STI screening. However, these reviews predate 2007 and did not analyze rectal or pharyngeal collection sites. Currently, there is no consensus on which sampling method is the most effective for the diagnosis of genital chlamydia (CT), gonorrhea (GC) or human papillomavirus (HPV) infection. Our meta-analysis aims to be comprehensive in that it will examine the evidence of whether self-collected vaginal, urine, pharyngeal and rectal specimens provide as accurate a clinical diagnosis as clinician-collected samples (reference standard). Eligible studies include both randomized and non-randomized controlled trials, pre- and post-test designs, and controlled observational studies. The databases that will be searched include the Cochrane Database of Systematic Reviews, Web of Science, Database of Abstracts of Reviews of Effects (DARE), EMBASE and PubMed/Medline. Data will be abstracted independently by two reviewers using a standardized pre-tested data abstraction form. Heterogeneity will be assessed using the Q2 test. Sensitivity and specificity estimates with 95% confidence intervals as well as negative and positive likelihood ratios will be pooled and weighted using random effects meta-analysis, if appropriate. A hierarchical summary receiver operating characteristics curve for self-collected specimens will be generated. This synthesis involves a meta-analysis of self-collected samples (urine, vaginal, pharyngeal and rectal swabs) versus clinician-collected samples for the diagnosis of CT, GC and HPV, the most prevalent STIs. Our systematic review will allow patients, clinicians and researchers to determine the diagnostic accuracy of specimens collected by patients compared to those collected by clinicians in the detection of chlamydia, gonorrhea and HPV.

  4. A modular method for the extraction of DNA and RNA, and the separation of DNA pools from diverse environmental sample types

    DEFF Research Database (Denmark)

    Lever, Mark; Torti, Andrea; Eickenbusch, Philip

    2015-01-01

    tests, in which permutations of all nucleic acid extraction steps were compared. The final modular protocol is suitable for extractions from igneous rock, air, water, and sediments. Sediments range from high-biomass, organic rich coastal samples to samples from the most oligotrophic region of the world...... DNA pools without cell lysis from intracellular and particle-complexed DNA pools may enable new insights into the cycling and preservation of DNA in environmental samples in the future. A general protocol is outlined, along with recommendations for optimizing this general protocol for specific sample...

  5. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    Directory of Open Access Journals (Sweden)

    Michael Robert Cunningham

    2016-10-01

    Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.

  6. Cross-Layer Protocol as a Better Option in Wireless Mesh Network with Respect to Layered-Protocol

    OpenAIRE

    Ahmed Abdulwahab Al-Ahdal; Dr. V. P. Pawar; G. N. Shinde

    2014-01-01

    The Optimal way to improve Wireless Mesh Networks (WMNs) performance is to use a better network protocol, but whether layered-protocol design or cross-layer design is a better option to optimize protocol performance in WMNs is still an on-going research topic. In this paper, we focus on cross-layer protocol as a better option with respect to layered-protocol. The layered protocol architecture (OSI) model divides networking tasks into layers and defines a pocket of services for each layer to b...

  7. A pilot audit of a protocol for ambulatory investigation of predicted low-risk patients with possible pulmonary embolism.

    Science.gov (United States)

    McDonald, A H; Murphy, R

    2011-09-01

    Patients with possible pulmonary embolism (PE) commonly present to acute medical services. Research has led to the identification of low-risk patients suitable for ambulatory management. We report on a protocol designed to select low-risk patients for ambulatory investigation if confirmatory imaging is not available that day. The protocol was piloted in the Emergency Department and Medical Assessment Area at the Royal Infirmary of Edinburgh. We retrospectively analysed electronic patient records in an open observational audit of all patients managed in the ambulatory arm over five months of use. We analysed 45 patients' records. Of these, 91.1% required imaging to confirm or refute PE, 62.2% received a computed tomography pulmonary angiogram (CTPA). In 25% of patients, PE was confirmed with musculoskeletal pain (22.7%), and respiratory tract infection (15.9%) the next most prevalent diagnoses. Alternative diagnoses was provided by CTPA in 32% of cases. We identified no adverse events or readmissions but individualised follow-up was not attempted. The data from this audit suggests this protocol can be applied to select and manage low-risk patients suitable for ambulatory investigation of possible PE. A larger prospective comparative study would be required to accurately define the safety and effectiveness of this protocol.

  8. Interlaboratory test comparison among Environmental Radioactivity Laboratories using the ISO/IUPAC/AOAC Protocol

    International Nuclear Information System (INIS)

    Romero, L.; Ramos, L.; Salas, R.

    1998-01-01

    World-wide acceptance of results from radiochemical analyses requires reliable, traceable and comparable measurements to SI units, particularly when data sets generated by laboratories are to contribute to evaluation of data from environmental pollution research and monitoring programmes. The Spanish Nuclear Safety Council (CSN) organizes in collaboration with CIEMAT periodical interlaboratory test comparisons for environmental radioactivity laboratories aiming to provide them with the necessary means to asses the quality of their results. This paper presents data from the most recent exercise which, for the first time, was evaluated following the procedure recommended in the ISO/IUPAC/AOAC Harmonized Protocol for the proficiency testing of analytical laboratories (1). The test sample was a Reference Material provided by the IAEA-AQCS, a lake sediment containing the following radionuclides: k-40, Ra-226, Ac-228, Cs-137, Sr-90, Pu-(239+240). The results of the proficiency test were computed for the 28 participating laboratories using the z-score approach, the evaluation of the exercises is presented in the paper. The use of a z-score classification has demonstrated to provide laboratories with a more objective means of assessing and demonstrating the reliability of the data they are producing. Analytical proficiency of the participating laboratories has been found to be satisfactory in 57 to 100 percent of cases. (1)- The International harmonized protocol for the proficiency testing of (chemical) analytical laboratories. Pure and Appl. Chem. Vol. 65, n 9, pp. 2123-2144, 1993 IUPAC. GB (Author) 3 refs

  9. A quick DNA extraction protocol: Without liquid nitrogen in ambient ...

    African Journals Online (AJOL)

    Marker assisted selection is an effective technique for quality traits selection in breeding program which are impossible by visual observation. Marker assisted selection in early generation requires rapid DNA extraction protocol for large number of samples in a low cost approach. A rapid and inexpensive DNA extraction ...

  10. Standard operating procedures for collection of soil and sediment samples for the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study

    Science.gov (United States)

    Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.

    2015-12-17

    An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this

  11. Evaluating Protocol Lifecycle Time Intervals in HIV/AIDS Clinical Trials

    Science.gov (United States)

    Schouten, Jeffrey T.; Dixon, Dennis; Varghese, Suresh; Cope, Marie T.; Marci, Joe; Kagan, Jonathan M.

    2014-01-01

    present for a specific study phase may have been masked by combining protocols into phase groupings. Presence of informative censoring, such as withdrawal of some protocols from development if they began showing signs of lost interest among investigators, complicates interpretation of Kaplan-Meier estimates. Because this study constitutes a retrospective examination over an extended period of time, it does not allow for the precise identification of relative factors impacting timing. Conclusions Delays not only increase the time and cost to complete clinical trials, but they also diminish their usefulness by failing to answer research questions in time. We believe that research analyzing the time spent traversing defined intervals across the clinical trial protocol development and implementation continuum can stimulate business process analyses and reengineering efforts that could lead to reductions in the time from clinical trial concept to results, thereby accelerating progress in clinical research. PMID:24980279

  12. Metabolic demands and replenishment of muscle glycogen after a rugby league match simulation protocol.

    Science.gov (United States)

    Bradley, Warren J; Hannon, Marcus P; Benford, Victoria; Morehen, James C; Twist, Craig; Shepherd, Sam; Cocks, Matthew; Impey, Samuel G; Cooper, Robert G; Morton, James P; Close, Graeme L

    2017-09-01

    The metabolic requirements of a rugby league match simulation protocol and the timing of carbohydrate provision on glycogen re-synthesis in damaged muscle were examined. Fifteen (mean±SD: age 20.9±2.9 year, body-mass 87.3±14.1kg, height 177.4±6.0cm) rugby league (RL) players consumed a 6gkgday-1 CHO diet for 7-days, completed a time to exhaustion test (TTE) and a glycogen depletion protocol on day-3, a RL simulated-match protocol (RLMSP) on day-5 and a TTE on day-7. Players were prescribed an immediate or delayed (2-h-post) re-feed post-simulation. Muscle biopsies and blood samples were obtained post-depletion, before and after simulated match-play, and 48-h after match-play with PlayerLoad and heart-rate collected throughout the simulation. Data were analysed using effects sizes±90% CI and magnitude-based inferences. PlayerLoad (8.0±0.7 AUmin-1) and %HRpeak (83±4.9%) during the simulation were similar to values reported for RL match-play. Muscle glycogen very likely increased from immediately after to 48-h post-simulation (272±97 cf. 416±162mmolkg-1d.w.; ES±90%CI) after immediate re-feed, but changes were unclear (283±68 cf. 361±144mmolkg-1d.w.; ES±90%CI) after delayed re-feed. CK almost certainly increased by 77.9±25.4% (0.75±0.19) post-simulation for all players. The RLMSP presents a replication of the internal loads associated with professional RL match-play, although difficulties in replicating the collision reduced the metabolic demands and glycogen utilisation. Further, it is possible to replete muscle glycogen in damaged muscle employing an immediate re-feed strategy. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  13. Procedures for field chemical analyses of water samples

    International Nuclear Information System (INIS)

    Korte, N.; Ealey, D.

    1983-12-01

    A successful water-quality monitoring program requires a clear understanding of appropriate measurement procedures in order to obtain reliable field data. It is imperative that the responsible personnel have a thorough knowledge of the limitations of the techniques being used. Unfortunately, there is a belief that field analyses are simple and straightforward. Yet, significant controversy as well as misuse of common measurement techniques abounds. This document describes procedures for field measurements of pH, carbonate and bicarbonate, specific conductance, dissolved oxygen, nitrate, Eh, and uranium. Each procedure section includes an extensive discussion regarding the limitations of the method as well as brief discussions of calibration procedures and available equipment. A key feature of these procedures is the consideration given to the ultimate use of the data. For example, if the data are to be used for geochemical modeling, more precautions are needed. In contrast, routine monitoring conducted merely to recognize gross changes can be accomplished with less effort. Finally, quality assurance documentation for each measurement is addressed in detail. Particular attention is given to recording sufficient information such that decisions concerning the quality of the data can be easily made. Application of the procedures and recommendations presented in this document should result in a uniform and credible water-quality monitoring program. 22 references, 4 figures, 3 tables

  14. [Multidisciplinary protocol for computed tomography imaging and angiographic embolization of splenic injury due to trauma: assessment of pre-protocol and post-protocol outcomes].

    Science.gov (United States)

    Koo, M; Sabaté, A; Magalló, P; García, M A; Domínguez, J; de Lama, M E; López, S

    2011-11-01

    To assess conservative treatment of splenic injury due to trauma, following a protocol for computed tomography (CT) and angiographic embolization. To quantify the predictive value of CT for detecting bleeding and need for embolization. The care protocol developed by the multidisciplinary team consisted of angiography with embolization of lesions revealed by contrast extravasation under CT as well as embolization of grade III-V injuries observed, or grade I-II injuries causing hemodynamic instability and/or need for blood transfusion. We collected data on demographic variables, injury severity score (ISS), angiographic findings, and injuries revealed by CT. Pre-protocol and post-protocol outcomes were compared. The sensitivity and specificity of CT findings were calculated for all patients who required angiographic embolization. Forty-four and 30 angiographies were performed in the pre- and post-protocol periods, respectively. The mean (SD) ISSs in the two periods were 25 (11) and 26 (12), respectively. A total of 24 (54%) embolizations were performed in the pre-protocol period and 28 (98%) after implementation of the protocol. Two and 7 embolizations involved the spleen in the 2 periods, respectively; abdominal laparotomies numbered 32 and 25, respectively, and 10 (31%) vs 4 (16%) splenectomies were performed. The specificity and sensitivity values for contrast extravasation found on CT and followed by embolization were 77.7% and 79.5%. The implementation of this multidisciplinary protocol using CT imaging and angiographic embolization led to a decrease in the number of splenectomies. The protocol allows us to take a more conservative treatment approach.

  15. Reliability of the k{sub 0}-standardization method using geological sample analysed in a proficiency test

    Energy Technology Data Exchange (ETDEWEB)

    Pelaes, Ana Clara O.; Menezes, Maria Ângela de B.C., E-mail: anacpelaes@gmail.com, E-mail: menezes@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-11-01

    The Neutron Activation Analysis (NAA) is an analytical technique to determine the elemental chemical composition in samples of several matrices, that has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear /Comissão Nacional de Energia Nuclear (Nuclear Technology Development Center/Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of application of the technique, the k{sub 0}-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was reestablished and optimized. In order to verify the reproducibility of the results generated by the application of the k{sub 0}-standardization method at CDTN, aliquots of a geological sample sent by WEPAL (Wageningen Evaluating Programs for Analytical Laboratories) were analysed and its results were compared with the results obtained through the Intercomparison of Results organized by the International Atomic Energy Agency in 2015. WEPAL is an accredited institution for the organisation of interlaboratory studies, preparing and organizing proficiency testing schemes all over the world. Therefore, the comparison with the results provided aims to contribute to the continuous improvement of the quality of the results obtained by the CDTN. The objective of this study was to verify the reliability of the method applied two years after the intercomparison round. (author)

  16. Cryptographic Protocols:

    DEFF Research Database (Denmark)

    Geisler, Martin Joakim Bittel

    cryptography was thus concerned with message confidentiality and integrity. Modern cryptography cover a much wider range of subjects including the area of secure multiparty computation, which will be the main topic of this dissertation. Our first contribution is a new protocol for secure comparison, presented...... implemented the comparison protocol in Java and benchmarks show that is it highly competitive and practical. The biggest contribution of this dissertation is a general framework for secure multiparty computation. Instead of making new ad hoc implementations for each protocol, we want a single and extensible...... in Chapter 2. Comparisons play a key role in many systems such as online auctions and benchmarks — it is not unreasonable to say that when parties come together for a multiparty computation, it is because they want to make decisions that depend on private information. Decisions depend on comparisons. We have...

  17. Low incidence of clonality in cold water corals revealed through the novel use of a standardized protocol adapted to deep sea sampling

    Science.gov (United States)

    Becheler, Ronan; Cassone, Anne-Laure; Noël, Philippe; Mouchel, Olivier; Morrison, Cheryl L.; Arnaud-Haond, Sophie

    2017-11-01

    Sampling in the deep sea is a technical challenge, which has hindered the acquisition of robust datasets that are necessary to determine the fine-grained biological patterns and processes that may shape genetic diversity. Estimates of the extent of clonality in deep-sea species, despite the importance of clonality in shaping the local dynamics and evolutionary trajectories, have been largely obscured by such limitations. Cold-water coral reefs along European margins are formed mainly by two reef-building species, Lophelia pertusa and Madrepora oculata. Here we present a fine-grained analysis of the genotypic and genetic composition of reefs occurring in the Bay of Biscay, based on an innovative deep-sea sampling protocol. This strategy was designed to be standardized, random, and allowed the georeferencing of all sampled colonies. Clonal lineages discriminated through their Multi-Locus Genotypes (MLG) at 6-7 microsatellite markers could thus be mapped to assess the level of clonality and the spatial spread of clonal lineages. High values of clonal richness were observed for both species across all sites suggesting a limited occurrence of clonality, which likely originated through fragmentation. Additionally, spatial autocorrelation analysis underlined the possible occurrence of fine-grained genetic structure in several populations of both L. pertusa and M. oculata. The two cold-water coral species examined had contrasting patterns of connectivity among canyons, with among-canyon genetic structuring detected in M. oculata, whereas L. pertusa was panmictic at the canyon scale. This study exemplifies that a standardized, random and georeferenced sampling strategy, while challenging, can be applied in the deep sea, and associated benefits outlined here include improved estimates of fine grained patterns of clonality and dispersal that are comparable across sites and among species.

  18. Evolution of the Lunar Receiving Laboratory to the Astromaterial Sample Curation Facility: Technical Tensions Between Containment and Cleanliness, Between Particulate and Organic Cleanliness

    Science.gov (United States)

    Allton, J. H.; Zeigler, R. A.; Calaway, M. J.

    2016-01-01

    The Lunar Receiving Laboratory (LRL) was planned and constructed in the 1960s to support the Apollo program in the context of landing on the Moon and safely returning humans. The enduring science return from that effort is a result of careful curation of planetary materials. Technical decisions for the first facility included sample handling environment (vacuum vs inert gas), and instruments for making basic sample assessment, but the most difficult decision, and most visible, was stringent biosafety vs ultra-clean sample handling. Biosafety required handling of samples in negative pressure gloveboxes and rooms for containment and use of sterilizing protocols and animal/plant models for hazard assessment. Ultra-clean sample handling worked best in positive pressure nitrogen environment gloveboxes in positive pressure rooms, using cleanable tools of tightly controlled composition. The requirements for these two objectives were so different, that the solution was to design and build a new facility for specific purpose of preserving the scientific integrity of the samples. The resulting Lunar Curatorial Facility was designed and constructed, from 1972-1979, with advice and oversight by a very active committee comprised of lunar sample scientists. The high precision analyses required for planetary science are enabled by stringent contamination control of trace elements in the materials and protocols of construction (e.g., trace element screening for paint and flooring materials) and the equipment used in sample handling and storage. As other astromaterials, especially small particles and atoms, were added to the collections curated, the technical tension between particulate cleanliness and organic cleanliness was addressed in more detail. Techniques for minimizing particulate contamination in sample handling environments use high efficiency air filtering techniques typically requiring organic sealants which offgas. Protocols for reducing adventitious carbon on sample

  19. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Directory of Open Access Journals (Sweden)

    Yongming Han

    Full Text Available Quantifying elemental carbon (EC content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT. A high-temperature method with extended heating times (STN120 showed the highest ECT/ECR ratio (0.86 while a low-temperature protocol (IMPROVE-550, with heating time adjusted for sample loading, showed the lowest (0.53. STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average than in soils (5.2 on average, most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  20. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Science.gov (United States)

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  1. A METHOD FOR PREPARING A SUBSTRATE BY APPLYING A SAMPLE TO BE ANALYSED

    DEFF Research Database (Denmark)

    2017-01-01

    The invention relates to a method for preparing a substrate (105a) comprising a sample reception area (110) and a sensing area (111). The method comprises the steps of: 1) applying a sample on the sample reception area; 2) rotating the substrate around a predetermined axis; 3) during rotation......, at least part of the liquid travels from the sample reception area to the sensing area due to capillary forces acting between the liquid and the substrate; and 4) removing the wave of particles and liquid formed at one end of the substrate. The sensing area is closer to the predetermined axis than...... the sample reception area. The sample comprises a liquid part and particles suspended therein....

  2. Sampling and analyses report for December 1992 semiannual postburn sampling at the RMI UCG Site, Hanna, Wyoming

    International Nuclear Information System (INIS)

    Lindblom, S.R.

    1993-03-01

    During December 1992, groundwater was sampled at the site of the November 1987--February 1988 Rocky Mountain 1 underground coal gasification test near Hanna, Wyoming. The groundwater in near baseline condition. Data from the field measurements and analyzes of samples are presented. Benzene concentrations in the groundwater are below analytical detection limits (<0.01 mg/L) for all wells, except concentrations of 0.016 mg/L and 0.013 mg/L in coal seam wells EMW-3 and EMW-1, respectively

  3. Stream Control Transmission Protocol as a Transport for SIP: a case study

    Directory of Open Access Journals (Sweden)

    Giuseppe De Marco

    2004-06-01

    Full Text Available The dominant signalling protocol both in future wireless and wired networks will be the Session Initiation Protocol (SIP, as pointed out in the 3G IP-based mobile networks specifications, entailing a fully Internet integrated network. The use of SIP in the IP Multimedia Subsytem (IMS of Release 5 involves the development of servers capable to handle a large number of call requests. The signaling traffic associated to such requests could explode, if an intelligent congestion control were not introduced. Stream Control Transmission Protocol (SCTP was born to support transport of SS7 signaling messages. However, many of the SCTP features are also useful for transport of SIP messages, as: congestion control mechanism, good separation among independent messages, multihoming. Indeed, adoption of SCTP as transport of SIP signaling might prove useful in some situations where usual transport protocols, like TCP and UDP, suffer performance degradation. In this paper, we analyse the general framework wherein SIP operates and we discuss the benefits of using SCTP as a transport for SIP, toward fair sharing of network resources. This study is carried on in the context of the implementation of an high-performance SIP Proxy Server. We also present some preliminar results of an implementation of SIP over SCTP/UDP in a real LAN environment.

  4. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  5. Disagreements in meta-analyses using outcomes measured on continuous or rating scales: observer agreement study

    DEFF Research Database (Denmark)

    Tendal, Britta; Higgins, Julian P T; Jüni, Peter

    2009-01-01

    difference (SMD), the protocols for the reviews and the trial reports (n=45) were retrieved. DATA EXTRACTION: Five experienced methodologists and five PhD students independently extracted data from the trial reports for calculation of the first SMD result in each review. The observers did not have access...... to the reviews but to the protocols, where the relevant outcome was highlighted. The agreement was analysed at both trial and meta-analysis level, pairing the observers in all possible ways (45 pairs, yielding 2025 pairs of trials and 450 pairs of meta-analyses). Agreement was defined as SMDs that differed less...... than 0.1 in their point estimates or confidence intervals. RESULTS: The agreement was 53% at trial level and 31% at meta-analysis level. Including all pairs, the median disagreement was SMD=0.22 (interquartile range 0.07-0.61). The experts agreed somewhat more than the PhD students at trial level (61...

  6. The importance of cooling of urine samples for doping analysis

    NARCIS (Netherlands)

    Kuenen, J. Gijs; Konings, Wil N.

    Storing and transporting of urine samples for doping analysis, as performed by the anti-doping organizations associated with the World Anti-Doping Agency, does not include a specific protocol for cooled transport from the place of urine sampling to the doping laboratory, although low cost cooling

  7. Multimode Communication Protocols Enabling Reconfigurable Radios

    Directory of Open Access Journals (Sweden)

    Berlemann Lars

    2005-01-01

    Full Text Available This paper focuses on the realization and application of a generic protocol stack for reconfigurable wireless communication systems. This focus extends the field of software-defined radios which usually concentrates on the physical layer. The generic protocol stack comprises common protocol functionality and behavior which are extended through specific parts of the targeted radio access technology. This paper considers parameterizable modules of basic protocol functions residing in the data link layer of the ISO/OSI model. System-specific functionality of the protocol software is realized through adequate parameterization and composition of the generic modules. The generic protocol stack allows an efficient realization of reconfigurable protocol software and enables a completely reconfigurable wireless communication system. It is a first step from side-by-side realized, preinstalled modes in a terminal towards a dynamic reconfigurable anymode terminal. The presented modules of the generic protocol stack can also be regarded as a toolbox for the accelerated and cost-efficient development of future communication protocols.

  8. Ancestors protocol for scalable key management

    Directory of Open Access Journals (Sweden)

    Dieter Gollmann

    2010-06-01

    Full Text Available Group key management is an important functional building block for secure multicast architecture. Thereby, it has been extensively studied in the literature. The main proposed protocol is Adaptive Clustering for Scalable Group Key Management (ASGK. According to ASGK protocol, the multicast group is divided into clusters, where each cluster consists of areas of members. Each cluster uses its own Traffic Encryption Key (TEK. These clusters are updated periodically depending on the dynamism of the members during the secure session. The modified protocol has been proposed based on ASGK with some modifications to balance the number of affected members and the encryption/decryption overhead with any number of the areas when a member joins or leaves the group. This modified protocol is called Ancestors protocol. According to Ancestors protocol, every area receives the dynamism of the members from its parents. The main objective of the modified protocol is to reduce the number of affected members during the leaving and joining members, then 1 affects n overhead would be reduced. A comparative study has been done between ASGK protocol and the modified protocol. According to the comparative results, it found that the modified protocol is always outperforming the ASGK protocol.

  9. Security and SCADA protocols

    International Nuclear Information System (INIS)

    Igure, V. M.; Williams, R. D.

    2006-01-01

    Supervisory control and data acquisition (SCADA) networks have replaced discrete wiring for many industrial processes, and the efficiency of the network alternative suggests a trend toward more SCADA networks in the future. This paper broadly considers SCADA to include distributed control systems (DCS) and digital control systems. These networks offer many advantages, but they also introduce potential vulnerabilities that can be exploited by adversaries. Inter-connectivity exposes SCADA networks to many of the same threats that face the public internet and many of the established defenses therefore show promise if adapted to the SCADA differences. This paper provides an overview of security issues in SCADA networks and ongoing efforts to improve the security of these networks. Initially, a few samples from the range of threats to SCADA network security are offered. Next, attention is focused on security assessment of SCADA communication protocols. Three challenges must be addressed to strengthen SCADA networks. Access control mechanisms need to be introduced or strengthened, improvements are needed inside of the network to enhance security and network monitoring, and SCADA security management improvements and policies are needed. This paper discusses each of these challenges. This paper uses the Profibus protocol as an example to illustrate some of the vulnerabilities that arise within SCADA networks. The example Profibus security assessment establishes a network model and an attacker model before proceeding to a list of example attacks. (authors)

  10. Research-Grade 3D Virtual Astromaterials Samples: Novel Visualization of NASA's Apollo Lunar Samples and Antarctic Meteorite Samples to Benefit Curation, Research, and Education

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K. R.; Zeigler, R. A.; Righter, K.; Hanna, R. D.; Ketcham, R. A.

    2017-01-01

    NASA's vast and growing collections of astromaterials are both scientifically and culturally significant, requiring unique preservation strategies that need to be recurrently updated to contemporary technological capabilities and increasing accessibility demands. New technologies have made it possible to advance documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. Our interdisciplinary team has developed a method to create 3D Virtual Astromaterials Samples (VAS) of the existing collections of Apollo Lunar Samples and Antarctic Meteorites. Research-grade 3D VAS will virtually put these samples in the hands of researchers and educators worldwide, increasing accessibility and visibility of these significant collections. With new sample return missions on the horizon, it is of primary importance to develop advanced curation standards for documentation and visualization methodologies.

  11. Evaluating Psychometric Characteristics of Detection Protocol of Malingering Stuttering

    Directory of Open Access Journals (Sweden)

    Arsia Thaghva

    2017-07-01

    Conclusion According to the results, the detection protocol of malingering stuttering is of good internal consistency and concurrent validity. However, considering that the sample population was not large in the present study, it can be said that this study is a preliminary evaluation to find the psychometric features of the instruments, with the aim of laying the groundwork for further studies.

  12. 33 CFR 159.317 - Sampling and reporting.

    Science.gov (United States)

    2010-07-01

    ... Section 159.317 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED... Operations § 159.317 Sampling and reporting. (a) The owner, operator, master or other person in charge of a... protocols, including chain of custody; (2) Laboratory analytical information including methods used...

  13. Evaluation of Extraction Protocols for Simultaneous Polar and Non-Polar Yeast Metabolite Analysis Using Multivariate Projection Methods

    Directory of Open Access Journals (Sweden)

    Nicolas P. Tambellini

    2013-07-01

    Full Text Available Metabolomic and lipidomic approaches aim to measure metabolites or lipids in the cell. Metabolite extraction is a key step in obtaining useful and reliable data for successful metabolite studies. Significant efforts have been made to identify the optimal extraction protocol for various platforms and biological systems, for both polar and non-polar metabolites. Here we report an approach utilizing chemoinformatics for systematic comparison of protocols to extract both from a single sample of the model yeast organism Saccharomyces cerevisiae. Three chloroform/methanol/water partitioning based extraction protocols found in literature were evaluated for their effectiveness at reproducibly extracting both polar and non-polar metabolites. Fatty acid methyl esters and methoxyamine/trimethylsilyl derivatized aqueous compounds were analyzed by gas chromatography mass spectrometry to evaluate non-polar or polar metabolite analysis. The comparative breadth and amount of recovered metabolites was evaluated using multivariate projection methods. This approach identified an optimal protocol consisting of 64 identified polar metabolites from 105 ion hits and 12 fatty acids recovered, and will potentially attenuate the error and variation associated with combining metabolite profiles from different samples for untargeted analysis with both polar and non-polar analytes. It also confirmed the value of using multivariate projection methods to compare established extraction protocols.

  14. Recommendations for sampling for prevention of hazards in civil defense. On analytics of chemical, biological and radioactive contaminations. Brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling

    International Nuclear Information System (INIS)

    Bachmann, Udo; Biederbick, Walter; Derakshani, Nahid

    2010-01-01

    The recommendation for sampling for prevention of hazards in civil defense is describing the analytics of chemical, biological and radioactive contaminations and includes detail information on the sampling, protocol preparation and documentation procedures. The volume includes a separate brief instruction for the CBRN (chemical, biological, radioactive, nuclear) sampling.

  15. Comparative biochemical analyses of venous blood and peritoneal fluid from horses with colic using a portable analyser and an in-house analyser.

    Science.gov (United States)

    Saulez, M N; Cebra, C K; Dailey, M

    2005-08-20

    Fifty-six horses with colic were examined over a period of three months. The concentrations of glucose, lactate, sodium, potassium and chloride, and the pH of samples of blood and peritoneal fluid, were determined with a portable clinical analyser and with an in-house analyser and the results were compared. Compared with the in-house analyser, the portable analyser gave higher pH values for blood and peritoneal fluid with greater variability in the alkaline range, and lower pH values in the acidic range, lower concentrations of glucose in the range below 8.3 mmol/l, and lower concentrations of lactate in venous blood in the range below 5 mmol/l and in peritoneal fluid in the range below 2 mmol/l, with less variability. On average, the portable analyser underestimated the concentrations of lactate and glucose in peritoneal fluid in comparison with the in-house analyser. Its measurements of the concentrations of sodium and chloride in peritoneal fluid had a higher bias and were more variable than the measurements in venous blood, and its measurements of potassium in venous blood and peritoneal fluid had a smaller bias and less variability than the measurements made with the in-house analyser.

  16. Epistemic Protocols for Distributed Gossiping

    Directory of Open Access Journals (Sweden)

    Krzysztof R. Apt

    2016-06-01

    Full Text Available Gossip protocols aim at arriving, by means of point-to-point or group communications, at a situation in which all the agents know each other's secrets. We consider distributed gossip protocols which are expressed by means of epistemic logic. We provide an operational semantics of such protocols and set up an appropriate framework to argue about their correctness. Then we analyze specific protocols for complete graphs and for directed rings.

  17. Design unbiased estimation in line intersect sampling using segmented transects

    Science.gov (United States)

    David L.R. Affleck; Timothy G. Gregoire; Harry T. Valentine; Harry T. Valentine

    2005-01-01

    In many applications of line intersect sampling. transects consist of multiple, connected segments in a prescribed configuration. The relationship between the transect configuration and the selection probability of a population element is illustrated and a consistent sampling protocol, applicable to populations composed of arbitrarily shaped elements, is proposed. It...

  18. Detecting Organic Compounds Released from Iron Oxidizing Bacteria using Sample Analysis at Mars (SAM) Like Instrument Protocols

    Science.gov (United States)

    Glavin, D. P.; Popa, R.; Martin, M. G.; Freissinet, C.; Fisk, M. R.; Dworkin, J. P.; Mahaffy, P. R.

    2012-01-01

    Mars is a planet of great interest for Astrobiology since its past environmental conditions are thought to have been favourable for the emergence life. At present, the Red Planet is extremely cold and dry and the surface is exposed to intense UV and ionizing radiation, conditions generally considered to be incompatible with life as we know it on Earth. It was proposed that the shallow subsurface of Mars, where temperatures can be above freezing and liquid water can exist on rock surfaces, could harbor chemolithoautotrophic bacteria such as the iron oxidizing microorganism Pseudomonas sp. HerB. The Mars Science Laboratory (MSL) mission will provide the next opportunity to carry out in situ measurements for organic compounds of possible biological origin on Mars. One instrument onboard MSL, called the Sample Analysis at Mars (SAM) instrument suite, will carry out a broad and sensitive search for organic compounds in surface samples using either high temperature pyrolysis or chemical extraction followed by gas chromatography mass spectrometry. We present gas chromatograph mass spectrometer (GC/MS) data on crushed olivine rock powders that have been inoculated with Pseudomonas sp. HerB at different concentrations ranging from approx 10(exp 2) to 10(exp 7) cells per gram. The inoculated olivine samples were heated under helium carrier gas flow at 500 C and the pyrolysis products concentrated using a SAM-like hydrocarbon trap set at -20 C followed by trap heating and analysis by GC/Ms. In addition, the samples were also extracted using a low temperature "one-pot" chemical extraction technique using N-methyl, N-(tert-butyldimethylsilyl) trifluoroacetamide (MTBSTFA) as the silylating agent prior to GC/MS analysis. We identified several aldehydes, thiols, and alkene nitriles after pyrolysis GC/MS analysis of the bacteria that were not found in the olivine control samples that had not been inoculated with bacteria. The distribution of pyrolysis products extracted from the

  19. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  20. Gamma-H2AX biodosimetry for use in large scale radiation incidents: comparison of a rapid ‘96 well lyse/fix’ protocol with a routine method

    Directory of Open Access Journals (Sweden)

    Jayne Moquet

    2014-03-01

    Full Text Available Following a radiation incident, preliminary dose estimates made by γ-H2AX foci analysis can supplement the early triage of casualties based on clinical symptoms. Sample processing time is important when many individuals need to be rapidly assessed. A protocol was therefore developed for high sample throughput that requires less than 0.1 ml blood, thus potentially enabling finger prick sampling. The technique combines red blood cell lysis and leukocyte fixation in one step on a 96 well plate, in contrast to the routine protocol, where lymphocytes in larger blood volumes are typically separated by Ficoll density gradient centrifugation with subsequent washing and fixation steps. The rapid ‘96 well lyse/fix’ method reduced the estimated sample processing time for 96 samples to about 4 h compared to 15 h using the routine protocol. However, scoring 20 cells in 96 samples prepared by the rapid protocol took longer than for the routine method (3.1 versus 1.5 h at zero dose; 7.0 versus 6.1 h for irradiated samples. Similar foci yields were scored for both protocols and consistent dose estimates were obtained for samples exposed to 0, 0.2, 0.6, 1.1, 1.2, 2.1 and 4.3 Gy of 250 kVp X-rays at 0.5 Gy/min and incubated for 2 h. Linear regression coefficients were 0.87 ± 0.06 (R2 = 97.6% and 0.85 ± 0.05 (R2 = 98.3% for estimated versus actual doses for the routine and lyse/fix method, respectively. The lyse/fix protocol can therefore facilitate high throughput processing for γ-H2AX biodosimetry for use in large scale radiation incidents, at the cost of somewhat longer foci scoring times.

  1. Improving the quality of perinatal mental health: a health visitor-led protocol.

    Science.gov (United States)

    Lewis, Anne; Ilot, Irene; Lekka, Chrysanthi; Oluboyede, Yemi

    2011-02-01

    The mental health of mothers is of significant concern to community practitioners. This paper reports on a case study exploring the success factors of a well established, health visitor-led protocol to identify and treat women with mild to moderate depression. Data were collected through interviews with a purposive sample of 12 community practitioners, a focus group of four health visitors and observation of a multidisciplinary steering group meeting. The protocol was described as an evidence-based tool and safety net that could be used flexibly to support clinical judgments and tailored to individual needs. Success factors included frontline clinician engagement and ownership, continuity of leadership to drive development and maintain momentum, comprehensive and on-going staff training, and strategic support for the protocol as a quality indicator at a time of organisational change. Quality and clinical leadership are continuing policy priorities. The protocol enabled frontline staff to lead a service innovation, providing a standardised multiprofessional approach to women's mental health needs through effective support, advice and treatment that can be measured and quality assured.

  2. Soil sampling and analytical strategies for mapping fallout in nuclear emergencies based on the Fukushima Dai-ichi Nuclear Power Plant accident

    International Nuclear Information System (INIS)

    Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long

    2015-01-01

    The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and 131 I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. - Highlights: • Emergency soil sampling protocol was proposed for nuclear hazards. • Various sampling procedures were tested and evaluated in Fukushima area. • Soil sample mixing procedure was of key importance for measured radioactivity. • Minimum number of sampling was determined for reducing measurement uncertainty

  3. Method for spiking soil samples with organic compounds

    DEFF Research Database (Denmark)

    Brinch, Ulla C; Ekelund, Flemming; Jacobsen, Carsten S

    2002-01-01

    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either...... higher than in control soil, probably due mainly to release of predation from indigenous protozoa. In order to minimize solvent effects on indigenous soil microorganisms when spiking native soil samples with compounds having a low water solubility, we propose a common protocol in which the contaminant...... tagged with luxAB::Tn5. For both solvents, application to the whole sample resulted in severe side effects on both indigenous protozoa and bacteria. Application of dichloromethane to the whole soil volume immediately reduced the number of protozoa to below the detection limit. In one of the soils...

  4. Blockchain protocols in clinical trials: Transparency and traceability of consent

    Science.gov (United States)

    Benchoufi, Mehdi; Porcher, Raphael; Ravaud, Philippe

    2018-01-01

    Clinical trial consent for protocols and their revisions should be transparent for patients and traceable for stakeholders. Our goal is to implement a process allowing for collection of patients’ informed consent, which is bound to protocol revisions, storing and tracking the consent in a secure, unfalsifiable and publicly verifiable way, and enabling the sharing of this information in real time. For that, we build a consent workflow using a trending technology called Blockchain. This is a distributed technology that brings a built-in layer of transparency and traceability. From a more general and prospective point of view, we believe Blockchain technology brings a paradigmatical shift to the entire clinical research field. We designed a Proof-of-Concept protocol consisting of time-stamping each step of the patient’s consent collection using Blockchain, thus archiving and historicising the consent through cryptographic validation in a securely unfalsifiable and transparent way. For each protocol revision, consent was sought again.  We obtained a single document, in an open format, that accounted for the whole consent collection process: a time-stamped consent status regarding each version of the protocol. This document cannot be corrupted and can be checked on any dedicated public website. It should be considered a robust proof of data. However, in a live clinical trial, the authentication system should be strengthened to remove the need for third parties, here trial stakeholders, and give participative control to the peer users. In the future, the complex data flow of a clinical trial could be tracked by using Blockchain, which core functionality, named Smart Contract, could help prevent clinical trial events not occurring in the correct chronological order, for example including patients before they consented or analysing case report form data before freezing the database. Globally, Blockchain could help with reliability, security, transparency and could be

  5. Blockchain protocols in clinical trials: Transparency and traceability of consent.

    Science.gov (United States)

    Benchoufi, Mehdi; Porcher, Raphael; Ravaud, Philippe

    2017-01-01

    Clinical trial consent for protocols and their revisions should be transparent for patients and traceable for stakeholders. Our goal is to implement a process allowing for collection of patients' informed consent, which is bound to protocol revisions, storing and tracking the consent in a secure, unfalsifiable and publicly verifiable way, and enabling the sharing of this information in real time. For that, we build a consent workflow using a trending technology called Blockchain. This is a distributed technology that brings a built-in layer of transparency and traceability. From a more general and prospective point of view, we believe Blockchain technology brings a paradigmatical shift to the entire clinical research field. We designed a Proof-of-Concept protocol consisting of time-stamping each step of the patient's consent collection using Blockchain, thus archiving and historicising the consent through cryptographic validation in a securely unfalsifiable and transparent way. For each protocol revision, consent was sought again.  We obtained a single document, in an open format, that accounted for the whole consent collection process: a time-stamped consent status regarding each version of the protocol. This document cannot be corrupted and can be checked on any dedicated public website. It should be considered a robust proof of data. However, in a live clinical trial, the authentication system should be strengthened to remove the need for third parties, here trial stakeholders, and give participative control to the peer users. In the future, the complex data flow of a clinical trial could be tracked by using Blockchain, which core functionality, named Smart Contract, could help prevent clinical trial events not occurring in the correct chronological order, for example including patients before they consented or analysing case report form data before freezing the database. Globally, Blockchain could help with reliability, security, transparency and could be a

  6. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  7. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  8. Understanding protocol performance: impact of test performance.

    Science.gov (United States)

    Turner, Robert G

    2013-01-01

    This is the second of two articles that examine the factors that determine protocol performance. The objective of these articles is to provide a general understanding of protocol performance that can be used to estimate performance, establish limits on performance, decide if a protocol is justified, and ultimately select a protocol. The first article was concerned with protocol criterion and test correlation. It demonstrated the advantages and disadvantages of different criterion when all tests had the same performance. It also examined the impact of increasing test correlation on protocol performance and the characteristics of the different criteria. To examine the impact on protocol performance when individual tests in a protocol have different performance. This is evaluated for different criteria and test correlations. The results of the two articles are combined and summarized. A mathematical model is used to calculate protocol performance for different protocol criteria and test correlations when there are small to large variations in the performance of individual tests in the protocol. The performance of the individual tests that make up a protocol has a significant impact on the performance of the protocol. As expected, the better the performance of the individual tests, the better the performance of the protocol. Many of the characteristics of the different criteria are relatively independent of the variation in the performance of the individual tests. However, increasing test variation degrades some criteria advantages and causes a new disadvantage to appear. This negative impact increases as test variation increases and as more tests are added to the protocol. Best protocol performance is obtained when individual tests are uncorrelated and have the same performance. In general, the greater the variation in the performance of tests in the protocol, the more detrimental this variation is to protocol performance. Since this negative impact is increased as

  9. Surface Sampling-Based Decontamination Studies and Protocol for Determining Sporicidal Efficacy of Gaseous Fumigants on Military-Relevant Surfaces

    Science.gov (United States)

    2008-09-01

    non-porous surfaces is vital to decon protocol development. Spore density (spore number per unit area) can result in layering and clustering over a...1999, 281, 1735-1745. 9. AOAC International Method 966.04; Official Methods of Analisis , 21’t ed.; Chapter 6: AOAC International: Gaithersburg, MD

  10. Surface and subsurface cleanup protocol for radionuclides, Gunnison, Colorado, UMTRA project processing site

    International Nuclear Information System (INIS)

    1993-09-01

    Surface and subsurface soil cleanup protocols for the Gunnison, Colorado, processing sits are summarized as follows: In accordance with EPA-promulgated land cleanup standards (40 CFR 192), in situ Ra-226 is to be cleaned up based on bulk concentrations not exceeding 5 and 15 pCi/g in 15-cm surface and subsurface depth increments, averaged over 100-m 2 grid blocks, where the parent Ra-226 concentrations are greater than, or in secular equilibrium with, the Th-230 parent. A bulk interpretation of these EPA standards has been accepted by the Nuclear Regulatory Commission (NRC), and while the concentration of the finer-sized soil fraction less than a No. 4 mesh sieve contains the higher concentration of radioactivity, the bulk approach in effect integrates the total sample radioactivity over the entire sample mass. In locations where Th-230 has differentially migrated in subsoil relative to Ra-226, a Th-230 cleanup protocol has been developed in accordance with Supplemental Standard provisions of 40 CFR 192 for NRC/Colorado Department of Health (CDH) approval for timely implementation. Detailed elements of the protocol are contained in Appendix A, Generic Protocol from Thorium-230 Cleanup/Verification at UMTRA Project Processing Sites. The cleanup of other radionuclides or nonradiological hazards that pose a significant threat to the public and the environment will be determined and implemented in accordance with pathway analysis to assess impacts and the implications of ALARA specified in 40 CFR 192 relative to supplemental standards

  11. A new testing protocol for zirconia dental implants.

    Science.gov (United States)

    Sanon, Clarisse; Chevalier, Jérôme; Douillard, Thierry; Cattani-Lorente, Maria; Scherrer, Susanne S; Gremillard, Laurent

    2015-01-01

    Based on the current lack of standards concerning zirconia dental implants, we aim at developing a protocol to validate their functionality and safety prior their clinical use. The protocol is designed to account for the specific brittle nature of ceramics and the specific behavior of zirconia in terms of phase transformation. Several types of zirconia dental implants with different surface textures (porous, alveolar, rough) were assessed. The implants were first characterized in their as-received state by Scanning Electron Microscopy (SEM), Focused Ion Beam (FIB), X-Ray Diffraction (XRD). Fracture tests following a method adapted from ISO 14801 were conducted to evaluate their initial mechanical properties. Accelerated aging was performed on the implants, and XRD monoclinic content measured directly at their surface instead of using polished samples as in ISO 13356. The implants were then characterized again after aging. Implants with an alveolar surface presented large defects. The protocol shows that such defects compromise the long-term mechanical properties. Implants with a porous surface exhibited sufficient strength but a significant sensitivity to aging. Even if associated to micro cracking clearly observed by FIB, aging did not decrease mechanical strength of the implants. As each dental implant company has its own process, all zirconia implants may behave differently, even if the starting powder is the same. Especially, surface modifications have a large influence on strength and aging resistance, which is not taken into account by the current standards. Protocols adapted from this work could be useful. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  12. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  13. Insight into the effects of different ageing protocols on Rh/Al{sub 2}O{sub 3} catalyst

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Baohuai [Key Laboratory of Advanced Materials (MOE), School of Materials Science and Engineering, Tsinghua University, 100084 Beijing (China); Ran, Rui, E-mail: ranr@tsinghua.edu.cn [Key Laboratory of Advanced Materials (MOE), School of Materials Science and Engineering, Tsinghua University, 100084 Beijing (China); Cao, Yidan; Wu, Xiaodong; Weng, Duan [Key Laboratory of Advanced Materials (MOE), School of Materials Science and Engineering, Tsinghua University, 100084 Beijing (China); Fan, Jun [The Administrative Center for China' s Agenda 21, 100038 Beijing (China); Wu, Xueyuan [Key Laboratory of Advanced Materials (MOE), School of Materials Science and Engineering, Tsinghua University, 100084 Beijing (China)

    2014-07-01

    In this work, a catalyst of Rh loaded on Al{sub 2}O{sub 3} was prepared by impregnating method with rhodium nitrate aqueous solution as the Rh precursor. The catalyst was aged under different protocols (lean, rich, inert and cyclic) to obtain several aged samples. All the Rh/Al{sub 2}O{sub 3} samples were characterized by X-ray diffraction (XRD), Brunauer-Emmett-Teller (BET) method, CO-chemisorption, H{sub 2}-temperature programmed reduction (H{sub 2}-TPR), transmission electron microscope (TEM) and X-ray photoelectron spectroscopy (XPS). It was found that a specific ageing treatment could strongly affect the catalytic activity. The N{sub 2} aged and the H{sub 2} aged samples had a better catalytic activity for CO + NO reaction than the fresh sample while the air aged and the cyclic aged samples exhibited much worse activity. More surface Rh content and better reducibility were obtained in the N{sub 2} and the H{sub 2} aged samples and the Rh particles existed with an appropriate size, which were all favorable to the catalytic reaction. However, the air and the cyclic ageing protocols induced a strong interaction between Rh species and the Al{sub 2}O{sub 3} support, which resulted in a severe sintering of particles of Rh species and the loss of active sites. The structure evolution scheme of the catalysts aged in different protocols was also established in this paper.

  14. Peak oxygen uptake in a sprint interval testing protocol vs. maximal oxygen uptake in an incremental testing protocol and their relationship with cross-country mountain biking performance.

    Science.gov (United States)

    Hebisz, Rafał; Hebisz, Paulina; Zatoń, Marek; Michalik, Kamil

    2017-04-01

    In the literature, the exercise capacity of cyclists is typically assessed using incremental and endurance exercise tests. The aim of the present study was to confirm whether peak oxygen uptake (V̇O 2peak ) attained in a sprint interval testing protocol correlates with cycling performance, and whether it corresponds to maximal oxygen uptake (V̇O 2max ) determined by an incremental testing protocol. A sample of 28 trained mountain bike cyclists executed 3 performance tests: (i) incremental testing protocol (ITP) in which the participant cycled to volitional exhaustion, (ii) sprint interval testing protocol (SITP) composed of four 30 s maximal intensity cycling bouts interspersed with 90 s recovery periods, (iii) competition in a simulated mountain biking race. Oxygen uptake, pulmonary ventilation, work, and power output were measured during the ITP and SITP with postexercise blood lactate and hydrogen ion concentrations collected. Race times were recorded. No significant inter-individual differences were observed in regards to any of the ITP-associated variables. However, 9 individuals presented significantly increased oxygen uptake, pulmonary ventilation, and work output in the SITP compared with the remaining cyclists. In addition, in this group of 9 cyclists, oxygen uptake in SITP was significantly higher than in ITP. After the simulated race, this group of 9 cyclists achieved significantly better competition times (99.5 ± 5.2 min) than the other cyclists (110.5 ± 6.7 min). We conclude that mountain bike cyclists who demonstrate higher peak oxygen uptake in a sprint interval testing protocol than maximal oxygen uptake attained in an incremental testing protocol demonstrate superior competitive performance.

  15. Establishment and optimization of NMR-based cell metabonomics study protocols for neonatal Sprague-Dawley rat cardiomyocytes.

    Science.gov (United States)

    Zhang, Ming; Sun, Bo; Zhang, Qi; Gao, Rong; Liu, Qiao; Dong, Fangting; Fang, Haiqin; Peng, Shuangqing; Li, Famei; Yan, Xianzhong

    2017-01-15

    A quenching, harvesting, and extraction protocol was optimized for cardiomyocytes NMR metabonomics analysis in this study. Trypsin treatment and direct scraping cells in acetonitrile were compared for sample harvesting. The results showed trypsin treatment cause normalized concentration increasing of phosphocholine and metabolites leakage, since the trypsin-induced membrane broken and long term harvesting procedures. Then the intracellular metabolite extraction efficiency of methanol and acetonitrile were compared. As a result, washing twice with phosphate buffer, direct scraping cells and extracting with acetonitrile were chosen to prepare cardiomyocytes extracts samples for metabonomics studies. This optimized protocol is rapid, effective, and exhibits greater metabolite retention. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. On the use of personalization to enhance compliance in experience sampling

    NARCIS (Netherlands)

    Markopoulos, P.; Batalas, N.; Timmermans, A.

    2015-01-01

    This paper argues that allowing personalization can increase respondent adherence in experience sampling studies. We report a one week long field experiment (N=36), which compared response rates when respondents select the times at which they are prompted to report in an experience sampling protocol

  17. Analysing change in music therapy interactions of children with communication difficulties.

    Science.gov (United States)

    Spiro, Neta; Himberg, Tommi

    2016-05-05

    Music therapy has been found to improve communicative behaviours and joint attention in children with autism, but it is unclear what in the music therapy sessions drives those changes. We developed an annotation protocol and tools to accumulate large datasets of music therapy, for analysis of interaction dynamics. Analysis of video recordings of improvisational music therapy sessions focused on simple, unambiguous individual and shared behaviours: movement and facing behaviours, rhythmic activity and musical structures and the relationships between them. To test the feasibility of the protocol, early and late sessions of five client-therapist pairs were annotated and analysed to track changes in behaviours. To assess the reliability and validity of the protocol, inter-rater reliability of the annotation tiers was calculated, and the therapists provided feedback about the relevance of the analyses and results. This small-scale study suggests that there are both similarities and differences in the profiles of client-therapist sessions. For example, all therapists faced the clients most of the time, while the clients did not face back so often. Conversely, only two pairs had an increase in regular pulse from early to late sessions. More broadly, similarity across pairs at a general level is complemented by variation in the details. This perhaps goes some way to reconciling client- and context-specificity on one hand and generalizability on the other. Behavioural characteristics seem to influence each other. For instance, shared rhythmic pulse alternated with mutual facing and the occurrence of shared pulse was found to relate to the musical structure. These observations point towards a framework for looking at change in music therapy that focuses on networks of variables or broader categories. The results suggest that even when starting with simple behaviours, we can trace aspects of interaction and change in music therapy, which are seen as relevant by therapists.

  18. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  19. Aespoe Hard Rock Laboratory. Prototype Repository. Analyses of microorganisms, gases and water chemistry in buffer and backfill, 2009

    Energy Technology Data Exchange (ETDEWEB)

    Lydmark, Sara (Microbial Analytics Sweden AB (Sweden))

    2010-09-15

    chemistry. The in 2007 improved sampling and analysis protocols worked very well. Also, the molecular methods that were tested for the first time in the Prototype showed promising potential. IPR 08-01 revealed that many of the hydrochemical sampling points differ quite remarkably from each other. The 16 sampling points were therefore divided into seven sampling groups with similar properties. The properties of one sampling group (i.e., KBU10002+8) resembled those of the groundwater, while others (i.e., KBU10004+6, KBU10005, and KFA01-04) differed, for example, in microbial composition, salinity, sulphate content, pH, and the concentrations of calcium, potassium, magnesium, sodium, and many dissolved metals, actinides, and lanthanides. One sampling group contained sampling points that seemed to be in contact with tunnel air (KBU10003+7). Another sampling group contained sampling points near the canisters in the buffer (KB513-614) with very little pore water with high pH and a high salt content. One sampling point in the backfill, which had not been reached by the groundwater as of May 2007 (KBU10001), now consisted of pore water with properties resembling those of groundwater. The gas composition in the sampling groups was uniform in that the proportion of nitrogen in the extracted gas was increasing and the oxygen content decreasing with time. In most sampling groups, the oxygen content in the pore water had decreased from 3-7% as of May 2007 to 0.6-4% in 2009. This can also be compared with the proportion of oxygen in the gas phase in 2005, which was 10-18%. Hydrogen, methane, helium, and carbon dioxide concentrations varied, especially in the sampling groups with extractable pore water. ATP analyses demonstrated that the biomass in the Prototype repository is high compared to the surrounding groundwater. The microbiological results indicated that aerobic microbes, such as MOB and CHAB, thrived in the aerobic Prototype environment

  20. Aespoe Hard Rock Laboratory. Prototype Repository. Analyses of microorganisms, gases and water chemistry in buffer and backfill, 2009

    International Nuclear Information System (INIS)

    Lydmark, Sara

    2010-09-01

    . The in 2007 improved sampling and analysis protocols worked very well. Also, the molecular methods that were tested for the first time in the Prototype showed promising potential. IPR 08-01 revealed that many of the hydrochemical sampling points differ quite remarkably from each other. The 16 sampling points were therefore divided into seven sampling groups with similar properties. The properties of one sampling group (i.e., KBU10002+8) resembled those of the groundwater, while others (i.e., KBU10004+6, KBU10005, and KFA01-04) differed, for example, in microbial composition, salinity, sulphate content, pH, and the concentrations of calcium, potassium, magnesium, sodium, and many dissolved metals, actinides, and lanthanides. One sampling group contained sampling points that seemed to be in contact with tunnel air (KBU10003+7). Another sampling group contained sampling points near the canisters in the buffer (KB513-614) with very little pore water with high pH and a high salt content. One sampling point in the backfill, which had not been reached by the groundwater as of May 2007 (KBU10001), now consisted of pore water with properties resembling those of groundwater. The gas composition in the sampling groups was uniform in that the proportion of nitrogen in the extracted gas was increasing and the oxygen content decreasing with time. In most sampling groups, the oxygen content in the pore water had decreased from 3-7% as of May 2007 to 0.6-4% in 2009. This can also be compared with the proportion of oxygen in the gas phase in 2005, which was 10-18%. Hydrogen, methane, helium, and carbon dioxide concentrations varied, especially in the sampling groups with extractable pore water. ATP analyses demonstrated that the biomass in the Prototype repository is high compared to the surrounding groundwater. The microbiological results indicated that aerobic microbes, such as MOB and CHAB, thrived in the aerobic Prototype environment

  1. A class-chest for deriving transport protocols

    Energy Technology Data Exchange (ETDEWEB)

    Strayer, W.T.

    1996-10-01

    Development of new transport protocols or protocol algorithms suffers from the complexity of the environment in which they are intended to run. Modeling techniques attempt to avoid this by simulating the environment. Another approach to promoting rapid prototyping of protocols and protocol algorithms is to provide a pre-built infrastructure that is common to transport protocols, so that the focus is placed on the protocol-specific aspects. The Meta-Transport Library is a library of C++ base classes that implement or abstract out the mundane functions of a protocol, new protocol implementations are derived from base classes. The result is a fully viable user- level transport protocol implementation, with emphasis on modularity. The collection of base classes form a ``class-chest`` of tools .from which protocols can be developed and studied with as little change to a normal UNIX environment as possible.

  2. Analysis of the new code stroke protocol in Asturias after one year. Experience at one hospital.

    Science.gov (United States)

    García-Cabo, C; Benavente, L; Martínez-Ramos, J; Pérez-Álvarez, Á; Trigo, A; Calleja, S

    2018-03-01

    Prehospital code stroke (CS) systems have been proved effective for improving access to specialised medical care in acute stroke cases. They also improve the prognosis of this disease, which is one of the leading causes of death and disability in our setting. The aim of this study is to analyse results one year after implementation of the new code stroke protocol at one hospital in Asturias. We prospectively included patients who were admitted to our tertiary care centre as per the code stroke protocol for the period of one year. We analysed 363 patients. Mean age was 69 years and 54% of the cases were men. During the same period in the previous year, there were 236 non-hospital CS activations. One hundred forty-seven recanalisation treatments were performed (66 fibrinolysis and 81 mechanical thrombectomies or combined treatments), representing a 25% increase with regard to the previous year. Recent advances in the management of acute stroke call for coordinated code stroke protocols that are adapted to the needs of each specific region. This may result in an increased number of patients receiving early care, as well as revascularisation treatments. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  4. Determination of 7BE in soil sample by gamma spectrometry for erosion researchs

    International Nuclear Information System (INIS)

    Esquivel, Alexander D.; Kastner, Geraldo F.; Amaral, Angela M.; Monteiro, Roberto Pellacani G.; Moreira, Rubens M.

    2015-01-01

    Cosmogenic 7 Be is a natural radiotracer produced in the stratosphere and troposphere and reached to the Earth surface via wet and dry fallout and hence its measurement for research of erosion in soils is very significant. The 7 Be radio analyse based on gamma spectrometry technique has been a routine methodology for decades and although is the reference procedure is not free of analytical interference. 7 Be is a β-γ emitting radionuclide (Eγ = 477.59 keV, T½ = 53.12d) and depending on the chemical profile of the soil its determination is susceptible to 228 Ac (E γ = 478.40 keV, T½ = 6.15h) interference. The aim of this work was to establish an analytical protocol for the 7 Be determination in soil samples from Juatuba-Mg region in different sampling periods of dry and rainy seasons for erosion studies and to establish some methodologies for evaluating and correcting the interference level of 228 Ac in the 7 Be activity measurements by gamma spectrometry. (author)

  5. Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy

    Science.gov (United States)

    Hassan, Afifa Afifi

    1981-01-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)

  6. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  7. Effective dose comparison between protocols stitched and usual protocols in dental cone beam CT for complete arcade

    International Nuclear Information System (INIS)

    Soares, M. R.; Maia, A. F.; Batista, W. O. G.; Lara, P. A.

    2014-08-01

    To visualization a complete dental radiology dental lives together with two separate proposals: [1] protocols diameter encompassing the entire arch (single) or [2] protocol with multiple fields of view (Fov) which together encompass the entire arch (stitched Fov s). The objective of this study is to evaluate effective dose values in examination protocols for all dental arcade available in different outfits with these two options. For this, a female anthropomorphic phantom manufactured by Radiology Support Devices twenty six thermoluminescent dosimeters inserted in relevant bodies and positions was used. Irradiate the simulator in the clinical conditions. The protocols were averaged and compared: [a] 14.0 cm x 8.5 cm and [b] 8.5 cm x 8.5 cm (Gendex Tomography GXCB 500), [c] protocol stitched for jaw combination of three volumes of 5.0 cm x 3.7 cm (Kodak 9000 3D scanner) [d] protocol stitched Fov s 5.0 cm x 8.0 cm (Planmeca Pro Max 3D) and [e] single technical Fov 14 cm x 8 cm (i-CAT Classical). Our results for the effective dose were: a range between 43.1 and 111.1 micro Sv for technical single Fov and 44.5 and 236.2 for technical stitched Fov s. The protocol presented the highest estimated effective dose was [d] and showed that lowest index was registered [a]. These results demonstrate that the protocol stitched Fov generated in Kodak 9000 3D machine applied the upper dental arch has practically equal value effective dose obtained by protocol extended diameter of, [a], which evaluates in a single image upper and lower arcade. It also demonstrates that the protocol [d] gives an estimate of five times higher than the protocol [a]. Thus, we conclude that in practical terms the protocol [c] stitched Fov s, not presents dosimetric advantages over other protocols. (Author)

  8. Effective dose comparison between protocols stitched and usual protocols in dental cone beam CT for complete arcade

    Energy Technology Data Exchange (ETDEWEB)

    Soares, M. R.; Maia, A. F. [Universidade Federal de Sergipe, Departamento de Fisica, Cidade Universitaria Prof. Jose Aloisio de Campos, Marechal Rondon s/n, Jardim Rosa Elze, 49-100000 Sao Cristovao, Sergipe (Brazil); Batista, W. O. G. [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho, Salvador, 40301015 Bahia (Brazil); Lara, P. A., E-mail: wilsonottobatista@gmail.com [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    To visualization a complete dental radiology dental lives together with two separate proposals: [1] protocols diameter encompassing the entire arch (single) or [2] protocol with multiple fields of view (Fov) which together encompass the entire arch (stitched Fov s). The objective of this study is to evaluate effective dose values in examination protocols for all dental arcade available in different outfits with these two options. For this, a female anthropomorphic phantom manufactured by Radiology Support Devices twenty six thermoluminescent dosimeters inserted in relevant bodies and positions was used. Irradiate the simulator in the clinical conditions. The protocols were averaged and compared: [a] 14.0 cm x 8.5 cm and [b] 8.5 cm x 8.5 cm (Gendex Tomography GXCB 500), [c] protocol stitched for jaw combination of three volumes of 5.0 cm x 3.7 cm (Kodak 9000 3D scanner) [d] protocol stitched Fov s 5.0 cm x 8.0 cm (Planmeca Pro Max 3D) and [e] single technical Fov 14 cm x 8 cm (i-CAT Classical). Our results for the effective dose were: a range between 43.1 and 111.1 micro Sv for technical single Fov and 44.5 and 236.2 for technical stitched Fov s. The protocol presented the highest estimated effective dose was [d] and showed that lowest index was registered [a]. These results demonstrate that the protocol stitched Fov generated in Kodak 9000 3D machine applied the upper dental arch has practically equal value effective dose obtained by protocol extended diameter of, [a], which evaluates in a single image upper and lower arcade. It also demonstrates that the protocol [d] gives an estimate of five times higher than the protocol [a]. Thus, we conclude that in practical terms the protocol [c] stitched Fov s, not presents dosimetric advantages over other protocols. (Author)

  9. Comparison of blood RNA isolation methods from samples stabilized in Tempus tubes and stored at a large human biobank.

    Science.gov (United States)

    Aarem, Jeanette; Brunborg, Gunnar; Aas, Kaja K; Harbak, Kari; Taipale, Miia M; Magnus, Per; Knudsen, Gun Peggy; Duale, Nur

    2016-09-01

    More than 50,000 adult and cord blood samples were collected in Tempus tubes and stored at the Norwegian Institute of Public Health Biobank for future use. In this study, we systematically evaluated and compared five blood-RNA isolation protocols: three blood-RNA isolation protocols optimized for simultaneous isolation of all blood-RNA species (MagMAX RNA Isolation Kit, both manual and semi-automated protocols; and Norgen Preserved Blood RNA kit I); and two protocols optimized for large RNAs only (Tempus Spin RNA, and Tempus 6-port isolation kit). We estimated the following parameters: RNA quality, RNA yield, processing time, cost per sample, and RNA transcript stability of six selected mRNAs and 13 miRNAs using real-time qPCR. Whole blood samples from adults (n = 59 tubes) and umbilical cord blood (n = 18 tubes) samples collected in Tempus tubes were analyzed. High-quality blood-RNAs with average RIN-values above seven were extracted using all five RNA isolation protocols. The transcript levels of the six selected genes showed minimal variation between the five protocols. Unexplained differences within the transcript levels of the 13 miRNA were observed; however, the 13 miRNAs had similar expression direction and they were within the same order of magnitude. Some differences in the RNA processing time and cost were noted. Sufficient amounts of high-quality RNA were obtained using all five protocols, and the Tempus blood RNA system therefore seems not to be dependent on one specific RNA isolation method.

  10. An Innovative Approach to Functionality Testing of Analysers in the Clinical Laboratory

    OpenAIRE

    Stockmann, Wolfgang; Engeldinger, Werner; Kunst, Albert; McGovern, Margaret

    2008-01-01

    The established protocols for evaluating new analytical systems produce indispensable information with regard to quality characteristics, but in general they fail to analyse the system performance under routine-like conditions. We describe a model which allows the testing of a new analytical system under conditions close to the routine in a controlled and systematic manner by using an appropriate software tool. Performing routine simulation experiments, either reflecting imprecision or method...

  11. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  12. Validity of the reduced-sample insulin modified frequently-sampled intravenous glucose tolerance test using the nonlinear regression approach.

    Science.gov (United States)

    Sumner, Anne E; Luercio, Marcella F; Frempong, Barbara A; Ricks, Madia; Sen, Sabyasachi; Kushner, Harvey; Tulloch-Reid, Marshall K

    2009-02-01

    The disposition index, the product of the insulin sensitivity index (S(I)) and the acute insulin response to glucose, is linked in African Americans to chromosome 11q. This link was determined with S(I) calculated with the nonlinear regression approach to the minimal model and data from the reduced-sample insulin-modified frequently-sampled intravenous glucose tolerance test (Reduced-Sample-IM-FSIGT). However, the application of the nonlinear regression approach to calculate S(I) using data from the Reduced-Sample-IM-FSIGT has been challenged as being not only inaccurate but also having a high failure rate in insulin-resistant subjects. Our goal was to determine the accuracy and failure rate of the Reduced-Sample-IM-FSIGT using the nonlinear regression approach to the minimal model. With S(I) from the Full-Sample-IM-FSIGT considered the standard and using the nonlinear regression approach to the minimal model, we compared the agreement between S(I) from the Full- and Reduced-Sample-IM-FSIGT protocols. One hundred African Americans (body mass index, 31.3 +/- 7.6 kg/m(2) [mean +/- SD]; range, 19.0-56.9 kg/m(2)) had FSIGTs. Glucose (0.3 g/kg) was given at baseline. Insulin was infused from 20 to 25 minutes (total insulin dose, 0.02 U/kg). For the Full-Sample-IM-FSIGT, S(I) was calculated based on the glucose and insulin samples taken at -1, 1, 2, 3, 4, 5, 6, 7, 8,10, 12, 14, 16, 19, 22, 23, 24, 25, 27, 30, 40, 50, 60, 70, 80, 90, 100, 120, 150, and 180 minutes. For the Reduced-Sample-FSIGT, S(I) was calculated based on the time points that appear in bold. Agreement was determined by Spearman correlation, concordance, and the Bland-Altman method. In addition, for both protocols, the population was divided into tertiles of S(I). Insulin resistance was defined by the lowest tertile of S(I) from the Full-Sample-IM-FSIGT. The distribution of subjects across tertiles was compared by rank order and kappa statistic. We found that the rate of failure of resolution of S(I) by

  13. Efficient secure two-party protocols

    CERN Document Server

    Hazay, Carmit

    2010-01-01

    The authors present a comprehensive study of efficient protocols and techniques for secure two-party computation -- both general constructions that can be used to securely compute any functionality, and protocols for specific problems of interest. The book focuses on techniques for constructing efficient protocols and proving them secure. In addition, the authors study different definitional paradigms and compare the efficiency of protocols achieved under these different definitions.The book opens with a general introduction to secure computation and then presents definitions of security for a

  14. Efficacy of 2 finishing protocols in the quality of orthodontic treatment outcome.

    Science.gov (United States)

    Stock, Gregory J; McNamara, James A; Baccetti, Tiziano

    2011-11-01

    The objectives of this prospective clinical study were to evaluate the quality of treatment outcomes achieved with a complex orthodontic finishing protocol involving serpentine wires and a tooth positioner, and to compare it with the outcomes of a standard finishing protocol involving archwire bends used to detail the occlusion near the end of active treatment. The complex finishing protocol sample consisted of 34 consecutively treated patients; 1 week before debonding, their molar bands were removed, and serpentine wires were placed; this was followed by active wear of a tooth positioner for up to 1 month after debonding. The standard finishing protocol group consisted of 34 patients; their dental arches were detailed with archwire bends and vertical elastics. The objective grading system of the American Board of Orthodontics was used to quantify the quality of the finish at each time point. The Wilcoxon signed rank test was used to compare changes in the complex finishing protocol; the Mann-Whitney U test was used to compare changes between groups. The complex finishing protocol group experienced a clinically significant improvement in objective grading system scores after treatment with the positioner. Mild improvement in posterior space closure was noted after molar band removal, but no improvement in the occlusion was observed after placement of the serpentine wires. Patients managed with the complex finishing protocol also had a lower objective grading system score (14.7) at the end of active treatment than did patients undergoing the standard finishing protocol (23.0). Tooth positioners caused a clinically significant improvement in interocclusal contacts, interproximal contacts, and net objective grading system score; mild improvement in posterior band space was noted after molar band removal 1 week before debond. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  15. High-performance liquid chromatographic determination of histamine in biological samples: the cerebrospinal fluid challenge--a review.

    Science.gov (United States)

    Wang, Zhaopin; Wu, Juanli; Wu, Shihua; Bao, Aimin

    2013-04-24

    Histamine, a neurotransmitter crucially involved in a number of basic physiological functions, undergoes changes in neuropsychiatric disorders. Detection of histamine in biological samples such as cerebrospinal fluid (CSF) is thus of clinical importance. The most commonly used method for measuring histamine levels is high performance liquid chromatography (HPLC). However, factors such as very low levels of histamine, the even lower CSF-histamine and CSF-histamine metabolite levels, especially in certain neuropsychiatric diseases, rapid formation of histamine metabolites, and other confounding elements during sample collection, make analysis of CSF-histamine and CSF-histamine metabolites a challenging task. Nonetheless, this challenge can be met, not only with respect to HPLC separation column, derivative reagent, and detector, but also in terms of optimizing the CSF sample collection. This review aims to provide a general insight into the quantitative analyses of histamine in biological samples, with an emphasis on HPLC instruments, methods, and hyphenated techniques, with the aim of promoting the development of an optimal and practical protocol for the determination of CSF-histamine and/or CSF-histamine metabolites. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Using Ovsynch protocol versus Cosynch protocol in dairy cows

    Directory of Open Access Journals (Sweden)

    Ion Valeriu Caraba

    2013-10-01

    Full Text Available As a research on the reproductive physiology and endocrinology surrounding the estrous cycle in dairy cattle has been compiled, several estrous synchronization programs have been developed for use with dairy cows. These include several programs that facilitate the mass breeding of all animals at a predetermined time (timed-AI rather than the detection of estrus. We studied on 15 dary cows which were synchronized by Ovsynch and Cosynch programs. The estrus response for cows in Ovsynch protocol was of 63%. Pregnancy per insemination at 60 days was of 25%. Estrus response for cow in Cosynch protocol was of 57%. Pregnancy per insemination at 60 days was of 57%. Synchronization of ovulation using Ovsynch protocols can provide an effective way to manage reproduction in lactating dairy cows by eliminating the need for estrus detection. These are really efficient management programs for TAI of dairy cows that are able to reduce both the labour costs and the extra handling to daily estrus detection and AI.

  17. Taxonomy and Analysis of IP Micro-Mobility Protocols in Single and Simultaneous Movements Scenarios

    Directory of Open Access Journals (Sweden)

    G. De Marco

    2007-01-01

    Full Text Available The micro-mobility is an important aspect in mobile communications, where the applications are anywhere and used anytime. One of the problems of micro-mobility is the hand-off latency. In this paper, we analyse two solutions for IP micro-mobility by means of a general taxonomy. The first one is based on the Stream Control Transmission Protocol (SCTP, which allows the dynamic address configuration of an association. The second one is based on the Session Initiation Protocol (SIP, which is the most popular protocol for multimedia communications over IP networks. We show that for the SCTP solution, there is room for further optimisations of the hand-off latency by adding slight changes to the protocol. However, as full end-to-end solution, SCTP is not able to handle simultaneous movement of hosts, whose probability in general cannot be neglected. On the other hand, the SIP can handle both single and simultaneous movements cases, although the hand-off latency can increase with respect to the SCTP solution. We show that for a correct and fast hand-off, the SIP server should be statefull.

  18. Comparison of Greenhouse Gas Offset Quantification Protocols for Nitrogen Management in Dryland Wheat Cropping Systems of the Pacific Northwest

    Directory of Open Access Journals (Sweden)

    Tabitha T. Brown

    2017-11-01

    Full Text Available In the carbon market, greenhouse gas (GHG offset protocols need to ensure that emission reductions are of high quality, quantifiable, and real. Lack of consistency across protocols for quantifying emission reductions compromise the credibility of offsets generated. Thus, protocol quantification methodologies need to be periodically reviewed to ensure emission offsets are credited accurately and updated to support practical climate policy solutions. Current GHG emission offset credits generated by agricultural nitrogen (N management activities are based on reducing the annual N fertilizer application rate for a given crop without reducing yield. We performed a “road test” of agricultural N management protocols to evaluate differences among protocol components and quantify nitrous oxide (N2O emission reductions under sample projects relevant to N management in dryland, wheat-based cropping systems of the inland Pacific Northwest (iPNW. We evaluated five agricultural N management offset protocols applicable to North America: two methodologies of American Carbon Registry (ACR1 and ACR2, Verified Carbon Standard (VCS, Climate Action Reserve (CAR, and Alberta Offset Credit System (Alberta. We found that only two protocols, ACR2 and VCS, were suitable for this study, in which four sample projects were developed representing feasible N fertilizer rate reduction activities. The ACR2 and VCS protocols had identical baseline and project emission quantification methodologies resulting in identical emission reduction values. Reducing N fertilizer application rate by switching to variable rate N (sample projects 1–3 or split N application (sample project 4 management resulted in a N2O emission reduction ranging from 0.07 to 0.16, and 0.26 Mg CO2e ha−1, respectively. Across the range of C prices considered ($5, $10, and $50 per metric ton of CO2 equivalent, we concluded that the N2O emission offset payment alone ($0.35–$13.0 ha−1 was unlikely to

  19. Molecular analyses of two bacterial sampling methods in ligature-induced periodontitis in rats.

    Science.gov (United States)

    Fontana, Carla Raquel; Grecco, Clovis; Bagnato, Vanderlei Salvador; de Freitas, Laura Marise; Boussios, Constantinos I; Soukos, Nikolaos S

    2018-02-01

    The prevalence profile of periodontal pathogens in dental plaque can vary as a function of the detection method; however, the sampling technique may also play a role in determining dental plaque microbial profiles. We sought to determine the bacterial composition comparing two sampling methods, one well stablished and a new one proposed here. In this study, a ligature-induced periodontitis model was used in 30 rats. Twenty-seven days later, ligatures were removed and microbiological samples were obtained directly from the ligatures as well as from the periodontal pockets using absorbent paper points. Microbial analysis was performed using DNA probes to a panel of 40 periodontal species in the checkerboard assay. The bacterial composition patterns were similar for both sampling methods. However, detection levels for all species were markedly higher for ligatures compared with paper points. Ligature samples provided more bacterial counts than paper points, suggesting that the technique for induction of periodontitis could also be applied for sampling in rats. Our findings may be helpful in designing studies of induced periodontal disease-associated microbiota.

  20. Sampling and sample handling procedures for priority pollutants in surface coal mining wastewaters. [Detailed list to be analyzed for

    Energy Technology Data Exchange (ETDEWEB)

    Hayden, R. S.; Johnson, D. O.; Henricks, J. D.

    1979-03-01

    The report describes the procedures used by Argonne National Laboratory to sample surface coal mine effluents in order to obtain field and laboratory data on 110 organic compounds or classes of compounds and 14 metals and minerals that are known as priority pollutants, plus 5-day biochemical oxygen demand (BOD/sub 5/), total organic carbon (TOC), chemical oxygen demand (COD), total dissolved solids (TDS), and total suspended solids (TSS). Included are directions for preparation of sampling containers and equipment, methods of sampling and sample preservation, and field and laboratory protocols, including chain-of-custody procedures. Actual analytical procedures are not described, but their sources are referenced.