WorldWideScience

Sample records for standard analytical protocol

  1. Analytical validation of a standardized scoring protocol for Ki67: phase 3 of an international multicenter collaboration

    Science.gov (United States)

    Leung, Samuel C Y; Nielsen, Torsten O; Zabaglo, Lila; Arun, Indu; Badve, Sunil S; Bane, Anita L; Bartlett, John M S; Borgquist, Signe; Chang, Martin C; Dodson, Andrew; Enos, Rebecca A; Fineberg, Susan; Focke, Cornelia M; Gao, Dongxia; Gown, Allen M; Grabau, Dorthe; Gutierrez, Carolina; Hugh, Judith C; Kos, Zuzana; Lænkholm, Anne-Vibeke; Lin, Ming-Gang; Mastropasqua, Mauro G; Moriya, Takuya; Nofech-Mozes, Sharon; Osborne, C Kent; Penault-Llorca, Frédérique M; Piper, Tammy; Sakatani, Takashi; Salgado, Roberto; Starczynski, Jane; Viale, Giuseppe; Hayes, Daniel F; McShane, Lisa M; Dowsett, Mitch

    2016-01-01

    Pathological analysis of the nuclear proliferation biomarker Ki67 has multiple potential roles in breast and other cancers. However, clinical utility of the immunohistochemical (IHC) assay for Ki67 immunohistochemistry has been hampered by unacceptable between-laboratory analytical variability. The International Ki67 Working Group has conducted a series of studies aiming to decrease this variability and improve the evaluation of Ki67. This study tries to assess whether acceptable performance can be achieved on prestained core-cut biopsies using a standardized scoring method. Sections from 30 primary ER+ breast cancer core biopsies were centrally stained for Ki67 and circulated among 22 laboratories in 11 countries. Each laboratory scored Ki67 using three methods: (1) global (4 fields of 100 cells each); (2) weighted global (same as global but weighted by estimated percentages of total area); and (3) hot-spot (single field of 500 cells). The intraclass correlation coefficient (ICC), a measure of interlaboratory agreement, for the unweighted global method (0.87; 95% credible interval (CI): 0.81–0.93) met the prespecified success criterion for scoring reproducibility, whereas that for the weighted global (0.87; 95% CI: 0.7999–0.93) and hot-spot methods (0.84; 95% CI: 0.77–0.92) marginally failed to do so. The unweighted global assessment of Ki67 IHC analysis on core biopsies met the prespecified criterion of success for scoring reproducibility. A few cases still showed large scoring discrepancies. Establishment of external quality assessment schemes is likely to improve the agreement between laboratories further. Additional evaluations are needed to assess staining variability and clinical validity in appropriate cohorts of samples. PMID:28721378

  2. Analytical protocols for characterisation of sulphur-free lignin

    NARCIS (Netherlands)

    Gosselink, R.J.A.; Abächerli, A.; Semke, H.; Malherbe, R.; Käuper, P.; Nadif, A.; Dam, van J.E.G.

    2004-01-01

    Interlaboratory tests for chemical characterisation of sulphur-free lignins were performed by five laboratories to develop useful analytical protocols, which are lacking, and identify quality-related properties. Protocols have been established for reproducible determination of the chemical

  3. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  4. Standardized North American marsh bird monitoring protocol

    Science.gov (United States)

    Conway, Courtney J.

    2011-01-01

    Little is known about the population status of many marsh-dependent birds in North America but recent efforts have focused on collecting more reliable information and estimates of population trends. As part of that effort, a standardized survey protocol was developed in 1999 that provided guidance for conducting marsh bird surveys throughout North America such that data would be consistent among locations. The original survey protocol has been revised to provide greater clarification on many issues as the number of individuals using the protocol has grown. The Standardized North American Marsh Bird Monitoring Protocol instructs surveyors to conduct an initial 5-minute passive point-count survey followed by a series of 1-minute segments during which marsh bird calls are broadcast into the marsh following a standardized approach. Surveyors are instructed to record each individual bird from the suite of 26 focal species that are present in their local area on separate lines of a datasheet and estimate the distance to each bird. Also, surveyors are required to record whether each individual bird was detected within each 1-minute subsegment of the survey. These data allow analysts to use several different approaches for estimating detection probability. The Standardized North American Marsh Bird Monitoring Protocol provides detailed instructions that explain the field methods used to monitor marsh birds in North America.

  5. Estimation of radiation exposure for brain perfusion CT: standard protocol compared with deviations in protocol.

    Science.gov (United States)

    Hoang, Jenny K; Wang, Chu; Frush, Donald P; Enterline, David S; Samei, Ehsan; Toncheva, Greta; Lowry, Carolyn; Yoshizumi, Terry T

    2013-11-01

    The purpose of this study was to measure the organ doses and estimate the effective dose for the standard brain perfusion CT protocol and erroneous protocols. An anthropomorphic phantom with metal oxide semiconductor field effect transistor (MOSFET) detectors was scanned on a 64-MDCT scanner. Protocol 1 used a standard brain perfusion protocol with 80 kVp and fixed tube current of 200 mA. Protocol 2 used 120 kVp and fixed tube current of 200 mA. Protocol 3 used 120 kVp with automatic tube current modulation (noise index, 2.4; minimum, 100 mA; maximum, 520 mA). Compared with protocol 1, the effective dose was 2.8 times higher with protocol 2 and 7.8 times higher with protocol 3. For all protocols, the peak dose was highest in the skin, followed by the brain and calvarial marrow. Compared with protocol 1, the peak skin dose was 2.6 times higher with protocol 2 and 6.7 times higher with protocol 3. The peak skin dose for protocol 3 exceeded 3 Gy. The ocular lens received significant scatter radiation: 177 mGy for protocol 2 and 435 mGy for protocol 3, which were 4.6 and 11.3 times the dose for protocol 1, respectively. Compared with the standard protocol, erroneous protocols of increasing the tube potential from 80 kVp to 120 kVp will lead to a three- to fivefold increase in organ doses, and concurrent use of high peak kilovoltage with incorrectly programmed tube current modulation can increase dose to organs by 7- to 11-fold. Tube current modulation with a low noise index can lead to doses to the skin and ocular lens that are close to thresholds for tissue reactions.

  6. The development of standard operating protocols for paediatric radiology

    International Nuclear Information System (INIS)

    Hardwick, J.; Mencik, C.; McLaren, C.; Young, C.; Scadden, S.; Mashford, P.; McHugh, K.; Beckett, M.; Calvert, M.; Marsden, P.J.

    2001-01-01

    This paper describes how the requirement for operating protocols for standard radiological practice was expanded to provide a comprehensive aide to the operator conducting a medical exposure. The protocols adopted now include justification criteria, patient preparation, radiographic technique, standard exposure charts, diagnostic reference levels and image quality criteria. In total, the protocols have been welcomed as a tool for ensuring that medical exposures are properly optimised. (author)

  7. SPIRIT 2013 Statement: defining standard protocol items for clinical trials

    Directory of Open Access Journals (Sweden)

    An-Wen Chan

    Full Text Available The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials 2013, a guideline for the minimum content of a clinical trial protocol. The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders.

  8. Modifications to the Standard Sit-and-Reach Flexibility Protocol.

    Science.gov (United States)

    Holt, Laurence E.; Burke, Darren G.; Pelham, Thomas W.

    1999-01-01

    Describes several modifications of the standard sit-and-reach flexibility protocol using a new device called the multitest flexometer (MTF). Using the MTF, researchers could take six flexibility measures beyond the stand-and-reach test. The modified protocol allowed the indirect assessment of the influence of the four major muscle groups that…

  9. Applying standards to systematize learning analytics in serious games

    NARCIS (Netherlands)

    Serrano-Laguna, Angel; Martinez-Ortiz, Ivan; Haag, Jason; Regan, Damon; Johnson, Andy; Fernandez-Manjon, Baltasar

    2016-01-01

    Learning Analytics is an emerging field focused on analyzing learners’ interactions with educational content. One of the key open issues in learning analytics is the standardization of the data collected. This is a particularly challenging issue in serious games, which generate a diverse range of

  10. Analytical standards for accountability of uranium hexafluoride - 1972

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    An analytical standard for the accountability of uranium hexafluoride is presented that includes procedures for subsampling, determination of uranium, determination of metallic impurities and isotopic analysis by gas and thermal ionization mass spectrometry

  11. Synergistic relationships between Analytical Chemistry and written standards

    International Nuclear Information System (INIS)

    Valcárcel, Miguel; Lucena, Rafael

    2013-01-01

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived

  12. Synergistic relationships between Analytical Chemistry and written standards

    Energy Technology Data Exchange (ETDEWEB)

    Valcárcel, Miguel, E-mail: qa1vacam@uco.es; Lucena, Rafael

    2013-07-25

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived.

  13. PCB transformer decontamination - standards and protocols

    International Nuclear Information System (INIS)

    1995-12-01

    The proper management and disposal of transformers to reduce human and environmental exposure to PCBs, is the subject of this document issued by the Canadian Council of Ministers of the Environment (CCME). The Council is committed to a policy of phasing out the use of all polychlorinated biphenyls (PCBs) in Canada. Since there are no national standards in Canada for the decontamination of PCB transformers, this handbook was issued to provide guidelines to promote uniform practices and to set national approaches for resource recovery and technological developments in this area. The guide describes methods for decontamination of transformers, and safe approaches for re-use, recycling and landfilling of electrical transformers that contain PCBs and their components. 16 refs., 1 tab., 5 figs

  14. Synthetic salt cake standards for analytical laboratory quality control

    International Nuclear Information System (INIS)

    Schilling, A.E.; Miller, A.G.

    1980-01-01

    The validation of analytical results in the characterization of Hanford Nuclear Defense Waste requires the preparation of synthetic waste for standard reference materials. Two independent synthetic salt cake standards have been prepared to monitor laboratory quality control for the chemical characterization of high-level salt cake and sludge waste in support of Rockwell Hanford Operations' High-Level Waste Management Program. Each synthetic salt cake standard contains 15 characterized chemical species and was subjected to an extensive verification/characterization program in two phases. Phase I consisted of an initial verification of each analyte in salt cake form in order to determine the current analytical capability for chemical analysis. Phase II consisted of a final characterization of those chemical species in solution form where conflicting verification data were observed. The 95 percent confidence interval on the mean for the following analytes within each standard is provided: sodium, nitrate, nitrite, phosphate, carbonate, sulfate, hydroxide, chromate, chloride, fluoride, aluminum, plutonium-239/240, strontium-90, cesium-137, and water

  15. The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy

    Directory of Open Access Journals (Sweden)

    Schindelin Johannes E

    2006-12-01

    Full Text Available Abstract Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc. 1. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy to a common coordinate system and computes average intensities for each voxel (volume pixel the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at http://www.neurofly.de2

  16. Estimations of isoprenoid emission capacity from enclosure studies: measurements, data processing, quality and standardized measurement protocols

    Science.gov (United States)

    Niinemets, Ü.; Kuhn, U.; Harley, P. C.; Staudt, M.; Arneth, A.; Cescatti, A.; Ciccioli, P.; Copolovici, L.; Geron, C.; Guenther, A.; Kesselmeier, J.; Lerdau, M. T.; Monson, R. K.; Peñuelas, J.

    2011-08-01

    The capacity for volatile isoprenoid production under standardized environmental conditions at a certain time (ES, the emission factor) is a key characteristic in constructing isoprenoid emission inventories. However, there is large variation in published ES estimates for any given species partly driven by dynamic modifications in ES due to acclimation and stress responses. Here we review additional sources of variation in ES estimates that are due to measurement and analytical techniques and calculation and averaging procedures, and demonstrate that estimations of ES critically depend on applied experimental protocols and on data processing and reporting. A great variety of experimental setups has been used in the past, contributing to study-to-study variations in ES estimates. We suggest that past experimental data should be distributed into broad quality classes depending on whether the data can or cannot be considered quantitative based on rigorous experimental standards. Apart from analytical issues, the accuracy of ES values is strongly driven by extrapolation and integration errors introduced during data processing. Additional sources of error, especially in meta-database construction, can further arise from inconsistent use of units and expression bases of ES. We propose a standardized experimental protocol for BVOC estimations and highlight basic meta-information that we strongly recommend to report with any ES measurement. We conclude that standardization of experimental and calculation protocols and critical examination of past reports is essential for development of accurate emission factor databases.

  17. Biocoder: A programming language for standardizing and automating biology protocols.

    Science.gov (United States)

    Ananthanarayanan, Vaishnavi; Thies, William

    2010-11-08

    Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains.

  18. Protocol Standards for Reporting Video Data in Academic Journals.

    Science.gov (United States)

    Rowland, Pamela A; Ignacio, Romeo C; de Moya, Marc A

    2016-04-01

    Editors of biomedical journals have estimated that a majority (40%-90%) of studies published in scientific journals cannot be replicated, even though an inherent principle of publication is that others should be able to replicate and build on published claims. Each journal sets its own protocols for establishing "quality" in articles, yet over the past 50 years, few journals in any field--especially medical education--have specified protocols for reporting the use of video data in research. The authors found that technical and industry-driven aspects of video recording, as well as a lack of standardization and reporting requirements by research journals, have led to major limitations in the ability to assess or reproduce video data used in research. Specific variables in the videotaping process (e.g., camera angle), which can be changed or be modified, affect the quality of recorded data, leading to major reporting errors and, in turn, unreliable conclusions. As more data are now in the form of digital videos, the historical lack of reporting standards makes it increasingly difficult to accurately replicate medical educational studies. Reproducibility is especially important as the medical education community considers setting national high-stakes standards in medicine and surgery based on video data. The authors of this Perspective provide basic protocol standards for investigators and journals using video data in research publications so as to allow for reproducibility.

  19. Synergistic relationships between Analytical Chemistry and written standards.

    Science.gov (United States)

    Valcárcel, Miguel; Lucena, Rafael

    2013-07-25

    This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Industrial wireless sensor networks applications, protocols, and standards

    CERN Document Server

    Güngör, V Çagri

    2013-01-01

    The collaborative nature of industrial wireless sensor networks (IWSNs) brings several advantages over traditional wired industrial monitoring and control systems, including self-organization, rapid deployment, flexibility, and inherent intelligent processing. In this regard, IWSNs play a vital role in creating more reliable, efficient, and productive industrial systems, thus improving companies' competitiveness in the marketplace. Industrial Wireless Sensor Networks: Applications, Protocols, and Standards examines the current state of the art in industrial wireless sensor networks and outline

  1. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples

    Directory of Open Access Journals (Sweden)

    Oscar Pindado Jiménez

    2017-01-01

    Full Text Available This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25–35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  2. A Standardized and Reproducible Urine Preparation Protocol for Cancer Biomarkers Discovery

    Directory of Open Access Journals (Sweden)

    Julia Beretov

    2014-01-01

    Full Text Available A suitable and standardized protein purification technique is essential to maintain consistency and to allow data comparison between proteomic studies for urine biomarker discovery. Ultimately, efforts should be made to standardize urine preparation protocols. The aim of this study was to develop an optimal analytical protocol to achieve maximal protein yield and to ensure that this method was applicable to examine urine protein patterns that distinguish disease and disease-free states. In this pilot study, we compared seven different urine sample preparation methods to remove salts, and to precipitate and isolate urinary proteins. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE profiles showed that the sequential preparation of urinary proteins by combining acetone and trichloroacetic acid (TCA alongside high speed centrifugation (HSC provided the best separation, and retained the most urinary proteins. Therefore, this approach is the preferred method for all further urine protein analysis.

  3. Standardized food images: A photographing protocol and image database.

    Science.gov (United States)

    Charbonnier, Lisette; van Meer, Floor; van der Laan, Laura N; Viergever, Max A; Smeets, Paul A M

    2016-01-01

    The regulation of food intake has gained much research interest because of the current obesity epidemic. For research purposes, food images are a good and convenient alternative for real food because many dietary decisions are made based on the sight of foods. Food pictures are assumed to elicit anticipatory responses similar to real foods because of learned associations between visual food characteristics and post-ingestive consequences. In contemporary food science, a wide variety of images are used which introduces between-study variability and hampers comparison and meta-analysis of results. Therefore, we created an easy-to-use photographing protocol which enables researchers to generate high resolution food images appropriate for their study objective and population. In addition, we provide a high quality standardized picture set which was characterized in seven European countries. With the use of this photographing protocol a large number of food images were created. Of these images, 80 were selected based on their recognizability in Scotland, Greece and The Netherlands. We collected image characteristics such as liking, perceived calories and/or perceived healthiness ratings from 449 adults and 191 children. The majority of the foods were recognized and liked at all sites. The differences in liking ratings, perceived calories and perceived healthiness between sites were minimal. Furthermore, perceived caloric content and healthiness ratings correlated strongly (r ≥ 0.8) with actual caloric content in both adults and children. The photographing protocol as well as the images and the data are freely available for research use on http://nutritionalneuroscience.eu/. By providing the research community with standardized images and the tools to create their own, comparability between studies will be improved and a head-start is made for a world-wide standardized food image database. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Estimations of isoprenoid emission capacity from enclosure studies: measurements, data processing, quality and standardized measurement protocols

    Directory of Open Access Journals (Sweden)

    Ü. Niinemets

    2011-08-01

    Full Text Available The capacity for volatile isoprenoid production under standardized environmental conditions at a certain time (ES, the emission factor is a key characteristic in constructing isoprenoid emission inventories. However, there is large variation in published ES estimates for any given species partly driven by dynamic modifications in ES due to acclimation and stress responses. Here we review additional sources of variation in ES estimates that are due to measurement and analytical techniques and calculation and averaging procedures, and demonstrate that estimations of ES critically depend on applied experimental protocols and on data processing and reporting. A great variety of experimental setups has been used in the past, contributing to study-to-study variations in ES estimates. We suggest that past experimental data should be distributed into broad quality classes depending on whether the data can or cannot be considered quantitative based on rigorous experimental standards. Apart from analytical issues, the accuracy of ES values is strongly driven by extrapolation and integration errors introduced during data processing. Additional sources of error, especially in meta-database construction, can further arise from inconsistent use of units and expression bases of ES. We propose a standardized experimental protocol for BVOC estimations and highlight basic meta-information that we strongly recommend to report with any ES measurement. We conclude that standardization of experimental and calculation protocols and critical examination of past reports is essential for development of accurate emission factor databases.

  5. Standards-Based Wireless Sensor Networking Protocols for Spaceflight Applications

    Science.gov (United States)

    Wagner, Raymond S.

    2010-01-01

    Wireless sensor networks (WSNs) have the capacity to revolutionize data gathering in both spaceflight and terrestrial applications. WSNs provide a huge advantage over traditional, wired instrumentation since they do not require wiring trunks to connect sensors to a central hub. This allows for easy sensor installation in hard to reach locations, easy expansion of the number of sensors or sensing modalities, and reduction in both system cost and weight. While this technology offers unprecedented flexibility and adaptability, implementing it in practice is not without its difficulties. Recent advances in standards-based WSN protocols for industrial control applications have come a long way to solving many of the challenges facing practical WSN deployments. In this paper, we will overview two of the more promising candidates - WirelessHART from the HART Communication Foundation and ISA100.11a from the International Society of Automation - and present the architecture for a new standards-based sensor node for networking and applications research.

  6. Pelvic Muscle Rehabilitation: A Standardized Protocol for Pelvic Floor Dysfunction

    Directory of Open Access Journals (Sweden)

    Rodrigo Pedraza

    2014-01-01

    Full Text Available Introduction. Pelvic floor dysfunction syndromes present with voiding, sexual, and anorectal disturbances, which may be associated with one another, resulting in complex presentation. Thus, an integrated diagnosis and management approach may be required. Pelvic muscle rehabilitation (PMR is a noninvasive modality involving cognitive reeducation, modification, and retraining of the pelvic floor and associated musculature. We describe our standardized PMR protocol for the management of pelvic floor dysfunction syndromes. Pelvic Muscle Rehabilitation Program. The diagnostic assessment includes electromyography and manometry analyzed in 4 phases: (1 initial baseline phase; (2 rapid contraction phase; (3 tonic contraction and endurance phase; and (4 late baseline phase. This evaluation is performed at the onset of every session. PMR management consists of 6 possible therapeutic modalities, employed depending on the diagnostic evaluation: (1 down-training; (2 accessory muscle isolation; (3 discrimination training; (4 muscle strengthening; (5 endurance training; and (6 electrical stimulation. Eight to ten sessions are performed at one-week intervals with integration of home exercises and lifestyle modifications. Conclusions. The PMR protocol offers a standardized approach to diagnose and manage pelvic floor dysfunction syndromes with potential advantages over traditional biofeedback, involving additional interventions and a continuous pelvic floor assessment with management modifications over the clinical course.

  7. Navigating the Benford Labyrinth: A big-data analytic protocol illustrated using the academic library context

    Directory of Open Access Journals (Sweden)

    Michael Halperin

    2016-03-01

    Full Text Available Objective: Big Data Analytics is a panoply of techniques the principal intention of which is to ferret out dimensions or factors from certain data streamed or available over the WWW. We offer a subset or “second” stage protocol of Big Data Analytics (BDA that uses these dimensional datasets as benchmarks for profiling related data. We call this Specific Context Benchmarking (SCB. Method: In effecting this benchmarking objective, we have elected to use a Digital Frequency Profiling (DFP technique based upon the work of Newcomb and Benford, who have developed a profiling benchmark based upon the Log10 function. We illustrate the various stages of the SCB protocol using the data produced by the Academic Research Libraries to enhance insights regarding the details of the operational benchmarking context and so offer generalizations needed to encourage adoption of SCB across other functional domains. Results: An illustration of the SCB protocol is offered using the recently developed Benford Practical Profile as the Conformity Benchmarking Measure. ShareWare: We have developed a Decision Support System called: SpecificContextAnalytics (SCA:DSS to create the various information sets presented in this paper. The SCA:DSS, programmed in Excel VBA, is available from the corresponding author as a free download without restriction to its use. Conclusions: We note that SCB effected using the DFPs is an enhancement not a replacement for the usual statistical and analytic techniques and fits very well in the BDA milieu.

  8. Standardized protocol for artery-only fingertip replantation.

    Science.gov (United States)

    Buntic, Rudolf F; Brooks, Darrell

    2010-09-01

    Artery-only fingertip replantation can be reliable if low-resistance flow through the replant is maintained until venous outflow is restored naturally. Injuring the tip of the replant to promote ongoing bleeding augmented with anticoagulation usually accomplishes this; however, such management results in prolonged hospitalization. In this study, we analyzed the outcomes of artery-only fingertip replantation using a standardized postoperative protocol consisting of dextran-40, heparin, and leech therapy. Between 2001 and 2008, we performed 19 artery-only fingertip replants for 17 patients. All patients had the replanted nail plate removed and received intravenous dextran-40, heparin, and aspirin to promote fingertip bleeding and vascular outflow. Anticoagulation was titrated to promote a controlled bleed until physiologic venous outflow was restored by neovascularization. We used medicinal leeches and mechanical heparin scrubbing for acute decongestion. By postoperative day 6, bleeding was no longer promoted. We initiated fluorescent dye perfusion studies to assess circulatory competence and direct further anticoagulant intervention if necessary. The absence of bleeding associated with an initial rise followed by an appropriate fall in fluorescent dye concentration would trigger a weaning of anticoagulation. All of the 19 replants survived. The average length of hospital stay was 9 days (range, 7-17 d). Eleven patients received blood transfusions. The average transfusion was 1.8 units (range, 0-9 units). All patients were happy with the decision to replant, and the cosmetic result. A protocol that promotes temporary, controlled bleeding from the fingertip is protective of artery-only replants distal to the distal interphalangeal joint until physiologic venous outflow is restored. The protocol described is both safe and reliable. The patient should be informed that such replant attempts may result in the need for transfusions and extended hospital stays, factors that

  9. Two RFID standard-based security protocols for healthcare environments.

    Science.gov (United States)

    Picazo-Sanchez, Pablo; Bagheri, Nasour; Peris-Lopez, Pedro; Tapiador, Juan E

    2013-10-01

    Radio Frequency Identification (RFID) systems are widely used in access control, transportation, real-time inventory and asset management, automated payment systems, etc. Nevertheless, the use of this technology is almost unexplored in healthcare environments, where potential applications include patient monitoring, asset traceability and drug administration systems, to mention just a few. RFID technology can offer more intelligent systems and applications, but privacy and security issues have to be addressed before its adoption. This is even more dramatical in healthcare applications where very sensitive information is at stake and patient safety is paramount. In Wu et al. (J. Med. Syst. 37:19, 43) recently proposed a new RFID authentication protocol for healthcare environments. In this paper we show that this protocol puts location privacy of tag holders at risk, which is a matter of gravest concern and ruins the security of this proposal. To facilitate the implementation of secure RFID-based solutions in the medical sector, we suggest two new applications (authentication and secure messaging) and propose solutions that, in contrast to previous proposals in this field, are fully based on ISO Standards and NIST Security Recommendations.

  10. Standardization of fertilization protocols for the European eel, Anguilla anguilla

    DEFF Research Database (Denmark)

    Butts, Ian; Sørensen, Sune Riis; Politis, Sebastian Nikitas

    2014-01-01

    Standardization of artificial fertilization protocols for the European eel, Anguilla anguilla, is a prerequisite for optimizing the use of available gametes in hatchery facilities and for conserving sperm from high quality males, which is either cryopreserved or in living gene banks. The objectives...... of this research were to provide a rapid, accurate and precise method to quantify sperm density by examining the relationship between sperm density and absorbance by use of a spectrophotometer, determine the optimal number of sperm required to fertilize eggs in a controlled setting, and explore how long eggs...... are receptive to fertilization post-stripping. Mean sperm density and absorbance at 350nm were 1.54e+10±4.95e+9sperm/mL and 1.91±0.22nm, respectively. Regression analysis demonstrated a highly significant positive relationship between sperm density and absorbance using a spectrophotometer at 350nm (R2=0.94, p

  11. Standard of Nuclear Medicine Protocols. The 3rd revision, 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    This document is designed to provide `Standard of Nuclear Medicine Protocols` revised in 1994. The present revision is aimed at providing new imaging agents (e.g, Tc-99m-HMPAO, Tc-99m-ECD, Tc-99m-technegas, Tc-99m MIBI, Tc-99m-tetrofosmin, I-123-BMIPP, and I-123-MIBG) and at introducing new application of conventional imaging agents. The descriptions are given under two sections: (I) standardization of imaging and (II) protocols for imaging of various organs. Section I covers static imaging, dynamic imaging, radioisotope angiography, SPECT, and collimators. In Section II, various imaging procedures are described as follows: (1) brain imaging, (2) brain perfusion imaging, (3) cistern imaging, (4) thyroid imaging, (5) imaging for metastatic foci from thyroid carcinoma, (6) parathyroid imaging, (7) lung perfusion imaging, (8) lung inhalation imaging, (9) lung ventilation imaging, (10) myocardial imaging, (11) myocardial metabolism imaging, (12) myocardial sympathetic functional imaging, (13) imaging for acute myocardial infarction, (14) blood pool imaging of the cardiac aorta, (15) RI venography, (16) hepato-splenic imaging, (17) hepato-biliary imaging, (18) imaging for the hepatic receptor, (19) hepatic RI angiography, (20) measurement of the portal-systemic shunt, (21) splenic imaging, (22) renal imaging, (23) vesicoureteral reflux imaging, (24) adrenal cortex imaging, (25) adrenal medulla imaging, (26) bone imaging, (27) bone joint imaging, (28) bone marrow imaging, (29) tumor imaging, (30) inflammatory imaging, (31) indium-labeled blood platelet imaging, (32) salivary imaging, (33) imaging of the Meckel`s diverticulum and ectopic gastric mucosa, (34) lymph node imaging, (35) lymphatic imaging, (36) imaging of the gastrointestinal hemorrhage, (37) testiclar imaging, (38) abdominal imaging, and (39) gastrointestinal movement imaging. (N.K.).

  12. 77 FR 24427 - Standards for Business Practices and Communication Protocols for Public Utilities

    Science.gov (United States)

    2012-04-24

    ...] Standards for Business Practices and Communication Protocols for Public Utilities AGENCY: Federal Energy... Standards for Business Practices and Communication Protocols for Public Utilities, Order No. 676, FERC Stats... Business Practices and Communication Protocols for Public Utilities, Order No. 676-F, FERC Stats. & Regs...

  13. Development of Standardized Material Testing Protocols for Prosthetic Liners.

    Science.gov (United States)

    Cagle, John C; Reinhall, Per G; Hafner, Brian J; Sanders, Joan E

    2017-04-01

    A set of protocols was created to characterize prosthetic liners across six clinically relevant material properties. Properties included compressive elasticity, shear elasticity, tensile elasticity, volumetric elasticity, coefficient of friction (CoF), and thermal conductivity. Eighteen prosthetic liners representing the diverse range of commercial products were evaluated to create test procedures that maximized repeatability, minimized error, and provided clinically meaningful results. Shear and tensile elasticity test designs were augmented with finite element analysis (FEA) to optimize specimen geometries. Results showed that because of the wide range of available liner products, the compressive elasticity and tensile elasticity tests required two test maxima; samples were tested until they met either a strain-based or a stress-based maximum, whichever was reached first. The shear and tensile elasticity tests required that no cyclic conditioning be conducted because of limited endurance of the mounting adhesive with some liner materials. The coefficient of friction test was based on dynamic coefficient of friction, as it proved to be a more reliable measurement than static coefficient of friction. The volumetric elasticity test required that air be released beneath samples in the test chamber before testing. The thermal conductivity test best reflected the clinical environment when thermal grease was omitted and when liner samples were placed under pressure consistent with load bearing conditions. The developed procedures provide a standardized approach for evaluating liner products in the prosthetics industry. Test results can be used to improve clinical selection of liners for individual patients and guide development of new liner products.

  14. Data distribution architecture based on standard real time protocol

    International Nuclear Information System (INIS)

    Castro, R.; Vega, J.; Pereira, A.; Portas, A.

    2009-01-01

    Data distribution architecture (DDAR) has been designed conforming to new requirements, taking into account the type of data that is going to be generated from experiments in International Thermonuclear Experimental Reactor (ITER). The main goal of this architecture is to implement a system that is able to manage on line all data that is being generated by an experiment, supporting its distribution for: processing, storing, analysing or visualizing. The first objective is to have a distribution architecture that supports long pulse experiments (even hours). The described system is able to distribute, using real time protocol (RTP), stored data or live data generated while the experiment is running. It enables researchers to access data on line instead of waiting for the end of the experiment. Other important objective is scalability, so the presented architecture can easily grow based on actual necessities, simplifying estimation and design tasks. A third important objective is security. In this sense, the architecture is based on standards, so complete security mechanisms can be applied, from secure transmission solutions until elaborated access control policies, and it is full compatible with multi-organization federation systems as PAPI or Shibboleth.

  15. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens

  16. An analytical protocol for the determination of total mercury concentrations in solid peat samples

    DEFF Research Database (Denmark)

    Roos-Barraclough, F; Givelet, N; Martinez-Cortizas, A

    2002-01-01

    variation on measured Hg concentrations are investigated. Slight increases in mercury concentrations were observed in samples dried at room temperature and at 30 degrees C (6.7 and 2.48 ng kg(-1) h(-1), respectively), and slight decreases were observed in samples dried at 60, 90 and 105 degrees C (2.36, 3...... AMA 254, capable of determining mercury concentrations in solid samples. Finally, an analytical protocol for the determination of Hg concentrations in solid peat samples is proposed. This method allows correction for variation in factors such as vegetation type, bulk density, water content and Hg...

  17. 75 FR 20901 - Standards for Business Practices and Communication Protocols for Public Utilities

    Science.gov (United States)

    2010-04-22

    ...; Order No. 676-F] Standards for Business Practices and Communication Protocols for Public Utilities... practices and electronic communications for public utilities)\\1\\ to incorporate by reference business... members. \\4\\ See Standards for Business Practices and Communication Protocols for Public Utilities, Order...

  18. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    Science.gov (United States)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in

  19. The evaluation of an analytical protocol for the determination of substances in waste for hazard classification.

    Science.gov (United States)

    Hennebert, Pierre; Papin, Arnaud; Padox, Jean-Marie; Hasebrouck, Benoît

    2013-07-01

    The classification of waste as hazardous could soon be assessed in Europe using largely the hazard properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC-MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of 'pools' of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved 'mass' during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved 'pools') should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and

  20. [Neonatal circumcision with local anesthesia. Results of a standardized protocol].

    Science.gov (United States)

    Ovalle, Alejandra; López, Pedro-Jose; Guelfand, Miguel; Zubieta, Ricardo

    2016-01-01

    Neonatal circumcision is a common procedure in the US and other countries, with low rates of complications in trained hands. However, it has recently been incorporated into the clinical environment in Chile. Our goal was to establish a local standardised protocol for neonatal circumcision under local anaesthesia, and evaluate the results and possible complications. A standardised prospective protocol was used on patients who underwent neonatal circumcision. The inclusion criteria were: children local anaesthesia and penile block, attrition of redundant prepuce and mucosa with Mogen® clamp, and section with scalpel. The protocol was used and evaluated from November 2005 to October 2014 by a paediatric surgeon and/or paediatric urologist trained in the technique. Complications and conditions until final discharge were analysed. The protocol was applied to 108 patients over a 9year period. The mean age at procedure was 9days (1-52). One patient (0.9%) had immediate bleeding, requiring further surgery. All patients were discharged from further medical checks at 1 month, without any other complications. The reason for the procedure was by parental request in 100% of the cases, and always for sociocultural reasons. Neonatal circumcision under local anaesthesia is a simple procedure, and has excellent results in selected patients, and with no major complications. With proper training, and adapting the initial protocol, it can be performed on an outpatient basis, without putting the neonates through the risks of general anaesthesia. Copyright © 2015 Sociedad Chilena de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Extended Standard Hough Transform for Analytical Line Recognition

    OpenAIRE

    Abdoulaye SERE; Oumarou SIE; Eric ANDRES

    2013-01-01

    This paper presents a new method which extends the Standard Hough Transform for the recognition of naive or standard line in a noisy picture. The proposed idea conserves the power of the Standard Hough Transform particularly a limited size of the parameter space and the recognition of vertical lines. The dual of a segment, and the dual of a pixel have been proposed to lead to a new definition of the preimage. Many alternatives of approximation could be established for the sinusoid curves of t...

  2. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Walvekar, A.P.; Ali, M.M.; Thantry, S.S.; Verma, R.; Devi, R.

    1995-01-01

    The concept of the use of human scalp hair as a first level indicator of exposure to inorganic pollutants has been established by us earlier. Efforts towards the preparation of a hair reference material are described. The analytical approaches for the determination of total mercury by cold vapour AAS and INAA and of methylmercury by extraction combined with gas chromatography coupled to an ECD are summarized with results on some of the samples analyzed, including the stability of values over a period of time of storage. (author)

  3. A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.

    Science.gov (United States)

    Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram

    2017-04-01

    Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.

  4. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Telemetry Transmission over Internet Protocol (TMoIP) Standard

    Science.gov (United States)

    2010-10-01

    Gigabit Interface Converter (GBIC)/Small Form-factor Pluggable ( SFP ) connector interfaces be included in TMoIP equipment that implements fiber optic...Description Protocol: a format for describing streaming media parameters. SEQ NUMBER Sequence Number (field name reference) SFP Small Form-factor...Ethernet over twisted pair at 1000 Mbit/sec 802.3ab Optional Notes: a. To provide user flexibility, it is recommended that support for GBIC/ SFP

  6. Epidemiological cut-off values for Flavobacterium psychrophilum MIC data generated by a standard test protocol

    DEFF Research Database (Denmark)

    Smith, P.; Endris, R.; Kronvall, G.

    2016-01-01

    Epidemiological cut-off values were developed for application to antibiotic susceptibility data for Flavobacterium psychrophilum generated by standard CLSI test protocols. The MIC values for ten antibiotic agents against Flavobacterium psychrophilum were determined in two laboratories. For five a...

  7. Standardizing Physiologic Assessment Data to Enable Big Data Analytics.

    Science.gov (United States)

    Matney, Susan A; Settergren, Theresa Tess; Carrington, Jane M; Richesson, Rachel L; Sheide, Amy; Westra, Bonnie L

    2016-07-18

    Disparate data must be represented in a common format to enable comparison across multiple institutions and facilitate big data science. Nursing assessments represent a rich source of information. However, a lack of agreement regarding essential concepts and standardized terminology prevent their use for big data science in the current state. The purpose of this study was to align a minimum set of physiological nursing assessment data elements with national standardized coding systems. Six institutions shared their 100 most common electronic health record nursing assessment data elements. From these, a set of distinct elements was mapped to nationally recognized Logical Observations Identifiers Names and Codes (LOINC®) and Systematized Nomenclature of Medicine-Clinical Terms (SNOMED CT®) standards. We identified 137 observation names (55% new to LOINC), and 348 observation values (20% new to SNOMED CT) organized into 16 panels (72% new LOINC). This reference set can support the exchange of nursing information, facilitate multi-site research, and provide a framework for nursing data analysis. © The Author(s) 2016.

  8. Shoulder muscle endurance: the development of a standardized and reliable protocol

    Directory of Open Access Journals (Sweden)

    Roy Jean-Sébastien

    2011-01-01

    Full Text Available Abstract Background Shoulder muscle fatigue has been proposed as a possible link to explain the association between repetitive arm use and the development of rotator cuff disorders. To our knowledge, no standardized clinical endurance protocol has been developed to evaluate the effects of muscle fatigue on shoulder function. Such a test could improve clinical examination of individuals with shoulder disorders. Therefore, the purpose of this study was to establish a reliable protocol for objective assessment of shoulder muscle endurance. Methods An endurance protocol was developed on a stationary dynamometer (Biodex System 3. The endurance protocol was performed in isotonic mode with the resistance set at 50% of each subject's peak torque as measured for shoulder external (ER and internal rotation (IR. Each subject performed 60 continuous repetitions of IR/ER rotation. The endurance protocol was performed by 36 healthy individuals on two separate occasions at least two days apart. Maximal isometric shoulder strength tests were performed before and after the fatigue protocol to evaluate the effects of the endurance protocol and its reliability. Paired t-tests were used to evaluate the reduction in shoulder strength due to the protocol, while intraclass correlation coefficients (ICC and minimal detectable change (MDC were used to evaluate its reliability. Results Maximal isometric strength was significantly decreased after the endurance protocol (P 0.84. Conclusions Changes in muscular performance observed during and after the muscular endurance protocol suggests that the protocol did result in muscular fatigue. Furthermore, this study established that the resultant effects of fatigue of the proposed isotonic protocol were reproducible over time. The protocol was performed without difficulty by all volunteers and took less than 10 minutes to perform, suggesting that it might be feasible for clinical practice. This protocol could be used to induce

  9. Microsystems for liquid-liquid extraction of radionuclides in the analytical protocols

    International Nuclear Information System (INIS)

    Helle, Gwendolyne

    2014-01-01

    Radiochemical analyses are necessary to numerous steps for nuclear wastes management and for the control of the environment. An analytical protocol generally includes different steps of chemical separations which are lengthy, manual and complicated to implement because of their confinement in glove boxes and because of the hostile chemical and radiochemical media. Thus there is a huge importance to propose innovative and robust solutions to automate these steps but also to reduce the volumes of the radioactive and chemical wastes at the end of the analytical cycle. One solution consists in the miniaturization of the analyses through the use of lab-on-chip. The objective of this thesis work was to propose a rational approach to the conception of separative microsystems for the liquid-liquid extraction of radionuclides. To achieve this, the hydrodynamic behavior as well as the extraction performances have been investigated in one chip for three different chemical systems: Eu(III)-HNO 3 /DMDBTDMA, Eu(III)-AcO(H,Na)-HNO 3 /HDEHP and U(VI)-HCl/Aliquat336. A methodology has been developed for the implementation of the liquid-liquid extraction in micro-system for each chemical system. The influence of various geometric parameters such as channel length or specific interfacial area has been studied and the comparison of the liquid-liquid extraction performances has led to highlight the influence of the phases viscosities ratio on the flows. Thanks to the modeling of both hydrodynamics and mass transfer in micro-system, the criteria related to physical and kinetic properties of the chemical systems have been distinguished to propose a rational conception of tailor-made chips. Finally, several examples of the liquid-liquid extraction implementation in micro-system have been described for analytical applications in the nuclear field: U/Co separation by Aliquat336, Eu/Sm separation by DMDBTDMA or even the coupling between a liquid-liquid extraction chip and the system of

  10. Impact of standard test protocols on sporicidal efficacy.

    Science.gov (United States)

    Wesgate, R; Rauwel, G; Criquelion, J; Maillard, J-Y

    2016-07-01

    There has been an increase in the availability of commercial sporicidal formulations. Any comparison of sporicidal data from the literature is hampered by the number of different standard tests available and the use of diverse test conditions including bacterial strains and endospore preparation. To evaluate the effect of sporicidal standard tests on the apparent activity of eight biocides against Clostridium difficile and Bacillus subtilis. The activity of eight biocidal formulations including two oxidizing agents, two aldehydes, three didecyldimethylammonium chloride (DDAC) and amine formulations, and sodium hypochlorite were evaluated using four standard sporicidal tests (BS EN 14347, BS EN13704, ASTM E2197-11, and AOAC MB-15-03) against B. subtilis (ACTC 19659) and C. difficile (NCTC 11209) spores. C. difficile spores were more susceptible to the sporicides than were B. subtilis spores, regardless of the method used. There were differences in sporicidal activity between methods at 5 min but not at 60 min exposure. DDAC and amine-based products were not sporicidal when neutralized appropriately. Neutralization validation was confirmed for these biocides using the reporting format described in the BS EN standard tests, although the raw data appear to indicate that neutralization failed. The different methods, whether based on suspension or carrier tests, produced similar sporicidal inactivation data. This study suggests that detailed neutralization validation data should be reported to ensure that neutralization of active spores is effective. Failure to do so may lead to erroneous sporicidal claims. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  11. Assessing impacts of roads: application of a standard assessment protocol

    Science.gov (United States)

    Duniway, Michael C.; Herrick, Jeffrey E.

    2013-01-01

    Adaptive management of road networks depends on timely data that accurately reflect the impacts those systems are having on ecosystem processes and associated services. In the absence of reliable data, land managers are left with little more than observations and perceptions to support management decisions of road-associated disturbances. Roads can negatively impact the soil, hydrologic, plant, and animal processes on which virtually all ecosystem services depend. The Interpreting Indicators of Rangeland Health (IIRH) protocol is a qualitative method that has been demonstrated to be effective in characterizing impacts of roads. The goal of this study were to develop, describe, and test an approach for using IIRH to systematically evaluate road impacts across large, diverse arid and semiarid landscapes. We developed a stratified random sampling approach to plot selection based on ecological potential, road inventory data, and image interpretation of road impacts. The test application on a semiarid landscape in southern New Mexico, United States, demonstrates that the approach developed is sensitive to road impacts across a broad range of ecological sites but that not all the types of stratification were useful. Ecological site and road inventory strata accounted for significant variability in the functioning of ecological processes but stratification based on apparent impact did not. Analysis of the repeatability of IIRH applied to road plots indicates that the method is repeatable but consensus evaluations based on multiple observers should be used to minimize risk of bias. Landscape-scale analysis of impacts by roads of contrasting designs (maintained dirt or gravel roads vs. non- or infrequently maintained roads) suggests that future travel management plans for the study area should consider concentrating traffic on fewer roads that are well designed and maintained. Application of the approach by land managers will likely provide important insights into

  12. 78 FR 14654 - Standards for Business Practices and Communication Protocols for Public Utilities

    Science.gov (United States)

    2013-03-07

    ...; Order No. 676-G] Standards for Business Practices and Communication Protocols for Public Utilities... practices and electronic communications for public utilities to incorporate by reference updated business... regulations at 18 CFR 38.2(a) (which establish standards for business practices and electronic communications...

  13. Using Learning Analytics to Enhance Student Learning in Online Courses Based on Quality Matters Standards

    Science.gov (United States)

    Martin, Florence; Ndoye, Abdou; Wilkins, Patricia

    2016-01-01

    Quality Matters is recognized as a rigorous set of standards that guide the designer or instructor to design quality online courses. We explore how Quality Matters standards guide the identification and analysis of learning analytics data to monitor and improve online learning. Descriptive data were collected for frequency of use, time spent, and…

  14. Standardized cardiovascular magnetic resonance imaging (CMR protocols, society for cardiovascular magnetic resonance: board of trustees task force on standardized protocols

    Directory of Open Access Journals (Sweden)

    Kim Raymond J

    2008-07-01

    Full Text Available Index 1. General techniques 1.1. Stress and safety equipment 1.2. Left ventricular (LV structure and function module 1.3. Right ventricular (RV structure and function module 1.4. Gadolinium dosing module. 1.5. First pass perfusion 1.6. Late gadolinium enhancement (LGE 2. Disease specific protocols 2.1. Ischemic heart disease 2.1.1. Acute myocardial infarction (MI 2.1.2. Chronic ischemic heart disease and viability 2.1.3. Dobutamine stress 2.1.4. Adenosine stress perfusion 2.2. Angiography: 2.2.1. Peripheral magnetic resonance angiography (MRA 2.2.2. Thoracic MRA 2.2.3. Anomalous coronary arteries 2.2.4. Pulmonary vein evaluation 2.3. Other 2.3.1. Non-ischemic cardiomyopathy 2.3.2. Arrhythmogenic right ventricular cardiomyopathy (ARVC 2.3.3. Congenital heart disease 2.3.4. Valvular heart disease 2.3.5. Pericardial disease 2.3.6. Masses

  15. Protocol Standardization Reveals MV Correlation to Healthy Donor BMI

    Directory of Open Access Journals (Sweden)

    Philip Hexley

    2014-01-01

    Full Text Available Microvesicles (MVs are cell-derived vesicles which are of interest in a clinical setting, as they may be predictive of early signs of disease and/or of treatment progression. However, there are growing concerns about using conventional flow cytometry (cFMC for the detection and quantification of microvesicles. These concerns range from error-sources in collection through to the physical limitations of detection. Here we present a standardized method for collection and analysis which shows that the MV numbers detected by cFCM correlate to donor Body Mass Index (BMI. Although unlikely to be comprehensive, we also demonstrate how cFCM is a useful and valid tool in the analysis of MVs.

  16. Asynchronous transfer mode and Local Area Network emulation standards, protocols, and security implications

    OpenAIRE

    Kirwin, John P.

    1999-01-01

    A complex networking technology called Asynchronous Transfer Mode (ATM) and a networking protocol called Local Area Network Emulation (LANE) are being integrated into many naval networks without any security-driven naval configuration guidelines. No single publication is available that describes security issues of data delivery and signaling relating to the transition of Ethernet to LANE and ATM. The thesis' focus is to provide: (1) an overview and security analysis of standardized protocols ...

  17. Analytical standards production for the analysis of pomegranate anthocyanins by HPLC

    Directory of Open Access Journals (Sweden)

    Manuela Cristina Pessanha de Araújo Santiago

    2014-03-01

    Full Text Available Pomegranate (Punica granatum L. is a fruit with a long medicinal history, especially due to its phenolic compounds content, such as the anthocyanins, which are reported as one of the most important natural antioxidants. The analysis of the anthocyanins by high performance liquid chromatography (HPLC can be considered as an important tool to evaluate the quality of pomegranate juice. For research laboratories the major challenge in using HPLC for quantitative analyses is the acquisition of high purity analytical standards, since these are expensive and in some cases not even commercially available. The aim of this study was to obtain analytical standards for the qualitative and quantitative analysis of the anthocyanins from pomegranate. Five vegetable matrices (pomegranate flower, jambolan, jabuticaba, blackberry and strawberry fruits were used to isolate each of the six anthocyanins present in pomegranate fruit, using an analytical HPLC scale with non-destructive detection, it being possible to subsequently use them as analytical standards. Furthermore, their identities were confirmed by high resolution mass spectrometry. The proposed procedure showed that it is possible to obtain analytical standards of anthocyanins with a high purity grade (98.0 to 99.9% from natural sources, which was proved to be an economic strategy for the production of standards by laboratories according to their research requirements.

  18. A framework for the definition of standardized protocols for measuring upper-extremity kinematics.

    Science.gov (United States)

    Kontaxis, A; Cutti, A G; Johnson, G R; Veeger, H E J

    2009-03-01

    Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross validation analysis and so weaken the body of knowledge. This difficulty highlights a need for standardised protocols, each addressing a set of questions of comparable content. The aim of this work is therefore to open a discussion and propose a flexible framework to support: (1) the definition of standardised protocols, (2) a standardised description of these protocols, and (3) the formulation of general recommendations. Proposal of a framework for the definition of standardized protocols. The framework is composed by two nested flowcharts. The first defines what a motion analysis protocol is by pointing out its role in a motion analysis study. The second flowchart describes the steps to build a protocol, which requires decisions on the joints or segments to be investigated and the description of their mechanical equivalent model, the definition of the anatomical or functional coordinate frames, the choice of marker or sensor configuration and the validity of their use, the definition of the activities to be measured and the refinements that can be applied to the final measurements. Finally, general recommendations are proposed for each of the steps based on the current literature, and open issues are highlighted for future investigation and standardisation. Standardisation of motion analysis protocols is urgent. The proposed framework can guide this process through the rationalisation of the approach.

  19. Standardization guide for construction and use of MORT-type analytic trees

    Energy Technology Data Exchange (ETDEWEB)

    Buys, J.R.

    1992-02-01

    Since the introduction of MORT (Management Oversight and Risk Tree) technology as a tool for evaluating the success or failure of safety management systems, there has been a proliferation of analytic trees throughout US Department of Energy (DOE) and its contractor organizations. Standard fault tree'' symbols have generally been used in logic diagram or tree construction, but new or revised symbols have also been adopted by various analysts. Additionally, a variety of numbering systems have been used for event identification. The consequent lack of standardization has caused some difficulties in interpreting the trees and following their logic. This guide seeks to correct this problem by providing a standardized system for construction and use of analytic trees. Future publications of the DOE System Safety Development Center (SSDC) will adhere to this guide. It is recommended that other DOE organizations and contractors also adopt this system to achieve intra-DOE uniformity in analytic tree construction.

  20. Emotional assistance in thalassaemia: pilot implementation of a standard protocol

    Directory of Open Access Journals (Sweden)

    M.T. Veit

    2011-12-01

    Full Text Available This study aims to describe the creation process of standard procedures to make possible multicentre studies related to emotional aspects of thalassaemic patients, their families and caregivers; and the pilot phase of the routine implementation. The objectives defined to perform this goal are: i develop routines to assess and manage/treat emotional issues; ii adjust the ABRASTA - Brazilian Association of Thalassaemia computer system to the input of collected data and its compilation; iii conduct a pilot implementation of the routines; iv discuss the whole process and propose next steps. Forty patients were assisted following the above mentioned routines of psychological evaluation, follow-up assistance and management of specific emotional issues. Conclusions are that the routines are adequate to enable multicenter research to compare findings and develop specific interventions to Thalassaemia patients, their families and caregivers; information gathered through them is an important means of supporting medical doctors and other members of the professional team, both in the therapeutic planning and in the communication process with patients and families; finally, considering the nature of the information, psychologists and psychiatrists are the most indicated professionals to perform the assessment and the interventions related to emotional issues, due to their professional background, training and specific skills that allow a free and candid communication with the patients and their families. 本研究旨在描述标准程序的创造过程,来进行关于地中海贫血患者、其家属和照顾者情感方面可能的多中心研究;以及例程实施的试点阶段。 为实现此目的而定下的目标有: 1)制定例程评估和管理/处理情感问题;2)调整巴西地中海贫血病协会(ABRASTA)计算机系统, 输入收集到的数据并对其进行编辑;3)对例程进行试点实施;4)讨论整个过

  1. Solution standards for quality control of nuclear-material analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.

    1981-01-01

    Analytical chemistry measurement control depends upon reliable solution standards. At the Savannah River Plant Control Laboratory over a thousand analytical measurements are made daily for process control, product specification, accountability, and nuclear safety. Large quantities of solution standards are required for a measurement quality control program covering the many different analytical chemistry methods. Savannah River Plant produced uranium, plutonium, neptunium, and americium metals or oxides are dissolved to prepare stock solutions for working or Quality Control Standards (QCS). Because extensive analytical effort is required to characterize or confirm these solutions, they are prepared in large quantities. These stock solutions are diluted and blended with different chemicals and/or each other to synthesize QCS that match the matrices of different process streams. The target uncertainty of a standard's reference value is 10% of the limit of error of the methods used for routine measurements. Standard Reference Materials from NBS are used according to special procedures to calibrate the methods used in measuring the uranium and plutonium standards so traceability can be established. Special precautions are required to minimize the effects of temperature, radiolysis, and evaporation. Standard reference values are periodically corrected to eliminate systematic errors caused by evaporation or decay products. Measurement control is achieved by requiring analysts to analyze a blind QCS each shift a measurement system is used on plant samples. Computer evaluation determines whether or not a measurement is within the +- 3 sigma control limits. Monthly evaluations of the QCS measurements are made to determine current bias correction factors for accountability measurements and detect significant changes in the bias and precision statistics. The evaluations are also used to plan activities for improving the reliability of the analytical chemistry measurements

  2. Gauging megadiversity with optimized and standardized sampling protocols: A case for tropical forest spiders.

    Science.gov (United States)

    Malumbres-Olarte, Jagoba; Scharff, Nikolaj; Pape, Thomas; Coddington, Jonathan A; Cardoso, Pedro

    2017-01-01

    Characterizing and monitoring biodiversity and assessing its drivers require accurate and comparable data on species assemblages, which, in turn, should rely on efficient and standardized field collection. Unfortunately, protocols that follow such criteria remain scarce and it is unclear whether they can be applied to megadiverse communities, whose study can be particularly challenging. Here, we develop and evaluate the first optimized and standardized sampling protocol for megadiverse communities, using tropical forest spiders as a model taxon. We designed the protocol COBRA-TF (Conservation Oriented Biodiversity Rapid Assessment for Tropical Forests) using a large dataset of semiquantitative field data from different continents. This protocol combines samples of different collecting methods to obtain as many species as possible with minimum effort (optimized) and widest applicability and comparability (standardized). We ran sampling simulations to assess the efficiency of COBRA-TF (optimized, non-site-specific) and its reliability for estimating taxonomic, phylogenetic, and functional diversity, and community structure by comparing it with (1) commonly used expert-based ad hoc protocols (nonoptimized, site-specific) and (2) optimal protocols (optimized, site-specific). We then tested the performance and feasibility of COBRA-TF in the field. COBRA-TF yielded similar results as ad hoc protocols for species (observed and estimated) and family richness, phylogenetic and functional diversity, and species abundance distribution. Optimal protocols detected more species than COBRA-TF. Data from the field test showed high sampling completeness and yielded low numbers of singletons and doubletons. Optimized and standardized protocols can be as effective in sampling and studying megadiverse communities as traditional sampling, while allowing data comparison. Although our target taxa are spiders, COBRA-TF can be modified to apply to any highly diverse taxon and habitat as

  3. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  4. ACR/NEMA Digital Image Interface Standard (An Illustrated Protocol Overview)

    Science.gov (United States)

    Lawrence, G. Robert

    1985-09-01

    The American College of Radiologists (ACR) and the National Electrical Manufacturers Association (NEMA) have sponsored a joint standards committee mandated to develop a universal interface standard for the transfer of radiology images among a variety of PACS imaging devicesl. The resulting standard interface conforms to the ISO/OSI standard reference model for network protocol layering. The standard interface specifies the lower layers of the reference model (Physical, Data Link, Transport and Session) and implies a requirement of the Network Layer should a requirement for a network exist. The message content has been considered and a flexible message and image format specified. The following Imaging Equipment modalities are supported by the standard interface... CT Computed Tomograpy DS Digital Subtraction NM Nuclear Medicine US Ultrasound MR Magnetic Resonance DR Digital Radiology The following data types are standardized over the transmission interface media.... IMAGE DATA DIGITIZED VOICE HEADER DATA RAW DATA TEXT REPORTS GRAPHICS OTHERS This paper consists of text supporting the illustrated protocol data flow. Each layer will be individually treated. Particular emphasis will be given to the Data Link layer (Frames) and the Transport layer (Packets). The discussion utilizes a finite state sequential machine model for the protocol layers.

  5. Telemetry Standards, IRIG Standard 106-17, Chapter 22, Network Based Protocol Suite

    Science.gov (United States)

    2017-07-01

    structure, field definitions , and media access control (MAC) conventions specified in IEEE 802.3-2012, Section 1, Clauses 2, 3, and 4. Data link...support for ICMP broadcast pings. 22.3.1.2 Internet Group Management Protocol (IGMP) NetworkNodes that consume or forward dynamically configured IPv4...selection of the exact SSL and TLS versions to use. Certificate generation and exchanges shall be in accordance with the profile identified in RFC

  6. Nanometrology, Standardization and Regulation of Nanomaterials in Brazil: A Proposal for an Analytical-Prospective Model

    Directory of Open Access Journals (Sweden)

    Ana Rusmerg Giménez Ledesma

    2013-05-01

    Full Text Available The main objective of this paper is to propose an analytical-prospective model as a tool to support decision-making processes concerning metrology, standardization and regulation of nanomaterials in Brazil, based on international references and ongoing initiatives in the world. In the context of nanotechnology development in Brazil, the motivation for carrying out this research was to identify potential benefits of metrology, standardization and regulation of nanomaterials production, from the perspective of future adoption of the model by the main stakeholders of development of these areas in Brazil. The main results can be summarized as follows: (i an overview of international studies on metrology, standardization and regulation of nanomaterials, and nanoparticles, in special; (ii the analytical-prospective model; and (iii the survey questionnaire and the roadmapping tool for metrology, standardization and regulation of nanomaterials in Brazil, based on international references and ongoing initiatives in the world.

  7. Inter-laboratory variation in DNA damage using a standard comet assay protocol

    DEFF Research Database (Denmark)

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen

    2012-01-01

    There are substantial inter-laboratory variations in the levels of DNA damage measured by the comet assay. The aim of this study was to investigate whether adherence to a standard comet assay protocol would reduce inter-laboratory variation in reported values of DNA damage. Fourteen laboratories ...

  8. Standard Protocol and Quality Assessment of Soil Phosphorus Speciation by P K-Edge XANES Spectroscopy.

    Science.gov (United States)

    Werner, Florian; Prietzel, Jörg

    2015-09-01

    Phosphorus (P) in soils is most often bound as phosphate to one or more of the following four elements or compounds: calcium, aluminum, iron, and soil organic matter. A promising method for direct P speciation in soils is synchrotron-based X-ray absorption near edge structure (XANES) spectroscopy at the K-edge of P. However, the quality of this method is debated controversially, partly because a standard protocol for reproducible spectrum deconvolution is lacking and minor modifications of the applied deconvolution procedure can lead to considerable changes in the P speciation results. On the basis of the observation that appropriate baseline correction and edge-step normalization are crucial for correct linear combination (LC) fitting results, we established a standard protocol for the deconvolution and LC fitting of P K-edge XANES spectra. We evaluated the quality of LC fits obtained according to this standard protocol with 16 defined dilute (2 mg P g(-1)) ternary mixtures of aluminum phosphate, iron phosphate, hydroxyapatite, and phytic acid in a quartz matrix. The LC fitting results were compared with the contribution of the different P compounds to total P in the various mixtures. Compared to using a traditional LC fitting procedure, our standard protocol reduced the fitting error by 6% (absolute). However, P portions smaller than 5% should be confirmed with other methods or excluded from the P speciation results. A publicly available database of P K-edge XANES reference spectra was initiated.

  9. A standard protocol for describing individual-based and agent-based models

    Science.gov (United States)

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  10. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  11. Improving treatment times for patients with in-hospital stroke using a standardized protocol.

    Science.gov (United States)

    Koge, Junpei; Matsumoto, Shoji; Nakahara, Ichiro; Ishii, Akira; Hatano, Taketo; Sadamasa, Nobutake; Kai, Yasutoshi; Ando, Mitsushige; Saka, Makoto; Chihara, Hideo; Takita, Wataru; Tokunaga, Keisuke; Kamata, Takahiko; Nishi, Hidehisa; Hashimoto, Tetsuya; Tsujimoto, Atsushi; Kira, Jun-Ichi; Nagata, Izumi

    2017-10-15

    Previous reports have shown significant delays in treatment of in-hospital stroke (IHS). We developed and implemented our IHS alert protocol in April 2014. We aimed to determine the influence of implementation of our IHS alert protocol. Our implementation processes comprise the following four main steps: IHS protocol development, workshops for hospital staff to learn about the protocol, preparation of standardized IHS treatment kits, and obtaining feedback in a monthly hospital staff conference. We retrospectively compared protocol metrics and clinical outcomes of patients with IHS treated with intravenous thrombolysis and/or endovascular therapy between before (January 2008-March 2014) and after implementation (April 2014-December 2016). Fifty-five patients were included (pre, 25; post, 30). After the implementation, significant reductions occurred in the median time from stroke recognition to evaluation by a neurologist (30 vs. 13.5min, pvs. 26.5min, pvs. 16min, p=0.02). The median time from first neuroimaging to endovascular therapy had a tendency to decrease (75 vs. 53min, p=0.08). There were no differences in the favorable outcomes (modified Rankin scale score of 0-2) at discharge or the incidence of symptomatic intracranial hemorrhage between the two periods. Our IHS alert protocol implementation saved time in treating patients with IHS without compromising safety. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. [Protocol for the study of bone tumours and standardization of pathology reports].

    Science.gov (United States)

    Machado, Isidro; Pozo, José Juan; Marcilla, David; Cruz, Julia; Tardío, Juan C; Astudillo, Aurora; Bagué, Sílvia

    Primary bone neoplasms represent a rare and heterogeneous group of mesenchymal tumours. The prevalence of benign and malignant tumours varies; the latter (sarcomas) account for less than 0.2% of all malignant tumours. Primary bone neoplasms are usually diagnosed and classified according to the criteria established and published by the World Health Organization (WHO 2013). These criteria are a result of advances in molecular pathology, which complements the histopathological diagnosis. Bone tumours should be diagnosed and treated in referral centers by a multidisciplinary team including pathologists, radiologists, orthopedic surgeons and oncologists. We analyzed different national and international protocols in order to provide a guide of recommendations for the improvement of pathological evaluation and management of bone tumours. We include specific recommendations for the pre-analytical, analytical, and post-analytical phases, as well as protocols for gross and microscopic pathology. Copyright © 2016 Sociedad Española de Anatomía Patológica. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    Science.gov (United States)

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while

  14. Proposed standardization of assessment protocols for plant molluscicides for use in self-help control programmes.

    Science.gov (United States)

    Brackenbury, T D

    1998-10-01

    Several candidate plant molluscicides have been identified for possible incorporation into self-help control programmes against schistosomiasis, but their full potential has yet to be realised. This has been, in the most part, due to the absence of standardized assessment and toxicity protocols, and subsequently the inability to register crude plant products in their country of origin or internationally. In an attempt to solve this dilemma, a series of protocols has been designed which will provide a useful standardized guideline for investigations into plant molluscicides, as well as precipitate moves towards the establishment of internationally accepted guidelines for the assessment of various categories of plant biopesticides. Ultimately, being able to register a crude plant extract will permit health organizations such as the World Health Organization to promote the use of such material, especially in self-help control programmes, thereby improving the health standards of rural communities.

  15. Comparison of a new whole-body continuous-table-movement protocol versus a standard whole-body MR protocol for the assessment of multiple myeloma.

    Science.gov (United States)

    Weckbach, S; Michaely, H J; Stemmer, A; Schoenberg, S O; Dinter, D J

    2010-12-01

    To evaluate a whole body (WB) continuous-table-movement (CTM) MR protocol for the assessment of multiple myeloma (MM) in comparison to a step-by-step WB protocol. Eighteen patients with MM were examined at 1.5T using a WB CTM protocol (axial T2-w fs BLADE, T1-w GRE sequence) and a step-by-step WB protocol including coronal/sagittal T1-w SE and STIR sequences as reference. Protocol time was assessed. Image quality, artefacts, liver/spleen assessability, and the ability to depict bone marrow lesions less than or greater than 1 cm as well as diffuse infiltration and soft tissue lesions were rated. Potential changes in the Durie and Salmon Plus stage and the detectability of complications were assessed. Mean protocol time was 6:38 min (CTM) compared to 24:32 min (standard). Image quality was comparable. Artefacts were more prominent using the CTM protocol (P = 0.0039). Organ assessability was better using the CTM protocol (P < 0.001). Depiction of bone marrow and soft tissue lesions was identical without a staging shift. Vertebral fractures were not detected using the CTM protocol. The new protocol allows a higher patient throughput and facilitates the depiction of extramedullary lesions. However, as long as vertebral fractures are not detectable, the protocol cannot be safely used for clinical routine without the acquisition of an additional sagittal sequence.

  16. Comparison of a new whole-body continuous-table-movement protocol versus a standard whole-body MR protocol for the assessment of multiple myeloma

    International Nuclear Information System (INIS)

    Weckbach, S.; Michaely, H.J.; Schoenberg, S.O.; Dinter, D.J.; Stemmer, A.

    2010-01-01

    To evaluate a whole body (WB) continuous-table-movement (CTM) MR protocol for the assessment of multiple myeloma (MM) in comparison to a step-by-step WB protocol. Eighteen patients with MM were examined at 1.5T using a WB CTM protocol (axial T2-w fs BLADE, T1-w GRE sequence) and a step-by-step WB protocol including coronal/sagittal T1-w SE and STIR sequences as reference. Protocol time was assessed. Image quality, artefacts, liver/spleen assessability, and the ability to depict bone marrow lesions less than or greater than 1 cm as well as diffuse infiltration and soft tissue lesions were rated. Potential changes in the Durie and Salmon Plus stage and the detectability of complications were assessed. Mean protocol time was 6:38 min (CTM) compared to 24:32 min (standard). Image quality was comparable. Artefacts were more prominent using the CTM protocol (P = 0.0039). Organ assessability was better using the CTM protocol (P < 0.001). Depiction of bone marrow and soft tissue lesions was identical without a staging shift. Vertebral fractures were not detected using the CTM protocol. The new protocol allows a higher patient throughput and facilitates the depiction of extramedullary lesions. However, as long as vertebral fractures are not detectable, the protocol cannot be safely used for clinical routine without the acquisition of an additional sagittal sequence. (orig.)

  17. Improving post-stroke dysphagia outcomes through a standardized and multidisciplinary protocol: an exploratory cohort study.

    Science.gov (United States)

    Gandolfi, Marialuisa; Smania, Nicola; Bisoffi, Giulia; Squaquara, Teresa; Zuccher, Paola; Mazzucco, Sara

    2014-12-01

    Stroke is a major cause of dysphagia. Few studies to date have reported on standardized multidisciplinary protocolized approaches to the management of post-stroke dysphagia. The aim of this retrospective cohort study was to evaluate the impact of a standardized multidisciplinary protocol on clinical outcomes in patients with post-stroke dysphagia. We performed retrospective chart reviews of patients with post-stroke dysphagia admitted to the neurological ward of Verona University Hospital from 2004 to 2008. Outcomes after usual treatment for dysphagia (T- group) were compared versus outcomes after treatment under a standardized diagnostic and rehabilitative multidisciplinary protocol (T+ group). Outcome measures were death, pneumonia on X-ray, need for respiratory support, and proportion of patients on tube feeding at discharge. Of the 378 patients admitted with stroke, 84 had dysphagia and were enrolled in the study. A significantly lower risk of in-hospital death (odds ratio [OR] 0.20 [0.53-0.78]), pneumonia (OR 0.33 [0.10-1.03]), need for respiratory support (OR 0.48 [0.14-1.66]), and tube feeding at discharge (OR 0.30 [0.09-0.91]) was recorded for the T+ group (N = 39) as compared to the T- group (N = 45). The adjusted OR showed no difference between the two groups for in-hospital death and tube feeding at discharge. Use of a standardized multidisciplinary protocolized approach to the management of post-stroke dysphagia may significantly reduce rates of aspiration pneumonia, in-hospital mortality, and tube feeding in dysphagic stroke survivors. Consistent with the study's exploratory purposes, our findings suggest that the multidisciplinary protocol applied in this study offers an effective model of management of post-stroke dysphagia.

  18. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    Science.gov (United States)

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  19. National Cancer Institute Biospecimen Evidence-Based Practices: a novel approach to pre-analytical standardization.

    Science.gov (United States)

    Engel, Kelly B; Vaught, Jim; Moore, Helen M

    2014-04-01

    Variable biospecimen collection, processing, and storage practices may introduce variability in biospecimen quality and analytical results. This risk can be minimized within a facility through the use of standardized procedures; however, analysis of biospecimens from different facilities may be confounded by differences in procedures and inferred biospecimen quality. Thus, a global approach to standardization of biospecimen handling procedures and their validation is needed. Here we present the first in a series of procedural guidelines that were developed and annotated with published findings in the field of human biospecimen science. The series of documents will be known as NCI Biospecimen Evidence-Based Practices, or BEBPs. Pertinent literature was identified via the National Cancer Institute (NCI) Biospecimen Research Database ( brd.nci.nih.gov ) and findings were organized by specific biospecimen pre-analytical factors and analytes of interest (DNA, RNA, protein, morphology). Meta-analysis results were presented as annotated summaries, which highlight concordant and discordant findings and the threshold and magnitude of effects when applicable. The detailed and adaptable format of the document is intended to support the development and execution of evidence-based standard operating procedures (SOPs) for human biospecimen collection, processing, and storage operations.

  20. Proposal for the standardization of flow cytometry protocols to detect minimal residual disease in acute lymphoblastic leukemia

    Science.gov (United States)

    Ikoma, Maura Rosane Valério; Beltrame, Miriam Perlingeiro; Ferreira, Silvia Inês Alejandra Cordoba Pires; Souto, Elizabeth Xisto; Malvezzi, Mariester; Yamamoto, Mihoko

    2015-01-01

    Minimal residual disease is the most powerful predictor of outcome in acute leukemia and is useful in therapeutic stratification for acute lymphoblastic leukemia protocols. Nowadays, the most reliable methods for studying minimal residual disease in acute lymphoblastic leukemia are multiparametric flow cytometry and polymerase chain reaction. Both provide similar results at a minimal residual disease level of 0.01% of normal cells, that is, detection of one leukemic cell in up to 10,000 normal nucleated cells. Currently, therapeutic protocols establish the minimal residual disease threshold value at the most informative time points according to the appropriate methodology employed. The expertise of the laboratory in a cancer center or a cooperative group could be the most important factor in determining which method should be used. In Brazil, multiparametric flow cytometry laboratories are available in most leukemia treatment centers, but multiparametric flow cytometry processes must be standardized for minimal residual disease investigations in order to offer reliable and reproducible results that ensure quality in the clinical application of the method. The Minimal Residual Disease Working Group of the Brazilian Society of Bone Marrow Transplantation (SBTMO) was created with that aim. This paper presents recommendations for the detection of minimal residual disease in acute lymphoblastic leukemia based on the literature and expertise of the laboratories who participated in this consensus, including pre-analytical and analytical methods. This paper also recommends that both multiparametric flow cytometry and polymerase chain reaction are complementary methods, and so more laboratories with expertise in immunoglobulin/T cell receptor (Ig/TCR) gene assays are necessary in Brazil. PMID:26670404

  1. The impact of a new standard labor protocol on maternal and neonatal outcomes.

    Science.gov (United States)

    Wang, Dingran; Ye, Shenglong; Tao, Liyuan; Wang, Yongqing

    2017-12-01

    To analyze the clinical outcomes following the implementation of a new standard labor procedure. This was a retrospective analysis that included a study group consisting of patients managed based on a new standard labor protocol and a control group comprising patients managed under an old standard labor protocol. The following maternal and perinatal outcomes were compared in the two groups: the indications for a cesarean section and the incidence of cesarean section, postpartum hemorrhage, fetal distress, neonatal asphyxia and pediatric intervention. We also compared the average number of days spent in the hospital, the incidence of medical disputes and hospitalization expenses. The cesarean section rates for the study and control groups were 19.29% (401/2079) and 33.53% (753/2246), respectively (P labor, fetal distress and intrapartum fever; the percentages of each indication were significantly different from those of the control group (P labor protocol reduced the cesarean section rate without negatively impacting maternal and neonatal outcomes. In practice, bed turnover and the hospital utilization rate should be better controlled, patient-doctor communication should be strengthened and the quality of obstetrical service should be improved.

  2. Standardized Physician-Administered Patient-Centered Discharge Protocol Improves Patients' Comprehension.

    Science.gov (United States)

    Caceres, Jennifer W; Alter, Scott M; Shih, Richard D; Fernandez, Jimmy D; Williams, Frederick K; Paley, Richard; Benda, William; Clayton, Lisa M

    2017-05-01

    Patients are 30% less likely to be readmitted or visit the emergency department if they have a clear understanding of their discharge instructions. A standardized approach to a hospital discharge plan has not been universally implemented, however. Our goal was to increase patients' comprehension of discharge instructions by implementing a standardized patient-centered discharge planning protocol that uses a physician team member to explain these plans. This was a prospective study that included all of the patients discharged from an inpatient medical teaching service in a community-based hospital during the study period. We used two 4-week periods separated by 4 months in which training and practice with the study intervention took place. Patients' understanding of discharge instructions was assessed via a follow-up telephone call from a physician co-investigator within 1 week of each patient's discharge. Differences in patients' understanding between groups were analyzed. A total of 181 patients were enrolled, with 9 lost to follow-up. After implementation of the discharge planning protocol, a statistically significant improvement in patients' understanding was found in study subjects' knowledge of their diagnosis, the adverse effects of their medications, whom to call after discharge, and follow-up appointments. Institution of a standardized patient-centered discharge planning protocol can improve patients' understanding of several key components of their discharge process, which may lead to improved compliance with instructions and outcomes.

  3. Standard protocol for evaluation of environmental transfer factors around NPP sites

    International Nuclear Information System (INIS)

    Hegde, A.G.; Verma, P.C.; Rao, D.D.

    2009-01-01

    This document presents the standard procedures for evaluation of site specific environmental transfer factors around NPP sites. The scope of this document is to provide standard protocol to be followed for evaluation of environmental transfer factors around NPP sites. The studies on transfer factors are being carried out at various NPP sites under DAE-BRNS projects for evaluation of site specific transfer factors for radionuclides released from power plants. This document contains a common methodology in terms of sampling, processing, measurements and analysis of elemental/radionuclides, while keeping the site specific requirements also in place. (author)

  4. Protocol for Usability Testing and Validation of the ISO Draft International Standard 19223 for Lung Ventilators

    Science.gov (United States)

    2017-01-01

    Background Clinicians, such as respiratory therapists and physicians, are often required to set up pieces of medical equipment that use inconsistent terminology. Current lung ventilator terminology that is used by different manufacturers contributes to the risk of usage errors, and in turn the risk of ventilator-associated lung injuries and other conditions. Human factors and communication issues are often associated with ventilator-related sentinel events, and inconsistent ventilator terminology compounds these issues. This paper describes our proposed protocol, which will be implemented at the University of Waterloo, Canada when this project is externally funded. Objective We propose to determine whether a standardized vocabulary improves the ease of use, safety, and utility as it relates to the usability of medical devices, compared to legacy medical devices from multiple manufacturers, which use different terms. Methods We hypothesize that usage errors by clinicians will be lower when standardization is consistently applied by all manufacturers. The proposed study will experimentally examine the impact of standardized nomenclature on performance declines in the use of an unfamiliar ventilator product in clinically relevant scenarios. Participants will be respiratory therapy practitioners and trainees, and we propose studying approximately 60 participants. Results The work reported here is in the proposal phase. Once the protocol is implemented, we will report the results in a follow-up paper. Conclusions The proposed study will help us better understand the effects of standardization on medical device usability. The study will also help identify any terms in the International Organization for Standardization (ISO) Draft International Standard (DIS) 19223 that may be associated with recurrent errors. Amendments to the standard will be proposed if recurrent errors are identified. This report contributes a protocol that can be used to assess the effect of

  5. Standard protocol for conducting pre-operational environmental surveillance around nuclear facilities

    International Nuclear Information System (INIS)

    Hegde, A.G.; Verma, P.C.; Rajan, M.P.

    2009-02-01

    This document presents the standard procedures for evaluation of site specific environmental transfer factors around NPP sites. The scope of this document is to provide standard protocol to be followed for conducting pre-operational environmental surveillance around nuclear facilities. Such surveillances have been proposed to be carried out by university professionals under DAE-BRNS projects. This document contains a common methodology in terms of sampling, processing, measurements and analysis of elemental/radionuclides, while keeping the site specific requirements also in place. (author)

  6. Why standard brain-computer interface (BCI) training protocols should be changed: an experimental study

    Science.gov (United States)

    Jeunet, Camille; Jahanpour, Emilie; Lotte, Fabien

    2016-06-01

    Objective. While promising, electroencephaloraphy based brain-computer interfaces (BCIs) are barely used due to their lack of reliability: 15% to 30% of users are unable to control a BCI. Standard training protocols may be partly responsible as they do not satisfy recommendations from psychology. Our main objective was to determine in practice to what extent standard training protocols impact users’ motor imagery based BCI (MI-BCI) control performance. Approach. We performed two experiments. The first consisted in evaluating the efficiency of a standard BCI training protocol for the acquisition of non-BCI related skills in a BCI-free context, which enabled us to rule out the possible impact of BCIs on the training outcome. Thus, participants (N = 54) were asked to perform simple motor tasks. The second experiment was aimed at measuring the correlations between motor tasks and MI-BCI performance. The ten best and ten worst performers of the first study were recruited for an MI-BCI experiment during which they had to learn to perform two MI tasks. We also assessed users’ spatial ability and pre-training μ rhythm amplitude, as both have been related to MI-BCI performance in the literature. Main results. Around 17% of the participants were unable to learn to perform the motor tasks, which is close to the BCI illiteracy rate. This suggests that standard training protocols are suboptimal for skill teaching. No correlation was found between motor tasks and MI-BCI performance. However, spatial ability played an important role in MI-BCI performance. In addition, once the spatial ability covariable had been controlled for, using an ANCOVA, it appeared that participants who faced difficulty during the first experiment improved during the second while the others did not. Significance. These studies suggest that (1) standard MI-BCI training protocols are suboptimal for skill teaching, (2) spatial ability is confirmed as impacting on MI-BCI performance, and (3) when faced

  7. An analysis of the Token Ring protocol as specified in ANSI/IEEE Standard 802.5-1985

    OpenAIRE

    Ayik, Nejdet

    1989-01-01

    Approved for public release; distribution in unlimited. This thesis discusses the formal specification techniques for communication protocols and the ANSI/IEEE Standard 802.5 Token Ring Access Method and Physical Layer Specification. Background information on formal protocol specification and a review of the targeted standard are provided. The ambiguities that were found with the standard and solutions to some of those are presented. The study concludes that there is a growing need to find...

  8. Multicenter validation of the analytical accuracy of Salmonella PCR: towards an international standard

    DEFF Research Database (Denmark)

    Malorny, B.; Hoorfar, Jeffrey; Bunge, C.

    2003-01-01

    As part of a major international project for the validation and standardization of PCR for detection of five major food-borne pathogens, four primer sets specific for Salmonella species were evaluated in-house for their analytical accuracy (selectivity and detection limit) in identifying 43 Salmo...... and is proposed as an international standard. This study addresses the increasing demand of quality assurance laboratories for standard diagnostic methods and presents findings that can facilitate the international comparison and exchange of epidemiological data.......As part of a major international project for the validation and standardization of PCR for detection of five major food-borne pathogens, four primer sets specific for Salmonella species were evaluated in-house for their analytical accuracy (selectivity and detection limit) in identifying 43...... of selectivity by using 364 strains showed that the inclusivity was 99.6% and the exclusivity was 100% for the invA primer set. To indicate possible PCR inhibitors derived from the sample DNA, an internal amplification control (IAC), which was coamplified with the invA target gene, was constructed...

  9. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. Age and gender leucocytes variances and references values generated using the standardized ONE-Study protocol

    DEFF Research Database (Denmark)

    Kverneland, Anders H.; Streitz, Mathias; Geissler, Edward

    2016-01-01

    Flow cytometry is now accepted as an ideal technology to reveal changes in immune cell composition and function. However, it is also an error-prone and variable technology, which makes it difficult to reproduce findings across laboratories. We have recently developed a strategy to standardize whole...... blood flow cytometry. The performance of our protocols was challenged here by profiling samples from healthy volunteers to reveal age- and gender-dependent differences and to establish a standardized reference cohort for use in clinical trials. Whole blood samples from two different cohorts were....... Males and females showed different dynamics in age-dependent T cell activation and differentiation, indicating faster immunosenescence in males. Importantly, although both cohorts consisted of a small sample size, our standardized approach enabled validation of age-dependent changes with the second...

  11. Preoperative vestibular assessment protocol of cochlear implant surgery: an analytical descriptive study

    Directory of Open Access Journals (Sweden)

    Roseli Saraiva Moreira Bittar

    Full Text Available Abstract Introduction: Cochlear implants are undeniably an effective method for the recovery of hearing function in patients with hearing loss. Objective: To describe the preoperative vestibular assessment protocol in subjects who will be submitted to cochlear implants. Methods: Our institutional protocol provides the vestibular diagnosis through six simple tests: Romberg and Fukuda tests, assessment for spontaneous nystagmus, Head Impulse Test, evaluation for Head Shaking Nystagmus and caloric test. Results: 21 patients were evaluated with a mean age of 42.75 ± 14.38 years. Only 28% of the sample had all normal test results. The presence of asymmetric vestibular information was documented through the caloric test in 32% of the sample and spontaneous nystagmus was an important clue for the diagnosis. Bilateral vestibular areflexia was present in four subjects, unilateral arreflexia in three and bilateral hyporeflexia in two. The Head Impulse Test was a significant indicator for the diagnosis of areflexia in the tested ear (p = 0.0001. The sensitized Romberg test using a foam pad was able to diagnose severe vestibular function impairment (p = 0.003. Conclusion: The six clinical tests were able to identify the presence or absence of vestibular function and function asymmetry between the ears of the same individual.

  12. Quantification of theobromine and caffeine in saliva, plasma and urine via liquid chromatography-tandem mass spectrometry: a single analytical protocol applicable to cocoa intervention studies.

    Science.gov (United States)

    Ptolemy, Adam S; Tzioumis, Emma; Thomke, Arjun; Rifai, Sami; Kellogg, Mark

    2010-02-01

    Targeted analyses of clinically relevant metabolites in human biofluids often require extensive sample preparation (e.g., desalting, protein removal and/or preconcentration) prior to quantitation. In this report, a single ultra-centrifugation based sample pretreatment combined with a designed liquid chromatography-tandem mass spectrometry (LC-MS/MS) protocol provides selective quantification of 3,7-dimethylxanthine (theobromine) and 1,3,7-trimethylxanthine (caffeine) in human saliva, plasma and urine samples. The optimized chromatography permitted elution of both analytes within 1.3 min of the applied gradient. Positive-mode electrospray ionization and a triple quadruple MS/MS instrument operated in multiple reaction mode were used for detection. (13)C(3) isotopically labeled caffeine was included as an internal standard to improve accuracy and precision. Implementing a 20-fold dilution of the isolated low MW biofluid fraction prior to injection effectively minimized the deleterious contributions of all three matrices to quantitation. The assay was linear over a 160-fold concentration range from 2.5 to 400 micromol L(-1) for both theobromine (average R(2) 0.9968) and caffeine (average R(2) 0.9997) respectively. Analyte peak area variations for 2.5 micromol L(-1) caffeine and theobromine in saliva, plasma and urine ranged from 5 and 10% (intra-day, N=10) to 9 and 13% (inter-day, N=25) respectively. The intra- and inter-day precision of theobromine and caffeine elution times were 3 and theobromine ranged from 114 to 118% and 99 to 105% at concentration levels of 10 and 300 micromol L(-1). This validated protocol also permitted the relative saliva, plasma and urine distribution of both theobromine and caffeine to be quantified following a cocoa intervention. 2009 Elsevier B.V. All rights reserved.

  13. Low-dose versus standard-dose CT protocol in patients with clinically suspected renal colic.

    Science.gov (United States)

    Poletti, Pierre-Alexandre; Platon, Alexandra; Rutschmann, Olivier T; Schmidlin, Franz R; Iselin, Christophe E; Becker, Christoph D

    2007-04-01

    The purpose of our study was to compare a low-dose abdominal CT protocol, delivering a dose of radiation close to the dose delivered by abdominal radiography, with standard-dose unenhanced CT in patients with suspected renal colic. One hundred twenty-five patients (87 men, 38 women; mean age, 45 years) who were admitted with suspected renal colic underwent both abdominal low-dose CT (30 mAs) and standard-dose CT (180 mAs). Low-dose CT and standard-dose CT were independently reviewed, in a delayed fashion, by two radiologists for the characterization of renal and ureteral calculi (location, size) and for indirect signs of renal colic (renal enlargement, pyeloureteral dilatation, periureteral or renal stranding). Results reported for low-dose CT, with regard to the patients' body mass indexes (BMIs), were compared with those obtained with standard-dose CT (reference standard). The presence of non-urinary tract-related disorders was also assessed. Informed consent was obtained from all patients. In patients with a BMI 3 mm. Low-dose CT was 100% sensitive and specific for depicting non-urinary tract-related disorders (n = 6). Low-dose CT achieves sensitivities and specificities close to those of standard-dose CT in assessing the diagnosis of renal colic, depicting ureteral calculi > 3 mm in patients with a BMI < 30, and correctly identifying alternative diagnoses.

  14. Creatinine Assay Attainment of Analytical Performance Goals Following Implementation of IDMS Standardization

    Directory of Open Access Journals (Sweden)

    Elizabeth Sunmin Lee

    2017-02-01

    Full Text Available Background: The international initiative to standardize creatinine (Cr assays by tracing reference materials to Isotope Dilution Mass Spectrometry (IDMS assigned values was implemented to reduce interlaboratory variability and improve assay accuracy. Objective: The aims of this study were to examine whether IDMS standardization has improved Cr assay accuracy (bias, interlaboratory variability (precision, total error (TE, and attainment of recommended analytical performance goals. Methods: External Quality Assessment (EQA data (n = 66 challenge vials from Ontario, Canada, were analyzed. The bias, precision, TE, and the number of EQA challenge vials meeting performance goals were determined by assay manufacturer before (n = 32 and after (n = 34 IDMS implementation. Results: The challenge vials with the worst bias and precision were spiked with known common interfering substances (glucose and bilirubin. IDMS standardization improved assay bias (10.4%-1.6%, P < .001, but precision remained unchanged (5.0%-4.7%, P = .5 with performance goals not consistently being met. Precision and TE goals based on biologic variation were attained by only 29% to 69% and 32% to 62% of challenge vials. Conclusions: While IDMS standardization has improved Cr assay accuracy and thus reduced TE, significant interlaboratory variability remains. Contemporary Cr assays do not currently meet the standards required to allow for accurate and consistent estimated glomerular filtration rate assessment and chronic kidney disease diagnosis across laboratories. Further improvements in Cr assay performance are needed.

  15. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  16. Antioxidant Phytochemicals in Fresh Produce: Exploitation of Genotype Variation and Advancements in Analytical Protocols

    Directory of Open Access Journals (Sweden)

    George A. Manganaris

    2018-02-01

    Full Text Available Horticultural commodities (fruit and vegetables are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, toward the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivars/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e., small tomato of Santorini island (cv. “Tomataki Santorinis” possesses appreciably high amounts of ascorbic acid (AsA. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier “gene pool” as the basis of future adaptation. Toward this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR and infrared (IR spectroscopy, electrochemical, and chemometric methods, flow injection analysis (FIA, optical sensors, and high resolution screening (HRS. Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e., metabolomics, foodomics is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of optical sensors and IR spectroscopy is recommended to

  17. Standardized communication protocol for BAS (IEIEJ/p); BAS hyojun interface shiyo (IEIEJ/p)

    Energy Technology Data Exchange (ETDEWEB)

    Toyoda, T. [Hitachi Building System Co. Ltd., Tokyo (Japan)

    2000-10-05

    For the BEMS user, to construct his BEMS tinder the multiple vendors environment is very beneficial because he could choice the most appropriate vendor among many vendors about every subsystem for the view point of technique and cost at any time. The effective tool which makes the BEMS tinder the multiple environment possible is the BACnet protocol which had developed and been standardized by ANSI/ASHRAE of U.S.. Institute of Electrical Installation Engineers-of Japan (IEIEJ) offers IEIEJ/p based on BACnet as IEIEJ's standard which is added the function of autonomous decentralized control to enhance the BEMS reliability to fit the Japanese multiple vendors environment, In this paper I present the outline of it's specification and feature of IEIEJ/p. (author)

  18. Multiparametric multidetector computed tomography scanning on suspicion of hyperacute ischemic stroke: validating a standardized protocol

    Directory of Open Access Journals (Sweden)

    Felipe Torres Pacheco

    2013-06-01

    Full Text Available Multidetector computed tomography (MDCT scanning has enabled the early diagnosis of hyperacute brain ischemia. We aimed at validating a standardized protocol to read and report MDCT techniques in a series of adult patients. The inter-observer agreement among the trained examiners was tested, and their results were compared with a standard reading. No false positives were observed, and an almost perfect agreement (Kappa>0.81 was documented when the CT angiography (CTA and cerebral perfusion CT (CPCT map data were added to the noncontrast CT (NCCT analysis. The inter-observer agreement was higher for highly trained readers, corroborating the need for specific training to interpret these modern techniques. The authors recommend adding CTA and CPCT to the NCCT analysis in order to clarify the global analysis of structural and hemodynamic brain abnormalities. Our structured report is suitable as a script for the reproducible analysis of the MDCT of patients on suspicion of ischemic stroke.

  19. Effectiveness of various irrigation protocols for the removal of calcium hydroxide from artificial standardized grooves

    Directory of Open Access Journals (Sweden)

    Hakan GOKTURK

    Full Text Available Abstract Objective The aim of this study was to investigate the ability of laser-activated irrigation (LAI, XP-endo Finisher, CanalBrush, Vibringe, passive ultrasonic irrigation (PUI, and conventional syringe irrigation systems on the removal of calcium hydroxide (CH from simulated root canal irregularities. Material and Methods The root canals of one hundred and five extracted single-rooted teeth were instrumented using Reciproc rotary files up to size R40. The teeth were split longitudinally. Two of the three standard grooves were created in the coronal and apical section of one segment, and another in the middle part of the second segment. The standardized grooves were filled with CH and the root halves were reassembled. After 14 days, the specimens were randomly divided into 7 experimental groups (n=15/group. CH was removed as follows: Group 1: beveled needle irrigation; Group 2: double side-vented needle irrigation; Group 3: CanalBrush; Group 4: XP-endo Finisher; Group 5: Vibringe; Group 6: PUI; Group 7: LAI. The amount of remaining CH in the grooves was scored under a stereomicroscope at 20× magnification. Statistical evaluation was performed using Kruskal–Wallis and Bonferroni-Correction Mann–Whitney U tests. Results Groups 1 and 2 were the least efficient in eliminating CH from the grooves. Groups 6 and 7 eliminated more CH than the other protocols; however, no significant differences were found between these two groups (P>.05. Conclusions Nevertheless, none of the investigated protocols were able to completely remove all CH from all three root regions. LAI and PUI showed less residual CH than the other protocols from artificial grooves.

  20. A Standardized Shift Handover Protocol: Improving Nurses’ Safe Practice in Intensive Care Units

    Directory of Open Access Journals (Sweden)

    Javad Malekzadeh

    2013-08-01

    Full Text Available Introduction: For maintaining the continuity of care and improving the quality of care, effective inter-shift information communication is necessary. Any handover error can endanger patient safety. Despite the importance of shift handover, there is no standard handover protocol in our healthcare settings. Methods In this one-group pretest-posttest quasi-experimental study conducted in spring and summer of 2011, we recruited a convenience sample of 56 ICU nurses. The Nurses’ Safe Practice Evaluation Checklist was used for data collection. The Content Validity Index and the inter-rater correlation coefficient of the checklist was 0.92 and 89, respectively. We employed the SPSS 11.5 software and the Mc Nemar and paired-samples t test for data analysis. Results: Study findings revealed that nurses’ mean score on the Safe Practice Evaluation Checklist increased significantly from 11.6 (2.7 to 17.0 (1.8 (P < 0.001. Conclusion: using a standard handover protocol for communicating patient’s needs and information improves nurses’ safe practice in the area of basic nursing care.

  1. Effect of a Standardized Protocol of Antibiotic Therapy on Surgical Site Infection after Laparoscopic Surgery for Complicated Appendicitis.

    Science.gov (United States)

    Park, Hyoung-Chul; Kim, Min Jeong; Lee, Bong Hwa

    Although it is accepted that complicated appendicitis requires antibiotic therapy to prevent post-operative surgical infections, consensus protocols on the duration and regimens of treatment are not well established. This study aimed to compare the outcome of post-operative infectious complications in patients receiving old non-standardized and new standard antibiotic protocols, involving either 5 or 10 days of treatment, respectively. We enrolled 1,343 patients who underwent laparoscopic surgery for complicated appendicitis between January 2009 and December 2014. At the beginning of the new protocol, the patients were divided into two groups; 10 days of various antibiotic regimens (between January 2009 and June 2012, called the non-standardized protocol; n = 730) and five days of cefuroxime and metronidazole regimen (between July 2012 and December 2014; standardized protocol; n = 613). We compared the clinical outcomes, including surgical site infection (SSI) (superficial and deep organ/space infections) in the two groups. The standardized protocol group had a slightly shorter operative time (67 vs. 69 min), a shorter hospital stay (5 vs. 5.4 d), and lower medical cost (US$1,564 vs. US$1,654). Otherwise, there was no difference between the groups. No differences were found in the non-standardized and standard protocol groups with regard to the rate of superficial infection (10.3% vs. 12.7%; p = 0.488) or deep organ/space infection (2.3% vs. 2.1%; p = 0.797). In patients undergoing laparoscopic surgery for complicated appendicitis, five days of cefuroxime and metronidazole did not lead to more SSIs, and it decreased the medical costs compared with non-standardized antibiotic regimens.

  2. Comparison of a fast 5-min knee MRI protocol with a standard knee MRI protocol. A multi-institutional multi-reader study

    International Nuclear Information System (INIS)

    FitzGerald Alaia, Erin; Beltran, Luis S.; Garwood, Elisabeth; Burke, Christopher J.; Gyftopoulos, Soterios; Benedick, Alex; Obuchowski, Nancy A.; Polster, Joshua M.; Schils, Jean; Subhas, Naveen; Chang, I. Yuan Joseph

    2018-01-01

    To compare diagnostic performance of a 5-min knee MRI protocol to that of a standard knee MRI. One hundred 3 T (100 patients, mean 38.8 years) and 50 1.5 T (46 patients, mean 46.4 years) MRIs, consisting of 5 fast, 2D multi-planar fast-spin-echo (FSE) sequences and five standard multiplanar FSE sequences, from two academic centers (1/2015-1/2016), were retrospectively reviewed by four musculoskeletal radiologists. Agreement between fast and standard (interprotocol agreement) and between standard (intraprotocol agreement) readings for meniscal, ligamentous, chondral, and bone pathology was compared for interchangeability. Frequency of major findings, sensitivity, and specificity was also tested for each protocol. Interprotocol agreement using fast MRI was similar to intraprotocol agreement with standard MRI (83.0-99.5%), with no excess disagreement (≤ 1.2; 95% CI, -4.2 to 3.8%), across all structures. Frequency of major findings (1.1-22.4% across structures) on fast and standard MRI was not significantly different (p ≥ 0.215), except more ACL tears on fast MRI (p = 0.021) and more cartilage defects on standard MRI (p < 0.001). Sensitivities (59-100%) and specificities (73-99%) of fast and standard MRI were not significantly different for meniscal and ligament tears (95% CI for difference, -0.08-0.08). For cartilage defects, fast MRI was slightly less sensitive (95% CI for difference, -0.125 to -0.01) but slightly more specific (95% CI for difference, 0.01-0.5) than standard MRI. A fast 5-min MRI protocol is interchangeable with and has similar accuracy to a standard knee MRI for evaluating internal derangement of the knee. (orig.)

  3. Comparison of a fast 5-min knee MRI protocol with a standard knee MRI protocol. A multi-institutional multi-reader study

    Energy Technology Data Exchange (ETDEWEB)

    FitzGerald Alaia, Erin; Beltran, Luis S.; Garwood, Elisabeth; Burke, Christopher J.; Gyftopoulos, Soterios [NYU Langone Medical Center, Department of Radiology, Musculoskeletal Division, New York, NY (United States); Benedick, Alex [Case Western Reserve University, School of Medicine, Cleveland, OH (United States); Obuchowski, Nancy A. [Cleveland Clinic, Department of Quantitative Health Sciences, Cleveland, OH (United States); Polster, Joshua M.; Schils, Jean; Subhas, Naveen [Cleveland Clinic, Department of Radiology, Musculoskeletal Division, Cleveland, OH (United States); Chang, I. Yuan Joseph [Texas Scottish Rite Hospital for Children, Dallas, TX (United States)

    2018-01-15

    To compare diagnostic performance of a 5-min knee MRI protocol to that of a standard knee MRI. One hundred 3 T (100 patients, mean 38.8 years) and 50 1.5 T (46 patients, mean 46.4 years) MRIs, consisting of 5 fast, 2D multi-planar fast-spin-echo (FSE) sequences and five standard multiplanar FSE sequences, from two academic centers (1/2015-1/2016), were retrospectively reviewed by four musculoskeletal radiologists. Agreement between fast and standard (interprotocol agreement) and between standard (intraprotocol agreement) readings for meniscal, ligamentous, chondral, and bone pathology was compared for interchangeability. Frequency of major findings, sensitivity, and specificity was also tested for each protocol. Interprotocol agreement using fast MRI was similar to intraprotocol agreement with standard MRI (83.0-99.5%), with no excess disagreement (≤ 1.2; 95% CI, -4.2 to 3.8%), across all structures. Frequency of major findings (1.1-22.4% across structures) on fast and standard MRI was not significantly different (p ≥ 0.215), except more ACL tears on fast MRI (p = 0.021) and more cartilage defects on standard MRI (p < 0.001). Sensitivities (59-100%) and specificities (73-99%) of fast and standard MRI were not significantly different for meniscal and ligament tears (95% CI for difference, -0.08-0.08). For cartilage defects, fast MRI was slightly less sensitive (95% CI for difference, -0.125 to -0.01) but slightly more specific (95% CI for difference, 0.01-0.5) than standard MRI. A fast 5-min MRI protocol is interchangeable with and has similar accuracy to a standard knee MRI for evaluating internal derangement of the knee. (orig.)

  4. Determination of perfluorinated compounds in human plasma and serum standard reference materials using independent analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Reiner, Jessica L. [National Institute of Standards and Technology, Analytical Chemistry Division, Gaithersburg, MD (United States); National Institute of Standards and Technology, Analytical Chemistry Division, Hollings Marine Laboratory, Charleston, SC (United States); Phinney, Karen W. [National Institute of Standards and Technology, Analytical Chemistry Division, Gaithersburg, MD (United States); Keller, Jennifer M. [National Institute of Standards and Technology, Analytical Chemistry Division, Hollings Marine Laboratory, Charleston, SC (United States)

    2011-11-15

    Perfluorinated compounds (PFCs) were measured in three National Institute of Standards and Technology (NIST) Standard Reference Materials (SRMs) (SRMs 1950 Metabolites in Human Plasma, SRM 1957 Organic Contaminants in Non-fortified Human Serum, and SRM 1958 Organic Contaminants in Fortified Human Serum) using two analytical approaches. The methods offer some independence, with two extraction types and two liquid chromatographic separation methods. The first extraction method investigated the acidification of the sample followed by solid-phase extraction (SPE) using a weak anion exchange cartridge. The second method used an acetonitrile extraction followed by SPE using a graphitized non-porous carbon cartridge. The extracts were separated using a reversed-phase C{sub 8} stationary phase and a pentafluorophenyl (PFP) stationary phase. Measured values from both methods for the two human serum SRMs, 1957 and 1958, agreed with reference values on the Certificates of Analysis. Perfluorooctane sulfonate (PFOS) values were obtained for the first time in human plasma SRM 1950 with good reproducibility among the methods (below 5% relative standard deviation). The nominal mass interference from taurodeoxycholic acid, which has caused over estimation of the amount of PFOS in biological samples, was separated from PFOS using the PFP stationary phase. Other PFCs were also detected in SRM 1950 and are reported. SRM 1950 can be used as a control material for human biomonitoring studies and as an aid to develop new measurement methods. (orig.)

  5. Use of the Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) for site cleanup activities

    International Nuclear Information System (INIS)

    Griggs, J.

    1999-01-01

    MARLAP is being developed as a multi-agency guidance manual for project managers and radioanalytical laboratories. The document uses a performance based approach and will provide guidance and a framework to assure that laboratory radioanalytical data meets the specific project or program needs and requirements. MARLAP supports a wide range of data collection activities including site characterization and compliance demonstration activities. Current participants include: US Environmental Protection Agency (EPA), US Department of Energy (DOE), US Nuclear Regulatory Commission (NRC), US Department of Defense (DoD), US National Institutes of Standards and Technology (NIST), US Geologic Survey (USGS), US Food and Drug Administration (FDA), Commonwealth of Kentucky, and the State of California. MARLAP is the radioanalytical laboratory counterpart to the Multi-Agency Radiological Survey and Site Investigation Manual (MARSSIM). MARLAP is currently in a preliminary draft stage. (author)

  6. From basic survival analytic theory to a non-standard application

    CERN Document Server

    Zimmermann, Georg

    2017-01-01

    Georg Zimmermann provides a mathematically rigorous treatment of basic survival analytic methods. His emphasis is also placed on various questions and problems, especially with regard to life expectancy calculations arising from a particular real-life dataset on patients with epilepsy. The author shows both the step-by-step analyses of that dataset and the theory the analyses are based on. He demonstrates that one may face serious and sometimes unexpected problems, even when conducting very basic analyses. Moreover, the reader learns that a practically relevant research question may look rather simple at first sight. Nevertheless, compared to standard textbooks, a more detailed account of the theory underlying life expectancy calculations is needed in order to provide a mathematically rigorous framework. Contents Regression Models for Survival Data Model Checking Procedures Life Expectancy Target Groups Researchers, lecturers, and students in the fields of mathematics and statistics Academics and experts work...

  7. Variation in institutional review board responses to a standard, observational, pediatric research protocol.

    Science.gov (United States)

    Mansbach, Jonathan; Acholonu, Uchechi; Clark, Sunday; Camargo, Carlos A

    2007-04-01

    Multicenter studies are becoming more common, and variability in local institutional review board (IRB) assessments can be problematic. To investigate the variability of IRB responses to a multicenter observational study of children presenting to emergency departments. The authors collected the original IRB applications, subsequent correspondence, and a survey assessing submission timing and response and the nature of IRB queries. The study was conducted as part of the Emergency Medicine Network (http://www.emnet-usa.org). Of 37 sites initiating the IRB process, 34 (92%) participated in this IRB-approved study. Institutional review boards returned initial applications in a median of 19 days (IQR, 11-34 d), and 91% considered the protocol to be minimal risk. Of 34 submissions, 13 required no changes, 18 received conditional approvals, and 3 were deferred. The median time from initial submission to final approval was 42 days (IQR, 27-61 d). Seven sites did not participate in patient recruitment: two had institutional issues, one obtained IRB approval too late for participation, and four sites (12%) reported that IRB hurdles contributed to their lack of participation. Nonetheless, 68% of sites that recruited patients reported that the overall experience made them more likely to participate in future multicenter research. There was substantial variation in IRB assessment of a standard protocol in this study. The burden of the application process contributed to some investigators not participating, but the majority of investigators remain enthusiastic about multicenter research. A national IRB may streamline the review process and facilitate multicenter clinical research.

  8. The effect of a standardized protocol for iron supplementation to blood donors low in hemoglobin concentration.

    Science.gov (United States)

    Magnussen, Karin; Bork, Nanna; Asmussen, Lisa

    2008-04-01

    Iron deficiency leading to low hemoglobin concentration (cHb) is a common problem for blood donors as well as for blood banks. A standardized protocol offering iron supplementation based on P-ferritin determination may help to reduce the problem and retain donors. This was a prospective study where 879 blood donors, presenting with cHb at or below the limit of acceptance for donation, were included. The predonation cHb result was read after donation. The donors received 50 iron tablets (JernC or Ferrochel, 100 or 25 mg elemental iron, respectively), and samples for P-ferritin, mean corpuscular volume, and control of cHb were secured. Based on a P-ferritin level of less than 60 microg per L, 20 iron tablets were offered after all following donations. Mean cHb was 7.6 mmol per L (122 g/L) and 8.2 mmol per L (132 g/L) in women and men, respectively. In 80 percent of the women and 48 percent of the men, iron stores were low (P-ferritin protocol offering iron supplementation and simple oral and written advice based on P-ferritin measurements is effective in normalizing cHb and retaining donors presenting with cHb at or below the limit of acceptance for donation.

  9. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  10. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  11. A standardized conjugation protocol to asses antibiotic resistance transfer between lactococcal species

    DEFF Research Database (Denmark)

    Lampkowska, Joanna; Feld, Louise; Monaghan, Aine

    2008-01-01

    Optimal conditions and a standardized method for conjugation between two model lactococcal strains, Lactococcus lactis SH4174 (pAM beta 1-containing, erythromycin resistant donor) and L. lactis Bu2-60 (plasmid-free, erythromycin sensitive recipient), were developed and tested in a inter-laborator...... of this kind, in which a standardized protocol of conjugal mating for testing antibiotic resistance dissemination among LAB was established and validated. (C) 2008 Elsevier B.V. All rights reserved.......Optimal conditions and a standardized method for conjugation between two model lactococcal strains, Lactococcus lactis SH4174 (pAM beta 1-containing, erythromycin resistant donor) and L. lactis Bu2-60 (plasmid-free, erythromycin sensitive recipient), were developed and tested in a inter......-laboratory experiments involving five laboratories from different countries. The ultimate goal of the study was to assess the microbial potential of antibiotic resistance transfer among Lactic Acid Bacteria (LAB). The influence of culture age (various OD values) and ratios of donor and recipient cultures as well...

  12. New multicast authentication protocol for entrusted members using advanced encryption standard

    Directory of Open Access Journals (Sweden)

    Reham Abdellatif Abouhogail

    2011-12-01

    Full Text Available Today there is a widening in digital technologies and increasing in new multimedia services like: pay-per-view TV, interactive simulations, teleconferencing. So there is an increasing demand for multicast communication. There is a number of security issues in multicast communication directly related to the specific nature of multicast. In this paper, we propose a new scheme for authenticating streamed data delivered in real-time over an insecure network, and we concentrate on the multicast authentication problem. Important requirements of multicast communication protocols are: to perform authentication in real-time, to resist packet loss and to have low communication and computation overheads. In this paper, a new multicast authentication scheme is proposed. It is suitable for real time applications. It uses the advanced encryption standard algorithm to solve the problem of entrusted members. This scheme uses the idea of the new index number each time the member sends certain block of packets in the multicast group.

  13. Optimizing Care With a Standardized Management Protocol for Patients With Infantile Spasms.

    Science.gov (United States)

    Fedak, Erin M; Patel, Anup D; Heyer, Geoffrey L; Wood, Eric G; Mytinger, John R

    2015-09-01

    The primary aim of this quality improvement initiative was to increase the number of patients receiving first-line therapy (adrenocorticotropic hormone, corticosteroids, vigabatrin) as the initial treatment for infantile spasms. We implemented a standardized management protocol for infantile spasms based on the best available data and expert consensus. To assess the impact of this intervention, we compared the 3-month remission rates between prestandardization (January 2009 to August 2012) and poststandardization (September 2012 to May 2014) cohorts. We found that the percentage of patients receiving first-line therapy as the initial treatment was 57% (31/54) in the prestandardization cohort and 100% (35/35) in the poststandardization cohort (P infantile spasms remission was higher poststandardization compared to prestandardization (78.8% vs 30.6%, P infantile spasms remission 3 months after diagnosis. © The Author(s) 2014.

  14. Cranial palpation pressures used by osteopathy students: effects of standardized protocol training.

    Science.gov (United States)

    Zegarra-Parodi, Rafael; de Chauvigny de Blot, Pierre; Rickards, Luke D; Renard, Edouard-Olivier

    2009-02-01

    Descriptions of subtle palpatory perceptions in osteopathic cranial palpation can be misperceived by students. Thus, adequate dissemination and replication of cranial palpatory techniques is challenging for osteopathy students. To evaluate the effects of standardized protocol training on cranial palpation of the frontomalar suture. Fourth-year osteopathy students from the European Center for Osteopathic Higher Education in Paris, France, were recruited and randomly divided into three groups. Students in the study group received instruction in a standardized protocol for palpatory assessment of the frontomalar suture; students in the control group did not receive instruction; and the remaining students acted as subjects. A specialized force sensor was placed on the skin covering the left frontomalar suture of each subject. Student practitioners were instructed to palpate subjects' left frontomalar suture using the customary pressure described for evaluation and treatment of somatic dysfunction of the cranium. Pressure measurements were exported to a laptop computer. Twelve students were in each group. Student practitioners' palpation pressures ranged from 0.19 to 1.12 N/cm(2), while mean palpation pressures for each test ranged from 0.27 to 0.98 N/cm(2). The mean (SD) palpation pressure in the study group and control group was 0.55 N/cm(2) (0.16 N/cm(2)) and 0.53 N/cm(2) (0.15 N/cm(2)), respectively. There was no statistically significant difference in mean palpation pressures used by the two groups. Substantial variation in test performance was noted in both groups. Palpatory training was ineffective in improving student practitioners' precision of cranial palpation performance. Quantitative feedback of palpation pressures during training may improve outcomes. To our knowledge, data on palpation pressures used during osteopathic cranial manipulation have not been reported previously in the medical literature.

  15. Comparing Short Dental Implants to Standard Dental Implants: Protocol for a Systematic Review.

    Science.gov (United States)

    Rokn, Amir Reza; Keshtkar, Abbasali; Monzavi, Abbas; Hashemi, Kazem; Bitaraf, Tahereh

    2018-01-18

    Short dental implants have been proposed as a simpler, cheaper, and faster alternative for the rehabilitation of atrophic edentulous areas to avoid the disadvantages of surgical techniques for increasing bone volume. This review will compare short implants (4 to 8 mm) to standard implants (larger than 8 mm) in edentulous jaws, evaluating on the basis of marginal bone loss (MBL), survival rate, complications, and prosthesis failure. We will electronically search for randomized controlled trials comparing short dental implants to standard dental implants in the following databases: PubMed, Web of Science, EMBASE, Scopus, the Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov with English language restrictions. We will manually search the reference lists of relevant reviews and the included articles in this review. The following journals will also be searched: European Journal of Oral Implantology, Clinical Oral Implants Research, and Clinical Implant Dentistry and Related Research. Two reviewers will independently perform the study selection, data extraction and quality assessment (using the Cochrane Collaboration tool) of included studies. All meta-analysis procedures including appropriate effect size combination, sub-group analysis, meta-regression, assessing publication or reporting bias will be performed using Stata (Statacorp, TEXAS) version 12.1. Short implant effectiveness will be assessed using the mean difference of MBL in terms of weighted mean difference (WMD) and standardized mean difference (SMD) using Cohen's method. The combined effect size measures in addition to the related 95% confidence intervals will be estimated by a fixed effect model. The heterogeneity of the related effect size will be assessed using a Q Cochrane test and I2 measure. The MBL will be presented by a standardized mean difference with a 95% confidence interval. The survival rate of implants, prostheses failures, and complications will be reported using a risk

  16. Molecular detection of Borrelia burgdorferi sensu lato – An analytical comparison of real-time PCR protocols from five different Scandinavian laboratories

    Science.gov (United States)

    Faller, Maximilian; Wilhelmsson, Peter; Kjelland, Vivian; Andreassen, Åshild; Dargis, Rimtas; Quarsten, Hanne; Dessau, Ram; Fingerle, Volker; Margos, Gabriele; Noraas, Sølvi; Ornstein, Katharina; Petersson, Ann-Cathrine; Matussek, Andreas; Lindgren, Per-Eric; Henningsson, Anna J.

    2017-01-01

    Introduction Lyme borreliosis (LB) is the most common tick transmitted disease in Europe. The diagnosis of LB today is based on the patient´s medical history, clinical presentation and laboratory findings. The laboratory diagnostics are mainly based on antibody detection, but in certain conditions molecular detection by polymerase chain reaction (PCR) may serve as a complement. Aim The purpose of this study was to evaluate the analytical sensitivity, analytical specificity and concordance of eight different real-time PCR methods at five laboratories in Sweden, Norway and Denmark. Method Each participating laboratory was asked to analyse three different sets of samples (reference panels; all blinded) i) cDNA extracted and transcribed from water spiked with cultured Borrelia strains, ii) cerebrospinal fluid spiked with cultured Borrelia strains, and iii) DNA dilution series extracted from cultured Borrelia and relapsing fever strains. The results and the method descriptions of each laboratory were systematically evaluated. Results and conclusions The analytical sensitivities and the concordance between the eight protocols were in general high. The concordance was especially high between the protocols using 16S rRNA as the target gene, however, this concordance was mainly related to cDNA as the type of template. When comparing cDNA and DNA as the type of template the analytical sensitivity was in general higher for the protocols using DNA as template regardless of the use of target gene. The analytical specificity for all eight protocols was high. However, some protocols were not able to detect Borrelia spielmanii, Borrelia lusitaniae or Borrelia japonica. PMID:28937997

  17. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  18. [Process control in acute pain management. An analysis of the degree of organization of applied standard protocols].

    Science.gov (United States)

    Erlenwein, J; Emons, M I; Hecke, A; Nestler, N; Przemeck, M; Bauer, M; Meißner, W; Petzke, F

    2014-10-01

    The aim of this study was to analyze the degree of organization of different standard protocols for acute pain management, as well as the derivation and definition of typical but structurally different models. A total of 85 hospitals provided their written standardized protocols for analysis. Protocols for defined target processes from 76 hospitals and another protocol used by more than one hospital were included into the analysis. The suggested courses of action were theoretically simulated to identify and characterize process types in a multistage evaluation process. The analysis included 148 standards. Four differentiated process types were defined ("standardized order", "analgesic ladder", "algorithm", "therapy path"), each with an increasing level of organization. These four types had the following distribution: 27 % (n = 40) "standardized order", 47 % (n = 70) "analgesic ladder", 22 % (n = 33) "algorithm", 4 % (n = 5) "therapy path". Models with a higher degree of organization included more control elements, such as action and intervention triggers or safety and supervisory elements, and were also associated with a formally better access to medication. For models with a lower degree of organization, immediate courses of action were more dependent on individual decisions. Although not quantifiable, this was particularly evident when simulating downstream courses of action. Interfaces between areas of hospital activity and a cross-departmental-boundary validity were only considered in a fraction of the protocols. Concepts from clinics with a certificate in (acute) pain management were more strongly process-oriented. For children, there were proportionately more simple concepts with a lower degree of organization and less controlling elements. This is the first analysis of a large sample of standardized protocols for acute pain management focusing on the degree of organization and the possible influence on courses of action. The analysis

  19. Methodological Study to Develop Standard Operational Protocol on Oral Drug Administration for Children.

    Science.gov (United States)

    Bijarania, Sunil Kumar; Saini, Sushma Kumari; Verma, Sanjay; Kaur, Sukhwinder

    2017-05-01

    To develop standard operational protocol (SOP) on oral drug administration and checklist to assess the implementation of the developed SOP. In this prospective methodological study, SOPs were developed in five phases. In the first phase, the preliminary draft of SOPs and checklists were prepared based on literature review, assessment of current practices and focus group discussion (FGD) with bedside working nurses. In the second phase, content validity was checked with the help of Delphi technique (12 experts). Total four drafts were prepared in stages and necessary modifications were made as per suggestions after each Delphi round. Fourth Delphi round was performed after conducting a pilot study. In the fourth phase, all bedside nurses were trained as per SOPs and asked to practice accordingly and observation of thirty oral drug administrations in children was done to check reliability of checklists for implementation of SOPs. In Phase-V, 7 FGDs were conducted with bedside nurses to assess the effectiveness of SOPs. The Content Validity Index (CVI) of SOP and checklists was 99.77%. Overall standardized Cronbach's alpha was calculated as 0.94. All the nurses felt that the SOP is useful. Valid and feasible SOP for drug administration to children through oral route along with valid and reliable checklist were developed. It is recommended to use this document for drug administration to children.

  20. Using standard treatment protocols to manage costs and quality of hospital services.

    Science.gov (United States)

    Meyer, J W; Feingold, M G

    1993-06-01

    The current health care environment has made it critically important that hospital costs and quality be managed in an integrated fashion. Promised health care reforms are expected to make cost reduction and quality enhancement only more important. Traditional methods of hospital cost and quality control have largely been replaced by such approaches as practice parameters, outcomes measurement, clinical indicators, clinical paths, benchmarking, patient-centered care, and a focus on patient selection criteria. This Special Report describes an integrated process for strategically managing costs and quality simultaneously, incorporating key elements of many important new quality and cost control tools. By using a multidisciplinary group process to develop standard treatment protocols, hospitals and their medical staffs address the most important services provided within major product lines. Using both clinical and financial data, groups of physicians, nurses, department managers, financial analysts, and administrators redesign key patterns of care within their hospital, incorporating the best practices of their own and other institutions. The outcome of this process is a new, standardized set of clinical guidelines that reduce unnecessary variation in care, eliminate redundant interventions, establish clear lines of communication for all caregivers, and reduce the cost of each stay. The hospital, medical staff, and patients benefit from the improved opportunities for managed care contracting, more efficient hospital systems, consensus-based quality measures, and reductions in the cost of care. STPs offer a workable and worthwhile approach to positioning the hospital of the 1990s for operational efficiency and cost and quality competitiveness.

  1. Protocol of the COSMIN study: COnsensus-based Standards for the selection of health Measurement INstruments

    Directory of Open Access Journals (Sweden)

    Patrick DL

    2006-01-01

    Full Text Available Abstract Background Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability, the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs, i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist. Method An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1 a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2 an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will

  2. Using GLOBE Plant Phenology Protocols To Meet the "National Science Education Standards."

    Science.gov (United States)

    Bombaugh, Ruth; Sparrow, Elena; Mal, Tarun

    2003-01-01

    Describes how high school biology teachers can use the Global Learning and Observations to Benefit the Environment (GLOBE) program protocols and data in their classrooms. Includes background information on plant phenology, an overview of GLOBE phenology protocols and materials, and implications for protocols with both deciduous trees and grasses…

  3. Is a New Protocol for Acute Lymphoblastic Leukemia Research or Standard Therapy?

    NARCIS (Netherlands)

    Dekking, SAS; van der Graaf, R; de Vries, Martine; Bierings, MB; van Delden, JJM; Kodish, Eric; Lantos, John

    2015-01-01

    In the United States, doctors generally develop new cancer chemotherapy for children by testing innovative chemotherapy protocols against existing protocols in prospective randomized trials. In the Netherlands, children with leukemia are treated by protocols that are agreed upon by the Dutch

  4. Assessment of image quality of a standard and two dose-reducing protocols in paediatric pelvic CT

    Energy Technology Data Exchange (ETDEWEB)

    Ratcliffe, John; Frawley, Kieran; Coakley, Kerry; Cloake, John [Department of Radiology, The Royal Children' s Hospital, Brisbane, Queensland (Australia); Swanson, Cheryl E. [Department of Surgery, University of Queensland (Australia); Hafiz, Niru [Department of Child Health, The Royal Children' s Hospital, Brisbane (Australia)

    2003-03-01

    Concerns exist regarding the effect of radiation dose from paediatric pelvic CT scans and the potential later risk of radiation-induced neoplasm and teratogenic outcomes in these patients. To assess the diagnostic quality of CT images of the paediatric pelvis using either reduced mAs or increased pitch compared with standard settings. A prospective study of pelvic CT scans of 105 paediatric patients was performed using one of three protocols: (1) 31 at a standard protocol of 200 mA with rotation time of 0.75 s at 120 kVp and a pitch factor approximating 1.4; (2) 31 at increased pitch factor approaching 2 and 200 mA; and (3) 43 at a reduced setting of 100 mA and a pitch factor of 1.4. All other settings remained the same in all three groups. Image quality was assessed by radiologists blinded to the protocol used in each scan. No significant difference was found between the quality of images acquired at standard settings and those acquired at half the standard mAs. The use of increased pitch factor resulted in a higher proportion of poor images. Images acquired at 120 kVp using 75 mAs are equivalent in diagnostic quality to those acquired at 150 mAs. Reduced settings can provide useful imaging of the paediatric pelvis and should be considered as a standard protocol in these situations. (orig.)

  5. Investigation of preparation techniques for δ2H analysis of keratin materials and a proposed analytical protocol

    Science.gov (United States)

    Qi, H.; Coplen, T.B.

    2011-01-01

    Accurate hydrogen isotopic measurements of keratin materials have been a challenge due to exchangeable hydrogen in the sample matrix and the paucity of appropriate isotopic reference materials for calibration. We found that the most reproducible δ2HVSMOW-SLAP and mole fraction of exchangeable hydrogen, x(H)ex, of keratin materials were measured with equilibration at ambient temperature using two desiccators and two different equilibration waters with two sets of the keratin materials for 6 days. Following equilibration, drying the keratin materials in a vacuum oven for 4 days at 60 °C was most critical. The δ2H analysis protocol also includes interspersing isotopic reference waters in silver tubes among samples in the carousel of a thermal conversion elemental analyzer (TC/EA) reduction unit. Using this analytical protocol, δ2HVSMOW-SLAP values of the non-exchangeable fractions of USGS42 and USGS43 human-hair isotopic reference materials were determined to be –78.5 ± 2.3 ‰ and –50.3 ± 2.8 ‰, respectively. The measured x(H)ex values of keratin materials analyzed with steam equilibration and N2 drying were substantially higher than those previously published, and dry N2 purging was unable to remove absorbed moisture completely, even with overnight purging. The δ2H values of keratin materials measured with steam equilibration were about 10 ‰ lower than values determined with equilibration in desiccators at ambient temperatures when on-line evacuation was used to dry samples. With steam equilibrations the x(H)ex of commercial keratin powder was as high as 28 %. Using human-hair isotopic reference materials to calibrate other keratin materials, such as hoof or horn, can introduce bias in δ2H measurements because the amount of absorbed water and the x(H)ex values may differ from those of unknown samples. Correct δ2HVSMOW-SLAP values of the non-exchangeable fractions of unknown human-hair samples can be determined with atmospheric moisture

  6. Multicriteria evaluation of power plants impact on the living standard using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Chatzimouratidis, Athanasios I.; Pilavachi, Petros A.

    2008-01-01

    The purpose of this study is to evaluate 10 types of power plants available at present including fossil fuel, nuclear as well as renewable-energy-based power plants, with regard to their overall impact on the living standard of local communities. Both positive and negative impacts of power plant operation are considered using the analytic hierarchy process (AHP). The current study covers the set of criteria weights considered typical for many local communities in many developed countries. The results presented here are illustrative only and user-defined weighting is required to make this study valuable for a specific group of users. A sensitivity analysis examines the most important weight variations, thus giving an overall view of the problem evaluation to every decision maker. Regardless of criteria weight variations, the five types of renewable energy power plant rank in the first five positions. Nuclear plants are in the sixth position when priority is given to quality of life and last when socioeconomic aspects are valued more important. Natural gas, oil and coal/lignite power plants rank between sixth and tenth position having slightly better ranking under priority to socioeconomic aspects

  7. Standardizing care for high-risk patients in spine surgery: the Northwestern high-risk spine protocol.

    Science.gov (United States)

    Halpin, Ryan J; Sugrue, Patrick A; Gould, Robert W; Kallas, Peter G; Schafer, Michael F; Ondra, Stephen L; Koski, Tyler R

    2010-12-01

    Review article of current literature on the preoperative evaluation and postoperative management of patients undergoing high-risk spine operations and a presentation of a multidisciplinary protocol for patients undergoing high-risk spine operation. To provide evidence-based outline of modifiable risk factors and give an example of a multidisciplinary protocol with the goal of improving outcomes. Protocol-based care has been shown to improve outcomes in many areas of medicine. A protocol to evaluate patients undergoing high-risk procedures may ultimately improve patient outcomes. The English language literature to date was reviewed on modifiable risk factors for spine surgery. A multidisciplinary team including hospitalists, critical care physicians, anesthesiologists, and spine surgeons from neurosurgery and orthopedics established an institutional protocol to provide comprehensive care in the pre-, peri-, and postoperative periods for patients undergoing high-risk spine operations. An example of a comprehensive pre-, peri-, and postoperative high-risk spine protocol is provided, with focus on the preoperative assessment of patients undergoing high-risk spine operations and modifiable risk factors. Standardizing preoperative risk assessment may lead to better outcomes after major spine operations. A high-risk spine protocol may help patients by having dedicated physicians in multiple specialties focusing on all aspects of a patients care in the pre-, intra-, and postoperative phases.

  8. An Advanced Encryption Standard Powered Mutual Authentication Protocol Based on Elliptic Curve Cryptography for RFID, Proven on WISP

    Directory of Open Access Journals (Sweden)

    Alaauldin Ibrahim

    2017-01-01

    Full Text Available Information in patients’ medical histories is subject to various security and privacy concerns. Meanwhile, any modification or error in a patient’s medical data may cause serious or even fatal harm. To protect and transfer this valuable and sensitive information in a secure manner, radio-frequency identification (RFID technology has been widely adopted in healthcare systems and is being deployed in many hospitals. In this paper, we propose a mutual authentication protocol for RFID tags based on elliptic curve cryptography and advanced encryption standard. Unlike existing authentication protocols, which only send the tag ID securely, the proposed protocol could also send the valuable data stored in the tag in an encrypted pattern. The proposed protocol is not simply a theoretical construct; it has been coded and tested on an experimental RFID tag. The proposed scheme achieves mutual authentication in just two steps and satisfies all the essential security requirements of RFID-based healthcare systems.

  9. Comparing Short Dental Implants to Standard Dental Implants: Protocol for a Systematic Review

    Science.gov (United States)

    Rokn, Amir Reza; Keshtkar, Abbasali; Monzavi, Abbas; Hashemi, Kazem

    2018-01-01

    Background Short dental implants have been proposed as a simpler, cheaper, and faster alternative for the rehabilitation of atrophic edentulous areas to avoid the disadvantages of surgical techniques for increasing bone volume. Objective This review will compare short implants (4 to 8 mm) to standard implants (larger than 8 mm) in edentulous jaws, evaluating on the basis of marginal bone loss (MBL), survival rate, complications, and prosthesis failure. Methods We will electronically search for randomized controlled trials comparing short dental implants to standard dental implants in the following databases: PubMed, Web of Science, EMBASE, Scopus, the Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov with English language restrictions. We will manually search the reference lists of relevant reviews and the included articles in this review. The following journals will also be searched: European Journal of Oral Implantology, Clinical Oral Implants Research, and Clinical Implant Dentistry and Related Research. Two reviewers will independently perform the study selection, data extraction and quality assessment (using the Cochrane Collaboration tool) of included studies. All meta-analysis procedures including appropriate effect size combination, sub-group analysis, meta-regression, assessing publication or reporting bias will be performed using Stata (Statacorp, TEXAS) version 12.1. Results Short implant effectiveness will be assessed using the mean difference of MBL in terms of weighted mean difference (WMD) and standardized mean difference (SMD) using Cohen’s method. The combined effect size measures in addition to the related 95% confidence intervals will be estimated by a fixed effect model. The heterogeneity of the related effect size will be assessed using a Q Cochrane test and I2 measure. The MBL will be presented by a standardized mean difference with a 95% confidence interval. The survival rate of implants, prostheses failures, and

  10. Wireless networking for the dental office: current wireless standards and security protocols.

    Science.gov (United States)

    Mupparapu, Muralidhar; Arora, Sarika

    2004-11-15

    Digital radiography has gained immense popularity in dentistry today in spite of the early difficulty for the profession to embrace the technology. The transition from film to digital has been happening at a faster pace in the fields of Orthodontics, Oral Surgery, Endodontics, Periodontics, and other specialties where the radiographic images (periapical, bitewing, panoramic, cephalometric, and skull radiographs) are being acquired digitally, stored within a server locally, and eventually accessed for diagnostic purposes, along with the rest of the patient data via the patient management software (PMS). A review of the literature shows the diagnostic performance of digital radiography is at least comparable to or even better than that of conventional radiography. Similarly, other digital diagnostic tools like caries detectors, cephalometric analysis software, and digital scanners were used for many years for the diagnosis and treatment planning purposes. The introduction of wireless charged-coupled device (CCD) sensors in early 2004 (Schick Technologies, Long Island City, NY) has moved digital radiography a step further into the wireless era. As with any emerging technology, there are concerns that should be looked into before adapting to the wireless environment. Foremost is the network security involved in the installation and usage of these wireless networks. This article deals with the existing standards and choices in wireless technologies that are available for implementation within a contemporary dental office. The network security protocols that protect the patient data and boost the efficiency of modern day dental clinics are enumerated.

  11. Implementing Istanbul Protocol standards for forensic evidence of torture in Kyrgyzstan.

    Science.gov (United States)

    Moreno, Alejandro; Crosby, Sondra; Xenakis, Stephen; Iacopino, Vincent

    2015-02-01

    The Kyrgyz government declared a policy of "zero tolerance" for torture and began reforms to stop such practice, a regular occurrence in the country's daily life. This study presents the results of 10 forensic evaluations of individuals alleging torture; they represent 35% of all criminal investigations into torture for the January 2011-July 2012 period. All individuals evaluated were male with an average age of 34 years. Police officers were implicated as perpetrators in all cases. All individuals reported being subjected to threats and blunt force trauma from punches, kicks, and blows with objects such as police batons. The most common conditions documented during the evaluations were traumatic brain injury and chronic seizures. Psychological sequelae included post-traumatic stress disorder and major depressive disorder, which was diagnosed in seven individuals. In all cases, the physical and psychological evidence was highly consistent with individual allegations of abuse. These forensic evaluations, which represent the first ever to be conducted in Kyrgyzstan in accordance with Istanbul Protocol standards, provide critical insight into torture practices in the country. The evaluations indicate a pattern of brutal torture practices and inadequate governmental and nongovernmental forensic evaluations. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  12. Standard errors and confidence intervals for correlations corrected for indirect range restriction: A simulation study comparing analytic and bootstrap methods.

    Science.gov (United States)

    Kennet-Cohen, Tamar; Kleper, Dvir; Turvall, Elliot

    2018-02-01

    A frequent topic of psychological research is the estimation of the correlation between two variables from a sample that underwent a selection process based on a third variable. Due to indirect range restriction, the sample correlation is a biased estimator of the population correlation, and a correction formula is used. In the past, bootstrap standard error and confidence intervals for the corrected correlations were examined with normal data. The present study proposes a large-sample estimate (an analytic method) for the standard error, and a corresponding confidence interval for the corrected correlation. Monte Carlo simulation studies involving both normal and non-normal data were conducted to examine the empirical performance of the bootstrap and analytic methods. Results indicated that with both normal and non-normal data, the bootstrap standard error and confidence interval were generally accurate across simulation conditions (restricted sample size, selection ratio, and population correlations) and outperformed estimates of the analytic method. However, with certain combinations of distribution type and model conditions, the analytic method has an advantage, offering reasonable estimates of the standard error and confidence interval without resorting to the bootstrap procedure's computer-intensive approach. We provide SAS code for the simulation studies. © 2017 The British Psychological Society.

  13. BioWes-from design of experiment, through protocol to repository, control, standardization and back-tracking.

    Science.gov (United States)

    Cisar, Petr; Soloviov, Dmytro; Barta, Antonin; Urban, Jan; Stys, Dalibor

    2016-07-15

    One of the main challenges in modern science is the amount of data produced by the experimental work; it is difficult to store, organize and share the scientific data and to extract the wealth of knowledge. Experimental method descriptions in scientific publications are often incomplete, which complicates experimental reproducibility. The proposed system was created in order to address these issues. It provides a solution for management of the experimental data and metadata to support the reproducibility. The system is implemented as a repository for experiment descriptions and experimental data. It has three main entry points: desktop application for protocol design and data processing, web interface dedicated for protocol and data management, and web-based interface for mobile devices suitable for the field experiments. The functionality of desktop client can be extended using the custom plug-ins for data extraction and data processing. The system provides several methods to support experimental reproducibility: standardized terminology support, data and metadata at a single location, standardized protocol design or protocol evolution. The system was tested in the framework of international infrastructure project AQUAEXCEL with five pilot installations at different institutes. The general testing in Tissue culture certified laboratory, Institute of complex systems and IFREMER verified the usability under different research infrastructures. The specific testing focused on the data processing modules and plug-ins demonstrated the modularity of the system for the specific conditions. The BioWes system represents experimental data as black box and therefore can handle any data type so as to provide broad usability for a variety of experiments and provide the data management infrastructure to improve the reproducibility and data sharing. The proposed system provides the tools for standard data management operations and extends the support by the standardization

  14. Dual-energy CT workflow: multi-institutional consensus on standardization of abdominopelvic MDCT protocols.

    Science.gov (United States)

    Patel, Bhavik N; Alexander, Lauren; Allen, Brian; Berland, Lincoln; Borhani, Amir; Mileto, Achille; Moreno, Courtney; Morgan, Desiree; Sahani, Dushyant; Shuman, William; Tamm, Eric; Tublin, Mitchell; Yeh, Benjamin; Marin, Daniele

    2017-03-01

    To standardize workflow for dual-energy computed tomography (DECT) involving common abdominopelvic exam protocols. 9 institutions (4 rsDECT, 1 dsDECT, 4 both) with 32 participants [average # years (range) in practice and DECT experience, 12.3 (1-35) and 4.6 (1-14), respectively] filled out a single survey (n = 9). A five-point agreement scale (0, 1, 2, 3, 4-contra-, not, mildly, moderately, strongly indicated, respectively) and utilization scale (0-not performing and shouldn't; 1-performing but not clinically useful; 2-performing but not sure if clinically useful; 3-not performing it but would like to; 4-performing and clinically useful) were used. Consensus was considered with a score of ≥2.5. Survey results were discussed over three separate live webinar sessions. 5/9 (56%) institutions exclude large patients from DECT. 2 (40%) use weight, 2 (40%) use transverse dimension, and 1 (20%) uses both. 7/9 (78%) use 50 keV for low and 70 keV for medium monochromatic reconstructed images. DECT is indicated for dual liver [agreement score (AS) 3.78; utilization score (US) 3.22] and dual pancreas in the arterial phase (AS 3.78; US 3.11), mesenteric ischemia/gastrointestinal bleeding in both the arterial and venous phases (AS 2.89; US 2.79), RCC exams in the arterial phase (AS 3.33; US 2.78), and CT urography in the nephrographic phase (AS 3.11; US 2.89). DECT for renal stone and certain single-phase exams is indicated (AS 3.00). DECT is indicated during the arterial phase for multiphasic abdominal exams, nephrographic phase for CTU, and for certain single-phase and renal stone exams.

  15. Network Coding to Enhance Standard Routing Protocols in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Fitzek, Frank

    2013-01-01

    This paper introduces a design and simulation of a locally optimized network coding protocol, called PlayNCool, for wireless mesh networks. PlayN-Cool is easy to implement and compatible with existing routing protocols and devices. This allows the system to gain from network coding capabilities...... linear network coding to increase the usefulness of each transmission from the helpers. This paper focuses on the design details needed to make the system operate in reality and evaluating performance using ns-3 in multi-hop topologies. Our results show that the PlayNCool protocol increases the end...

  16. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM)

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J.

    2015-01-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. PMID:26188274

  17. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  18. Shortened screening method for phosphorus fractionation in sediments A complementary approach to the standards, measurements and testing harmonised protocol

    International Nuclear Information System (INIS)

    Pardo, Patricia; Rauret, Gemma; Lopez-Sanchez, Jose Fermin

    2004-01-01

    The SMT protocol, a sediment phosphorus fractionation method harmonised and validated in the frame of the standards, measurements and testing (SMT) programme (European Commission), establishes five fractions of phosphorus according to their extractability. The determination of phosphate extracted is carried out spectrophotometrically. This protocol has been applied to 11 sediments of different origin and characteristics and the phosphorus extracted in each fraction was determined not only by UV-Vis spectrophotometry, but also by inductively coupled plasma-atomic emission spectrometry. The use of these two determination techniques allowed the differentiation between phosphorus that was present in the extracts as soluble reactive phosphorus and as total phosphorus. From the comparison of data obtained with both determination techniques a shortened screening method, for a quick evaluation of the magnitude and importance of the fractions given by the SMT protocol, is proposed and validated using two certified reference materials

  19. Photographing Injuries in the Acute Care Setting: Development and Evaluation of a Standardized Protocol for Research, Forensics, and Clinical Practice.

    Science.gov (United States)

    Bloemen, Elizabeth M; Rosen, Tony; Cline Schiroo, Justina A; Clark, Sunday; Mulcare, Mary R; Stern, Michael E; Mysliwiec, Regina; Flomenbaum, Neal E; Lachs, Mark S; Hargarten, Stephen

    2016-05-01

    Photographing injuries in the acute setting allows for improved documentation as well as assessment by clinicians and others who have not personally examined a patient. This tool is important, particularly for telemedicine, tracking of wound healing, the evaluation of potential abuse, and injury research. Despite this, protocols to ensure standardization of photography in clinical practice, forensics, or research have not been published. In preparation for a study of injury patterns in elder abuse and geriatric falls, our goal was to develop and evaluate a protocol for standardized photography of injuries that may be broadly applied. We conducted a literature review for techniques and standards in medical, forensic, and legal photography. We developed a novel protocol describing types of photographs and body positioning for eight body regions, including instructional diagrams. We revised it iteratively in consultation with experts in medical photography; forensics; and elder, child, and domestic abuse. The resulting protocol requires a minimum of four photos of each injury at multiple distances with and without a ruler/color guide. To evaluate the protocol's efficacy, multiple research assistants without previous photography experience photographed injuries from a convenience sample of elderly patients presenting to a single large, urban, academic emergency department. A selection of these patients' images were then evaluated in a blinded fashion by four nontreating emergency medicine physicians and the inter-rater reliability between these physicians was calculated. Among the 131 injuries, from 53 patients, photographed by 18 photographers using this protocol, photographs of 25 injuries (10 bruises, seven lacerations, and eight abrasions) were used to assess characterization of the injury. Physicians' characterizations of the injuries were reliable for the size of the injury (κ = 0.91, 95% confidence interval [CI] = 0.77 to 1.00), side of the body (κ = 0.97, 95

  20. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    OpenAIRE

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development o...

  1. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    Science.gov (United States)

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support.

  2. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Shane Ó Conchúir

    Full Text Available The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  3. Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engebrecht, Cheryn [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2010-09-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  4. The effect of the introduction of a standard monitoring protocol on ...

    African Journals Online (AJOL)

    2011-03-07

    Mar 7, 2011 ... Background: A comprehensive approach to the control of type 2 diabetes is required to reduce mortality and morbidity. To improve ... protocol on the investigations performed on the metabolic control of type 2 diabetes at ... In 2007, following a routine notes audit, it was noted that investigations were not ...

  5. A framework for the definition of standardized protocols for measuring upper-extremity kinematics

    NARCIS (Netherlands)

    Kontaxis, A.; Cutti, A.G.; Johnson, G.R.; Veeger, H.E.J.

    2009-01-01

    Background: Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross

  6. Psychometric Properties of a Standardized Observation Protocol to Quantify Pediatric Physical Therapy Actions

    NARCIS (Netherlands)

    Sonderer, Patrizia; Ziegler, Schirin Akhbari; Oertle, Barbara Gressbach; Meichtry, Andre; Hadders-Algra, Mijna

    Purpose: Pediatric physical therapy (PPT) is characterized by heterogeneity. This blurs the evaluation of effective components of PPT. The Groningen Observation Protocol (GOP) was developed to quantify contents of PPT. This study assesses the reliability and completeness of the GOP. Methods: Sixty

  7. Standard guide for establishing a quality assurance program for analytical chemistry laboratories within the nuclear industry

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 This guide covers the establishment of a quality assurance (QA) program for analytical chemistry laboratories within the nuclear industry. Reference to key elements of ANSI/ISO/ASQC Q9001, Quality Systems, provides guidance to the functional aspects of analytical laboratory operation. When implemented as recommended, the practices presented in this guide will provide a comprehensive QA program for the laboratory. The practices are grouped by functions, which constitute the basic elements of a laboratory QA program. 1.2 The essential, basic elements of a laboratory QA program appear in the following order: Section Organization 5 Quality Assurance Program 6 Training and Qualification 7 Procedures 8 Laboratory Records 9 Control of Records 10 Control of Procurement 11 Control of Measuring Equipment and Materials 12 Control of Measurements 13 Deficiencies and Corrective Actions 14

  8. A Meta-Analytic Assessment of Empirical Differences in Standard Setting Procedures.

    Science.gov (United States)

    Bontempo, Brian D.; Marks, Casimer M.; Karabatsos, George

    Using meta-analysis, this research takes a look at studies included in a meta-analysis by R. Jaeger (1989) that compared the cut score set by one standard setting method with that set by another. This meta-analysis looks beyond Jaeger's studies to select 10 from the research literature. Each compared at least two types of standard setting method.…

  9. Analysis of 855 upper extremity fistulas created using a standard protocol: the role of graft extension to achieve functional status.

    Science.gov (United States)

    Allan, Bassan J; Perez, Enrique R; Tabbara, Marwan

    2013-06-01

    The Fistula First Breakthrough Initiative (FFBI) has been one of the most important national programs to help achieve considerable improvements in the care of patients on chronic hemodialysis. FFBI has helped place guidelines to push practitioners to reduce the use of tunneled central venous catheters and to increase the rate of arteriovenous fistula use in patients requiring chronic hemodialysis access. However, despite current guidelines, no specific protocols exist for the creation and management of autogenous arteriovenous fistulas and outcomes at most centers are below national benchmarks. In this study, we examine the effectiveness of a standard protocol used at our institution for the creation of autogenous upper extremity fistulas for hemodialysis access in achieving early cannulation and early removal of tunneled dialysis catheters. Our review encompasses 855 consecutive autogenous fistulas created over a 10-year period. Our findings suggest that the use of a standard protocol for creation and management of autogenous fistulas can help increase the rate of functional accesses over national benchmarks. Additionally, extension/conversion of malfunctioning fistulas to grafts appears to be an excellent method to expedite removal of a tunneled dialysis catheter with concomitant preservation of a fistula.

  10. [Standard protocol of ALK fusion gene assessment by fluorescent in situ hybridization in non-small cell lung cancer].

    Science.gov (United States)

    Guo, Lei; Zheng, Shan; Xie, Yong-qiang

    2013-08-01

    To investigate the standard protocol for anaplastic lymphoma kinase (ALK) fusion gene assessment by fluorescent in situ hybridization (FISH) in non-small cell lung cancer (NSCLC). Tissue specimens of NSCLC cases were retrospectively collected from Jan. 2011 to July 2012. ALK fusion gene was examined by FISH using break-apart ALK gene probes (Vysis company). The identification of ALK fusion gene was determined by fluorescent signals under a fluorescence microscope. One hundred and forty-six eligible NSCLC tumor specimens were tested for ALK fusion gene by FISH. The specimens included 110 cases (75.4%) of surgically-removed tissues, 11 cases (7.5%) of biopsy, 19 cases (13.0%) of lymph node and 6 cases (4.1%) of other metastatic tissues. The positivity of ALK fusion gene was 8.9% (13/146). The assessment of ALK fusing gene by FISH using standard protocol for formalin-fixed, paraffin-embedded (FFPE) tissue is feasible. The protocol can used to test in surgically-removed tissues, biopsies, metastatic lymph nodes and other metastastic specimens.

  11. A standardized education protocol significantly reduces traumatic injuries and syncope recurrence: an observational study in 316 patients with vasovagal syncope.

    Science.gov (United States)

    Aydin, M Ali; Mortensen, Kai; Salukhe, Tushar V; Wilke, Iris; Ortak, Michelle; Drewitz, Imke; Hoffmann, Boris; Müllerleile, Kai; Sultan, Arian; Servatius, Helge; Steven, Daniel; von Kodolitsch, Yskert; Meinertz, Thomas; Ventura, Rodolfo; Willems, Stephan

    2012-03-01

    The aim of this study was to assess the role of a non-pharmacological approach on the frequency of traumatic injuries and syncope recurrence in patients with vasovagal syncope and normal hearts. We report the experience in our syncope centre with a standardized education and teaching protocol for patients with vasovagal syncope. The treatment of vasovagal syncope is often complex and discouraging. Besides medical options, behaviour modification is a main component of therapy but has no statistical evidence to support its use. Between January 1999 and September 2006, we prospectively enrolled all patients with vasovagal syncope. The patients were counselled about the benign nature of their disease. Specific recommendations were made according to a standardized education protocol established at our syncope centre. A pre-/post-study was conducted to investigate the effectiveness of our approach on syncope recurrence and frequency of injury as the study endpoints. Complete follow-up data were available from 85% of the study population (316 of 371) after a mean time of 710 ± 286 days (mean age 50 years; standard deviation ± 18 years, 160 female). Eighty-seven patients (27.5%) had a syncope recurrence with 22 suffering an injury during syncope. During the follow-up period, the syncope burden per month was significantly reduced from 0.35 ± 0.03 at initial presentation to 0.08 ± 0.02 (Psyncope was significantly lower at the time of recurrence compared with the initial presentation (25 vs. 42%; McNemar's test P= 0.02). A standardized education protocol significantly reduces traumatic injuries and syncope recurrence in patients with vasovagal syncope.

  12. [Preparation of sub-standard samples and XRF analytical method of powder non-metallic minerals].

    Science.gov (United States)

    Kong, Qin; Chen, Lei; Wang, Ling

    2012-05-01

    In order to solve the problem that standard samples of non-metallic minerals are not satisfactory in practical work by X-ray fluorescence spectrometer (XRF) analysis with pressed powder pellet, a method was studied how to make sub-standard samples according to standard samples of non-metallic minerals and to determine how they can adapt to analysis of mineral powder samples, taking the K-feldspar ore in Ebian-Wudu, Sichuan as an example. Based on the characteristic analysis of K-feldspar ore and the standard samples by X-ray diffraction (XRD) and chemical methods, combined with the principle of the same or similar between the sub-standard samples and unknown samples, the experiment developed the method of preparation of sub-standard samples: both of the two samples above mentioned should have the same kind of minerals and the similar chemical components, adapt mineral processing, and benefit making working curve. Under the optimum experimental conditions, a method for determination of SiO2, Al2O3, Fe2O3, TiO2, CaO, MgO, K2O and Na2O of K-feldspar ore by XRF was established. Thedetermination results are in good agreement with classical chemical methods, which indicates that this method was accurate.

  13. A protocol using coho salmon to monitor Tongass National Forest Land and Resource Management Plan standards and guidelines for fish habitat.

    Science.gov (United States)

    M.D. Bryant; Trent McDonald; R. Aho; B.E. Wright; Michelle Bourassa. Stahl

    2008-01-01

    We describe a protocol to monitor the effectiveness of the Tongass Land Management Plan (TLMP) management standards for maintaining fish habitat. The protocol uses juvenile coho salmon (Oncorhynchus kisutch) in small tributary streams in forested watersheds. We used a 3-year pilot study to develop detailed methods to estimate juvenile salmonid...

  14. Establishment and intra-/inter-laboratory validation of a standard protocol of reactive oxygen species assay for chemical photosafety evaluation.

    Science.gov (United States)

    Onoue, Satomi; Hosoi, Kazuhiro; Wakuri, Shinobu; Iwase, Yumiko; Yamamoto, Toshinobu; Matsuoka, Naoko; Nakamura, Kazuichi; Toda, Tsuguto; Takagi, Hironori; Osaki, Naoto; Matsumoto, Yasuhiro; Kawakami, Satoru; Seto, Yoshiki; Kato, Masashi; Yamada, Shizuo; Ohno, Yasuo; Kojima, Hajime

    2013-11-01

    A reactive oxygen species (ROS) assay was previously developed for photosafety evaluation of pharmaceuticals, and the present multi-center study aimed to establish and validate a standard protocol for ROS assay. In three participating laboratories, two standards and 42 coded chemicals, including 23 phototoxins and 19 nonphototoxic drugs/chemicals, were assessed by the ROS assay according to the standardized protocol. Most phototoxins tended to generate singlet oxygen and/or superoxide under UV-vis exposure, but nonphototoxic chemicals were less photoreactive. In the ROS assay on quinine (200 µm), a typical phototoxic drug, the intra- and inter-day precisions (coefficient of variation; CV) were found to be 1.5-7.4% and 1.7-9.3%, respectively. The inter-laboratory CV for quinine averaged 15.4% for singlet oxygen and 17.0% for superoxide. The ROS assay on 42 coded chemicals (200 µm) provided no false negative predictions upon previously defined criteria as compared with the in vitro/in vivo phototoxicity, although several false positives appeared. Outcomes from the validation study were indicative of satisfactory transferability, intra- and inter-laboratory variability, and predictive capacity of the ROS assay. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Additionality and permanence standards in California's Forest Offset Protocol: A review of project and program level implications.

    Science.gov (United States)

    Ruseva, T; Marland, E; Szymanski, C; Hoyle, J; Marland, G; Kowalczyk, T

    2017-08-01

    A key component of California's cap-and-trade program is the use of carbon offsets as compliance instruments for reducing statewide GHG emissions. Under this program, offsets are tradable credits representing real, verifiable, quantifiable, enforceable, permanent, and additional reductions or removals of GHG emissions. This paper focuses on the permanence and additionality standards for offset credits as defined and operationalized in California's Compliance Offset Protocol for U.S. Forest Projects. Drawing on a review of the protocol, interviews, current offset projects, and existing literature, we discuss how additionality and permanence standards relate to project participation and overall program effectiveness. Specifically, we provide an overview of offset credits as compliance instruments in California's cap-and-trade program, the timeline for a forest offset project, and the factors shaping participation in offset projects. We then discuss the implications of permanence and additionality at both the project and program levels. Largely consistent with previous work, we find that stringent standards for permanent and additional project activities can present barriers to participation, but also, that there may be a trade-off between project quality and quantity (i.e. levels of participation) when considering overall program effectiveness. We summarize what this implies for California's forest offset program and provide suggestions for improvements in light of potential program diffusion and policy learning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Development of Taiwanese government’s climate policy after the Kyoto protocol: Applying policy network theory as an analytical framework

    International Nuclear Information System (INIS)

    Shyu, Chian-Woei

    2014-01-01

    Given its limited involvement in and recognition by international organizations, Taiwan is not presently a signatory to the United Nations Framework Convention on Climate Change (UNFCCC) or the Kyoto Protocol. The objective of this study is to analyze how and the extent to which changes in an exogenous factor, namely the Kyoto Protocol and Post-Kyoto climate negotiations, affect and ultimately lead to the formulation of and changes in the Taiwanese government's climate policy. This study applies policy network theory to examine the development of and changes in the Taiwanese government's climate policy. The results demonstrate that international climate agreements and negotiations play a key role in the development of, changes to, and transformation of Taiwan's climate policy. Scarce evidence was found in this study to demonstrate that domestic or internal factors affect climate change policy. Despite its lack of participation in the UNFCCC and the Kyoto Protocol, Taiwan has adopted national climate change strategies, action plans, and programs to reduce greenhouse gas emissions. However, these climate policies and measures are fairly passive and aim to only conform to the minimal requirements for developing countries under international climate agreements and negotiations. This process results in inconsistent and variable climate policies, targets, and regulations. - Highlights: • Taiwan is not a signatory to the UNFCCC or its Kyoto Protocol. • International climate agreements strongly affected Taiwan's climate policy. • Little evidence was found that domestic factors affect Taiwan's climate policy. • New climate policies, regulations, and laws are formulated and implemented. • Climate policies, targets, and regulations change frequently and are inconsistent

  17. A standard operating protocol (SOP) and minimum data set (MDS) for nursing and medical handover: considerations for flexible standardization in developing electronic tools.

    Science.gov (United States)

    Turner, Paul; Wong, Ming Chao; Yee, Kwang Chien

    2009-01-01

    As part of Australia's participation in the World Health Organization, the Australian Commission on Safety and Quality in Health Care (ACSQHC) is the leading federal government technical agency involved in the area of clinical handover improvement. The ACSQHC has funded a range of handover improvement projects in Australia including one at the Royal Hobart Hospital (RHH), Tasmania. The RHH project aims to investigate the potential for generalizable and transferable clinical handover solutions throughout the medical and nursing disciplines. More specifically, this project produced an over-arching minimum data set (MDS) and over-arching standardized operating protocol (SOP) based on research work on nursing and medical shift-to-shift clinical handover in general medicine, general surgery and emergency medicine. The over-arching MDS consists of five headings: situational awareness, patient identification, history and information, responsibility and tasks and accountability. The over-arching SOP has five phases: preparation; design; implementation; evaluation; and maintenance. This paper provides an overview of the project and the approach taken. It considers the implications of these standardized operating protocols and minimum data sets for developing electronic clinical handover support tools. Significantly, the paper highlights a human-centred design approach that actively involves medical and nursing staff in data collection, analysis, interpretation, and systems design. This approach reveals the dangers of info-centrism when considering electronic tools, as information emerges as the only factor amongst many others that influence the efficiency and effectiveness of clinical handover.

  18. Are Randomized Controlled Trials the (G)old Standard? From Clinical Intelligence to Prescriptive Analytics

    Science.gov (United States)

    2016-01-01

    Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers’ scientific epistemology of “falsificationism.” Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation. PMID:27383622

  19. [Food Security in Europe: comparison between the "Hygiene Package" and the British Retail Consortium (BRC) & International Food Standard (IFS) protocols].

    Science.gov (United States)

    Stilo, A; Parisi, S; Delia, S; Anastasi, F; Bruno, G; Laganà, P

    2009-01-01

    The birth of Hygiene Package and of the Reg. CE no 2073/2005 in the food production field signalled a change in Italy. This process started in Italy in 1997 with the legislative decree no 155 on Self-control but in reality, it was implemented in the UK in 1990 with the promulgation of the Food Safety Act. This legal act was influenced by some basic rules corresponding to the application of HACCP standards. Since 1990 the British chains of distribution (Retailers) have involved all aspects of the food line in this type of responsibility. Due to this growing awareness for a need for greater regulation, a protocol, edited by British Retail Consortium was created in 1998. This protocol acted as a "stamp" of approval for food products and it is now known as the BRC Global Food Standard. In July 2008, this protocol became effective in its fifth version. After the birth of BRC, also French and German Retailers have established a standard practically equivalent and perhaps more pertinent to safety food, that is International Food Standard (IFS). The new approach is specific to the food field and strictly applies criteria which will ensure "safety, quality and legality" of food products, similarly to ISO 22000:2005 (mainly based on BRC & IFS past experiences). New standards aim to create a sort of green list with fully "proper and fit" Suppliers only, because of comprehensible exigencies of Retailers. It is expected, as we have shown, that Auditor authorities who are responsible for ensuring that inspections are now carried out like the Hygiene Package, will find these new standards useful. The advantages of streamlining this system is that it will allow enterprises to diligently enforce food safety practices without fear of upset or legal consequence, to improve the quality (HACCP) of management & traceability system; to restrict wastes, reprocessing and withdrawal of products. However some discordances about the interpretation of certain sub-field norms (e.g., water

  20. Ficus deltoidea Standardization: Analytical Methods for Bioactive Markers in Deltozide Tablet 200 MG

    International Nuclear Information System (INIS)

    Hazlina Ahmad Hassali; Zainah Adam; Rosniza Razali

    2016-01-01

    Standardization of herbal materials based on their chemical and biological profile is an important prerequisite for development of herbal product. The phyto pharmaceutical product that has been developed by Medical Technology Division, Malaysian Nuclear Agency is DELTOZIDE TABLET 200 MG containing 200 mg of spray-dried aqueous extract of Ficus deltoidea var kunstleri leaf as the active ingredient. Ficus deltoidea Jack or locally known as Mas Cotek is a South East Asian native plant traditionally used to treat several diseases. Pharmacological data showed that this plant exhibited good antioxidant, anti-diabetic and anti-inflammatory properties. It is important to establish the chemical profiles and determine the phytochemicals content of this plant as it is popularly used in traditional medicines. Thus, the present study reports on the comprehensive phytochemicals evaluation of bioactive markers from this extract for the development of DELTOZIDE TABLET 200 MG . Characterization of extract using LCMS/ MS Triple TOF System showed the presence of major constituents representing vitexin, isovitexin, gallic acid, catechinic, api genin, epicatechin and caffeoylquinic acid along with other minor constituents. The extract was standardized by ultra-high performance liquid chromatography (UHPLC) using two pharmacologically active markers, vitexin and isovitexin. Furthermore, qualitative determination of phytochemicals showed the presence of important phyto-constituents namely anthraquinones, terpenoids, flavonoids, tannins, phlobatannins, alkaloids, saponins, cardiac glycosides, steroids and phenols in the aqueous extract of Ficus deltoidea. Quantitative determination of phytochemicals revealed that the amount of total phenolic content (TPC; Gallic acid as standard) and total flavonoid content (TFC; Quercetin as standard) were 126.67±3.98 mg GAE/ g extract and 9.08±0.36 mg QE/ g extract respectively. The generated data provides some explanation for its wide usage in

  1. A tiered analytical protocol for the characterization of heavy oil residues at petroleum-contaminated hazardous waste sites

    International Nuclear Information System (INIS)

    Pollard, S.J.T.; Kenefick, S.L.; Hrudey, S.E.; Fuhr, B.J.; Holloway, L.R.; Rawluk, M.

    1994-01-01

    The analysis of hydrocarbon-contaminated soils from abandoned refinery sites in Alberta, Canada is used to illustrate a tiered analytical approach to the characterization of complex hydrocarbon wastes. Soil extracts isolated from heavy oil- and creosote-contaminated sites were characterized by thin layer chromatography with flame ionization detection (TLC-FID), ultraviolet fluorescence, simulated distillation (GC-SIMDIS) and chemical ionization GC-MS analysis. The combined screening and detailed analytical methods provided information essential to remedial technology selection including the extent of contamination, the class composition of soil extracts, the distillation profile of component classes and the distribution of individual class components within various waste fractions. Residual contamination was characteristic of heavy, degraded oils, consistent with documented site operations and length of hydrocarbon exposure at the soil surface

  2. Analytical and Numerical Tooth Contact Analysis (TCA of Standard and Modified Involute Profile Spur Gear

    Directory of Open Access Journals (Sweden)

    Nassear Rasheid Hmoad

    2016-03-01

    Full Text Available Among all the common mechanical transmission elements, gears still playing the most dominant role especially in the heavy duty works offering extraordinary performance under extreme conditions and that the cause behind the extensive researches concentrating on the enhancement of its durability to do its job as well as possible. Contact stress distribution within the teeth domain is considered as one of the most effective parameters characterizing gear life, performance, efficiency, and application so that it has been well sought for formal gear profiles and paid a lot of attention for moderate tooth shapes. The aim of this work is to investigate the effect of pressure angle, speed ratio, and correction factor on the maximum contact and bending stress value and principal stresses distribution for symmetric and asymmetric spur gear. The analytical investigation adopted Hertz equations to find the contact stress value, distribution, and the contact zone width while the numerical part depends on Ansys software version 15, as a FE solver with Lagrange and penalty contact algorithm. The most fruitful points to be noticed are that the increasing of pressure angle and speed ratio trends to minimize all the induced stresses for the classical gears and the altered teeth shape with larger loaded side pressure angle than the unloaded side one behave better than the symmetric teeth concerning the stress reduction.

  3. Psychometric Properties of a Standardized Observation Protocol to Quantify Pediatric Physical Therapy Actions.

    Science.gov (United States)

    Sonderer, Patrizia; Akhbari Ziegler, Schirin; Gressbach Oertle, Barbara; Meichtry, André; Hadders-Algra, Mijna

    2017-07-01

    Pediatric physical therapy (PPT) is characterized by heterogeneity. This blurs the evaluation of effective components of PPT. The Groningen Observation Protocol (GOP) was developed to quantify contents of PPT. This study assesses the reliability and completeness of the GOP. Sixty infant PPT sessions were video-taped. Two random samples of 10 videos were used to determine interrater and intrarater reliability using interclass correlation coefficients (ICCs) with 95% confidence intervals. Completeness of GOP 2.0 was based on 60 videos. Interrater reliability of quantifying PPT actions was excellent (ICC, 0.75-1.0) in 71% and sufficient to good (ICC, 0.4-0.74) in 24% of PPT actions. Intrarater reliability was excellent in 94% and sufficient to good in 6% of PPT actions. Completeness was good for greater than 90% of PPT actions. GOP 2.0 has good reliability and completeness. After appropriate training, it is a useful tool to quantify PPT for children with developmental disorders.

  4. Reproducibility of microbial mutagenicity assays. I. Tests with Salmonella typhimurium and Escherichia coli using a standardized protocol

    International Nuclear Information System (INIS)

    Dunkel, V.C.; Zeiger, E.; Brusick, D.; McCoy, E.; McGregor, D.; Mortelmans, K.; Rosenkranz, H.S.; Simmon, V.F.

    1984-01-01

    The Salmonella/microsome test developed by Ames and his coworkers has been widely used in the evaluation of chemicals for genotoxic potential. Although the value of this assay is well recognized, there have been no comprehensive studies on the interlaboratory reproducibility of the method using a standardized protocol. A program was therefore initiated to compare the results obtained in four laboratories from testing a series of coded mutagens and nonmutagens using a standardized protocol. Additional objectives of this study were to compare male Fisher 344 rat, B6C3F1 mouse, and Syrian hamster liver S-9 preparations for the activation of chemicals; to compare Aroclor 1254-induced liver S-9 from all three species with the corresponding non-induced liver S-9's; and to compare the response of Escherichia coli WP-2 uvrA with the Salmonella typhimurium tester strains recommended by Ames. Since a primary use of in vitro microbial mutagenesis tests is the identification of potential carcinogens by their mutagenicity, the authors decided to compare the animal species and strains used by the National Cancer Institute/National Toxicology Program (NCI/NTP) for animal carcinogenicity studies

  5. Analytical Expressions of the Efficiency of Standard and High Contact Ratio Involute Spur Gears

    Directory of Open Access Journals (Sweden)

    Miguel Pleguezuelos

    2013-01-01

    Full Text Available Simple, traditional methods for computation of the efficiency of spur gears are based on the hypotheses of constant friction coefficient and uniform load sharing along the path of contact. However, none of them is accurate. The friction coefficient is variable along the path of contact, though average values can be often considered for preliminary calculations. Nevertheless, the nonuniform load sharing produced by the changing rigidity of the pair of teeth has significant influence on the friction losses, due to the different relative sliding at any contact point. In previous works, the authors obtained a nonuniform model of load distribution based on the minimum elastic potential criterion, which was applied to compute the efficiency of standard gears. In this work, this model of load sharing is applied to study the efficiency of both standard and high contact ratio involute spur gears (with contact ratio between 1 and 2 and greater than 2, resp.. Approximate expressions for the friction power losses and for the efficiency are presented assuming the friction coefficient to be constant along the path of contact. A study of the influence of some transmission parameters (as the gear ratio, pressure angle, etc. on the efficiency is also presented.

  6. Replacing Amalgam Restorations: A Standardized Protocol Based on Analyzing Tissue Physicochemical Modifications.

    Science.gov (United States)

    Decup, Franck; Epaillard, Alexandre; Chemla, Florence

    2015-12-01

    Almost 60% of operative dentistry is devoted to replacing restorations. When practitioners have to replace an amalgam restoration, they tend to opt for an adhesive restoration, as it is conservative of tooth tissues and mimics the natural appearance of teeth. Based on a literature review, the aim of this article is to determine the best tissue approach when replacing an old amalgam by a new adhesive restoration. After analyzing and understanding tissue alterations due to the amalgam corrosion process, the authors propose an analytical approach to managing the situation. Both tissue orientated and specific mechanical approaches are developed and should be implemented to carry out the optimal clinical procedure and achieve the most conservative and durable treatment.

  7. 78 FR 45096 - Standards for Business Practices and Communication Protocols for Public Utilities

    Science.gov (United States)

    2013-07-26

    .... ] 31,309 (2010). \\13\\ Order No. 676-G, see supra n.8. 8. In Order No. 890, the Commission revisited the... the Business Practice Standards. Accordingly, NAESB set up a work project to review the existing... case, the end entity and the service provider operating the system, to authenticate the identities of...

  8. A standardized conjugation protocol to assess antibiotic resistance transfer between lactococcal species

    NARCIS (Netherlands)

    Lampkowska, J.; Feld, L.; Monaghan, A.; Toomey, N.; Schjørring, S.; Jacobsen, B.; Voet, van der H.; Andersen, S.R.; Bolton, D.; Aarts, H.J.M.; Krogfelt, K.A.; Wilcks, A.; Bardowski, J.K.

    2008-01-01

    Optimal conditions and a standardized method for conjugation between two model lactococcal strains, Lactococcus lactis SH4174 (pAMbeta1-containing, erythromycin resistant donor) and L. lactis Bu2-60 (plasmid-free, erythromycin sensitive recipient), were developed and tested in a inter-laboratory

  9. Protocol - realist and meta-narrative evidence synthesis: Evolving Standards (RAMESES

    Directory of Open Access Journals (Sweden)

    Westhorp Gill

    2011-08-01

    Full Text Available Abstract Background There is growing interest in theory-driven, qualitative and mixed-method approaches to systematic review as an alternative to (or to extend and supplement conventional Cochrane-style reviews. These approaches offer the potential to expand the knowledge base in policy-relevant areas - for example by explaining the success, failure or mixed fortunes of complex interventions. However, the quality of such reviews can be difficult to assess. This study aims to produce methodological guidance, publication standards and training resources for those seeking to use the realist and/or meta-narrative approach to systematic review. Methods/design We will: [a] collate and summarise existing literature on the principles of good practice in realist and meta-narrative systematic review; [b] consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing methods could be improved; [c] using an online Delphi method with an interdisciplinary panel of experts from academia and policy, produce a draft set of methodological steps and publication standards; [d] produce training materials with learning outcomes linked to these steps; [e] pilot these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise; [f] synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards; [g] disseminate outputs to audiences in academia and policy. The outputs of the study will be threefold: 1. Quality standards and methodological guidance for realist and meta-narrative reviews for use by researchers, research sponsors, students and supervisors 2. A 'RAMESES' (Realist and Meta-review Evidence Synthesis: Evolving Standards statement (comparable to CONSORT or PRISMA of publication standards for such reviews, published in an open

  10. Study on bird's & insect's wing aerodynamics and comparison of its analytical value with standard airfoil

    Science.gov (United States)

    Ali, Md. Nesar; Alam, Mahbubul; Hossain, Md. Abed; Ahmed, Md. Imteaz

    2017-06-01

    by several species of birds. Hovering, which is generating only lift through flapping alone rather than as a product of thrust, demands a lot of energy. On the other hand, for practical knowledge we also fabricate the various bird's, insect's & fighter jet wing by using random value of parameter & test those airfoil in wind tunnel. Finally for comparison & achieving analytical knowledge we also test those airfoil model in various simulation software.

  11. Echinacea standardization: analytical methods for phenolic compounds and typical levels in medicinal species.

    Science.gov (United States)

    Perry, N B; Burgess, E J; Glennie, V L

    2001-04-01

    A proposed standard extraction and HPLC analysis method has been used to measure typical levels of various phenolic compounds in the medicinally used Echinacea species. Chicoric acid was the main phenolic in E. purpurea roots (mean 2.27% summer, 1.68% autumn) and tops (2.02% summer, 0.52% autumn), and echinacoside was the main phenolic in E. angustifolia (1.04%) and E. pallida roots (0.34%). Caftaric acid was the other main phenolic compound in E. purpurea roots (0.40% summer, 0.35% autumn) and tops (0.82% summer, 0.18% autumn), and cynarin was a characteristic component of E. angustifolia roots (0.12%). Enzymatic browning during extraction could reduce the measured levels of phenolic compounds by >50%. Colorimetric analyses for total phenolics correlated well with the HPLC results for E. purpurea and E. angustifolia, but the colorimetric method gave higher values.

  12. The network formation assay: a spatially standardized neurite outgrowth analytical display for neurotoxicity screening.

    Science.gov (United States)

    Frimat, Jean-Philippe; Sisnaiske, Julia; Subbiah, Subanatarajan; Menne, Heike; Godoy, Patricio; Lampen, Peter; Leist, Marcel; Franzke, Joachim; Hengstler, Jan G; van Thriel, Christoph; West, Jonathan

    2010-03-21

    We present a rapid, reproducible and sensitive neurotoxicity testing platform that combines the benefits of neurite outgrowth analysis with cell patterning. This approach involves patterning neuronal cells within a hexagonal array to standardize the distance between neighbouring cellular nodes, and thereby standardize the length of the neurite interconnections. This feature coupled with defined assay coordinates provides a streamlined display for rapid and sensitive analysis. We have termed this the network formation assay (NFA). To demonstrate the assay we have used a novel cell patterning technique involving thin film poly(dimethylsiloxane) (PDMS) microcontact printing. Differentiated human SH-SY5Y neuroblastoma cells colonized the array with high efficiency, reliably producing pattern occupancies above 70%. The neuronal array surface supported neurite outgrowth, resulting in the formation of an interconnected neuronal network. Exposure to acrylamide, a neurotoxic reference compound, inhibited network formation. A dose-response curve from the NFA was used to determine a 20% network inhibition (NI(20)) value of 260 microM. This concentration was approximately 10-fold lower than the value produced by a routine cell viability assay, and demonstrates that the NFA can distinguish network formation inhibitory effects from gross cytotoxic effects. Inhibition of the mitogen-activated protein kinase (MAPK) ERK1/2 and phosphoinositide-3-kinase (PI-3K) signaling pathways also produced a dose-dependent reduction in network formation at non-cytotoxic concentrations. To further refine the assay a simulation was developed to manage the impact of pattern occupancy variations on network formation probability. Together these developments and demonstrations highlight the potential of the NFA to meet the demands of high-throughput applications in neurotoxicology and neurodevelopmental biology.

  13. Critical sources of bacterial contamination and adoption of standard sanitary protocol during semen collection and processing in Semen Station

    Directory of Open Access Journals (Sweden)

    Chandrahas Sannat

    2015-05-01

    Full Text Available Aim: The present investigation was conducted to locate the critical sources of bacterial contamination and to evaluate the standard sanitation protocol so as to improve the hygienic conditions during collection, evaluation, and processing of bull semen in the Semen Station. Materials and Methods: The study compared two different hygienic procedures during the collection, evaluation and processing of semen in Central Semen Station, Anjora, Durg. Routinely used materials including artificial vagina (AV inner liner, cone, semen collection tube, buffer, extender/diluter, straws; and the laboratory environment like processing lab, pass box and laminar air flow (LAF cabinet of extender preparation lab, processing lab, sealing filling machine, and bacteriological lab were subjected to bacteriological examination in two phases of study using two different sanitary protocols. Bacterial load in above items/environment was measured using standard plate count method and expressed as colony forming unit (CFU. Results: Bacterial load in a laboratory environment and AV equipments during two different sanitary protocol in present investigation differed highly significantly (p<0.001. Potential sources of bacterial contamination during semen collection and processing included laboratory environment like processing lab, pass box, and LAF cabinets; AV equipments, including AV Liner and cone. Bacterial load was reduced highly significantly (p<0.001 in AV liner (from 2.33±0.67 to 0.50±0.52, cone (from 4.16±1.20 to 1.91±0.55, and extender (from 1.33±0.38 to 0 after application of improved practices of packaging, handling, and sterilization in Phase II of study. Glasswares, buffers, and straws showed nil bacterial contamination in both the phases of study. With slight modification in fumigation protocol (formalin @600 ml/1000 ft3, bacterial load was significantly decreased (p<0.001 up to 0-6 CFU in processing lab (from 6.43±1.34 to 2.86±0.59, pass box (from 12

  14. Recommended volumetric capacity definitions and protocols for accurate, standardized and unambiguous metrics for hydrogen storage materials

    Science.gov (United States)

    Parilla, Philip A.; Gross, Karl; Hurst, Katherine; Gennett, Thomas

    2016-03-01

    The ultimate goal of the hydrogen economy is the development of hydrogen storage systems that meet or exceed the US DOE's goals for onboard storage in hydrogen-powered vehicles. In order to develop new materials to meet these goals, it is extremely critical to accurately, uniformly and precisely measure materials' properties relevant to the specific goals. Without this assurance, such measurements are not reliable and, therefore, do not provide a benefit toward the work at hand. In particular, capacity measurements for hydrogen storage materials must be based on valid and accurate results to ensure proper identification of promising materials for further development. Volumetric capacity determinations are becoming increasingly important for identifying promising materials, yet there exists controversy on how such determinations are made and whether such determinations are valid due to differing methodologies to count the hydrogen content. These issues are discussed herein, and we show mathematically that capacity determinations can be made rigorously and unambiguously if the constituent volumes are well defined and measurable in practice. It is widely accepted that this occurs for excess capacity determinations and we show here that this can happen for the total capacity determination. Because the adsorption volume is undefined, the absolute capacity determination remains imprecise. Furthermore, we show that there is a direct relationship between determining the respective capacities and the calibration constants used for the manometric and gravimetric techniques. Several suggested volumetric capacity figure-of-merits are defined, discussed and reporting requirements recommended. Finally, an example is provided to illustrate these protocols and concepts.

  15. A Review of Offset Programs: Trading Systems, Funds, Protocols, Standards and Retailers

    Energy Technology Data Exchange (ETDEWEB)

    Kollmuss, Anja; Lazarus, Michael; Lee, Carrie; Polycarp, Clifford

    2008-11-15

    Carbon or greenhouse gas (GHG) offsets have long been promoted as an important element of a comprehensive climate policy approach. Offset programs can reduce the overall cost of achieving a given emission goal by enabling emission reductions to occur where costs are lower. Furthermore, offsets have the potential to deliver sustainability co-benefits, spurred through technology development and transfer, and to develop human and institutional capacity for reducing emissions in sectors and locations not included in a cap and trade or a mandatory government policy. However, offsets can pose a risk to the environmental integrity of climate actions, especially if issues surrounding additionality, permanence, leakage, quantification and verification are not adequately addressed. The challenge for policymakers is clear: to design offset programs and policies that can maximize their potential benefits while minimizing their potential risks. The goal of this review is to provide an up-to-date analysis and synthesis of the most influential offset programs and activities, to reflect on lessons learned, and thus to inform participants and designers of current and future offset programs. Our intention is to periodically update this review to stay abreast of ongoing developments, and to develop a website portal to make this information more accessible. This version targets programs that meet one or more of the following criteria: - a significant volume of credit transactions occurring or anticipated; - an established set of rules or protocols - path-breaking, novel or otherwise notable initiatives or important lessons learned

  16. Standard protocol for demographic and epidemiological survey to be carried out for nuclear facilities

    International Nuclear Information System (INIS)

    Joshi, M.L.; Datta, D.; Singh, Jitendra; Sardhi, I.V.; Verma, P.C.

    2007-11-01

    This document presents the standard procedures for conducting demographic and epidemiological studies for nuclear facilities. These studies are required to be carried out to prepare baseline data, the impact of the facility and the risk factors for the population residing in the vicinity of facility. This document includes the basic elements of these type surveys, their methodology and statistical analysis of the data collected during demographic and epidemiological surveillance. (author)

  17. Medical ethical standards in dermatology: an analytical study of knowledge, attitudes and practices.

    Science.gov (United States)

    Mostafa, W Z; Abdel Hay, R M; El Lawindi, M I

    2015-01-01

    Dermatology practice has not been ethically justified at all times. The objective of the study was to find out dermatologists' knowledge about medical ethics, their attitudes towards regulatory measures and their practices, and to study the different factors influencing the knowledge, the attitude and the practices of dermatologists. This is a cross-sectional comparative study conducted among 214 dermatologists, from five Academic Universities and from participants in two conferences. A 54 items structured anonymous questionnaire was designed to describe the demographical characteristics of the study group as well as their knowledge, attitude and practices regarding the medical ethics standards in clinical and research settings. Five scoring indices were estimated regarding knowledge, attitude and practice. Inferential statistics were used to test differences between groups as indicated. The Student's t-test and analysis of variance were carried out for quantitative variables. The chi-squared test was conducted for qualitative variables. The results were considered statistically significant at a P > 0.05. Analysis of the possible factors having impact on the overall scores revealed that the highest knowledge scores were among dermatologists who practice in an academic setting plus an additional place; however, this difference was statistically non-significant (P = 0.060). Female dermatologists showed a higher attitude score compared to males (P = 0.028). The highest significant attitude score (P = 0.019) regarding clinical practice was recorded among those practicing cosmetic dermatology. The different studied groups of dermatologists revealed a significant impact on the attitude score (P = 0.049), and the evidence-practice score (P dermatology research. © 2014 European Academy of Dermatology and Venereology.

  18. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    International Nuclear Information System (INIS)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-01-01

    1 H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0−4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  19. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    Energy Technology Data Exchange (ETDEWEB)

    Bernini, Patrizia; Bertini, Ivano, E-mail: bertini@cerm.unifi.it; Luchinat, Claudio [University of Florence, Magnetic Resonance Center (CERM) (Italy); Nincheri, Paola; Staderini, Samuele [FiorGen Foundation (Italy); Turano, Paola [University of Florence, Magnetic Resonance Center (CERM) (Italy)

    2011-04-15

    {sup 1}H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  20. Yarn supplier selection using analytical hierarchy process (AHP) and standardized unitless rating (SUR) method on textile industry

    Science.gov (United States)

    Erfaisalsyah, M. H.; Mansur, A.; Khasanah, A. U.

    2017-11-01

    For a company which engaged in the textile field, specify the supplier of raw materials for production is one important part of supply chain management which can affect the company's business processes. This study aims to identify the best suppliers of raw material suppliers of yarn for PC. PKBI based on several criteria. In this study, the integration between the Analytical Hierarchy Process (AHP) and the Standardized Unitless Rating (SUR) are used to assess the performance of the suppliers. By using AHP, it can be known the value of the relative weighting of each criterion. While SUR shows the sequence performance value of the supplier. The result of supplier ranking calculation can be used to know the strengths and weaknesses of each supplier based on its performance criteria. From the final result, it can be known which suppliers should improve their performance in order to create long term cooperation with the company.

  1. Methodological Study to Develop Standard Operational Protocol on Intramuscular (IM, Intradermal (ID and Subcutaneous Drug Administration for Children

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Bijarania

    2017-10-01

    Full Text Available Introduction: Medicine administration is a major role played by registered nurses. Medicines are prescribed by the physician and dispensed by the pharmacist but responsibility for meticulous administration rests with the registered nurse. It becomes even more important when drugs are to be administered to children. Drug administration via Intramuscular (IM, Intradermal (ID and Subcutaneous route is a complex process. Errors are associated with medicine administration. Aim: The objective of this study was to develop Standard Operational Protocol (SOP for IM, ID and Subcutaneous drug administration and checklist to assess the implementation of the developed SOP. Materials and Methods: A methodological research design adapted to carry out the present study to develop standard operational protocol for IM, ID and subcutaneous drug administration for children, admitted in Advanced Paediatric Centre, Post Graduate Institute of Medical Education and Research, Chandigarh, India. The study included 58 bedside nurses and 90 observations of medicine administration procedure. Results: The Content Validity Index (CVI was prepared to assess the validity of content (items of SOPs and checklists. Over all Cronbach's-alpha values was calculated to assess the internal consistency of Items in SOPs and checklists. CVI of SOP and checklists were 98.51%, 97.83% and 99.03%. Over all Cronbach'salpha values were calculated 0.96, 0.82 and 0.95. All the nurses felt that SOPs are useful. Conclusion: Valid and feasible SOPs for drug administration in children along with valid and reliable checklists were developed. It is recommended to use this document for drug administration in children to prevent any possible error during drug administration to children.

  2. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality.

  3. [A standardized protocol for detection of ALK protein expression and gene fusion in lung adenocarcinoma cytologic specimens].

    Science.gov (United States)

    Wang, Zheng; Wu, Xiaonan; Shi, Yuankai; Han, Xiaohong; Cheng, Gang; Li, Lin; Mu, Xinlin; Zhang, Yuhui; Cui, Di; Zhang, Li; Fan, Zaiwen; Zhu, Guangqing; Ma, Lingyun; Yang, Li; Di, Jing; Liu, Dongge

    2015-10-01

    The aim of this study was to establish a standardized protocol for detection of ALK protein expression and gene fusion in cytologic specimens. Lung adenocarcinoma cytologic specimens were collected from seven hospitals in Beijing city. A detection protocol for ALK protein expression and gene fusion was designed according to the results of comparative experiment. Ventana immunohistochemical (IHC) ALK(D5F3) detecting ALK protein expression was performed in 203 prepared formalin-fixed paraffin-embedded (FFPE) cell blocks. ALK gene fusion in 98 EGFR gene wild type cytologic specimens and in 4 bronchoalveolar lavage fluid (BL) samples was detected by quantitative reverse transcription polymerase chain reaction (qRT-PCR). ALK gene fusion in the Ventana IHC ALK (D5F3) positive samples was further tested by fluorescence in situ hybridization (FISH). Six patients with ALK IHC-positive result were followed up to analyze the responses of crizotinib therapy. Comparative experiments: (1) Comparison of the results of 4% neutral buffered formalin fixed for different time (24 h, 48 h, 72 h) on the Ventana IHC ALK (D5F3) staining was conducted in two cases of IHC ALK positive FFPE cell blocks; (2) Comparing qRT-PCR results for ALK fusion in samples from FFPE cell blocks and cytospin prepared slides in 10 cases of lung adenocarcinoma cytologic specimens. Among the specimens examined using the standardized protocol recommended by this study, 229 cases of cytologic specimens met the diagnostic criteria of lung adenocarcinoma. Among them, 207 cases obtained ALK gene test results (by at least one method), with an ALK test ratio of 90.4% (207/229). FFPE cell blocks were successfully prepared in 203 cases, Ventana IHC ALK (D5F3) were successfully performed in all the 203 FFPE cell blocks (100%), and the ALK protein positive detection rate was 10.3% (21/203). ALK fusion was tested in 98 FFPE cytologic samples of EGFR wild types by qRT-PCR, and 96 out of 98 (97.96%) cytologic samples were

  4. Orthogonal analytical methods for botanical standardization: determination of green tea catechins by qNMR and LC-MS/MS.

    Science.gov (United States)

    Napolitano, José G; Gödecke, Tanja; Lankin, David C; Jaki, Birgit U; McAlpine, James B; Chen, Shao-Nong; Pauli, Guido F

    2014-05-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative (1)H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted (1)H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. The international protocol for the dosimetry of external radiotherapy beams based on standards of absorbed dose to water

    International Nuclear Information System (INIS)

    Andreo, P.

    2001-01-01

    An International Code of Practice (CoP, or dosimetry protocol) for external beam radiotherapy dosimetry based on standards of absorbed dose to water has been published by the IAEA on behalf of IAEA, WHO, PAHO and ESTRO. The CoP provides a systematic and internationally unified approach for the determination of the absorbed dose to water in reference conditions with radiotherapy beams. The development of absorbed-dose-to-water standards for high-energy photons and electrons offers the possibility of reducing the uncertainty in the dosimetry of radiotherapy beams. Many laboratories already provide calibrations at the radiation quality of 60Co gamma-rays and some have extended calibrations to high-energy photon and electron beams. The dosimetry of kilovoltage x-rays, as well as that of proton and ion beams can also be based on these standards. Thus, a coherent dosimetry system based on the same formalism is achieved for practically all radiotherapy beams. The practical use of the CoP as simple. The document is formed by a set of different CoPs for each radiation type, which include detailed procedures and worksheets. All CoPs are based on ND,w chamber calibrations at a reference beam quality Qo, together with radiation beam quality correction factors kQ preferably measured directly for the user's chamber in a standards laboratory. Calculated values of kQ are provided together with their uncertainty estimates. Beam quality specifiers are 60Co, TPR20,10 (high-energy photons), R50 (electrons), HVL and kV (x-rays) and Rres (protons and ions) [es

  6. Thromboelastometry versus standard coagulation tests versus restrictive protocol to guide blood transfusion prior to central venous catheterization in cirrhosis: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Rocha, Leonardo Lima; Pessoa, Camila Menezes Souza; Neto, Ary Serpa; do Prado, Rogerio Ruscitto; Silva, Eliezer; de Almeida, Marcio Dias; Correa, Thiago Domingos

    2017-02-27

    Liver failure patients have traditionally been empirically transfused prior to invasive procedures. Blood transfusion is associated with immunologic and nonimmunologic reactions, increased risk of adverse outcomes and high costs. Scientific evidence supporting empirical transfusion is lacking, and the best approach for blood transfusion prior to invasive procedures in cirrhotic patients has not been established so far. The aim of this study is to compare three transfusion strategies (routine coagulation test-guided - ordinary or restrictive, or thromboelastometry-guided) prior to central venous catheterization in critically ill patients with cirrhosis. Design and setting: a double-blinded, parallel-group, single-center, randomized controlled clinical trial in a tertiary private hospital in São Paulo, Brazil. adults (aged 18 years or older) admitted to the intensive care unit with cirrhosis and an indication for central venous line insertion. Patients will be randomly assigned to three groups for blood transfusion strategy prior to central venous catheterization: standard coagulation tests-based, thromboelastometry-based, or restrictive. The primary efficacy endpoint will be the proportion of patients transfused with any blood product prior to central venous catheterization. The primary safety endpoint will be the incidence of major bleeding. Secondary endpoints will be the proportion of transfusion of fresh frozen plasma, platelets and cryoprecipitate; infused volume of blood products; hemoglobin and hematocrit before and after the procedure; intensive care unit and hospital length of stay; 28-day and hospital mortality; incidence of minor bleeding; transfusion-related adverse reactions; and cost analysis. This study will evaluate three strategies to guide blood transfusion prior to central venous line placement in severely ill patients with cirrhosis. We hypothesized that thromboelastometry-based and/or restrictive protocols are safe and would significantly

  7. Analytical protocol to study the food safety of (multiple-)recycled high-density polyethylene (HDPE) and polypropylene (PP) crates: Influence of recycling on the migration and formation of degradation products

    NARCIS (Netherlands)

    Coulier, L.; Orbons, H.G.M.; Rijk, R.

    2007-01-01

    An analytical protocol was set up and successfully applied to study the food safety of recycled HDPE and PP crates. A worst-case scenario was applied that focused not only on overall migration and specific migration of accepted starting materials but also on migratable degradation products of

  8. Comparison of mRNA splicing assay protocols across multiple laboratories: recommendations for best practice in standardized clinical testing.

    Science.gov (United States)

    Whiley, Phillip J; de la Hoya, Miguel; Thomassen, Mads; Becker, Alexandra; Brandão, Rita; Pedersen, Inge Sokilde; Montagna, Marco; Menéndez, Mireia; Quiles, Francisco; Gutiérrez-Enríquez, Sara; De Leeneer, Kim; Tenés, Anna; Montalban, Gemma; Tserpelis, Demis; Yoshimatsu, Toshio; Tirapo, Carole; Raponi, Michela; Caldes, Trinidad; Blanco, Ana; Santamariña, Marta; Guidugli, Lucia; de Garibay, Gorka Ruiz; Wong, Ming; Tancredi, Mariella; Fachal, Laura; Ding, Yuan Chun; Kruse, Torben; Lattimore, Vanessa; Kwong, Ava; Chan, Tsun Leung; Colombo, Mara; De Vecchi, Giovanni; Caligo, Maria; Baralle, Diana; Lázaro, Conxi; Couch, Fergus; Radice, Paolo; Southey, Melissa C; Neuhausen, Susan; Houdayer, Claude; Fackenthal, Jim; Hansen, Thomas Van Overeem; Vega, Ana; Diez, Orland; Blok, Rien; Claes, Kathleen; Wappenschmidt, Barbara; Walker, Logan; Spurdle, Amanda B; Brown, Melissa A

    2014-02-01

    Accurate evaluation of unclassified sequence variants in cancer predisposition genes is essential for clinical management and depends on a multifactorial analysis of clinical, genetic, pathologic, and bioinformatic variables and assays of transcript length and abundance. The integrity of assay data in turn relies on appropriate assay design, interpretation, and reporting. We conducted a multicenter investigation to compare mRNA splicing assay protocols used by members of the ENIGMA (Evidence-Based Network for the Interpretation of Germline Mutant Alleles) consortium. We compared similarities and differences in results derived from analysis of a panel of breast cancer 1, early onset (BRCA1) and breast cancer 2, early onset (BRCA2) gene variants known to alter splicing (BRCA1: c.135-1G>T, c.591C>T, c.594-2A>C, c.671-2A>G, and c.5467+5G>C and BRCA2: c.426-12_8delGTTTT, c.7988A>T, c.8632+1G>A, and c.9501+3A>T). Differences in protocols were then assessed to determine which elements were critical in reliable assay design. PCR primer design strategies, PCR conditions, and product detection methods, combined with a prior knowledge of expected alternative transcripts, were the key factors for accurate splicing assay results. For example, because of the position of primers and PCR extension times, several isoforms associated with BRCA1, c.594-2A>C and c.671-2A>G, were not detected by many sites. Variation was most evident for the detection of low-abundance transcripts (e.g., BRCA2 c.8632+1G>A Δ19,20 and BRCA1 c.135-1G>T Δ5q and Δ3). Detection of low-abundance transcripts was sometimes addressed by using more analytically sensitive detection methods (e.g., BRCA2 c.426-12_8delGTTTT ins18bp). We provide recommendations for best practice and raise key issues to consider when designing mRNA assays for evaluation of unclassified sequence variants.

  9. Building America House Simulation Protocols (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Engebrecht, C.

    2010-10-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  10. Evolutionary cognitive therapy versus standard cognitive therapy for depression: a protocol for a blinded, randomized, superiority clinical trial.

    Science.gov (United States)

    Giosan, Cezar; Cobeanu, Oana; Mogoase, Cristina; Muresan, Vlad; Malta, Loretta S; Wyka, Katarzyna; Szentagotai, Aurora

    2014-03-19

    Depression is estimated to become the leading cause of disease burden globally by 2030. Despite existing efficacious treatments (both medical and psychotherapeutic), a large proportion of patients do not respond to therapy. Recent insights from evolutionary psychology suggest that, in addition to targeting the proximal causes of depression (for example, targeting dysfunctional beliefs by cognitive behavioral therapy), the distal or evolutionary causes (for example, inclusive fitness) should also be addressed. A randomized superiority trial is conducted to develop and test an evolutionary-driven cognitive therapy protocol for depression, and to compare its efficacy against standard cognitive therapy for depression. Romanian-speaking adults (18 years or older) with elevated Beck Depression Inventory (BDI) scores (>13), current diagnosis of major depressive disorder or major depressive episode (MDD or MDE), and MDD with comorbid dysthymia, as evaluated by the Structured Clinical Interview for DSM-IV (SCID), are included in the study. Participants are randomized to one of two conditions: 1) evolutionary-driven cognitive therapy (ED-CT) or 2) cognitive therapy (CT). Both groups undergo 12 psychotherapy sessions, and data are collected at baseline, mid-treatment, post-treatment, and the 3-month follow-up. Primary outcomes are depressive symptomatology and a categorical diagnosis of depression post-treatment. This randomized trial compares the newly proposed ED-CT with a classic CT protocol for depression. To our knowledge, this is the first attempt to integrate insights from evolutionary theories of depression into the treatment of this condition in a controlled manner. This study can thus add substantially to the body of knowledge on validated treatments for depression. Current Controlled Trials ISRCTN64664414The trial was registered in June 2013. The first participant was enrolled on October 3, 2012.

  11. Normalisation of conventional analytical methods (XRF And icp) on NAA using different kinds Of geological standard reference material

    International Nuclear Information System (INIS)

    Bounakhla, M.; Embarch, K.; Zahry, F.; Gaudry, A.; Piccot, D.; Gruffat, J.J.; Moutte, J.; Bilal, E.

    2001-01-01

    The aim of this work is the comparison between INAA (Instrumental Neutron Activation Analysis), XRF (X-ray Fluorescence) Analysis and ICP-AES (Inductively Coupled Plasma Atomic Emission Spectrometry) methods for geological samples. The study had been done on 15 standard reference materials (SRM) of different kind of rocks having different origins. For INAA, the irradiation was made in Pierre-Sue Laboratory in France using two reactors, OSIRIS for irradiation with epithermal neutron and ORPHEE for thermal neutron irradiation. Two protocols of irradiation were performed, the first one under cadmium to eliminate thermal neutrons and the second one with a high flux ratio of thermal to epithermal neutrons (f=2000). Concerning XRF, we used two techniques : Energy Dispersive XRF and Wave-length XRF. The ICP-AES spectrometer used was a sequential. In this comparison study, we first normalized the INAA results on the certified values. The measurements of the conventional methods (XRF and ICP-AES) had been normalized on both of INAA results and the certified values. The function used in the fitting of the measured and certified values were a gaussian. It was concluded that both of the conventional methods complement in many cases the results of INAA, but their main disadvantage was poor sensitivity (especially for XRF) in the determination of trace elements, mainly rare elements. However, the conventional methods are necessary in rocks characterization throw major elements determination

  12. A Protocol, a standard and a (PULI) database for quantitative micro-FTIR measurements of water in nominally anhydrous minerals: an update

    Science.gov (United States)

    Kovacs, Istvan; Udvardi, Beatrix; Pintér, Zsanett; Hidas, Károly; Kutassy, Lászlóné; Falus, György; Lendvay, Pál; István, Török; Zelei, Tamás; Fancsik, Tamás; Gál, Tamás; Mihály, Judith; Németh, Csaba; Ingrin, Jannick; Xia, Qunke; Hermann, Jörg; Stalder, Roland; Perucchi, Andrea; Kamarás, Katalin; Szekrényes, Zsolt

    2014-05-01

    'Water' (H2O, OH and H+) in the nominally anhydrous minerals (NAMs) of the upper mantle play a key role in determining its geochemical and geophysical properties. Both the concentration and the substitution mechanism of water are important to formulate its effect on material properties. Fourier-transform infrared (FTIR) spectrometry can provide both qualitative and quantitative information on the substitution of water into NAMs, therefore, it is a widely used analytical technique. The quantitative evaluation of micro-FTIR mineral spectra, however, seems to be still rather ambiguous. This is because there are several different - sometimes controversial - ways to measure or estimate the total polarized integrated absorbance (Atot). Furthermore, there are mineral-, substitution mechanism- and wavenumber-dependent calibrations factors available to convert Atot to the absolute concentration of water (usually given in ppm wt.%). No wonder that very different absolute water concentrations may be obtained from the very same IR spectrum. Thus, there is certainly a need for an evaluation protocol which would reduce these uncertainties giving clear instructions how Atot should be obtained and what calibrations factors should be used. This will be introduced in our study. Inter laboratory differences were monitored by analysing the some unoriented grains of the Pakistani olivine standard using different brands of infrared microscopes in several different countries and laboratories worldwide. During these measurements optimal measurement settings for the IR analysis of NAMs were constrained. The results show that the inter laboratory deviations are typically less than 10%. To put constraints on the micro-scale (reported numerous infrared spectra. To keep up with this rising numbers of infrared spectra an electronic spectral database would be desirable. This was the motivation why we have constructed the Pannon Uniform Lithosheric Infrared (PULI) spectral database ( puli

  13. Analytical Evaluation of Preliminary Drop Tests Performed to Develop a Robust Design for the Standardized DOE Spent Nuclear Fuel Canister

    International Nuclear Information System (INIS)

    Ware, A.G.; Morton, D.K.; Smith, N.L.; Snow, S.D.; Rahl, T.E.

    1999-01-01

    The Department of Energy (DOE) has developed a design concept for a set of standard canisters for the handling, interim storage, transportation, and disposal in the national repository, of DOE spent nuclear fuel (SNF). The standardized DOE SNF canister has to be capable of handling virtually all of the DOE SNF in a variety of potential storage and transportation systems. It must also be acceptable to the repository, based on current and anticipated future requirements. This expected usage mandates a robust design. The canister design has four unique geometries, with lengths of approximately 10 feet or 15 feet, and an outside nominal diameter of 18 inches or 24 inches. The canister has been developed to withstand a drop from 30 feet onto a rigid (flat) surface, sustaining only minor damage - but no rupture - to the pressure (containment) boundary. The majority of the end drop-induced damage is confined to the skirt and lifting/stiffening ring components, which can be removed if de sired after an accidental drop. A canister, with its skirt and stiffening ring removed after an accidental drop, can continue to be used in service with appropriate operational steps being taken. Features of the design concept have been proven through drop testing and finite element analyses of smaller test specimens. Finite element analyses also validated the canister design for drops onto a rigid (flat) surface for a variety of canister orientations at impact, from vertical to 45 degrees off vertical. Actual 30-foot drop testing has also been performed to verify the final design, though limited to just two full-scale test canister drops. In each case, the analytical models accurately predicted the canister response

  14. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry: protocol standardization and database expansion for rapid identification of clinically important molds.

    Science.gov (United States)

    Paul, Saikat; Singh, Pankaj; Rudramurthy, Shivaprakash M; Chakrabarti, Arunaloke; Ghosh, Anup K

    2017-12-01

    To standardize the matrix-assisted laser desorption ionization-time of flight mass spectrometry protocols and expansion of existing Bruker Biotyper database for mold identification. Four different sample preparation methods (protocol A, B, C and D) were evaluated. On analyzing each protein extraction method, reliable identification and best log scores were achieved through protocol D. The same protocol was used to identify 153 clinical isolates. Of these 153, 123 (80.3%) were accurately identified by using existing database and remaining 30 (19.7%) were not identified due to unavailability in database. On inclusion of missing main spectrum profile in existing database, all 153 isolates were identified. Matrix-assisted laser desorption ionization-time of flight mass spectrometry can be used for routine identification of clinically important molds.

  15. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    Directory of Open Access Journals (Sweden)

    Saurabh B. Ganorkar

    2017-02-01

    Full Text Available A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral, oxidative, photolytic (acidic, basic, neutral, solid state and thermal (dry heat degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm by isocratic mode at ambient temperature, employing a mobile phase methanol and (0.2%, v/v orthophosphoric acid in ratio of (80:20, v/v at a flow rate of 1.0 mL min−1 and detection at 260 nm. ‘Design of Experiments’ (DOE employing ‘Central Composite Design’ (CCD and ‘Response Surface Methodology’ (RSM were applied as an advancement to traditional ‘One Variable at Time’ (OVAT approach to evaluate the effects of variations in selected factors (methanol content, flow rate, concentration of orthophosphoric acid as graphical interpretation for robustness and statistical interpretation was achieved with Multiple Linear Regression (MLR and ANOVA. The method succeeded over the validation parameters: linearity, precision, accuracy, limit of detection and limit of quantitation, and robustness. The method was applied effectively for analysis of in-house zileuton tablets.

  16. Clinical Outcome of a Prospective Case Series of Patients With Ketamine Cystitis Who Underwent Standardized Treatment Protocol.

    Science.gov (United States)

    Yee, Chi-hang; Lai, Pui-tak; Lee, Wai-man; Tam, Yuk-him; Ng, Chi-fai

    2015-08-01

    To assess the outcome of a prospective cohort of patients with ketamine-associated uropathy after standardized treatment. This is a prospective case series of patients with ketamine-related urologic problems. Management for the patients includes a 4-tier approach, namely anti-inflammatory or anti-cholinergic drugs, opioid analgesics or pregabalin, intravesical hyaluronic acid, and finally, surgical intervention including hydrodistension and augmentation cystoplasty. Outcome was assessed with functional bladder capacity, pelvic pain and urgency or frequency (PUF) symptom scale, and the EuroQol visual analog scale. Between December 2011 and June 2014, 463 patients presented with ketamine-associated uropathy. All were managed by the same standardized protocol. Among these patients, 319 patients came back for follow-up assessment. Overall mean follow-up duration was 10.7 ± 8.5 months. For those patients who received first-line treatment (290 patients), there was a significant improvement in PUF scores, the EuroQol visual analog scale, and functional bladder capacity. Both abstinence from ketamine usage and the amount of ketamine consumed were factors predicting the improvement of PUF scores. For those patients who required second-line oral therapy (62 patients), 42 patients (67.7%) reported improvement in symptoms. Eight patients have completed intravesical therapy. There was a significant improvement in voided volume for the patients after treatment. The study demonstrated the efficacy of managing ketamine-associated uropathy using a 4-tier approach. Both anti-inflammatory drugs and analgesics could effectively alleviate symptoms. Being abstinent from ketamine abuse and the amount of ketamine consumed have bearings on treatment response. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Changes in Body Mass Index and Lipid Profile in Psoriatic Patients After Treatment With Standard Protocol of Infliximab.

    Science.gov (United States)

    Ehsani, Amir Houshang; Mortazavi, Hossein; Balighi, Kamran; Hosseini, Mahboubeh Sadat; Azizpour, Arghavan; Hejazi, Seyyedeh Pardis; Goodarzi, Azadeh; Darvari, Seyyedeh Bahareh

    2016-09-01

    Psoriasis is a chronic and inflammatory dermatologic disease. Psoriasis may predispose to cardiovascular disease and diabetes. However, the role of tumor necrosis factor (TNF) inhibitor in mediating this risk is controversial. Regarding frequent use of infliximab in psoriasis, and the hypothesis that anti TNF-α treatment may increase Body Mass Index (BMI) and alter lipid profile in these patients, the aim of this study was to assess changes in BMI and Lipid Profile and level of leptin in Psoriatic Patients under Treatment of Standard Protocol of Infliximab in a 24 week period. This study was accomplished as a before-after study. Twenty-seven psoriatic patients were included, and standard infliximab therapy was applied. All patients underwent 3 times of blood collection and in each session; LDL, HDL, Total Cholesterol, Triglycerides, Leptin, and PASI score were measured at the start of the study and at the 12th and 24th week of follow-up. Twenty-five patients consisted of 18 (72%) male and 7 (28%) female subjects were evaluated. The mean age of the patients was 36.91±13.31 years. PASI score demonstrated significant decrease after 24 weeks; however, BMI and HDL and leptin showed a significant increase during treatment. Significant negative correlation was seen between Leptin and PASI score changes (r=0.331, P=0.042). HDL and BMI had the most correlations with leptin (positive correlation) and PASI score (negative correlation). Results demonstrated a dramatic decrease in PASI, increase in BMI and HDL and increased in leptin; somewhat correlated to each other. These results suggest that patients taking infliximab should take more care of their weight and lipid profile, while on treatment.

  18. An umbrella protocol for standardized data collection (SDC) in rectal cancer: a prospective uniform naming and procedure convention to support personalized medicine.

    Science.gov (United States)

    Meldolesi, Elisa; van Soest, Johan; Dinapoli, Nicola; Dekker, Andre; Damiani, Andrea; Gambacorta, Maria Antonietta; Valentini, Vincenzo

    2014-07-01

    Predictive models allow treating physicians to deliver tailored treatment moving from prescription by consensus to prescription by numbers. The main features of an umbrella protocol for standardizing data and procedures to create a consistent dataset useful to obtain a trustful analysis for a Decision Support System for rectal cancer are reported. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. [Studies on the identification of psychotropic substances. VIII. Preparation and various analytical data of reference standard of some stimulants, amfepramone, cathinone, N-ethylamphetamine, fenethylline, fenproporex and mefenorex].

    Science.gov (United States)

    Shimamine, M; Takahashi, K; Nakahara, Y

    1992-01-01

    The Reference Standards for amfepramone, cathinone, N-ethylamphetamine, fenethylline, fenproporex and mefenorex were prepared. Their purities determined by HPLC were more than 99.5%. For the identification and determination of these six drugs, their analytical data were measured and discussed by TLC, UV, IR, HPLC, GC/MS and NMR.

  20. Accelerated rehabilitation compared with a standard protocol after distal radial fractures treated with volar open reduction and internal fixation: a prospective, randomized, controlled study.

    Science.gov (United States)

    Brehmer, Jess L; Husband, Jeffrey B

    2014-10-01

    There are relatively few studies in the literature that specifically evaluate accelerated rehabilitation protocols for distal radial fractures treated with open reduction and internal fixation (ORIF). The purpose of this study was to compare the early postoperative outcomes (at zero to twelve weeks postoperatively) of patients enrolled in an accelerated rehabilitation protocol with those of patients enrolled in a standard rehabilitation protocol following ORIF for a distal radial fracture. We hypothesized that patients with accelerated rehabilitation after volar ORIF for a distal radial fracture would have an earlier return to function compared with patients who followed a standard protocol. From November 2007 to November 2010, eighty-one patients with an unstable distal radial fracture were prospectively randomized to follow either an accelerated or a standard rehabilitation protocol after undergoing ORIF with a volar plate for a distal radial fracture. Both groups began with gentle active range of motion at three to five days postoperatively. At two weeks, the accelerated group initiated wrist/forearm passive range of motion and strengthening exercises, whereas the standard group initiated passive range of motion and strengthening at six weeks postoperatively. Patients were assessed at three to five days, two weeks, three weeks, four weeks, six weeks, eight weeks, twelve weeks, and six months postoperatively. Outcomes included Disabilities of the Arm, Shoulder and Hand (DASH) scores (primary outcome) and measurements of wrist flexion/extension, supination, pronation, grip strength, and palmar pinch. The patients in the accelerated group had better mobility, strength, and DASH scores at the early postoperative time points (zero to eight weeks postoperatively) compared with the patients in the standard rehabilitation group. The difference between the groups was both clinically relevant and statistically significant. Patients who follow an accelerated rehabilitation

  1. Does leaf chemistry differentially affect breakdown in tropical versus temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ardon; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to...

  2. LHCb: FPGA based data-flow injection module at 10 Gbit/s reading data from network exported storage and using standard protocols

    CERN Multimedia

    Lemouzy, B; Garnier, J-C

    2010-01-01

    The goal of the LHCb readout upgrade is to speed up the DAQ to 40 MHz. Such a DAQ system will certainly employ 10 Gigabit or similar technologies and might also need new networking protocols such as a customized, light-weight TCP or more specialised protocols. A test module is being implemented, which integrates in the existing LHCb infrastructure. It is a multiple 10-Gigabit traffic generator, driven by a Stratix IV FPGA, which is flexibile enough to either generate LHCb's raw data packets internally or read them from external storage via the network. For reading the data we have implemented a light-weight industry standard protocol ATA over Ethernet (AoE) and we present an outlook of using a filesystem on these network-exported disk-drivers.

  3. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee Pro

    Science.gov (United States)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Standards-based wireless sensor network (WSN) protocols are promising candidates for spacecraft avionics systems, offering unprecedented instrumentation flexibility and expandability. Ensuring reliable data transport is key, however, when migrating from wired to wireless data gathering systems. In this paper, we conduct a rigorous laboratory analysis of the relative performances of the ZigBee Pro and ISA100.11a protocols in a representative crewed aerospace environment. Since both operate in the 2.4 GHz radio frequency (RF) band shared by systems such as Wi-Fi, they are subject at times to potentially debilitating RF interference. We compare goodput (application-level throughput) achievable by both under varying levels of 802.11g Wi-Fi traffic. We conclude that while the simpler, more inexpensive ZigBee Pro protocol performs well under moderate levels of interference, the more complex and costly ISA100.11a protocol is needed to ensure reliable data delivery under heavier interference. This paper represents the first published, rigorous analysis of WSN protocols in an aerospace environment that we are aware of and the first published head-to-head comparison of ZigBee Pro and ISA100.11a.

  4. Development of standardized bioassay protocols for the toxicity assessment of waste, manufactured products, and effluents in Latin America: Venezuela, a Case Study

    International Nuclear Information System (INIS)

    Rodriquez-Grau, J.

    1993-01-01

    The present status of the toxicity assessment of industrial products in Latin America is well below North America/EC standards. As an example, most of Latin America regulatory laws regarding effluent discharge are still based upon concentration limits of certain major pollutants, and BOD/COD measurements; no reference is made to the necessity of aquatic bioassay toxicity data. Aware of this imperative need, the Venezuelan Petroleum Industry (PDVSA), through its R ampersand D Corporative branch (INTEVEP) gave priority to the development of standardized acute/sublethal toxicity test protocols as sound means of evaluating their products and wastes. Throughout this presentation, the Venezuelan case will be studied, showing strategies undertaken to accelerate protocol development. Results will show the assessment of 14 different protocols encompassing a variety of species of aquatic/terrestrial organisms, and a series of toxicity test endpoints including mortality, reproductive, biological and immunological measurements, most of which are currently in use or being developed. These protocols have already yielded useful results in numerous cases where toxicity assessment was required, including evaluations of effluent, oil dispersants, drilling fluids, toxic wastes, fossil fuels and newly developed products. The Venezuelan case demonstrates that the integration of Industry, Academia and Government, which is an essential part of SETAC's philosophy, is absolutely necessary for the successful advancement of environmental scientific/regulatory issues

  5. Radiographic assessment of alignment following TKA: outline of a standardized protocol and assessment of a newly devised trigonometric method of analysis.

    Science.gov (United States)

    Balakrishnan, Vikram; De Steiger, Richard; Lowe, Adrian

    2010-05-01

    An important determinant of long-term outcomes following total knee arthroplasty (TKA) is post-operative alignment as measured on radiographs. Thus far, radiographs have been measured using the goniometer method (GM) and no standard protocol has been followed. The aim of this prospective study was to: (i) outline a protocol for radiographic measurement following TKA; and (ii) compare the accuracy of the traditional GM with a new trigonometric method (TM) of radiographic analysis. A protocol for the measurement of alignment on radiographs following TKA was outlined in detail with step-by-step instructions. A new TM of angle measurement was also delineated. Alignment was measured on 51 post-operative TKA radiographs. A single angle was chosen and measured by two observers using both the GM and TM. The TM had a precision of 1.06 degrees compared with 1.5 degrees using the GM. The standard deviation of the TM was significantly smaller than the GM (P= 0.033). The intra-class correlation coefficient of the TM was 0.94 versus 0.90 for the GM. The study detailed a protocol for the measurement of axial alignment of the limbs and components following TKA, and provided evidence that a newer TM of angle measurement was superior in terms of precision and intra-rater reliability in comparison with the traditional method.

  6. Standard echocardiography versus handheld echocardiography for the detection of subclinical rheumatic heart disease: protocol for a systematic review.

    Science.gov (United States)

    Telford, Lisa H; Abdullahi, Leila H; Ochodo, Eleanor A; Zühlke, Liesl J; Engel, Mark E

    2018-02-10

    Rheumatic heart disease (RHD) is a preventable and treatable chronic condition which persists in many developing countries largely affecting impoverished populations. Handheld echocardiography presents an opportunity to address the need for more cost-effective methods of diagnosing RHD in developing countries, where the disease continues to carry high rates of morbidity and mortality. Preliminary studies have demonstrated moderate sensitivity as well as high specificity and diagnostic odds for detecting RHD in asymptomatic patients. We describe a protocol for a systematic review on the diagnostic performance of handheld echocardiography compared to standard echocardiography using the 2012 World Heart Federation criteria for diagnosing subclinical RHD. Electronic databases including PubMed, Scopus, Web of Science and EBSCOhost as well as reference lists and citations of relevant articles will be searched from 2012 to date using a predefined strategy incorporating a combination of Medical Subject Heading terms and keywords. The methodological validity and quality of studies deemed eligible for inclusion will be assessed against review specific Quality Assessment of Diagnostic Accuracy Studies 2 criteria and information on metrics of diagnostic accuracy and demographics extracted. Forest plots of sensitivity and specificity as well as scatter plots in receiver operating characteristic (ROC) space will be used to investigate heterogeneity. If possible, a meta-analysis will be conducted to produce summary results of sensitivity and specificity using the Hierarchical Summary ROC method. In addition, a sensitivity analysis will be conducted to investigate the effect of studies with a high risk of bias. Ethics approval is not required for this systematic review of previously published literature. The planned review will provide a summary of the diagnostic accuracy of handheld echocardiography. Results may feed into evidence-based guidelines and should the findings of this

  7. Standardization of a protocol to obtain genomic DNA for the quantification of 5mC in epicormics buds of Tectona grandis L.

    OpenAIRE

    Elisa Quiala; Luis Valledor; Rodrigo Hazbun; Raúl Barbón; Manuel de Feria; Maité Chávez

    2008-01-01

    The present investigation was carried out with the objective of defining an extraction and purification method that it provided deoxyribonucleic acid (DNA) appropriate to determine the percentage of 5mC in the genomic DNA of epicormics buds of Tectona grandis L. During the standardization of the protocol four methods were compared: 1 -classic based on saline shock solution with CTAB (hexadecil trimetil ammonium bromide), 2 - Kit of extraction of DNA plants DNeasy Plant Mini Kit (QIAGEN) accor...

  8. Effective reduction of fluoroscopy duration by using an advanced electroanatomic-mapping system and a standardized procedural protocol for ablation of atrial fibrillation: 'the unleaded study'.

    Science.gov (United States)

    Knecht, Sven; Sticherling, Christian; Reichlin, Tobias; Pavlovic, Nikola; Mühl, Aline; Schaer, Beat; Osswald, Stefan; Kühne, Michael

    2015-11-01

    It is recommended to keep exposure to ionizing radiation as low as reasonably achievable. The aim of this study was to determine whether fluoroscopy-free mapping and ablation using a standardized procedural protocol is feasible in patients undergoing pulmonary vein isolation (PVI). Sixty consecutive patients were analysed: Thirty consecutive patients undergoing PVI using Carto3 were treated using a standardized procedural fluoroscopy protocol with X-ray being disabled after transseptal puncture (Group 1) and compared with a set of previous 30 consecutive patients undergoing PVI without a specific recommendation regarding the use of fluoroscopy (Group 2). The main outcome measures were the feasibility of fluoroscopy-free mapping and ablation, total fluoroscopy time, total dose area product (DAP), and procedure time. Sixty patients (age 60 ± 10 years, 73% male, ejection fraction 0.55 ± 0.09, left atrium 42 ± 8 mm) were included. In Group 1, total fluoroscopy time was 4.2 (2.6-5.6) min and mapping and ablation during PVI without using fluoroscopy was feasible in 29 of 30 patients (97%). In Group 2, total fluoroscopy time was 9.3 (6.4-13.9) min (P fluoroscopy after transseptal puncture using a standardized procedural protocol is feasible in almost all patients and is associated with markedly decreased total fluoroscopy duration and DAP. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  9. Analytical performance evaluation of a high-volume hematology laboratory utilizing sigma metrics as standard of excellence.

    Science.gov (United States)

    Shaikh, M S; Moiz, B

    2016-04-01

    Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.

  10. Automated analytical standard production with supercritical fluid chromatography for the quantification of bioactive C17-polyacetylenes: a case study on food processing waste.

    Science.gov (United States)

    Bijttebier, Sebastiaan; D'Hondt, Els; Noten, Bart; Hermans, Nina; Apers, Sandra; Exarchou, Vassiliki; Voorspoels, Stefan

    2014-12-15

    Food processing enterprises produce enormous amounts of organic waste that contains valuable phytochemicals (e.g. C17-polyacetylenes). Knowledge on the phytochemicals content is a first step towards valorisation. Quantification of C17-polyacetylenes is however often hampered by the lack of commercially available standards or by tedious multistep in-house standard production procedures. In the current study, a new and straightforward supercritical fluid chromatography purification procedure is described for the simultaneous production of 2 analytical C17-polyacetylene standards. Respectively, 5 and 6 mg of falcarinol and falcarindiol were purified in 17 h on analytical scale. After confirming the identity and quality (97% purity) by Nuclear Magnetic Resonance, accurate mass-Mass Spectrometry (am-MS) and Photo Diode Array (PDA) detection the C17-polyacetylene standards were used for the analysis of industrial vegetable waste with Liquid Chromatography coupled to PDA and am-MS detection. Measurements showed varying concentrations of C17-polyacetylenes in the organic waste depending on its nature and origin. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Effectiveness of individualized physiotherapy on pain and functioning compared to a standard exercise protocol in patients presenting with clinical signs of subacromial impingement syndrome. A randomized controlled trial.

    Science.gov (United States)

    Kromer, Thilo O; de Bie, Rob A; Bastiaenen, Caroline H G

    2010-06-09

    Shoulder impingement syndrome is a common musculoskeletal complaint leading to significant reduction of health and disability. Physiotherapy is often the first choice of treatment although its effectiveness is still under debate. Systematic reviews in this field highlight the need for more high quality trials to investigate the effectiveness of physiotherapy interventions in patients with subacromial impingement syndrome. This randomized controlled trial will investigate the effectiveness of individualized physiotherapy in patients presenting with clinical signs and symptoms of subacromial impingement, involving 90 participants aged 18-75. Participants are recruited from outpatient physiotherapy clinics, general practitioners, and orthopaedic surgeons in Germany. Eligible participants will be randomly allocated to either individualized physiotherapy or to a standard exercise protocol using central randomization. The control group will perform the standard exercise protocol aiming to restore muscular deficits in strength, mobility, and coordination of the rotator cuff and the shoulder girdle muscles to unload the subacromial space during active movements. Participants of the intervention group will perform the standard exercise protocol as a home program, and will additionally be treated with individualized physiotherapy based on clinical examination results, and guided by a decision tree. After the intervention phase both groups will continue their home program for another 7 weeks. Outcome will be measured at 5 weeks and at 3 and 12 months after inclusion using the shoulder pain and disability index and patients' global impression of change, the generic patient-specific scale, the average weekly pain score, and patient satisfaction with treatment. Additionally, the fear avoidance beliefs questionnaire, the pain catastrophizing scale, and patients' expectancies of treatment effect are assessed. Participants' adherence to the protocol, use of additional treatments

  12. Effectiveness of individualized physiotherapy on pain and functioning compared to a standard exercise protocol in patients presenting with clinical signs of subacromial impingement syndrome. A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    de Bie Rob A

    2010-06-01

    Full Text Available Abstract Background Shoulder impingement syndrome is a common musculoskeletal complaint leading to significant reduction of health and disability. Physiotherapy is often the first choice of treatment although its effectiveness is still under debate. Systematic reviews in this field highlight the need for more high quality trials to investigate the effectiveness of physiotherapy interventions in patients with subacromial impingement syndrome. Methods/Design This randomized controlled trial will investigate the effectiveness of individualized physiotherapy in patients presenting with clinical signs and symptoms of subacromial impingement, involving 90 participants aged 18-75. Participants are recruited from outpatient physiotherapy clinics, general practitioners, and orthopaedic surgeons in Germany. Eligible participants will be randomly allocated to either individualized physiotherapy or to a standard exercise protocol using central randomization. The control group will perform the standard exercise protocol aiming to restore muscular deficits in strength, mobility, and coordination of the rotator cuff and the shoulder girdle muscles to unload the subacromial space during active movements. Participants of the intervention group will perform the standard exercise protocol as a home program, and will additionally be treated with individualized physiotherapy based on clinical examination results, and guided by a decision tree. After the intervention phase both groups will continue their home program for another 7 weeks. Outcome will be measured at 5 weeks and at 3 and 12 months after inclusion using the shoulder pain and disability index and patients' global impression of change, the generic patient-specific scale, the average weekly pain score, and patient satisfaction with treatment. Additionally, the fear avoidance beliefs questionnaire, the pain catastrophizing scale, and patients' expectancies of treatment effect are assessed. Participants

  13. Intra-rater and inter-rater reliability of the standardized ultrasound protocol for assessing subacromial structures

    DEFF Research Database (Denmark)

    Hougs Kjær, Birgitte; Ellegaard, Karen; Wieland, Ina

    2017-01-01

    BACKGROUND: US-examinations related to shoulder impingement (SI) often vary due to methodological differences, examiner positions, transducers, and recording parameters. Reliable US protocols for examination of different structures related to shoulder impingement are therefore needed. OBJECTIVES...... of the supraspinatus tendon (SUPRA) and subacromial subdeltoid (SASD) bursa in two imaging positions, and the acromial humeral distance (AHD) in one position. Additionally, agreement on dynamic impingement (DI) examination was performed. The intra- and inter-rater reliability was carried out on the same day...

  14. Effects of standard training in the use of closed-circuit televisions in visually impaired adults: design of a training protocol and a randomized controlled trial

    Science.gov (United States)

    2010-01-01

    Background Reading problems are frequently reported by visually impaired persons. A closed-circuit television (CCTV) can be helpful to maintain reading ability, however, it is difficult to learn how to use this device. In the Netherlands, an evidence-based rehabilitation program in the use of CCTVs was lacking. Therefore, a standard training protocol needed to be developed and tested in a randomized controlled trial (RCT) to provide an evidence-based training program in the use of this device. Methods/Design To develop a standard training program, information was collected by studying literature, observing training in the use of CCTVs, discussing the content of the training program with professionals and organizing focus and discussion groups. The effectiveness of the program was evaluated in an RCT, to obtain an evidence-based training program. Dutch patients (n = 122) were randomized into a treatment group: normal instructions from the supplier combined with training in the use of CCTVs, or into a control group: instructions from the supplier only. The effect of the training program was evaluated in terms of: change in reading ability (reading speed and reading comprehension), patients' skills to operate the CCTV, perceived (vision-related) quality of life and tasks performed in daily living. Discussion The development of the CCTV training protocol and the design of the RCT in the present study may serve as an example to obtain an evidence-based training program. The training program was adjusted to the needs and learning abilities of individual patients, however, for scientific reasons it might have been preferable to standardize the protocol further, in order to gain more comparable results. Trial registration http://www.trialregister.nl, identifier: NTR1031 PMID:20219120

  15. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  16. Reference standard space hippocampus labels according to the European Alzheimer's Disease Consortium-Alzheimer's Disease Neuroimaging Initiative harmonized protocol: Utility in automated volumetry.

    Science.gov (United States)

    Wolf, Dominik; Bocchetta, Martina; Preboske, Gregory M; Boccardi, Marina; Grothe, Michel J

    2017-08-01

    A harmonized protocol (HarP) for manual hippocampal segmentation on magnetic resonance imaging (MRI) has recently been developed by an international European Alzheimer's Disease Consortium-Alzheimer's Disease Neuroimaging Initiative project. We aimed at providing consensual certified HarP hippocampal labels in Montreal Neurological Institute (MNI) standard space to serve as reference in automated image analyses. Manual HarP tracings on the high-resolution MNI152 standard space template of four expert certified HarP tracers were combined to obtain consensual bilateral hippocampus labels. Utility and validity of these reference labels is demonstrated in a simple atlas-based morphometry approach for automated calculation of HarP-compliant hippocampal volumes within SPM software. Individual tracings showed very high agreement among the four expert tracers (pairwise Jaccard indices 0.82-0.87). Automatically calculated hippocampal volumes were highly correlated (r L/R  = 0.89/0.91) with gold standard volumes in the HarP benchmark data set (N = 135 MRIs), with a mean volume difference of 9% (standard deviation 7%). The consensual HarP hippocampus labels in the MNI152 template can serve as a reference standard for automated image analyses involving MNI standard space normalization. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  17. Standardization of a Protocol for Obtaining Platelet Rich Plasma from blood Donors; a Tool for Tissue Regeneration Procedures.

    Science.gov (United States)

    Gómez, Lina Andrea; Escobar, Magally; Peñuela, Oscar

    2015-01-01

    To develop a protocol for obtaining autologous platelet rich plasma in healthy individuals and to determine the concentration of five major growth factors before platelet activation. This protocol could be integrated into the guidelines of good clinical practice and research in regenerative medicine. Platelet rich plasma was isolated by centrifugation from 38 healthy men and 42 women ranging from 18 to 59 years old. The platelet count and quantification of growth factors were analyzed in eighty samples, stratified for age and gender of the donor. Analyses were performed using parametric the t-test or Pearson's analysis for non-parametric distribution. P platelet counts from 1.6 to 4.9 times (mean = 2.8). There was no correlation between platelet concentration and the level of the following growth factors: VEGF-D (r = 0.009, p = 0.4105), VEGF-A (r = 0.0068, p = 0.953), PDGF subunit AA (p = 0.3618; r = 0.1047), PDGF-BB (p = 0.5936; r = 0.6095). In the same way, there was no correlation between donor gender and growth factor concentrations. Only TGF-β concentration was correlated to platelet concentration (r = 0.3163, p = 0.0175). The procedure used allowed us to make preparations rich in platelets, low in leukocytes and red blood cells, and sterile. Our results showed biological variations in content of growth factors in PRP. The factors influencing these results should be further studied.

  18. Toward an international standard for PCR-based detection of food-borne thermotolerant Campylobacters: Assay development and analytical validation

    DEFF Research Database (Denmark)

    Lübeck, Peter Stephensen; Wolffs, P.; On, Stephen L.W.

    2003-01-01

    As part of a European research project (FOOD-PCR), we developed a standardized and robust PCR detection assay specific for the three most frequently reported food-borne pathogenic Campylobacter species, C. jejuni, C. coli, and C. lari. Fifteen published and unpublished PCR primers targeting the 16S...... carcass rinse, unlike both Taq DNA polymerase and DyNAzyme. Based on these results, Tth was selected as the most suitable enzyme for the assay. The standardized PCR test described shows potential for use in large-scale screening programs for food-borne Campylobacter species under the assay conditions....... The inclusivity and exclusivity were 100 and 97%, respectively. In an attempt to find a thermostable DNA polymerase more resistant than Taq to PCR inhibitors present in chicken samples, three DNA polymerases were evaluated. The DNA polymerase Tth was not inhibited at a concentration of 2% (vol/vol) chicken...

  19. Toward an International Standard for PCR-Based Detection of Food-Borne Thermotolerant Campylobacters: Assay Development and Analytical Validation

    OpenAIRE

    Lübeck, P. S.; Wolffs, P.; On, S. L. W.; Ahrens, P.; Rådström, P.; Hoorfar, J.

    2003-01-01

    As part of a European research project (FOOD-PCR), we developed a standardized and robust PCR detection assay specific for the three most frequently reported food-borne pathogenic Campylobacter species, C. jejuni, C. coli, and C. lari. Fifteen published and unpublished PCR primers targeting the 16S rRNA gene were tested in all possible pairwise combinations, as well as two published primers targeting the 23S rRNA gene. A panel of 150 strains including target an...

  20. The modified 2VO ischemia protocol causes cognitive impairment similar to that induced by the standard method, but with a better survival rate

    Directory of Open Access Journals (Sweden)

    F. Cechetti

    2010-12-01

    Full Text Available Permanent bilateral occlusion of the common carotid arteries (2VO in the rat has been established as a valid experimental model to investigate the effects of chronic cerebral hypoperfusion on cognitive function and neurodegenerative processes. Our aim was to compare the cognitive and morphological outcomes following the standard 2VO procedure, in which there is concomitant artery ligation, with those of a modified protocol, with a 1-week interval between artery occlusions to avoid an abrupt reduction of cerebral blood flow, as assessed by animal performance in the water maze and damage extension to the hippocampus and striatum. Male Wistar rats (N = 47 aged 3 months were subjected to chronic hypoperfusion by permanent bilateral ligation of the common carotid arteries using either the standard or the modified protocol, with the right carotid being the first to be occluded. Three months after the surgical procedure, rat performance in the water maze was assessed to investigate long-term effects on spatial learning and memory and their brains were processed in order to estimate hippocampal volume and striatal area. Both groups of hypoperfused rats showed deficits in reference (F(8,172 = 7.0951, P < 0.00001 and working spatial memory [2nd (F(2,44 = 7.6884, P < 0.001, 3rd (F(2,44 = 21.481, P < 0.00001 and 4th trials (F(2,44 = 28.620, P < 0.0001]; however, no evidence of tissue atrophy was found in the brain structures studied. Despite similar behavioral and morphological outcomes, the rats submitted to the modified protocol showed a significant increase in survival rate, during the 3 months of the experiment (P < 0.02.

  1. DXA in the assessment of subchondral bone mineral density in knee osteoarthritis--A semi-standardized protocol after systematic review.

    Science.gov (United States)

    Sepriano, Alexandre; Roman-Blas, Jorge A; Little, Robert D; Pimentel-Santos, Fernando; Arribas, Jose María; Largo, Raquel; Branco, Jaime C; Herrero-Beaumont, Gabriel

    2015-12-01

    Subchondral bone mineral density (sBMD) contributes to the initiation and progression of knee osteoarthritis (OA). Reliable methods to assess sBMD status may predict the response of specific OA phenotypes to targeted therapies. While dual-energy X-ray absorptiometry (DXA) of the knee can determine sBMD, no consensus exists regarding its methodology. Construct a semi-standardized protocol for knee DXA to measure sBMD in patients with OA of the knee by evaluating the varying methodologies present in existing literature. We performed a systematic review of original papers published in PubMed and Web of Science from their inception to July 2014 using the following search terms: subchondral bone, osteoarthritis, and bone mineral density. DXA of the knee can be performed with similar reproducibility values to those proposed by the International Society for Clinical Densitometry for the hip and spine. We identified acquisition view, hip rotation, knee positioning and stabilization, ROI location and definition, and the type of analysis software as important sources of variation. A proposed knee DXA protocol was constructed taking into consideration the results of the review. DXA of the knee can be reliably performed in patients with knee OA. Nevertheless, we found substantial methodological variation across previous studies. Methodological standardization may provide a foundation from which to establish DXA of the knee as a valid tool for identification of SB changes and as an outcome measure in clinical trials of disease modifying osteoarthritic drugs. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Standardizing serum 25-hydroxyvitamin D data from four Nordic population samples using the Vitamin D Standardization Program protocols: Shedding new light on vitamin D status in Nordic individuals

    DEFF Research Database (Denmark)

    Cashman, Kevin D; Dowling, Kirsten G; Škrabáková, Zuzana

    2015-01-01

    Knowledge about the distributions of serum 25-hydroxyvitamin D (25(OH)D) concentrations in representative population samples is critical for the quantification of vitamin D deficiency as well as for setting dietary reference values and food-based strategies for its prevention. Such data for the E...... standardization. In conclusion, standardization of serum 25(OH)D concentrations is absolutely necessary in order to compare serum 25(OH)D concentrations across different study populations, which is needed to quantify and prevent vitamin D deficiency.......Knowledge about the distributions of serum 25-hydroxyvitamin D (25(OH)D) concentrations in representative population samples is critical for the quantification of vitamin D deficiency as well as for setting dietary reference values and food-based strategies for its prevention. Such data...... for the European Union are of variable quality making it difficult to estimate the prevalence of vitamin D deficiency across member states. As a consequence of the widespread, method-related differences in measurements of serum 25(OH)D concentrations, the Vitamin D Standardization Program (VDSP) developed...

  3. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  4. Towards an international standard for PCR-based detection of food-borne thermotolerant Campylobacters: assay development and analytical validation.

    OpenAIRE

    Lübeck, P S; Wolffs, Petra; On, S L; Ahrens, P; Rådström, Peter; Hoorfar, J

    2003-01-01

    As part of a European research project (FOOD-PCR), we developed a standardized and robust PCR detection assay specific for the three most frequently reported food-borne pathogenic Campylobacter species, C. jejuni, C. coli, and C. lari. Fifteen published and unpublished PCR primers targeting the 16S rRNA gene were tested in all possible pairwise combinations, as well as two published primers targeting the 23S rRNA gene. A panel of 150 strains including target and nontarget strains was used in ...

  5. Individual versus standard dose of rFSH in a mild stimulation protocol for intrauterine insemination: a randomized study

    DEFF Research Database (Denmark)

    la Cour Freiesleben, N; Lossl, K; Bogstad, J

    2009-01-01

    -stimulating hormone (rFSH) dosage nomogram. The nomogram has now been tested. METHODS: Multicentre randomized controlled trial (RCT) including 228 ovulatory patients scheduled for COS and IUI. Patients were randomized to 'individual' (50-100 IU rFSH/day, n = 113) or 'standard' (75 IU rFSH/day, n = 115) dose......' group and 21/115 (18%) in the 'standard' group and the rate of multiple gestations was 1/113 (1%) versus 5/115 (4%), P = 0.21. CONCLUSIONS: This RCT is the first to clinically test a dosage nomogram in ovulatory IUI patients' first rFSH treatment cycle. Dosing according to the nomogram was superior...

  6. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Bang-Cheng Tang

    2016-01-01

    Full Text Available Multivariate calibration (MVC and near-infrared (NIR spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA net analyte signal (NAS method (SANAS for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders.

  7. Randomised social-skills training and parental training plus standard treatment versus standard treatment of children with attention deficit hyperactivity disorder - The SOSTRA trial protocol

    Directory of Open Access Journals (Sweden)

    Thomsen Per

    2011-01-01

    Full Text Available Abstract Background Children with attention deficit hyperactivity disorder (ADHD are hyperactive and impulsive, cannot maintain attention, and have difficulties with social interactions. Medical treatment may alleviate symptoms of ADHD, but seldom solves difficulties with social interactions. Social-skills training may benefit ADHD children in their social interactions. We want to examine the effects of social-skills training on difficulties related to the children's ADHD symptoms and social interactions. Methods/Design The design is randomised two-armed, parallel group, assessor-blinded trial. Children aged 8-12 years with a diagnosis of ADHD are randomised to social-skills training and parental training plus standard treatment versus standard treatment alone. A sample size calculation estimated that at least 52 children must be included to show a 4-point difference in the primary outcome on the Conners 3rd Edition subscale for 'hyperactivity-impulsivity' between the intervention group and the control group. The outcomes will be assessed 3 and 6 months after randomisation. The primary outcome measure is ADHD symptoms. The secondary outcome is social skills. Tertiary outcomes include the relationship between social skills and symptoms of ADHD, the ability to form attachment, and parents' ADHD symptoms. Discussion We hope that the results from this trial will show that the social-skills training together with medication may have a greater general effect on ADHD symptoms and social and emotional competencies than medication alone. Trial registration ClinicalTrials (NCT: NCT00937469

  8. Randomised social-skills training and parental training plus standard treatment versus standard treatment of children with attention deficit hyperactivity disorder - the SOSTRA trial protocol.

    Science.gov (United States)

    Storebø, Ole Jakob; Pedersen, Jesper; Skoog, Maria; Thomsen, Per Hove; Winkel, Per; Gluud, Christian; Simonsen, Erik

    2011-01-21

    Children with attention deficit hyperactivity disorder (ADHD) are hyperactive and impulsive, cannot maintain attention, and have difficulties with social interactions. Medical treatment may alleviate symptoms of ADHD, but seldom solves difficulties with social interactions. Social-skills training may benefit ADHD children in their social interactions. We want to examine the effects of social-skills training on difficulties related to the children's ADHD symptoms and social interactions. The design is randomised two-armed, parallel group, assessor-blinded trial. Children aged 8-12 years with a diagnosis of ADHD are randomised to social-skills training and parental training plus standard treatment versus standard treatment alone. A sample size calculation estimated that at least 52 children must be included to show a 4-point difference in the primary outcome on the Conners 3rd Edition subscale for 'hyperactivity-impulsivity' between the intervention group and the control group. The outcomes will be assessed 3 and 6 months after randomisation. The primary outcome measure is ADHD symptoms. The secondary outcome is social skills. Tertiary outcomes include the relationship between social skills and symptoms of ADHD, the ability to form attachment, and parents' ADHD symptoms. We hope that the results from this trial will show that the social-skills training together with medication may have a greater general effect on ADHD symptoms and social and emotional competencies than medication alone. ClinicalTrials (NCT): NCT00937469.

  9. Analysis of the Implementation of Standardized Clinical Protocol «Diabetes Mellitus Type 2» by Quality Indicators in Institutions of Kyiv Region

    Directory of Open Access Journals (Sweden)

    V.I. Tkachenko

    2014-10-01

    Full Text Available In Ukraine, a standardized clinical protocol (SCP to provide medical care in diabetes mellitus type 2 (order of the Ministry of Healthcare of Ukraine dated 21.12.2012 № 1118, which identifies 4 quality indicators, is being implemented. The objective of research — to analyze the implementation of SCP based on monitoring of quality indicators in the institutions of the Kyiv region. Materials and Methods. Technique for assessing the quality of diabetes care, one element of which is the monitoring of quality indicators specified in SCP, has been developed and applied. Collection and analysis of information was carried out by forms of primary records № 025/030 and 030/o, forms of statistical reporting № 12 and 20. Statistical analysis was performed using Excel 2007, SPSS. Results. Today, primary health care institutions in Kyiv region developed local protocols that confirms the implementation of the first quality indicator, in accordance with the desired level of the indicator value by SCP. The second indicator — the percentage of patients who were defined the level of glycated hemoglobin in the reporting period amounted to 12.2 %, which is higher than in 2012 (8.84 %, but remains low. The third quality indicator — the percentage of patients who were admitted to hospital for diabetes mellitus and its complications during the reporting period amounted to 15.01 %, while in 2012 it stood at 8.66 %. For comparison, this figure in 2007 was 9.37 %. Conclusions. The quality of care at an early stage of implementation is not enough, partly due to the lack of awareness by physicians of major provisions of the protocol, lack of equipment, the need of payment by a patient for medical services specified in the protocol, lack of doctors’ understanding of the characteristics of different types of medical and technological documents and difficulties in the development and implementation of local protocols, particularly. The obtained results are

  10. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee

    Science.gov (United States)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Wireless Sensor Networks (WSNs) can provide a substantial benefit in spacecraft systems, reducing launch weight and providing unprecedented flexibility by allowing instrumentation capabilities to grow and change over time. Achieving data transport reliability on par with that of wired systems, however, can prove extremely challenging in practice. Fortunately, much progress has been made in developing standard WSN radio protocols for applications from non-critical home automation to mission-critical industrial process control. The relative performances of candidate protocols must be compared in representative aerospace environments, however, to determine their suitability for spaceflight applications. In this paper, we will present the results of a rigorous laboratory analysis of the performance of two standards-based, low power, low data rate WSN protocols: ZigBee Pro and ISA100.11a. Both are based on IEEE 802.15.4 and augment that standard's specifications to build complete, multi-hop networking stacks. ZigBee Pro targets primarily the home and office automation markets, providing an ad-hoc protocol that is computationally lightweight and easy to implement in inexpensive system-on-a-chip components. As a result of this simplicity, however, ZigBee Pro can be susceptible to radio frequency (RF) interference. ISA100.11a, on the other hand, targets the industrial process control market, providing a robust, centrally-managed protocol capable of tolerating a significant amount of RF interference. To achieve these gains, a coordinated channel hopping mechanism is employed, which entails a greater computational complexity than ZigBee and requires more sophisticated and costly hardware. To guide future aerospace deployments, we must understand how well these standards relatively perform in analog environments under expected operating conditions. Specifically, we are interested in evaluating goodput -- application level throughput -- in a representative crewed environment

  11. Role of sediment-trace element chemistry in water-quality monitoring and the need for standard analytical methods

    Science.gov (United States)

    Horowitz, Arthur J.

    1991-01-01

    Multiple linear regression models calculated from readily obtainable chemical and physical parameters can explain a high percentage (70% or greater) of observed sediment trace-element variance for Cu, Zn, Pb, Cr, Ni, Co, As, Sb, Se, and Hg. Almost all the factors used in the various models fall into the category of operational definitions (e.g., grain size, surface area, and geochemical substrates such as amorphous iron and manganese oxides). Thus, the concentrations and distributions used in the various models are operationally defined, and are subject to substantial change depending on the method used to determine them. Without standardized procedures, data from different sources are not comparable, and the utility and applicability of the various models would be questionable.

  12. Development and validation of an internationally-standardized, high-resolution capillary gel-based electrophoresis PCR-ribotyping protocol for Clostridium difficile.

    Directory of Open Access Journals (Sweden)

    Warren N Fawley

    Full Text Available PCR-ribotyping has been adopted in many laboratories as the method of choice for C. difficile typing and surveillance. However, issues with the conventional agarose gel-based technique, including inter-laboratory variation and interpretation of banding patterns have impeded progress. The method has recently been adapted to incorporate high-resolution capillary gel-based electrophoresis (CE-ribotyping, so improving discrimination, accuracy and reproducibility. However, reports to date have all represented single-centre studies and inter-laboratory variability has not been formally measured or assessed. Here, we achieved in a multi-centre setting a high level of reproducibility, accuracy and portability associated with a consensus CE-ribotyping protocol. Local databases were built at four participating laboratories using a distributed set of 70 known PCR-ribotypes. A panel of 50 isolates and 60 electronic profiles (blinded and randomized were distributed to each testing centre for PCR-ribotype identification based on local databases generated using the standard set of 70 PCR-ribotypes, and the performance of the consensus protocol assessed. A maximum standard deviation of only ±3.8bp was recorded in individual fragment sizes, and PCR-ribotypes from 98.2% of anonymised strains were successfully discriminated across four ribotyping centres spanning Europe and North America (98.8% after analysing discrepancies. Consensus CE-ribotyping increases comparability of typing data between centres and thereby facilitates the rapid and accurate transfer of standardized typing data to support future national and international C. difficile surveillance programs.

  13. R2SM: a package for the analytic computation of the R{sub 2} Rational terms in the Standard Model of the Electroweak interactions

    Energy Technology Data Exchange (ETDEWEB)

    Garzelli, M.V. [INFN (Italy); Granada Univ. (Spain). Dept. de Fisica Teorica y del Cosmos y CAFPE; Malamos, I. [Radboud Universiteit Nijmegen, Department of Theoretical High Energy Physics, Institute for Mathematics, Astrophysics and Particle Physics, Nijmegen (Netherlands)

    2011-03-15

    The analytical package written in FORM presented in this paper allows the computation of the complete set of Feynman Rules producing the Rational terms of kind R{sub 2} contributing to the virtual part of NLO corrections in the Standard Model of the Electroweak interactions. Building block topologies filled by means of generic scalars, vectors and fermions, allowing to build these Feynman Rules in terms of specific elementary particles, are explicitly given in the R{sub {xi}} gauge class, together with the automatic dressing procedure to obtain the Feynman Rules from them. The results in more specific gauges, like the 't Hooft Feynman one, follow as particular cases, in both the HV and the FDH dimensional regularization schemes. As a check on our formulas, the gauge independence of the total Rational contribution (R{sub 1}+R{sub 2}) to renormalized S-matrix elements is verified by considering the specific example of the H {yields}{gamma}{gamma} decay process at 1-loop. This package can be of interest for people aiming at a better understanding of the nature of the Rational terms. It is organized in a modular way, allowing a further use of some its files even in different contexts. Furthermore, it can be considered as a first seed in the effort towards a complete automation of the process of the analytical calculation of the R{sub 2} effective vertices, given the Lagrangian of a generic gauge theory of particle interactions. (orig.)

  14. Comparability of the expanded WMS-III standardization protocol to the published WMS-III among right and left temporal lobectomy patients.

    Science.gov (United States)

    Doss, R C; Chelune, G J; Naugle, R I

    2000-11-01

    We examined whether differences between the expanded standardization protocol (SP) used to derive norms for the final published version (PB) of the Wechsler Memory Scale - Third Edition (WMS-III; Wechsler, 1997a) would result in differences on the Primary Indexes in a neurologic sample. Specifically, we examined the comparability of the performances of 63 patients with temporal lobectomy (TL) who were administered either the expanded SP protocol (n = 33: 22 left TL and 11 right TL) or the PB battery (n = 30: 11 left TL and 19 right TL). Patients who were administered the SP or PB were comparable in terms of age, sex, education, seizure duration, postsurgical seizure status, and Full Scale IQ. Postoperative intervals were significantly longer for the SP group, although correlational analyses demonstrated no significant relationship between postoperative follow-up interval and WMS-III performance. A series of t tests revealed no significant differences on any of the eight Primary Index scores between patients taking the two versions of the WMS-III for either left or right TL groups. Furthermore, repeated measures analyses of variance failed to show significant differences on modality-specific memory scores between the SP and PB for the left and right TL groups. The current study indicates that temporal lobectomy patients obtained comparable scores on the two versions of the WMS-III.

  15. Clinical-pathological findings of otitis media and media-interna in calves and (clinical) evaluation of a standardized therapeutic protocol.

    Science.gov (United States)

    Bertone, I; Bellino, C; Alborali, G L; Cagnasso, A; Cagnotti, G; Dappiano, E; Lizzi, M; Miciletta, M; Ramacciotti, A; Gianella, P; D'Angelo, A

    2015-12-03

    The aims of this field trial were to describe the clinical-pathologic findings in calves with otitis media (OM) and media-interna (OMI), to evaluate, through the development of a scoring system, the effectiveness of a standardized therapeutic protocol, and to identify the causative pathogens and their possible correlation with concurrent respiratory disease. All animals underwent physical and neurological examinations at three experimental time points: at diagnosis/beginning of treatment (T0), 1 week (T1) and 2 weeks (T2) after therapy was started, respectively. Follow-up telephone interviews with animal owners were conducted 1 month later. The therapeutic protocol consisted of tulathromycin (Draxxin®; Zoetis), oxytetracycline hydrochloride (Terramicina 100®; Zoetis), and carprofen (Rimadyl®; Zoetis). Twenty-two calves were enrolled. Physical and otoscopic examination at T0 revealed monolateral and bilateral otorrhea in 16 and 6 calves, respectively, with peripheral vestibular system involvement in calves presenting with neurological signs (n = 17; 77 %). A significant improvement of clinical and neurological scores was observed in 20 (90 %) calves, a full recovery in only 1 (5 %). One calf worsened between T0 and T1 and it was removed from the study. None of the other animals showed a worsening of clinical conditions and/or required further treatments at one month follow up. Mycoplasma bovis was isolated in 89 % of the affected ears either alone or together with P. multocida (n = 5), Streptococcus spp. (n = 1), Staphylococcus spp. (n = 1), and Pseudomonas spp. (n = 1). M. bovis either alone or together with these bacteria was also isolated from the upper and/or lower respiratory tract in 19 (86 %) calves. This is the first prospective study to evaluate the effectiveness of a standardized therapeutic protocol for the treatment of OM/OMI in calves. The therapy led to clinical improvement in the majority of the calves. Persistence of mild clinical

  16. [Developing and standardizing experimental protocols using human iPS-derived cells to predict adverse drug reactions in pre-clinical safety studies].

    Science.gov (United States)

    Sekino, Yuko; Sato, Kaoru; Kanda, Yasunari; Ishida, Seiichi

    2013-01-01

    In this study, we have standardized experimental protocols to evaluate the possibility of using cells differentiated from human induced pluripotent stem cells (hiPSCs) in the pre-clinical studies for the drug approval processes. Cells differentiated from hiPSC, especially cardiomyocytes, neurons and hepatocytes, are expected to be used as new pharmacological and toxicological assay tools. Current preclinical test methods have limitations for predicting clinical adverse drug reactions. This is because of the so-called 'problem of species difference'. Drug-induced arrhythmia, cognitive impairment and hepatotoxicity which can't be predicted in pre-clinical studies are major causes of the high rate attrition of new-drug candidates in clinical studies and of withdrawal of products from the market. The development of new pre-clinical test methods using cells differentiated from hiPSCs would resolve these problems, in addition to solving the issue of "the replacement, refinement and reduction (3Rs)" of animal experiments. From 2010 to 2011, we surveyed companies belonging to the Japan Pharmaceutical Manufacturers Association (JPMA) and academic researchers about the usage of differentiated cells in their laboratories. We found that studies were performed using differentiated cells from different cell lines of hiPSC with laboratory-specific differentiation methods. The cells were cultured in various conditions and their activities were measured using different methods. This resulted in a variety of pharmacological responses of the cells. It is therefore impossible to compare reproducibility and ensure reliability of experiments using these cells. To utilize the cells in the drug approval processes, we need robust, standardized test methods to accurately reproduce these methods in all laboratories. We will then be able to compare and analyze the obtained results. Based on the survey, the Ministry of Health, Labor and Welfare funded our study. In our study, we standardize

  17. Validation of a standard forensic anthropology examination protocol by measurement of applicability and reliability on exhumed and archive samples of known biological attribution.

    Science.gov (United States)

    Francisco, Raffaela Arrabaça; Evison, Martin Paul; Costa Junior, Moacyr Lobo da; Silveira, Teresa Cristina Pantozzi; Secchieri, José Marcelo; Guimarães, Marco Aurelio

    2017-10-01

    Forensic anthropology makes an important contribution to human identification and assessment of the causes and mechanisms of death and body disposal in criminal and civil investigations, including those related to atrocity, disaster and trafficking victim identification. The methods used are comparative, relying on assignment of questioned material to categories observed in standard reference material of known attribution. Reference collections typically originate in Europe and North America, and are not necessarily representative of contemporary global populations. Methods based on them must be validated when applied to novel populations. This study describes the validation of a standardized forensic anthropology examination protocol by application to two contemporary Brazilian skeletal samples of known attribution. One sample (n=90) was collected from exhumations following 7-35 years of burial and the second (n=30) was collected following successful investigations following routine case work. The study presents measurement of (1) the applicability of each of the methods: used and (2) the reliability with which the biographic parameters were assigned in each case. The results are discussed with reference to published assessments of methodological reliability regarding sex, age and-in particular-ancestry estimation. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Progressing towards more quantitative analytical pyrolysis of soil organic matter using molecular beam mass spectroscopy of whole soils and added standards

    Energy Technology Data Exchange (ETDEWEB)

    Haddix, Michelle L.; Magrini-Bair, Kim; Evans, Robert J.; Conant, Richard T.; Wallenstein, Matthew D.; Morris, Sherri J.; Calderón, Francisco; Paul, Eldor A.

    2016-12-01

    .8 g C kg-1) and low % clay (5.4%) having the least interference and the Colorado soil with low C (14.6 g C kg-1) and a moderate smectite clay content of 14% having the greatest soil interference. Due to soil interference from clay type and content and varying optimum temperatures of pyrolysis for different compounds it is unlikely that analytical pyrolysis can be quantitative for all types of compounds. Select compound categories such as carbohydrates have the potential to be quantified in soil with analytical pyrolysis due to the fact that they: 1) almost fully pyrolyzed, 2) were represented by a limited number of m/z, and 3) had a strong relationship with the amount added and the total ion intensity produced. The three different soils utilized in this study had similar proportions of C pyrolyzed in the whole soil (54-57%) despite differences in %C and %clay between the soils. Mid-infrared spectroscopic analyses of the soil before and after pyrolysis showed that pyrolysis resulted in reductions in the 3400, 2930-2870, 1660 and 1430 cm-1 bands. These bands are primarily representative of O-H and N-H bonds, C-H stretch, and ..delta.. (CH2) in polysaccharides/lipid and are associated with mineralizable SOM. The incorporation of standards into routine analytical pyrolysis allowed us to assess the quantitative potential of py-MBMS along with the effect of the mineral matrix, which we believe is applicable to all forms of analytical pyrolysis.

  19. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  20. Incremental yield of dysplasia detection in Barrett's esophagus using volumetric laser endomicroscopy with and without laser marking compared with a standardized random biopsy protocol.

    Science.gov (United States)

    Alshelleh, Mohammad; Inamdar, Sumant; McKinley, Matthew; Stewart, Molly; Novak, Jeffrey S; Greenberg, Ronald E; Sultan, Keith; Devito, Bethany; Cheung, Mary; Cerulli, Maurice A; Miller, Larry S; Sejpal, Divyesh V; Vegesna, Anil K; Trindade, Arvind J

    2018-02-02

    Volumetric laser endomicroscopy (VLE) is a new wide-field advanced imaging technology for Barrett's esophagus (BE). No data exist on incremental yield of dysplasia detection. Our aim is to report the incremental yield of dysplasia detection in BE using VLE. This is a retrospective study from a prospectively maintained database from 2011 to 2017 comparing the dysplasia yield of 4 different surveillance strategies in an academic BE tertiary care referral center. The groups were (1) random biopsies (RB), (2) Seattle protocol random biopsies (SP), (3) VLE without laser marking (VLE), and (4) VLE with laser marking (VLEL). A total of 448 consecutive patients (79 RB, 95 SP, 168 VLE, and 106 VLEL) met the inclusion criteria. After adjusting for visible lesions, the total dysplasia yield was 5.7%, 19.6%, 24.8%, and 33.7%, respectively. When compared with just the SP group, the VLEL group had statistically higher rates of overall dysplasia yield (19.6% vs 33.7%, P = .03; odds ratio, 2.1, P = .03). Both the VLEL and VLE groups had statistically significant differences in neoplasia (high-grade dysplasia and intramucosal cancer) detection compared with the SP group (14% vs 1%, P = .001 and 11% vs 1%, P = .003). A surveillance strategy involving VLEL led to a statistically significant higher yield of dysplasia and neoplasia detection compared with a standard random biopsy protocol. These results support the use of VLEL for surveillance in BE in academic centers. Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  1. Determination and Standardization of Analytical Conditions for Dissolved Boron in Coastal Waters of East Sea in Korea by ICP-OES

    Science.gov (United States)

    Yoon, H.; Shin, M.; Yoon, C.; Lee, J.

    2005-12-01

    The analysis of metals in seawaters has been an important subject for many years. Achieving low-level detection limits as well as overcoming high matrix effect are requested in seawater analysis especially elements of interest are present in various chemical forms. Among them, boron is one of the widely distributed elements in nature and its concentrations of about 10 ppm in the Earth's crust and about 4.5 ppm in the seawater as borates. In seawater boron concentration exhibit a linear relationship to the amount of chloride ion present. Boron had been considered as one of the valuable elements to recover from seawaters for commercial use. Currently, we launched research team for the production of valuable metals from seawaters in Korea that can be used commercially. Several metals including boron were already under serious studies. In this study we aim to prepare standardized operational procedures in analysis of boron during pilot study for boron recovery as pilot recovery process. Inductively coupled plasma Optical Emission Spectrometry (ICP-OES) method is preferred for the analysis of the low levels of boron found in environmental samples such as seawater. In order to develop test method for the determination of dissolved Boron from East Sea Seawater in Korea, all soluble boron present in seawater has been tested and accuracy of measurement was checked from the sampling step. The result of analysis of boron in seawaters presents many difficult problems, ionization of from the alkali and alkaline earth metals. And the problems also exist in handling nebulizer and injector tubes in high saline solutions. The scope of this study was to determine boron which can contain up to 35psu dissolved salt. The work also included comparing various analytical methods for better accurate results in several solution conditions. Dilution, standard addition, matrix matching calibration methods was thoroughly tested differently and detailed operating conditions for using auxiliary

  2. Establishment of a Standard Analytical Model of Distribution Network with Distributed Generators and Development of Multi Evaluation Method for Network Configuration Candidates

    Science.gov (United States)

    Hayashi, Yasuhiro; Kawasaki, Shoji; Matsuki, Junya; Matsuda, Hiroaki; Sakai, Shigekazu; Miyazaki, Teru; Kobayashi, Naoki

    Since a distribution network has many sectionalizing switches, there are huge radial network configuration candidates by states (opened or closed) of sectionalizing switches. Recently, the total number of distributed generation such as photovoltaic generation system and wind turbine generation system connected to the distribution network is drastically increased. The distribution network with the distributed generators must be operated keeping reliability of power supply and power quality. Therefore, the many configurations of the distribution network with the distributed generators must be evaluated multiply from various viewpoints such as distribution loss, total harmonic distortion, voltage imbalance and so on. In this paper, the authors propose a multi evaluation method to evaluate the distribution network configuration candidates satisfied with constraints of voltage and line current limit from three viewpoints ((1) distribution loss, (2) total harmonic distortion and (3) voltage imbalance). After establishing a standard analytical model of three sectionalized and three connected distribution network configuration with distributed generators based on the practical data, the multi evaluation for the established model is carried out by using the proposed method based on EMTP (Electro-Magnetic Transients Programs).

  3. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  4. The standard treatment protocol for paracetamol poisoning may be inadequate following overdose with modified release formulation: a pharmacokinetic and clinical analysis of 53 cases.

    Science.gov (United States)

    Salmonson, Heléne; Sjöberg, Gunilla; Brogren, Jacob

    2018-01-01

    The use of the standard procedure for managing overdoses with immediate release (IR) paracetamol is questionable when applied to overdoses with modified release (MR) formulations. This study describes the pharmacokinetics of paracetamol and the clinical outcomes following overdoses with a MR formulation. Medical records including laboratory analyses concerning overdoses of MR paracetamol from 2009 to 2015 were collected retrospectively. Inclusion criteria were ingestion of a toxic dose, known time of intake and documented measurements of serum paracetamol and liver function tests. Graphical analysis, descriptive statistics and population pharmacokinetic modelling were used to describe data. Fifty-three cases were identified. Median age was 26 years (range 13-68), median dose was 20 g (range 10-166) and 74% were females. The pharmacokinetic analysis showed a complex, dose dependent serum versus time profile with prolonged absorption and delayed serum peak concentrations with increasing dose. Ten patients had persistently high serum levels for 24 h or more, six of them had a second peak 8-19 h after ingestion. Seven of 34 patients receiving N-acetylcysteine (NAC) within 8 h had alanine aminotransferase (ALT) above reference range. Three of them developed hepatotoxicity (ALT >1000 IU/l). The pharmacokinetic and clinical analysis showed that the standard treatment protocol, including risk assessment and NAC regimen, used for IR paracetamol poisoning not appear suitable for MR formulation. Individual and tailored treatment may be valuable but further studies are warranted to determine optimal regimen of overdoses with MR formulation.

  5. Low incidence of clonality in cold water corals revealed through the novel use of a standardized protocol adapted to deep sea sampling

    Science.gov (United States)

    Becheler, Ronan; Cassone, Anne-Laure; Noël, Philippe; Mouchel, Olivier; Morrison, Cheryl L.; Arnaud-Haond, Sophie

    2017-11-01

    Sampling in the deep sea is a technical challenge, which has hindered the acquisition of robust datasets that are necessary to determine the fine-grained biological patterns and processes that may shape genetic diversity. Estimates of the extent of clonality in deep-sea species, despite the importance of clonality in shaping the local dynamics and evolutionary trajectories, have been largely obscured by such limitations. Cold-water coral reefs along European margins are formed mainly by two reef-building species, Lophelia pertusa and Madrepora oculata. Here we present a fine-grained analysis of the genotypic and genetic composition of reefs occurring in the Bay of Biscay, based on an innovative deep-sea sampling protocol. This strategy was designed to be standardized, random, and allowed the georeferencing of all sampled colonies. Clonal lineages discriminated through their Multi-Locus Genotypes (MLG) at 6-7 microsatellite markers could thus be mapped to assess the level of clonality and the spatial spread of clonal lineages. High values of clonal richness were observed for both species across all sites suggesting a limited occurrence of clonality, which likely originated through fragmentation. Additionally, spatial autocorrelation analysis underlined the possible occurrence of fine-grained genetic structure in several populations of both L. pertusa and M. oculata. The two cold-water coral species examined had contrasting patterns of connectivity among canyons, with among-canyon genetic structuring detected in M. oculata, whereas L. pertusa was panmictic at the canyon scale. This study exemplifies that a standardized, random and georeferenced sampling strategy, while challenging, can be applied in the deep sea, and associated benefits outlined here include improved estimates of fine grained patterns of clonality and dispersal that are comparable across sites and among species.

  6. Standardized radiologic protocol for the study of common coccygodynia and characteristics of the lesions observed in the sitting position. Clinical elements differentiating luxation, hypermobility, and normal mobility.

    Science.gov (United States)

    Maigne, J Y; Tamalet, B

    1996-11-15

    Ninety-one patients with common coccygodynia and 47 control subjects prospectively underwent dynamic radiographic imagery. To standardize the radiologic protocol to better define normal and abnormal mobility of the coccyx, and to study clinical parameters useful in classifying and differentiating the lesions. In a previous study, comparison of films taken in the sitting and standing positions allowed to individualize two distinct coccygeal lesions: luxation and hypermobility. Measurement technique was precise and reproducible, but the control group was not pain-free. No specific clinical features were described. Standing films were made first. Control subjects were healthy volunteers. The following items were recorded: presence of an initial traumatic event, elapsed time before investigation, body mass index, presence of an acute pain when passing from sitting to standing, effect of intradiscal steroid injection, and angle of the coccyx with respect to the seat. Hypermobility was defined as a flexion of more than 25 degrees, luxation by displacement of more than 25% of the coccyx. The base angle is a good predictor of the direction in which the coccyx moves when sitting. In the "luxation" group, a history of initial trauma, a shorter clinical course, pain when standing up, increased body mass index, and satisfactory results with intradiscal injection were found more frequently than in the "normal" group. The "hypermobility" group had characteristics between these two groups. Common coccygodynia is associated in 48.4% of patients with a luxation or hypermobility of the coccyx. A distinct clinical presentation was found in individuals with luxation of the coccyx.

  7. Protocols for quantum binary voting

    Science.gov (United States)

    Thapliyal, Kishore; Sharma, Rishi Dutt; Pathak, Anirban

    Two new protocols for quantum binary voting are proposed. One of the proposed protocols is designed using a standard scheme for controlled deterministic secure quantum communication (CDSQC), and the other one is designed using the idea of quantum cryptographic switch, which uses a technique known as permutation of particles. A few possible alternative approaches to accomplish the same task (quantum binary voting) have also been discussed. Security of the proposed protocols is analyzed. Further, the efficiencies of the proposed protocols are computed, and are compared with that of the existing protocols. The comparison has established that the proposed protocols are more efficient than the existing protocols.

  8. Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.

    1986-08-20

    Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.

  9. Validation of the G.LAB MD2200 wrist blood pressure monitor according to the European Society of Hypertension, the British Hypertension Society, and the International Organization for Standardization Protocols.

    Science.gov (United States)

    Liu, Ze-Yu; Zhang, Qing-Han; Ye, Xiao-Lei; Liu, Da-Peng; Cheng, Kang; Zhang, Chun-Hai; Wan, Yi

    2017-04-01

    To validate the G.LAB MD2200 automated wrist blood pressure (BP) monitors according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010, the British Hypertension Society (BHS), and the International Organization for Standardization (ISO) 81060-2:2013 protocols. The device was assessed on 33 participants according to the ESH requirements and was then tested on 85 participants according to the BHS and ISO 81060-2:2013 criteria. The validation procedures and data analysis followed the protocols precisely. The G.LAB MD2200 devices passed all parts of ESH-IP revision 2010 for both systolic and diastolic BP, with a device-observer difference of 2.15±5.51 and 1.51±5.16 mmHg, respectively. The device achieved A/A grading for the BHS protocol and it also fulfilled the criteria of ISO 81060-2:2013, with mean differences of systolic and diastolic BP between the device and the observer of 2.19±5.21 and 2.11±4.70 mmHg, respectively. The G.LAB MD2200 automated wrist BP monitor passed the ESH-IP revision 2010 and the ISO 81060-2:2013 protocol, and achieved the A/A grade of the BHS protocol, which can be recommended for self-measurement in the general population.

  10. Sepsis and Critical Illness Research Center investigators: protocols and standard operating procedures for a prospective cohort study of sepsis in critically ill surgical patients.

    Science.gov (United States)

    Loftus, Tyler J; Mira, Juan C; Ozrazgat-Baslanti, Tezcan; Ghita, Gabriella L; Wang, Zhongkai; Stortz, Julie A; Brumback, Babette A; Bihorac, Azra; Segal, Mark S; Anton, Stephen D; Leeuwenburgh, Christiaan; Mohr, Alicia M; Efron, Philip A; Moldawer, Lyle L; Moore, Frederick A; Brakenridge, Scott C

    2017-08-01

    Sepsis is a common, costly and morbid cause of critical illness in trauma and surgical patients. Ongoing advances in sepsis resuscitation and critical care support strategies have led to improved in-hospital mortality. However, these patients now survive to enter state of chronic critical illness (CCI), persistent low-grade organ dysfunction and poor long-term outcomes driven by the persistent inflammation, immunosuppression and catabolism syndrome (PICS). The Sepsis and Critical Illness Research Center (SCIRC) was created to provide a platform by which the prevalence and pathogenesis of CCI and PICS may be understood at a mechanistic level across multiple medical disciplines, leading to the development of novel management strategies and targeted therapies. Here, we describe the design, study cohort and standard operating procedures used in the prospective study of human sepsis at a level 1 trauma centre and tertiary care hospital providing care for over 2600 critically ill patients annually. These procedures include implementation of an automated sepsis surveillance initiative, augmentation of clinical decisions with a computerised sepsis protocol, strategies for direct exportation of quality-filtered data from the electronic medical record to a research database and robust long-term follow-up. This study has been registered at ClinicalTrials.gov, approved by the University of Florida Institutional Review Board and is actively enrolling subjects. Dissemination of results is forthcoming. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Definition of a standard protocol to determine the growth potential of Listeria monocytogenes and Yersinia enterocolitica in pork sausage produced in Abruzzo Region, Italy

    Directory of Open Access Journals (Sweden)

    Anna Franca Sperandii

    2015-11-01

    Full Text Available Pork meat products consumed raw or after a short period of fermentation can be considered at risk for food safety. Sausages (fresh sausage made from pork meat are produced in several Italian regions, with variation in ingredients. In some Italian Regions, including Abruzzo, these products are frequently consumed raw or undercooked, after a variable period of fermentation. The European Community food regulation promotes the use of challenge tests to determine safety levels. This study is aimed to ensure safety of Abruzzo’s sausages, compared with growth potential (δ of Listeria monocytogenes and Yersinia enterocolitica, and also aims to define an experimental standard protocol document to carry out challenge tests. Guidelines classify ready-to-eat foods in categories that are able to support (δ>0.5 log10 ufc/g and not support (δ≤0.5 log10 ufc/g the growth of Listeria monocytogenes. The products were manufactured according to traditional recipes and were contaminated in laboratory. Results from the experiment yielded information useful to assess the ability of these products to support the growth of pathogenic microorganisms. The batches of sausages were stored at 8, 12, 18 and 20°C to get statistical evaluation. The results showed that, despite the conditioning of the storage temperature and the level of water activity, both organisms remain in the product in concentrations similar to those leading or being able to increase its charge. In particular, the period of greatest consumption of this product (7/8 days of preparation corresponds to the period of greatest growth of pathogenic microorganisms studied, except for those stored at a temperature of 8°C, which are safer for the consumer.

  12. Definition of a Standard Protocol to Determine the Growth Potential ofListeria MonotgenesandYersinia Enterocoliticain Pork Sausage Produced in Abruzzo Region, Italy.

    Science.gov (United States)

    Sperandii, Anna Franca; Neri, Diana; Romantini, Romina; Santarelli, Gino Angelo; Prencipe, Vincenza

    2015-11-02

    Pork meat products consumed raw or after a short period of fermentation can be considered at risk for food safety. Sausages (fresh sausage made from pork meat) are produced in several Italian regions, with variation in ingredients. In some Italian Regions, including Abruzzo, these products are frequently consumed raw or undercooked, after a variable period of fermentation. The European Community food regulation promotes the use of challenge tests to determine safety levels. This study is aimed to ensure safety of Abruzzo's sausages, compared with growth potential (δ) of Listeria monocytogenes and Yersinia enterocolitica , and also aims to define an experimental standard protocol document to carry out challenge tests. Guidelines classify ready-to-eat foods in categories that are able to support (δ>0.5 log 10 ufc/g) and not support (δ≤0.5 log 10 ufc/g) the growth of Listeria monocytogenes. The products were manufactured according to traditional recipes and were contaminated in laboratory. Results from the experiment yielded information useful to assess the ability of these products to support the growth of pathogenic microorganisms. The batches of sausages were stored at 8, 12, 18 and 20°C to get statistical evaluation. The results showed that, despite the conditioning of the storage temperature and the level of water activity, both organisms remain in the product in concentrations similar to those leading or being able to increase its charge. In particular, the period of greatest consumption of this product (7/8 days of preparation) corresponds to the period of greatest growth of pathogenic microorganisms studied, except for those stored at a temperature of 8°C, which are safer for the consumer.

  13. Indoor fire in a nursing home: evaluation of the medical response to a mass casualty incident based on a standardized protocol.

    Science.gov (United States)

    Koning, S W; Ellerbroek, P M; Leenen, L P H

    2015-04-01

    This retrospective study reports the outcome of a mass casualty incident (MCI) caused by a fire in a nursing home. Data from the medical charts and registration system of the Major Incident Hospital (MIH) and ambulance service were analyzed. The evaluation reports from the MIH and an independent research institute were used. The protocol for reports from major accidents and disaster was used to standardize the reporting [Lennquist, in Int J Disaster Med 1(1):79-86, 2003]. The emergency services were quickly at the scene. The different levels of pre-hospital management performed a tight coordination. However, miscommunication led to confusion in the registration and tracking of patients. In total, 49 persons needed medical treatment, 46 were treated in the MIH. Because of (possible) inhalation injury nine patients needed mechanical ventilation and nine patients were hospitalized to exclude delayed onset of pulmonary symptoms. No incident related deaths occurred. The intensive care unit of the MIH was initially understaffed despite the efforts of the automated calling system and switchboard operators. The handwritten registration of incoming staff was incomplete and should be performed digitally. Some staff members were unfamiliar with the MIH procedures. The medical chart appeared too extensive. Miscommunication between chain partners resulted in the delayed sharing of (semi) medical information. The different levels of incident managers performed a tight coordination. The MIH demonstrated its potency to provide emergency care for 46 patients and 9 intubated patients. No deaths or persistent disabilities occurred. Areas of improvement were recognized both in the pre-hospital as the hospital phase.

  14. Using fMRI to Detect Activation of the Cortical and Subcortical Auditory Centers: Development of a Standard Protocol for a Conventional 1.5-T MRI Scanner

    International Nuclear Information System (INIS)

    Tae, Woo Suk; Kim, Sam Soo; Lee, Kang Uk; Lee, Seung Hwan; Nam, Eui Cheol; Choi, Hyun Kyung

    2009-01-01

    We wanted to develop a standard protocol for auditory functional magnetic resonance imaging (fMRI) for detecting blood oxygenation level-dependent (BOLD) responses at the cortical and subcortical auditory centers with using a 1.5-T MRI scanner. Fourteen normal volunteers were enrolled in the study. The subjects were stimulated by four repetitions of 32 sec each with broadband white noise and silent period blocks as a run (34 echo planar images [EPIs]). Multiple regression analysis for the individual analysis and one-sample t-tests for the group analysis were applied (FDR, p <0.05). The auditory cortex was activated in most of the volunteers (left 100% and right 92.9% at an uncorrected p value <0.05, and left 92.9% and right 92.9% at an uncorreced p value <0.01). The cochlear nuclei (100%, 85.7%), inferior colliculi (71.4%, 64.3%), medial geniculate bodies (64.3%, 35.7%) and superior olivary complexes (35.7%, 35.7%) showed significant BOLD responses at uncorrected p values of <0.05 and p <0.01, respectively. On the group analysis, the cortical and subcortical auditory centers showed significant BOLD responses (FDR, p <0.05), except for the superior olivary complex. The signal intensity time courses of the auditory centers showed biphasic wave forms. We successfully visualized BOLD responses at the cortical and subcortical auditory centers using appropriate sound stimuli and an image acquisition method with a 1.5-T MRI scanner

  15. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    Science.gov (United States)

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre-analytical

  16. International Electronical Committee (IEC 61850 Mapping with Constrained Application Protocol (CoAP in Smart Grids Based European Telecommunications Standard Institute Machine-to-Machine (M2M Environment

    Directory of Open Access Journals (Sweden)

    In-Jae Shin

    2017-03-01

    Full Text Available As power systems develop rapidly into smarter and more flexible configurations, so too must the communication technologies that support them. Machine-to-machine (M2M communication in power systems enables information collection by combining sensors and communication protocols. In doing so, M2M technology supports communication between machines to improve power quality and protection coordination. When functioning in a “smart grid” environment, M2M has been labelled by the European Telecommunications Standard Institute (ETSI. International Electronical Committee (IEC 61850 as the most important standard in power network systems. As evidence, this communication platform has been used for device data collection/control in substation automation systems and distribution automation systems. If the IEC 61850 information model were to be combined with a set of contemporary web protocols, the potential benefits would be enormous. Therefore, a constrained application protocol (CoAP has been adopted to create an ETSI M2M communication architecture. CoAP is compared with other protocols (MQTT, SOAP to demonstrate the validity of using it. This M2M communication technology is applied in an IEC61850, and use the OPNET Modeler 17.1 to demonstrate intercompatibility of CoAP Gateway. The proposed IEC 61850 and CoAP mapping scheme reduces the mapping time and improves throughput. CoAP is useful in the ETSI M2M environment where device capability is able to be limited.

  17. PCR protocols: current methods and applications

    National Research Council Canada - National Science Library

    White, Bruce Alan

    1993-01-01

    ..." between "small" and "big" labs, since its use makes certain projects, especially those related to molecular cloning, now far more feasible for the small lab with a modest budget. This new volume on PCR Protocols does not attempt the impossible task of representing all PCR-based protocols. Rather, it presents a range of protocols, both analytical ...

  18. Systematic assessment of benefits and risks: study protocol for a multi-criteria decision analysis using the Analytic Hierarchy Process for comparative effectiveness research [v1; ref status: indexed, http://f1000r.es/1fk

    Directory of Open Access Journals (Sweden)

    Nisa M Maruthur

    2013-07-01

    Full Text Available Background: Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes.  Methods: This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation.  Discussion: Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent

  19. The Suspected CANcer (SCAN) pathway: protocol for evaluating a new standard of care for patients with non-specific symptoms of cancer.

    Science.gov (United States)

    Nicholson, Brian D; Oke, Jason; Friedemann Smith, Claire; Phillips, Julie-Ann; Lee, Jennifer; Abel, Lucy; Kelly, Sadie; Gould, Isabella; Mackay, Toni; Kaveney, Zoe; Anthony, Suzie; Hayles, Shelley; Lasserson, Daniel; Gleeson, Fergus

    2018-01-21

    Cancer survival in England lags behind most European countries, due partly to lower rates of early stage diagnosis. We report the protocol for the evaluation of a multidisciplinary diagnostic centre-based pathway for the investigation of 'low-risk but not no-risk' cancer symptoms called the Suspected CANcer (SCAN) pathway. SCAN is a new standard of care being implemented in Oxfordshire; one of a number of pathways implemented during the second wave of the Accelerate, Coordinate, Evaluate (ACE) programme, an initiative which aims to improve England's cancer survival rates through establishing effective routes to early diagnosis. To evaluate SCAN, we are collating a prospective database of patients referred onto the pathway by their general practitioner (GP). Patients aged over 40 years, with non-specific symptoms such as weight loss or fatigue, who do not meet urgent cancer referral criteria or for whom symptom causation remains unclear after investigation via other existing pathways, can be referred to SCAN. SCAN provides rapid CT scanning, laboratory testing and clinic review within 2 weeks. We will follow all patients in the primary and secondary care record for at least 2 years. The data will be used to understand the diagnostic yield of the SCAN pathway in the short term (28 days) and the long term (2 years). Routinely collected primary and secondary care data from patients not referred to SCAN but with similar symptoms will also be used to evaluate SCAN. We will map the routes to diagnosis for patients referred to SCAN to assess cost-effectiveness. Acceptability will be evaluated using patient and GP surveys. The Oxford Joint Research Office Study Classification Group has judged this to be a service evaluation and so outside of research governance. The results of this project will be disseminated by peer-reviewed publication and presentation at conferences. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018

  20. Bioremediation protocols

    National Research Council Canada - National Science Library

    Sheehan, David

    1997-01-01

    ..., .. . . . . .. ,. . . .. . . . . . . . .. . . . . .. . . .. . .. 3 2 Granular Nina Sludge Christiansen, Consortia lndra for Bioremediation, M. Mathrani, and Birgitte K. Ahring . 23 PART II PROTOCOLS...

  1. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  2. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report

    OpenAIRE

    Levitt, H. M.; Bamberg, M.; Creswell, J. W.; Frost, D. M.; Josselson, R.; Suárez-Orozco, C.

    2018-01-01

    The American Psychological Association Publications and Communications Board Working Group on Journal Article Reporting Standards for Qualitative Research (JARS–Qual Working Group) was charged with examining the state of journal article reporting standards as they applied to qualitative research and with generating recommendations for standards that would be appropriate for a wide range of methods within the discipline of psychology. These standards describe what should be included in a resea...

  3. ATM and Internet protocol

    CERN Document Server

    Bentall, M; Turton, B

    1998-01-01

    Asynchronous Transfer Mode (ATM) is a protocol that allows data, sound and video being transferred between independent networks via ISDN links to be supplied to, and interpreted by, the various system protocols.ATM and Internet Protocol explains the working of the ATM and B-ISDN network for readers with a basic understanding of telecommunications. It provides a handy reference to everyone working with ATM who may not require the full standards in detail, but need a comprehensive guide to ATM. A substantial section is devoted to the problems of running IP over ATM and there is some discussion o

  4. Analytical chemistry

    Czech Academy of Sciences Publication Activity Database

    Křivánková, Ludmila

    -, č. 22 (2011), s. 718-719 ISSN 1472-3395 Institutional research plan: CEZ:AV0Z40310501 Keywords : analytical chemistry * analytical methods * nanotechnologies Subject RIV: CB - Analytical Chemistry, Separation http://edition.pagesuite-professional.co.uk/launch.aspx?referral=other&pnum=&refresh=M0j83N1cQa91&EID=82bccec1-b05f-46f9-b085-701afc238b42&skip=

  5. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  6. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  7. Influence of centrifugation conditions on the results of 77 routine clinical chemistry analytes using standard vacuum blood collection tubes and the new BD-Barricor tubes.

    Science.gov (United States)

    Cadamuro, Janne; Mrazek, Cornelia; Leichtle, Alexander B; Kipman, Ulrike; Felder, Thomas K; Wiedemann, Helmut; Oberkofler, Hannes; Fiedler, Georg M; Haschke-Becher, Elisabeth

    2018-02-15

    Although centrifugation is performed in almost every blood sample, recommendations on duration and g-force are heterogeneous and mostly based on expert opinions. In order to unify this step in a fully automated laboratory, we aimed to evaluate different centrifugation settings and their influence on the results of routine clinical chemistry analytes. We collected blood from 41 healthy volunteers into BD Vacutainer PST II-heparin-gel- (LiHepGel), BD Vacutainer SST II-serum-, and BD Vacutainer Barricor heparin-tubes with a mechanical separator (LiHepBar). Tubes were centrifuged at 2000xg for 10 minutes and 3000xg for 7 and 5 minutes, respectively. Subsequently 60 and 21 clinical chemistry analytes were measured in plasma and serum samples, respectively, using a Roche COBAS instrument. High sensitive Troponin T, pregnancy-associated plasma protein A, ß human chorionic gonadotropin and rheumatoid factor had to be excluded from statistical evaluation as many of the respective results were below the measuring range. Except of free haemoglobin (fHb) measurements, no analyte result was altered by the use of shorter centrifugation times at higher g-forces. Comparing LiHepBar to LiHepGel tubes at different centrifugation setting, we found higher lactate-dehydrogenase (LD) (P = 0.003 to centrifuged at higher speed (3000xg) for a shorter amount of time (5 minutes) without alteration of the analytes tested in this study. When using LiHepBar tubes for blood collection, a separate LD reference value might be needed.

  8. Delineation of upper urinary tract segments at MDCT urography in patients with extra-urinary mass lesions: retrospective comparison of standard and low-dose protocols for the excretory phase of imaging

    Energy Technology Data Exchange (ETDEWEB)

    Mueller-Lisse, Ulrike L. [University of Munich, Department of Urology, Munich (Germany); University of Munich Medical School, Department of Urology, Muenchen (Germany); Coppenrath, Eva M.; Meindl, Thomas; Degenhart, Christoph; Scherr, Michael K.; Reiser, Maximilian F.; Mueller-Lisse, Ullrich G. [University of Munich, Department of Radiology, Munich (Germany); Stief, Christian G. [University of Munich, Department of Urology, Munich (Germany)

    2011-02-15

    Excretory-phase CT urography (CTU) may replace excretory urography in patients without urinary tumors. However, radiation exposure is a concern. We retrospectively compared upper urinary tract (UUT) delineation in low-dose and standard CTU. CTU (1-2 phases, 120 KV, 4 x 2.5 mm, pitch 0.875, i.v. non-ionic contrast media, iodine 36 g) was obtained with standard (14 patients, n = 27 UUTs, average 175.6 mAs/slice, average delay 16.8 min) or low-dose (26 patients, n = 86 UUTs, 29 mAs/slice, average delay 19.6 min) protocols. UUT was segmented into intrarenal collecting system (IRCS), upper, middle, and lower ureter (UU,MU,LU). Two independent readers (R1,R2) graded UUT segments as 1-not delineated, 2-partially delineated, 3-completely delineated (noisy margins), 4-completely delineated (clear margins). Chi-square statistics were calculated for partial versus complete delineation and complete delineation (clear margins), respectively. Complete delineation of UUT was similar in standard and low-dose CTU (R1, p > 0.15; R2, p > 0.2). IRCS, UU, and MU clearly delineated similarly often in standard and low-dose CTU (R1, p > 0.25; R2, p > 0.1). LU clearly delineated more often in standard protocols (R1, 18/6 standard, 38/31 low-dose, p > 0.1; R2 18/6 standard, 21/48 low-dose, p < 0.05). Low-dose CTU sufficiently delineated course of UUT and may locate obstruction/dilation, but appears unlikely to find intraluminal LU lesions. (orig.)

  9. Delineation of upper urinary tract segments at MDCT urography in patients with extra-urinary mass lesions: retrospective comparison of standard and low-dose protocols for the excretory phase of imaging.

    Science.gov (United States)

    Mueller-Lisse, Ulrike L; Coppenrath, Eva M; Meindl, Thomas; Degenhart, Christoph; Scherr, Michael K; Stief, Christian G; Reiser, Maximilian F; Mueller-Lisse, Ullrich G

    2011-02-01

    Excretory-phase CT urography (CTU) may replace excretory urography in patients without urinary tumors. However, radiation exposure is a concern. We retrospectively compared upper urinary tract (UUT) delineation in low-dose and standard CTU. CTU (1-2 phases, 120 KV, 4 × 2.5 mm, pitch 0.875, i.v. non-ionic contrast media, iodine 36 g) was obtained with standard (14 patients, n = 27 UUTs, average 175.6 mAs/slice, average delay 16.8 min) or low-dose (26 patients, n = 86 UUTs, 29 mAs/slice, average delay 19.6 min) protocols. UUT was segmented into intrarenal collecting system (IRCS), upper, middle, and lower ureter (UU,MU,LU). Two independent readers (R1,R2) graded UUT segments as 1-not delineated, 2-partially delineated, 3-completely delineated (noisy margins), 4-completely delineated (clear margins). Chi-square statistics were calculated for partial versus complete delineation and complete delineation (clear margins), respectively. Complete delineation of UUT was similar in standard and low-dose CTU (R1, p > 0.15; R2, p > 0.2). IRCS, UU, and MU clearly delineated similarly often in standard and low-dose CTU (R1, p > 0.25; R2, p > 0.1). LU clearly delineated more often in standard protocols (R1, 18/6 standard, 38/31 low-dose, p > 0.1; R2 18/6 standard, 21/48 low-dose, p < 0.05). Low-dose CTU sufficiently delineated course of UUT and may locate obstruction/dilation, but appears unlikely to find intraluminal LU lesions.

  10. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report.

    Science.gov (United States)

    Levitt, Heidi M; Bamberg, Michael; Creswell, John W; Frost, David M; Josselson, Ruthellen; Suárez-Orozco, Carola

    2018-01-01

    The American Psychological Association Publications and Communications Board Working Group on Journal Article Reporting Standards for Qualitative Research (JARS-Qual Working Group) was charged with examining the state of journal article reporting standards as they applied to qualitative research and with generating recommendations for standards that would be appropriate for a wide range of methods within the discipline of psychology. These standards describe what should be included in a research report to enable and facilitate the review process. This publication marks a historical moment-the first inclusion of qualitative research in APA Style, which is the basis of both the Publication Manual of the American Psychological Association (APA, 2010) and APA Style CENTRAL, an online program to support APA Style. In addition to the general JARS-Qual guidelines, the Working Group has developed standards for both qualitative meta-analysis and mixed methods research. The reporting standards were developed for psychological qualitative research but may hold utility for a broad range of social sciences. They honor a range of qualitative traditions, methods, and reporting styles. The Working Group was composed of a group of researchers with backgrounds in varying methods, research topics, and approaches to inquiry. In this article, they present these standards and their rationale, and they detail the ways that the standards differ from the quantitative research reporting standards. They describe how the standards can be used by authors in the process of writing qualitative research for submission as well as by reviewers and editors in the process of reviewing research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Cryptographic Protocols:

    DEFF Research Database (Denmark)

    Geisler, Martin Joakim Bittel

    with a variant of the classic BGW protocol. The protocol is secure against a semi-honest adversary. In Chapter 4 we describe a new protocol for VIFF that is secure against malicious adversaries. The protocol guarantees termination if the adversary allows a preprocessing phase to terminate, in which...... systems hosted by an untrusted provider. It guarantees atomic read and write operations on the shared data when the service is correct and preserves fork-linearizability when the service is faulty. A prototype has been implemented on top of the Subversion revision control system; benchmarks show...

  12. Protocol for the building construction process with regard to the implementation trajectory protocols EWN and EUN. Manual for commissioners, contractors, building management offices and energy efficiency standard advisors; Handleiding opnameprotocollen EWN en EUN. Voor opdrachtgevers, aannemers, bouwmanagementbureaus en EPN-adviseurs

    Energy Technology Data Exchange (ETDEWEB)

    Neeleman, J. [DWA installatie- en energieadvies, Duitslandweg 4, Postbus 274, 2410 AG Bodegraven (Netherlands)

    2013-04-15

    In the year 2012 it was foreseen to base the energy label for new buildings on the Energy Efficiency Coefficient (EPC in Dutch). This is a protocol for residential and utility buildings, with the aim to check whether and to what extent buildings were constructed according the EPC and to determine the realized EPC. In order to gain experience with the new protocols and the voluntary ventilation test the Protocol for the Energy Label for New Houses (EWN in Dutch) and the Protocol for the Energy Label for New Utility Buildings (EUN in Dutch) were conducted in 12 newly built housing projects and 5 projects in the utility building sector. With this manual you can realize energy efficient houses and/or utility buildings that meet the standards [Dutch] In het jaar 2012 was voorzien om het nieuwbouwlabel te baseren op de EPC (Energie Prestatie Coefficient). Hiervoor is een opnameprotocol opgesteld voor de woningbouw en de utiliteitsbouw, met als doel te controleren of en in hoeverre conform de EPC is gebouwd en om de gerealiseerde EPC te bepalen. Om ervaring op te doen met de nieuwe opnameprotocollen en de vrijwillige ventilatietoets werden het Opnameprotocol Energielabel Woningen Nieuwbouw (EWN) en Opnameprotocol Energielabel Utiliteitsgebouwen Nieuwbouw (EUN) uitgevoerd bij 12 nieuwbouwprojecten in de woningbouw en 5 projecten in de utiliteitsbouw. Met deze handleiding realiseert u energiezuinige woningen en/of utiliteitsgebouwen die aan de verwachtingen voldoen.

  13. Assessing cost-effectiveness of HPV vaccines with decision analytic models: what are the distinct challenges of low- and middle-income countries? A protocol for a systematic review.

    Science.gov (United States)

    Ekwunife, Obinna I; Grote, Andreas Gerber; Mosch, Christoph; O'Mahony, James F; Lhachimi, Stefan K

    2015-05-12

    Cervical cancer poses a huge health burden, both to developed and developing nations, making prevention and control strategies necessary. However, the challenges of designing and implementing prevention strategies differ for low- and middle-income countries (LMICs) as compared to countries with fully developed health care systems. Moreover, for many LMICs, much of the data needed for decision analytic modelling, such as prevalence, will most likely only be partly available or measured with much larger uncertainty. Lastly, imperfect implementation of human papillomavirus (HPV) vaccination may influence the effectiveness of cervical cancer prevention in unpredictable ways. This systematic review aims to assess how decision analytic modelling studies of HPV cost-effectiveness in LMICs accounted for the particular challenges faced in such countries. Specifically, the study will assess the following: (1) whether the existing literature on cost-effectiveness modelling of HPV vaccines acknowledges the distinct challenges of LMICs, (2) how these challenges were accommodated in the models, (3) whether certain parameters systemically exhibited large degrees of uncertainty due to lack of data and how influential were these parameters on model-based recommendations, and (4) whether the choice of modelling herd immunity influences model-based recommendations, especially when coverage of a HPV vaccination program is not optimal. We will conduct a systematic review to identify suitable studies from MEDLINE (via PubMed), EMBASE, NHS Economic Evaluation Database (NHS EED), EconLit, Web of Science, and CEA Registry. Searches will be conducted for studies of interest published since 2006. The searches will be supplemented by hand searching of the most relevant papers found in the search. Studies will be critically appraised using Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement checklist. We will undertake a descriptive, narrative, and interpretative

  14. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    International Nuclear Information System (INIS)

    Abel, R.

    2000-01-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes

  15. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  16. New safety standards for large volume vans and light transporters - optimized with analytical crash simulation methods; Neue Sicherheitsstandards bei Grossraum-Limousinen und Leicht-Transportern - Optimierung durch konsequente Unterstuetzung mittels rechnerischer Crash-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Alber, P.; Boelke, D.

    1996-04-01

    Under the pseudonym of T0, two parallel vehicle developments where conducted. One lead to a one ton commercial vehicle class transporter (Vito) and the other to comfortable large colume van (V-class). Both of these vehicles share common body-in-white structural components based on the high standards of passenger vehicle specifications. Consequently, analytical methods were applied to an even larger extent than with the bigger brother `Sprinter` beginning with the preliminary design phases, continuing through the development phases and ending with the optimized production version. Completely new vehicle concepts led themselves to the application of modern analytical methods. (orig.) [Deutsch] Unter dem Pseudonym T0 wurden zwei Entwicklungen parallel betrieben, die zum einen in einem Transporter der Ein-Tonnen-Nutzlast-Klasse (Vito) und zum anderen in einer komfortablen Grossraum-Limousine (V-Klasse) muendeten. Beiden gemeinsam ist neben einer hohen Anzahl von Gleichteilen der Rohbau-Struktur das Sicherheits-Konzept, das sich am bekannt hohen Standard des Pkw-Lastenheftes misst. Pkw-Sicherheits-Standard bedeutet Sicherheit rundum, und es wurde in noch staerkerem Masse wie beim grossen Bruder `Sprinter` die rechnerische Simulation konsequent von der Vordimensionierung in der Vorentwicklungs-Phase ueber saemtliche Entwicklungs-Iterationsschleifen bis zur optimierten Serienreife herangezogen. Ein von Grund auf neues Fahrzeug-Konzept liess moderner Berechnungstechnik besondere Bedeutung zukommen. (orig.)

  17. DNA repair protocols

    DEFF Research Database (Denmark)

    Bjergbæk, Lotte

    In its 3rd edition, this Methods in Molecular Biology(TM) book covers the eukaryotic response to genomic insult including advanced protocols and standard techniques in the field of DNA repair. Offers expert guidance for DNA repair, recombination, and replication. Current knowledge of the mechanisms...... that regulate DNA repair has grown significantly over the past years with technology advances such as RNA interference, advanced proteomics and microscopy as well as high throughput screens. The third edition of DNA Repair Protocols covers various aspects of the eukaryotic response to genomic insult including...... recent advanced protocols as well as standard techniques used in the field of DNA repair. Both mammalian and non-mammalian model organisms are covered in the book, and many of the techniques can be applied with only minor modifications to other systems than the one described. Written in the highly...

  18. Critical Response Protocol

    Science.gov (United States)

    Ellingson, Charlene; Roehrig, Gillian; Bakkum, Kris; Dubinsky, Janet M.

    2016-01-01

    This article introduces the Critical Response Protocol (CRP), an arts-based technique that engages students in equitable critical discourse and aligns with the "Next Generation Science Standards" vision for providing students opportunities for language learning while advancing science learning (NGSS Lead States 2013). CRP helps teachers…

  19. Geneva protocols

    International Nuclear Information System (INIS)

    Kimminich, O.

    1990-01-01

    The First Protocol Additional to the Geneva Conventions of 1949 and relating to the Protection of Victims of International Armed Conflicts contains provisions prohibiting indiscriminate attacks. Nuclear warfare as such is not mentioned in the Protocol. It has been asserted that the Protocol does not apply to nuclear weapons for several reasons. However, close analysis shows that the rules governing the application of means and methods of warfare cannot exempt the nuclear weapons. If nuclear weapons are applied in a manner not consistent with article 51 of Protocol I, their use is forbidden by this article even in situations in which general international law might grant an exception from the ban on nuclear weapons, as in the case of reprisal. (orig./HSCH) [de

  20. Towards standard protocols and guidelines for urine proteomics: a report on the Human Kidney and Urine Proteome Project (HKUPP) symposium and workshop, 6 October 2007, Seoul, Korea and 1 November 2007, San Francisco, CA, USA.

    Science.gov (United States)

    Yamamoto, Tadashi; Langham, Robyn G; Ronco, Pierre; Knepper, Mark A; Thongboonkerd, Visith

    2008-06-01

    The Human Kidney and Urine Proteome Project (HKUPP) was initiated in 2005 to promote proteomics research in the nephrology field, to better understand kidney functions as well as pathogenic mechanisms of kidney diseases, and to define novel biomarkers and therapeutic targets. This project was first approved in 2005 by the Human Proteome Organisation (HUPO) as a Kidney Disease Initiative under an umbrella of the HUPO Disease Biomarker Initiative (DBI), and more recently was approved as the HKUPP Initiative in 2007. Several sub-projects have been planned to achieve the ultimate goals. The most pressing is the establishment of "standard protocols and guidelines for urine proteome analysis". This sub-project had been extensively discussed during the first HKUPP symposium (during 6(th) HUPO Annual World Congress--October 2007, Seoul, Korea) and second workshop (during 40(th) American Society of Nephrology Renal Week--November 2007, San Francisco, CA, USA). Additional data and references have been collected after the symposium and workshop. An initial draft of standard protocols and guidelines for proteome analysis of non-proteinuric urine (urine protein excretion 150 mg/day) will soon be released as the first HKUPP product.

  1. Differences in cytoplasmic maturation between the BCB+ and control porcine oocytes do not justify application of the BCB test for a standard IVM protocol.

    Science.gov (United States)

    Pawlak, Piotr; Warzych, Ewelina; Chabowska, Agnieszka; Lechniak, Dorota

    2014-03-07

    The Brilliant Cresyl Blue (BCB) test relies on G6PDH activity and a simple protocol for the selection of higher quality oocytes. Although the BCB+ oocytes of all the species that have been investigated are characterized by superior quality when compared to BCB- counterparts, application of the test for embryo production still remains an open issue. The aim of our study was to compare BCB+ and the control oocytes (not subjected to the BCB test) in terms of selected aspects of cytoplasmic maturation (mtDNA copy number, mitochondria distribution, relative transcript abundance of six marker genes). The results of our study revealed more relevant differences within the BCB+ and the control oocytes (before and after IVM) than between the two categories of oocytes. There was no difference in the transcript abundance of the BCB+ and the control oocytes in 5 out of 6 analyzed genes (BMP15, GDF9, ATP5A1, EEF1A, ZAR1) and in mtDNA content (pre-IVM 179609 vs. 176595 and post-IVM 187243 vs. 246984, respectively). With regard to mitochondria distribution in pre- and post-IVM oocytes, there was nonsignificant tendency for a more frequent occurrence of the expected patterns in the BCB+ group. The results of the present study do not support the application of BCB staining in a routine IVM protocol due to relatively high similarity in selected parameters characterizing cytoplasmic maturation of BCB+ and control oocytes. This high similarity may results from the limited amount of less competent BCB- oocytes (10%) still present among nonselected oocytes of proper morphology.

  2. Synthesis and analytics of 2,2,3,4,4-d5-19-nor-5alpha-androsterone--an internal standard in doping analysis.

    Science.gov (United States)

    Gaertner, Peter; Bica, Katharina; Felzmann, Wolfgang; Forsdahl, Guro; Gmeiner, Günter

    2007-05-01

    A short and efficient synthesis of pentadeuterated 2,2,3,4,4-d5-19-nor-5alpha-androsterone 7 starting from 19-norandrost-4-ene-3,17-dione 1 by a d1-L-Selectride mediated stereo- and regioselective reduction of the 3-keto group is presented. The use of compound 7 as internal standard for the detection of anabolic steroids via mass spectrometric techniques such as gas chromatography-mass spectrometry (GC-MS) is discussed.

  3. High pressure size exclusion chromatography (HPSEC) determination of dissolved organic matter molecular weight revisited: Accounting for changes in stationary phases, analytical standards, and isolation methods

    Science.gov (United States)

    McAdams, Brandon C.; Aiken, George R.; McKnight, Diane M.; Arnold, William A.; Chin, Yu-Ping

    2018-01-01

    We reassessed the molecular weight of dissolved organic matter (DOM) determined by high pressure size exclusion chromatography (HPSEC) using measurements made with different columns and various generations of polystyrenesulfonate (PSS) molecular weight standards. Molecular weight measurements made with a newer generation HPSEC column and PSS standards from more recent lots are roughly 200 to 400 Da lower than initial measurements made in the early 1990s. These updated numbers match DOM molecular weights measured by colligative methods and fall within a range of values calculated from hydroxyl radical kinetics. These changes suggest improved accuracy of HPSEC molecular weight measurements that we attribute to improved accuracy of PSS standards and changes in the column packing. We also isolated DOM from wetlands in the Prairie Pothole Region (PPR) using XAD-8, a cation exchange resin, and PPL, a styrene-divinylbenzene media, and observed little difference in molecular weight and specific UV absorbance at 280 nm (SUVA280) between the two solid phase extraction resins, suggesting they capture similar DOM moieties. PPR DOM also showed lower SUVA280 at similar weights compared to DOM isolates from a global range of environments, which we attribute to oxidized sulfur in PPR DOM that would increase molecular weight without affecting SUVA280.

  4. High Pressure Size Exclusion Chromatography (HPSEC) Determination of Dissolved Organic Matter Molecular Weight Revisited: Accounting for Changes in Stationary Phases, Analytical Standards, and Isolation Methods.

    Science.gov (United States)

    McAdams, Brandon C; Aiken, George R; McKnight, Diane M; Arnold, William A; Chin, Yu-Ping

    2018-01-16

    We reassessed the molecular weight of dissolved organic matter (DOM) determined by high pressure size exclusion chromatography (HPSEC) using measurements made with different columns and various generations of polystyrenesulfonate (PSS) molecular weight standards. Molecular weight measurements made with a newer generation HPSEC column and PSS standards from more recent lots are roughly 200 to 400 Da lower than initial measurements made in the early 1990s. These updated numbers match DOM molecular weights measured by colligative methods and fall within a range of values calculated from hydroxyl radical kinetics. These changes suggest improved accuracy of HPSEC molecular weight measurements that we attribute to improved accuracy of PSS standards and changes in the column packing. We also isolated DOM from wetlands in the Prairie Pothole Region (PPR) using XAD-8, a cation exchange resin, and PPL, a styrene-divinylbenzene media, and observed little difference in molecular weight and specific UV absorbance at 280 nm (SUVA 280 ) between the two solid phase extraction resins, suggesting they capture similar DOM moieties. PPR DOM also showed lower SUVA 280 at similar weights compared to DOM isolates from a global range of environments, which we attribute to oxidized sulfur in PPR DOM that would increase molecular weight without affecting SUVA 280 .

  5. Relevant Standards

    Indian Academy of Sciences (India)

    .86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...

  6. Analytics for Cyber Network Defense

    Energy Technology Data Exchange (ETDEWEB)

    Plantenga, Todd. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Kolda, Tamara Gibson [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  7. The factor analytic structure of the Roberts Apperception Test for Children: a comparison of the standardization sample with a sample of chronically ill children.

    Science.gov (United States)

    Palomares, R S; Crowley, S L; Worchel, F F; Olson, T K; Rae, W A

    1991-06-01

    A confirmatory principal component factor analysis of the Roberts Apperception Test for Children was conducted using the standardization sample and a sample of chronically ill children. An interpretation of three- and four-factor solutions identified the three-factor solution as superior to the four-factor solution as measured by chi-square goodness of fit and coefficients of convergence. A cluster analysis using Ward's minimum variance method was calculated to determine the typical profiles that best describe the chronically ill sample. Results of this analysis reveal two distinct profiles that differ primarily on the level of adaptive psychological functioning.

  8. D/H and Water Concentrations of Submarine MORB Glass Around the World: Analytical Aspects, Standardization, and (re)defining Mantle D/H Ranges

    Science.gov (United States)

    Bindeman, I. N.; Dixon, J. E.; Langmuir, C. H.; Palandri, J. L.

    2015-12-01

    The advent and calibration of the Thermal Combustion Element Analyzer (TCEA) continuous flow system coupled with the large-radius mass spectrometer MAT253 permits precise (±0.02 wt.% H2O, ±1-3‰ D/H) measurements in 1-10 mg of volcanic glass (0.1 wt.% H2O requires ~10 mg glass), which permits the targeting of small amounts of the freshest concentrate. This is a >100 factor reduction in sample size over conventional methods, four times over more common Delta series instruments. We investigated in triplicate 115 samples of submarine MORB glasses ranging from water-poor (0.1-0.2wt%) to water-rich (1.2-1.5wt%). These samples were previously investigated for major and trace elements, radiogenic isotopes; a large subset of these samples coming from the FAZAR expedition were studied previously by FTIR for water concentration. We also ran samples previously studied by the conventional off-line technique: MORB glass including those from the Easter Platform and the Alvin 526-1 standard (0.2wt% H2O). We observe excellent 1:1 correspondence (1.02x+0.02, R2=0.94) of wt% water by FTIR and TCEA suggesting complete extraction of water and no dependence on water concentration. We measure 51‰ total range in D/H that correlates with all other chemical and isotopic indicators of mantle enrichment, with the heaviest values occurring in the most enriched samples. When used uncorrected values of H2 gas run against H2 gas of known composition, this range agrees nicely with previous D/H range for MORB (-30 to -90‰), measured for samples run conventionally. Uncorrected analyses of Alvin glass 526-1 gives -66‰. When run against SMOW, SLAP and -41‰ water sealed in silver cups, the range is shifted by -15‰; when standardization is done by with three commonly used mica standards as is done most commonly in different labs, the range is shifted downward by -30-32‰. There are no isotopic offsets related to total water or D/H range requiring different slope or non-linear correction

  9. Standardized processing of MALDI imaging raw data for enhancement of weak analyte signals in mouse models of gastric cancer and Alzheimer's disease.

    Science.gov (United States)

    Schwartz, Matthias; Meyer, Björn; Wirnitzer, Bernhard; Hopf, Carsten

    2015-03-01

    Conventional mass spectrometry image preprocessing methods used for denoising, such as the Savitzky-Golay smoothing or discrete wavelet transformation, typically do not only remove noise but also weak signals. Recently, memory-efficient principal component analysis (PCA) in conjunction with random projections (RP) has been proposed for reversible compression and analysis of large mass spectrometry imaging datasets. It considers single-pixel spectra in their local context and consequently offers the prospect of using information from the spectra of adjacent pixels for denoising or signal enhancement. However, little systematic analysis of key RP-PCA parameters has been reported so far, and the utility and validity of this method for context-dependent enhancement of known medically or pharmacologically relevant weak analyte signals in linear-mode matrix-assisted laser desorption/ionization (MALDI) mass spectra has not been explored yet. Here, we investigate MALDI imaging datasets from mouse models of Alzheimer's disease and gastric cancer to systematically assess the importance of selecting the right number of random projections k and of principal components (PCs) L for reconstructing reproducibly denoised images after compression. We provide detailed quantitative data for comparison of RP-PCA-denoising with the Savitzky-Golay and wavelet-based denoising in these mouse models as a resource for the mass spectrometry imaging community. Most importantly, we demonstrate that RP-PCA preprocessing can enhance signals of low-intensity amyloid-β peptide isoforms such as Aβ1-26 even in sparsely distributed Alzheimer's β-amyloid plaques and that it enables enhanced imaging of multiply acetylated histone H4 isoforms in response to pharmacological histone deacetylase inhibition in vivo. We conclude that RP-PCA denoising may be a useful preprocessing step in biomarker discovery workflows.

  10. Effectiveness of blood flow restricted exercise compared with standard exercise in patients with recurrent low back pain: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Amano, Shinichi; Ludin, Arimi Fitri Mat; Clift, Rachel; Nakazawa, Masato; Law, Timothy D; Rush, Laura J; Manini, Todd M; Thomas, James S; Russ, David W; Clark, Brian C

    2016-02-12

    Low back pain is a highly prevalent condition in the United States and has a staggeringly negative impact on society in terms of expenses and disability. It has previously been suggested that rehabilitation strategies for persons with recurrent low back pain should be directed to the medial back muscles as these muscles provide functional support of the lumbar region. However, many individuals with low back pain cannot safely and effectively induce trunk muscle adaptation using traditional high-load resistance exercise, and no viable low-load protocols to induce trunk extensor muscle adaptation exist. Herein, we present the study protocol for a randomized controlled trial that will investigate the "cross-transfer" of effects of a novel exercise modality, blood flow restricted exercise, on cross-sectional area (primary outcome), strength and endurance (secondary outcomes) of trunk extensor muscles, as well as the pain, disability, and rate of recurrence of low back pain (tertiary outcomes). This is a single-blinded, single-site, randomized controlled trial. A minimum of 32 (and up to 40) subjects aged 18 to 50 years with recurrent low back pain and poor trunk extensor muscle endurance will be recruited, enrolled and randomized. After completion of baseline assessments, participants will be randomized in a 1:1 ratio to receive a 10-week resistance exercise training program with blood flow restriction (BFR exercise group) or without blood flow restriction (control exercise group). Repeat assessments will be taken immediately post intervention and at 12 weeks after the completion of the exercise program. Furthermore, once every 4 weeks during a 36-week follow-up period, participants will be asked to rate their perceived disability and back pain over the past 14 days. This study will examine the potential for blood flow restricted exercise applied to appendicular muscles to result in a "cross-transfer" of therapeutic effect to the lumbar musculature in individuals with

  11. The UK Paediatric Ocular Trauma Study 1 (POTS1): development of a global standardized protocol for prospective data collection in pediatric ocular trauma.

    Science.gov (United States)

    Sii, Freda; Barry, Robert J; Blanch, Richard J; Abbott, Joseph; MacEwen, Caroline J; Shah, Peter

    2017-01-01

    Ocular trauma is an important cause of visual morbidity in children worldwide. Pediatric ocular trauma accounts for up to one third of all ocular trauma admissions, with significant economic implications for health care providers. It is estimated that 90% of all ocular trauma is preventable. Development of strategies to reduce the incidence and severity of pediatric ocular trauma requires an understanding of the epidemiology of these injuries and their characteristics. This will enable appropriate targeting of resources toward prevention and allow effective service planning. At present, there is no standardized methodology for the collection of global cross-sectional data in pediatric ocular trauma, and the ability to undertake detailed epidemiological and health-economic analyses is limited. Furthermore, it is difficult to draw international comparisons in incidence, etiology, and outcomes of pediatric ocular trauma due to the range of published reporting criteria. This study describes two novel questionnaires for standardized data collection in pediatric ocular trauma, which can be adopted across a range of health care settings internationally. Two standardized data collection questionnaires have been developed from previously reported templates. The first enables collection of demographic and incident data on serious pediatric ocular trauma requiring hospitalization, and the second enables follow-up outcome data collection. Both the questionnaires are designed to collect primarily categorical data in order to increase ease of completion and facilitate quantitative analysis. These questionnaires enable acquisition of standardized data on the incidence, etiology, and outcomes of pediatric ocular trauma. These questionnaires enable collection of standardized data and are designed for global use across all health care settings. Through prospective data collection, epidemiological trends can be determined, allowing health care providers to develop collaborative

  12. Parallel synthesis: a new approach for developing analytical internal standards. Application to the analysis of patulin by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Llovera, Montserrat; Balcells, Mercè; Torres, Mercè; Canela, Ramon

    2005-08-24

    The polymer-assisted reaction of 4-(hydroxymethyl)furan-2(5H)-one (4HM2F) with 21 carboxylic acids using polystyrene-carbodiimide (PS-carbodiimide) yielded an ester library. Four of the esters, (5-oxo-2,5-dihydrofuran-3-yl)methyl acetate (IS-1), (5-oxo-2,5-dihydrofuran-3-yl)methyl butyrate (IS-2), (5-oxo-2,5-dihydrofuran-3-yl)methyl 2-methylpropanoate (IS-3), and (5-oxo-2,5-dihydrofuran-3-yl)methyl chloroacetate (IS-4), were tested as internal standards for the quantification of patulin in apple juice by gas chromatography-mass spectrometry in the selected ion monitoring mode (GC-MS-SIM). The developed method combines an AOAC official extractive step and a GC-MS-SIM analysis. Using a chromatographic column containing trifluoropropylmethylpolysiloxane as the stationary phase and IS-1 as the internal standard, it was possible to perform an accurate and precise quantification of underivatizated patulin in apple juice at concentrations down to 6 microg/L. A detection limit of 1 microg/L was established.

  13. Decontamination systems information and research program -- Literature review in support of development of standard test protocols and barrier design models for in situ formed barriers project

    International Nuclear Information System (INIS)

    1994-12-01

    The US Department of Energy is responsible for approximately 3,000 sites in which contaminants such as carbon tetrachloride, trichlorethylene, perchlorethylene, non-volatile and soluble organic and insoluble organics (PCBs and pesticides) are encountered. In specific areas of these sites radioactive contaminants are stored in underground storage tanks which were originally designed and constructed with a 30-year projected life. Many of these tanks are now 10 years beyond the design life and failures have occurred allowing the basic liquids (ph of 8 to 9) to leak into the unconsolidated soils below. Nearly one half of the storage tanks located at the Hanford Washington Reservation are suspected of leaking and contaminating the soils beneath them. The Hanford site is located in a semi-arid climate region with rainfall of less than 6 inches annually, and studies have indicated that very little of this water finds its way to the groundwater to move the water down gradient toward the Columbia River. This provides the government with time to develop a barrier system to prevent further contamination of the groundwater, and to develop and test remediation systems to stabilize or remove the contaminant materials. In parallel to remediation efforts, confinement and containment technologies are needed to retard or prevent the advancement of contamination plumes through the environment until the implementation of remediation technology efforts are completed. This project examines the various confinement and containment technologies and protocols for testing the materials in relation to their function in-situ

  14. A new improved protocol for in vitro intratubular dentinal bacterial contamination for antimicrobial endodontic tests: standardization and validation by confocal laser scanning microscopy

    Directory of Open Access Journals (Sweden)

    Flaviana Bombarda de ANDRADE

    2015-01-01

    Full Text Available Objectives To compare three methods of intratubular contamination that simulate endodontic infections using confocal laser scanning microscopy (CLSM. Material and Methods Two pre-existing models of dentinal contamination were used to induce intratubular infection (groups A and B. These methods were modified in an attempt to improve the model (group C. Among the modifications it may be included: specimen contamination for five days, ultrasonic bath with BHI broth after specimen sterilization, use of E. faecalis during the exponential growth phase, greater concentration of inoculum, and two cycles of centrifugation on alternate days with changes of culture media. All specimens were longitudinally sectioned and stained with of LIVE/DEAD® for 20 min. Specimens were assessed using CLSM, which provided images of the depth of viable bacterial proliferation inside the dentinal tubules. Additionally, three examiners used scores to classify the CLSM images according to the following parameters: homogeneity, density, and depth of the bacterial contamination inside the dentinal tubules. Kruskal-Wallis and Dunn’s tests were used to evaluate the live and dead cells rates, and the scores obtained. Results The contamination scores revealed higher contamination levels in group C when compared with groups A and B (p0.05. The volume of live cells in group C was higher than in groups A and B (p<0.05. Conclusion The new protocol for intratubular infection resulted in high and uniform patterns of bacterial contamination and higher cell viability in all specimens when compared with the current methods.

  15. A new improved protocol for in vitro intratubular dentinal bacterial contamination for antimicrobial endodontic tests: standardization and validation by confocal laser scanning microscopy.

    Science.gov (United States)

    Andrade, Flaviana Bombarda de; Arias, Marcela Paola Castro; Maliza, Amanda Garcia Alves; Duarte, Marco Antonio Hungaro; Graeff, Márcia Sirlene Zardin; Amoroso-Silva, Pablo Andrés; Midena, Raquel Zanin; Moraes, Ivaldo Gomes de

    2015-01-01

    To compare three methods of intratubular contamination that simulate endodontic infections using confocal laser scanning microscopy (CLSM). Two pre-existing models of dentinal contamination were used to induce intratubular infection (groups A and B). These methods were modified in an attempt to improve the model (group C). Among the modifications it may be included: specimen contamination for five days, ultrasonic bath with BHI broth after specimen sterilization, use of E. faecalisduring the exponential growth phase, greater concentration of inoculum, and two cycles of centrifugation on alternate days with changes of culture media. All specimens were longitudinally sectioned and stained with of LIVE/DEAD for 20 min. Specimens were assessed using CLSM, which provided images of the depth of viable bacterial proliferation inside the dentinal tubules. Additionally, three examiners used scores to classify the CLSM images according to the following parameters: homogeneity, density, and depth of the bacterial contamination inside the dentinal tubules. Kruskal-Wallis and Dunn's tests were used to evaluate the live and dead cells rates, and the scores obtained. The contamination scores revealed higher contamination levels in group C when compared with groups A and B (p0.05). The volume of live cells in group C was higher than in groups A and B (p<0.05). The new protocol for intratubular infection resulted in high and uniform patterns of bacterial contamination and higher cell viability in all specimens when compared with the current methods.

  16. Decontamination systems information and research program -- Literature review in support of development of standard test protocols and barrier design models for in situ formed barriers project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-01

    The US Department of Energy is responsible for approximately 3,000 sites in which contaminants such as carbon tetrachloride, trichlorethylene, perchlorethylene, non-volatile and soluble organic and insoluble organics (PCBs and pesticides) are encountered. In specific areas of these sites radioactive contaminants are stored in underground storage tanks which were originally designed and constructed with a 30-year projected life. Many of these tanks are now 10 years beyond the design life and failures have occurred allowing the basic liquids (ph of 8 to 9) to leak into the unconsolidated soils below. Nearly one half of the storage tanks located at the Hanford Washington Reservation are suspected of leaking and contaminating the soils beneath them. The Hanford site is located in a semi-arid climate region with rainfall of less than 6 inches annually, and studies have indicated that very little of this water finds its way to the groundwater to move the water down gradient toward the Columbia River. This provides the government with time to develop a barrier system to prevent further contamination of the groundwater, and to develop and test remediation systems to stabilize or remove the contaminant materials. In parallel to remediation efforts, confinement and containment technologies are needed to retard or prevent the advancement of contamination plumes through the environment until the implementation of remediation technology efforts are completed. This project examines the various confinement and containment technologies and protocols for testing the materials in relation to their function in-situ.

  17. Lee Silverman voice treatment versus standard NHS speech and language therapy versus control in Parkinson's disease (PD COMM pilot): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Sackley, Catherine M; Smith, Christina H; Rick, Caroline; Brady, Marian C; Ives, Natalie; Patel, Ramilla; Roberts, Helen; Dowling, Francis; Jowett, Sue; Wheatley, Keith; Patel, Smitaa; Kelly, Debbie; Sands, Gina; Clarke, Carl

    2014-06-07

    Parkinson's disease is a common movement disorder affecting approximately 127,000 people in the UK, with an estimated two thirds having speech-related problems. Currently there is no preferred approach to speech and language therapy within the NHS and there is little evidence for the effectiveness of standard NHS therapy or Lee Silverman voice treatment. This trial aims to investigate the feasibility and acceptability of randomizing people with Parkinson's disease-related speech or voice problems to Lee Silverman voice treatment or standard speech and language therapy compared to a no-intervention control. The PD COMM pilot is a three arm, assessor-blinded, randomized controlled trial. Randomization will be computer-generated with participants randomized at a ratio of 1:1:1. Participants randomized to intervention arms will be immediately referred to the appropriate speech and language therapist. The target population are patients with a confirmed diagnosis of idiopathic Parkinson's disease who have problems with their speech or voice. The Lee Silverman voice treatment intervention group will receive the standard regime of 16 sessions between 50 and 60 minutes in length over four weeks, with extra home practice. The standard speech and language therapy intervention group will receive a dose determined by patients' individual needs, but not exceeding eight weeks of treatment. The control group will receive standard care with no speech and language therapy input for at least six months post-randomization. Outcomes will be assessed at baseline (pre-randomization) and post- randomization at three, six, and 12 months. The outcome measures include patient-reported voice measures, quality of life, resource use, and assessor-rated speech recordings. The recruitment aim is at least 60 participants over 21 months from 11 sites, equating to at least 20 participants in each arm of the trial. This trial is ongoing and recruitment commenced in May 2012. This study will

  18. Considerations for the design and execution of protocols for animal research and treatment to improve reproducibility and standardization: "DEPART well-prepared and ARRIVE safely".

    Science.gov (United States)

    Smith, M M; Clarke, E C; Little, C B

    2017-03-01

    To review the factors in experimental design that contribute to poor translation of pre-clinical research to therapies for patients with osteoarthritis (OA) and how this might be improved. Narrative review of the literature, and evaluation of the different stages of design conduct and analysis of studies using animal models of OA to define specific issues that might reduce quality of evidence and how this can be minimised. Preventing bias and improving experimental rigour and reporting are important modifiable factors to improve translation from pre-clinical animal models to successful clinical trials of therapeutic agents. Despite publication and adoption by many journals of guidelines such as Animals in Research: Reporting In Vivo Experiments (ARRIVE), experimental animal studies published in leading rheumatology journals are still deficient in their reporting. In part, this may be caused by researchers first consulting these guidelines after the completion of experiments, at the time of publication. This review discusses factors that can (1) bias the outcome of experimental studies using animal models of osteoarthritis or (2) alter the quality of evidence for translation. We propose a checklist to consult prior to starting experiments; in the Design and Execution of Protocols for Animal Research and Treatment (DEPART). Following DEPART during the design phase will enable completion of the ARRIVE checklist at the time of publication, and thus improve the quality of evidence for inclusion of experimental animal research in meta-analyses and systematic reviews: "DEPART well-prepared and ARRIVE safely". Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  19. The AQUA-FONTIS study: protocol of a multidisciplinary, cross-sectional and prospective longitudinal study for developing standardized diagnostics and classification of non-thyroidal illness syndrome

    Directory of Open Access Journals (Sweden)

    Klein Harald H

    2008-10-01

    Full Text Available Abstract Background Non-thyroidal illness syndrome (NTIS is a characteristic functional constellation of thyrotropic feedback control that frequently occurs in critically ill patients. Although this condition is associated with significantly increased morbidity and mortality, there is still controversy on whether NTIS is caused by artefacts, is a form of beneficial adaptation, or is a disorder requiring treatment. Trials investigating substitution therapy of NTIS revealed contradictory results. The comparison of heterogeneous patient cohorts may be the cause for those inconsistencies. Objectives Primary objective of this study is the identification and differentiation of different functional states of thyrotropic feedback control in order to define relevant evaluation criteria for the prognosis of affected patients. Furthermore, we intend to assess the significance of an innovative physiological index approach (SPINA in differential diagnosis between NTIS and latent (so-called "sub-clinical" thyrotoxicosis. Secondary objective is observation of variables that quantify distinct components of NTIS in the context of independent predictors of evolution, survival or pathophysiological condition and influencing or disturbing factors like medication. Design The approach to a quantitative follow-up of non-thyroidal illness syndrome (AQUA FONTIS study is designed as both a cross-sectional and prospective longitudinal observation trial in critically ill patients. Patients are observed in at least two evaluation points with consecutive assessments of thyroid status, physiological and clinical data in additional weekly observations up to discharge. A second part of the study investigates the neuropsychological impact of NTIS and medium-term outcomes. The study design incorporates a two-module structure that covers a reduced protocol in form of an observation trial before patients give informed consent. Additional investigations are performed if and after

  20. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Becce, Fabio [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick [Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Verdun, Francis R. [University of Lausanne, Institute of Radiation Physics, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Meuli, Reto [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland)

    2013-07-15

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  1. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    oljić, I. Eškinja, M. Kaštelan-Macan, I. Piljac. Š. Cerjan-Stefanović and others translated Chromatographic nomenclature (IUPAC Compendium of Analytical Nomenclature. The related area is covered by books of V. Grdinić and F. Plavšić.During the project Croatian nomenclature of analytical chemistry there shall be an analysis of dictionaries, textbooks, handbooks, professional and scientific monographs and articles, official governmental and economic publications, regulations and instructions. The Compendium of Analytical Nomenclature is expected to have been translated and the translation mostly adjusted to the Croatian language standard. EUROLAB and EURACHEM documents related to quality assurance in analytical laboratories, especially in research and development have not yet been included in the Compendium, and due to the globalization of the information and service market, such documents need to be adjusted to the Croatian language standard in collaboration with consultants from the Institute for Croatian Language and Lingiustics. The terms shall be sorted according to the analytical process from sampling to final information.It is expected that the project's results shall be adopted by the Croatian scientific and professional community, so as to raise the awareness of the necessity of using Croatian terms in everyday professional communication and particularly in scientific and educational work. The Croatian language is rich enough for all analytical terms to be translated appropriately. This shall complete the work our predecessors began several times. We face a great challenge of contributing to the creation of the Croatian scientific terminology and believe we shall succeed.

  2. In silico toxicology protocols.

    Science.gov (United States)

    Myatt, Glenn J; Ahlberg, Ernst; Akahori, Yumi; Allen, David; Amberg, Alexander; Anger, Lennart T; Aptula, Aynur; Auerbach, Scott; Beilke, Lisa; Bellion, Phillip; Benigni, Romualdo; Bercu, Joel; Booth, Ewan D; Bower, Dave; Brigo, Alessandro; Burden, Natalie; Cammerer, Zoryana; Cronin, Mark T D; Cross, Kevin P; Custer, Laura; Dettwiler, Magdalena; Dobo, Krista; Ford, Kevin A; Fortin, Marie C; Gad-McDonald, Samantha E; Gellatly, Nichola; Gervais, Véronique; Glover, Kyle P; Glowienke, Susanne; Van Gompel, Jacky; Gutsell, Steve; Hardy, Barry; Harvey, James S; Hillegass, Jedd; Honma, Masamitsu; Hsieh, Jui-Hua; Hsu, Chia-Wen; Hughes, Kathy; Johnson, Candice; Jolly, Robert; Jones, David; Kemper, Ray; Kenyon, Michelle O; Kim, Marlene T; Kruhlak, Naomi L; Kulkarni, Sunil A; Kümmerer, Klaus; Leavitt, Penny; Majer, Bernhard; Masten, Scott; Miller, Scott; Moser, Janet; Mumtaz, Moiz; Muster, Wolfgang; Neilson, Louise; Oprea, Tudor I; Patlewicz, Grace; Paulino, Alexandre; Lo Piparo, Elena; Powley, Mark; Quigley, Donald P; Reddy, M Vijayaraj; Richarz, Andrea-Nicole; Ruiz, Patricia; Schilter, Benoit; Serafimova, Rositsa; Simpson, Wendy; Stavitskaya, Lidiya; Stidl, Reinhard; Suarez-Rodriguez, Diana; Szabo, David T; Teasdale, Andrew; Trejo-Martin, Alejandra; Valentin, Jean-Pierre; Vuorinen, Anna; Wall, Brian A; Watts, Pete; White, Angela T; Wichard, Joerg; Witt, Kristine L; Woolley, Adam; Woolley, David; Zwickl, Craig; Hasselgren, Catrin

    2018-04-17

    The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information. Copyright © 2018. Published by Elsevier Inc.

  3. Solving signal instability to maintain the second-order advantage in the resolution and determination of multi-analytes in complex systems by modeling liquid chromatography-mass spectrometry data using alternating trilinear decomposition method assisted with piecewise direct standardization.

    Science.gov (United States)

    Gu, Hui-Wen; Wu, Hai-Long; Yin, Xiao-Li; Li, Shan-Shan; Liu, Ya-Juan; Xia, Hui; Xie, Li-Xia; Yu, Ru-Qin; Yang, Peng-Yuan; Lu, Hao-Jie

    2015-08-14

    The application of calibration transfer methods has been successful in combination with near-infrared spectroscopy or other tools for prediction of chemical composition. One of the developed methods that can provide accurate performances is the piecewise direct standardization (PDS) method, which in this paper is firstly applied to transfer from one day to another the second-order calibration model based on alternating trilinear decomposition (ATLD) method built for the interference-free resolution and determination of multi-analytes in complex systems by liquid chromatography-mass spectrometry (LC-MS) in full scan mode. This is an example of LC-MS analysis in which interferences have been found, making necessary the use of second-order calibration because of its capacity for modeling this phenomenon, which implies analytes of interest can be resolved and quantified even in the presence of overlapped peaks and unknown interferences. Once the second-order calibration model based on ATLD method was built, the calibration transfer was conducted to compensate for the signal instability of LC-MS instrument over time. This allows one to reduce the volume of the heavy works for complete recalibration which is necessary for later accurate determinations. The root-mean-square error of prediction (RMSEP) and average recovery were used to evaluate the performances of the proposed strategy. Results showed that the number of calibration samples used on the real LC-MS data was reduced by using the PDS method from 11 to 3 while producing comparable RMSEP values and recovery values that were statistically the same (F-test, 95% confidence level) to those obtained with 11 calibration samples. This methodology is in accordance with the highly recommended green analytical chemistry principles, since it can reduce the experimental efforts and cost with regard to the use of a new calibration model built in modified conditions. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Analytical chemistry

    International Nuclear Information System (INIS)

    Choi, Jae Seong

    1993-02-01

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  5. Meropenem vs standard of care for treatment of late onset sepsis in children of less than 90 days of age: study protocol for a randomised controlled trial

    Directory of Open Access Journals (Sweden)

    de Cabre Vincent

    2011-09-01

    Full Text Available Abstract Background Late onset neonatal sepsis (LOS with the mortality of 17 to 27% is still a serious disease. Meropenem is an antibiotic with wide antibacterial coverage. The advantage of it over standard of care could be its wider antibacterial coverage and thus the use of mono-instead of combination therapy. Methods NeoMero-1, an open label, randomised, comparator controlled, superiority trial aims to compare the efficacy of meropenem with a predefined standard of care (ampicillin + gentamicin or cefotaxime + gentamicin in the treatment of LOS in neonates and infants aged less than 90 days admitted to a neonatal intensive care unit. A total of 550 subjects will be recruited following a 1:1 randomisation scheme. The trial includes patients with culture confirmed (at least one positive culture from normally sterile site except coagulase negative staphylococci in addition to one clinical or laboratory criterion or clinical sepsis (at least two laboratory and two clinical criteria suggestive of LOS in subjects with postmenstrual age The study will start recruitment in September 2011; the total duration is of 24 months. Trial registration EudraCT 2011-001515-31

  6. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  7. Immunochemical protocols

    National Research Council Canada - National Science Library

    Pound, John D

    1998-01-01

    ... easy and important refinements often are not published. This much anticipated 2nd edition of Immunochemzcal Protocols therefore aims to provide a user-friendly up-to-date handbook of reliable techniques selected to suit the needs of molecular biologists. It covers the full breadth of the relevant established immunochemical methods, from protein blotting and immunoa...

  8. Multilaboratory validation study of standardized multiple-locus variable-number tandem repeat analysis protocol for shiga toxin-producing Escherichia coli O157: a novel approach to normalize fragment size data between capillary electrophoresis platforms.

    Science.gov (United States)

    Hyytia-Trees, Eija; Lafon, Patricia; Vauterin, Paul; Ribot, Efrain M

    2010-02-01

    The PulseNet USA subtyping network recently established a standardized protocol for multiple-locus variable-number tandem repeat analysis (MLVA) to characterize Shiga toxin-producing Escherichia coli O157. To enable data comparisons from different laboratories in the same database, reproducibility and high quality of the data must be ensured. The aim of this study was to test the robustness and reproducibility of the proposed standardized protocol by subjecting it to a multilaboratory validation process and to address any discrepancies that may have arisen from the study. A set of 50 strains was tested in 10 PulseNet participating laboratories that used capillary electrophoresis instruments from two manufacturers. Six out of the 10 laboratories were able to generate correct MLVA types for 46 (92%) or more strains. The discrepancies in MLVA type assignment were caused mainly by difficulties in optimizing polymerase chain reactions that were attributed to technical inexperience of the staff and suboptimal quality of reagents and instrumentation. It was concluded that proper training of staff must be an integral part of technology transfer. The interlaboratory reproducibility of fragment sizing was excellent when the same capillary electrophoresis platform was used. However, sizing discrepancies of up to six base pairs for the same fragment were detected between the two platforms. These discrepancies were attributed to different dye and polymer chemistries employed by the manufacturers. A novel software script was developed to assign alleles based on two platform-specific (Beckman Coulter CEQ8000 and Applied Biosystems Genetic Analyzer 3130xl) look-up tables containing fragment size ranges for all alleles. The new allele assignment method was validated at the PulseNet central laboratory using a diverse set of 502 Shiga toxin-producing Escherichia coli O157 isolates. The validation confirmed that the script reliably assigned the same allele for the same fragment

  9. A standardized crisis management model for self-harming and suicidal individuals with three or more diagnostic criteria of borderline personality disorder: The Brief Admission Skåne randomized controlled trial protocol (BASRCT).

    Science.gov (United States)

    Liljedahl, Sophie I; Helleman, Marjolein; Daukantaité, Daiva; Westrin, Åsa; Westling, Sofie

    2017-06-15

    Brief Admission is a crisis and risk management strategy in which self-harming and suicidal individuals with three or more diagnostic criteria of borderline personality disorder self-admit to hospital at times of increasing risk when other efforts to stay safe are failing. Standardized in the current randomized controlled trial, the intensity of Brief Admission Skåne is implemented in durations of three days, with a maximum frequency of three times a month. Brief Admission is integrated into existing treatment plans in advance of crises to prevent reliance on general psychiatric admissions for risk management, as these may be lengthy, unstructured, and of uncertain therapeutic value. The overall objective of the Brief Admission Skåne randomized controlled trial is to determine if Brief Admission can replace general psychiatric admission for self-harming and suicidal individuals with complex mental illness at times of escalating risk. Other objectives of the study are to evaluate whether Brief Admission increases daily functioning and enhances coping, reduces psychiatric symptoms including frequency and severity of self-harm and suicidal behaviours. A final objective is to determine if Brief Admission is an effective crisis management model for this population. Participants are randomized at an individual level to either Brief Admission Skåne plus Treatment as Usual or Treatment As Usual. Based on a priori power analyses, N = 124 participants will be recruited to the study. Data collection is in progress, and will continue until June 2018. All participant data are single-blinded and will be handled with intention-to-treat analysis. Based on the combined clinical experience of our international research group, the Brief Admission Skåne randomized controlled trial upon which the current protocol is based represents the first initiative to standardize, implement and evaluate Brief Admission amongst self-harming and suicidal individuals, including those with

  10. Video Analytics

    DEFF Research Database (Denmark)

    Nasrollahi, Kamal; Distante, Cosimo; Hua, Gang

    2017-01-01

    This book collects the papers presented at two workshops during the 23rd International Conference on Pattern Recognition (ICPR): the Third Workshop on Video Analytics for Audience Measurement (VAAM) and the Second International Workshop on Face and Facial Expression Recognition (FFER) from Real...... World Videos. The workshops were run on December 4, 2016, in Cancun in Mexico. The two workshops together received 13 papers. Each paper was then reviewed by at least two expert reviewers in the field. In all, 11 papers were accepted to be presented at the workshops. The topics covered in the papers...

  11. Video Analytics

    DEFF Research Database (Denmark)

    This book collects the papers presented at two workshops during the 23rd International Conference on Pattern Recognition (ICPR): the Third Workshop on Video Analytics for Audience Measurement (VAAM) and the Second International Workshop on Face and Facial Expression Recognition (FFER) from Real...... World Videos. The workshops were run on December 4, 2016, in Cancun in Mexico. The two workshops together received 13 papers. Each paper was then reviewed by at least two expert reviewers in the field. In all, 11 papers were accepted to be presented at the workshops. The topics covered in the papers...

  12. Individualized versus standardized risk assessment in patients at high risk for adverse drug reactions (IDrug) - study protocol for a pragmatic randomized controlled trial.

    Science.gov (United States)

    Stingl, Julia Carolin; Kaumanns, Katharina Luise; Claus, Katrin; Lehmann, Marie-Louise; Kastenmüller, Kathrin; Bleckwenn, Markus; Hartmann, Gunther; Steffens, Michael; Wirtz, Dorothee; Leuchs, Ann-Kristin; Benda, Norbert; Meier, Florian; Schöffski, Oliver; Holdenrieder, Stefan; Coch, Christoph; Weckbecker, Klaus

    2016-04-26

    Elderly patients are particularly vulnerable to adverse drug reactions, especially if they are affected by additional risk factors such as multimorbidity, polypharmacy, impaired renal function and intake of drugs with high risk potential. Apart from these clinical parameters, drug safety and efficacy can be influenced by pharmacogenetic factors. Evidence-based recommendations concerning drug-gene-combinations have been issued by international consortia and in drug labels. However, clinical benefit of providing information on individual patient factors in a comprehensive risk assessment aiming to reduce the occurrence and severity of adverse drug reactions is not evident. Purpose of this randomized controlled trial is to compare the effect of a concise individual risk information leaflet with standard information on risk factors for side effects. The trial was designed as a prospective, two-arm, randomized, controlled, multicenter, pragmatic study. 960 elderly, multimorbid outpatients in general medicine are included if they take at least one high risk and one other long-term drug (polymedication). As high risk "index drugs" oral anticoagulants and antiplatelets were chosen because of their specific, objectively assessable side effects. Following randomization, test group patients receive an individualized risk assessment leaflet evaluating their personal data concerning bleeding- and thromboembolic-risk-scores, potential drug-drug-interactions, age, renal function and pharmacogenetic factors. Control group patients obtain a standardized leaflet only containing general information on these criteria. Follow-up period is 9 months for each patient. Primary endpoint is the occurrence of a thromboembolic/bleeding event or death. Secondary endpoints are other adverse drug reactions, hospital admissions, specialist referrals and medication changes due to adverse drug reactions, the patients' adherence to medication regimen as well as health related quality of life

  13. IN.PACT Amphirion paclitaxel eluting balloon versus standard percutaneous transluminal angioplasty for infrapopliteal revascularization of critical limb ischemia: rationale and protocol for an ongoing randomized controlled trial.

    Science.gov (United States)

    Zeller, Thomas; Baumgartner, Iris; Scheinert, Dierk; Brodmann, Marianne; Bosiers, Marc; Micari, Antonio; Peeters, Patrick; Vermassen, Frank; Landini, Mario

    2014-02-19

    The effectiveness and durability of endovascular revascularization therapies for chronic critical limb ischemia (CLI) are challenged by the extensive burden of infrapopliteal arterial disease and lesion-related characteristics (e.g., severe calcification, chronic total occlusions), which frequently result in poor clinical outcomes. While infrapopliteal vessel patency directly affects pain relief and wound healing, sustained patency and extravascular care both contribute to the ultimate "patient-centric" outcomes of functional limb preservation, mobility and quality of life (QoL). IN.PACT DEEP is a 2:1 randomized controlled trial designed to assess the efficacy and safety of infrapopliteal arterial revascularization between the IN.PACT Amphirion™ paclitaxel drug-eluting balloon (IA-DEB) and standard balloon angioplasty (PTA) in patients with Rutherford Class 4-5-6 CLI. This multicenter trial has enrolled 358 patients at 13 European centers with independent angiographic core lab adjudication of the primary efficacy endpoint of target lesion late luminal loss (LLL) and clinically driven target lesion revascularization (TLR) in major amputation-free surviving patients through 12-months. An independent wound core lab will evaluate all ischemic wounds to assess the extent of healing and time to healing at 1, 6, and 12 months. A QoL questionnaire including a pain scale will assess changes from baseline scores through 12 months. A Clinical Events Committee and Data Safety Monitoring Board will adjudicate the composite primary safety endpoints of all-cause death, major amputation, and clinically driven TLR at 6 months and other trial endpoints and supervise patient safety throughout the study. All patients will be followed for 5 years. A literature review is presented of the current status of endovascular treatment of CLI with drug-eluting balloon and standard PTA. The rationale and design of the IN.PACT DEEP Trial are discussed. IN.PACT DEEP is a milestone, prospective

  14. Study, design and realization of a fault-tolerant and predictable synchronous communication protocol on off-the-shelf components; Etude, conception et mise en oeuvre d'un protocole de communication synchrone tolerant aux fautes et predictible sur des composants reseaux standards

    Energy Technology Data Exchange (ETDEWEB)

    Chabrol, D

    2006-06-15

    This PhD thesis contributes to the design and realization of safety-critical real-time systems on multiprocessor architectures with distributed memory. They are essential to compute systems that have to ensure complex and critical functions. This PhD thesis deals with communication media management. The communication management conditions strongly the capability of the system to fulfill the timeliness property and the dependability requirements. Our contribution includes: - The design of predictable and fault-tolerant synchronous communication protocol; - The study and the definition of the execution model to have a efficient and safe communications management; - The proposal of a method to generate automatically the communications scheduling. Our approach is based on a communication model that allows the analysis of the feasibility, before execution, of a distributed safe-critical real-time system with timeliness and safety requirements. This leads to the definition of an execution model based on a time-triggered and parallel communication management. A set of linear constraints system is generated automatically to compute the network scheduling and the network load with timeliness fulfillment. Then, the proposed communication interface is based on an advanced version of TDMA protocol which allows to use proprietary components (TTP, FlexRay) as well as standard components (Ethernet). The concepts presented in this thesis lead to the realisation and evaluation of a prototype within the framework of the OASIS project done at the CEA/List. (author)

  15. [The analytic quality in laboratory medicine: problems and perspectives (a lecture)].

    Science.gov (United States)

    Émanuél', A V; Ivanov, G A; Émanuél', Iu V

    2014-03-01

    The article considers the structure of analytical errors in clinical diagnostic laboratory analysis from the position of GOST R ISO 15189-2009 "Laboratories of medicine. Particular requirements to quality and competence". The key value of metrologic traceability of analyses is emphasized. The role of official standard patterns, control materials and statistical methods applied in quality analysis are discussed. The international experience and applied methodical procedures to implement requirements of ISO 15189 concerning validation and verification of analytical quality are presented. The approaches of protocols E3 23-A, ER 15-A2, N59-A in the sphere of USA laboratory medicine developed by the institute of clinical and laboratory standards are demonstrated. The review of referent patterns and methods is given. The problem of optimization of requirements to quality of production for laboratory diagnostic is discussed. The expedience of organization of the National institute of laboratory standards is substantiated.

  16. No gold standard estimation of the sensitivity and specificity of two molecular diagnostic protocols for Trypanosoma brucei spp. in Western Kenya.

    Directory of Open Access Journals (Sweden)

    Barend Mark de Clare Bronsvoort

    2010-01-01

    Full Text Available African animal trypanosomiasis is caused by a range of tsetse transmitted protozoan parasites includingTrypanosoma vivax, Trypanosoma congolense and Trypansoma brucei. In Western Kenya and other parts of East Africa two subspecies of T. brucei, T.b. brucei and the zoonoticT.b. rhodesiense, co-circulate in livestock. A range of polymerase chain reactions (PCR have been developed as important molecular diagnostic tools for epidemiological investigations of T. brucei s.l. in the animal reservoir and of its zoonotic potential. Quantification of the relative performance of different diagnostic PCRs is essential to ensure comparability of studies. This paper describes an evaluation of two diagnostic test systems for T. brucei using a T. brucei s.l. specific PCR [1] and a single nested PCR targeting the Internal Transcribed Spacer (ITS regions of trypanosome ribosomal DNA [2]. A Bayesian formulation of the Hui-Walter latent class model was employed to estimate their test performance in the absence of a gold standard test for detecting T.brucei s.l. infections in ear-vein blood samples from cattle, pig, sheep and goat populations in Western Kenya, stored on Whatman FTA cards. The results indicate that the system employing the T. brucei s.l. specific PCR (Se1=0.760 had a higher sensitivity than the ITS-PCR (Se2=0.640; both have high specificity (Sp1=0.998; Sp2=0.997. The true prevalences for livestock populations were estimated (pcattle=0.091, ppigs=0.066, pgoats=0.005, psheep=0.006, taking into account the uncertainties in the specificity and sensitivity of the two test systems. Implications of test performance include the required survey sample size; due to its higher sensitivity and specificity, the T. brucei s.l. specific PCR requires a consistently smaller sample size than the ITS-PCR for the detection of T. brucei s.l. However the ITS-PCR is able to simultaneously screen samples for other pathogenic trypanosomes and may thus be, overall, a better

  17. Combining Standard Conventional Measures and Ecological Momentary Assessment of Depression, Anxiety and Coping Using Smartphone Application in Minor Stroke Population: A Longitudinal Study Protocol

    Directory of Open Access Journals (Sweden)

    Camille Vansimaeys

    2017-07-01

    Full Text Available Context: Stroke has several consequences on survivors’ daily life even for those who experience short-lasting neurological symptoms with no functional disability. Depression and anxiety are common psychological disorders occurring after a stroke. They affect long-term outcomes and quality of life but they are difficult to diagnose because of the neurobiological consequences of brain lesions. Current research priority is given to the improvement of the detection and prevention of those post-stroke psychological disorders. Although previous studies have brought promising perspectives, their designs based on retrospective tools involve some limits regarding their ecological validity. Ecological Momentary Assessment (EMA is an alternative to conventional instruments that could be a key in research for understanding processes that underlined post-stroke depression and anxiety onset. We aim to evaluate the feasibility and validity of anxiety, depression and coping EMA for minor stroke patients.Methods: Patients hospitalized in an Intensive Neuro-vascular Care Unit between April 2016 and January 2017 for a minor stroke is involved in a study based on an EMA methodology. We use a smartphone application in order to assess anxiety and depression symptoms and coping strategies four times a day during 1 week at three different times after stroke (hospital discharge, 2 and 4 months. Participants’ self-reports and clinician-rates of anxiety, depression and coping are collected simultaneously using conventional and standard instruments. Feasibility of the EMA method will be assessed considering the participation and compliance rate. Validity will be the assessed by comparing EMA and conventional self-report and clinician-rated measures.Discussion: We expect this study to contribute to the development of EMA using smartphone in minor stroke population. EMA method offers promising research perspective in the assessment and understanding of post

  18. Comparative effectiveness of a complex Ayurvedic treatment and conventional standard care in osteoarthritis of the knee--study protocol for a randomized controlled trial.

    Science.gov (United States)

    Witt, Claudia M; Michalsen, Andreas; Roll, Stephanie; Morandi, Antonio; Gupta, Shivnarain; Rosenberg, Mark; Kronpass, Ludwig; Stapelfeldt, Elmar; Hissar, Syed; Müller, Matthias; Kessler, Christian

    2013-05-23

    Traditional Indian Ayurvedic medicine uses complex treatment approaches, including manual therapies, lifestyle and nutritional advice, dietary supplements, medication, yoga, and purification techniques. Ayurvedic strategies are often used to treat osteoarthritis (OA) of the knee; however, no systematic data are available on their effectiveness in comparison with standard care. The aim of this study is to evaluate the effectiveness of complex Ayurvedic treatment in comparison with conventional methods of treating OA symptoms in patients with knee osteoarthritis. In a prospective, multicenter, randomized controlled trial, 150 patients between 40 and 70 years, diagnosed with osteoarthritis of the knee, following American College of Rheumatology criteria and an average pain intensity of ≥40 mm on a 100 mm visual analog scale in the affected knee at baseline will be randomized into two groups. In the Ayurveda group, treatment will include tailored combinations of manual treatments, massages, dietary and lifestyle advice, consideration of selected foods, nutritional supplements, yoga posture advice, and knee massage. Patients in the conventional group will receive self-care advice, pain medication, weight-loss advice (if overweight), and physiotherapy following current international guidelines. Both groups will receive 15 treatment sessions over 12 weeks. Outcomes will be evaluated after 6 and 12 weeks and 6 and 12 months. The primary endpoint is a change in the score on the Western Ontario and McMaster University Osteoarthritis Index (WOMAC) after 12 weeks. Secondary outcome measurements will use WOMAC subscales, a pain disability index, a visual analog scale for pain and sleep quality, a pain experience scale, a quality-of-life index, a profile of mood states, and Likert scales for patient satisfaction, patient diaries, and safety. Using an adapted PRECIS scale, the trial was identified as lying mainly in the middle of the efficacy-effectiveness continuum. This trial

  19. Comparative effectiveness of a complex Ayurvedic treatment and conventional standard care in osteoarthritis of the knee – study protocol for a randomized controlled trial

    Science.gov (United States)

    2013-01-01

    Background Traditional Indian Ayurvedic medicine uses complex treatment approaches, including manual therapies, lifestyle and nutritional advice, dietary supplements, medication, yoga, and purification techniques. Ayurvedic strategies are often used to treat osteoarthritis (OA) of the knee; however, no systematic data are available on their effectiveness in comparison with standard care. The aim of this study is to evaluate the effectiveness of complex Ayurvedic treatment in comparison with conventional methods of treating OA symptoms in patients with knee osteoarthritis. Methods and design In a prospective, multicenter, randomized controlled trial, 150 patients between 40 and 70 years, diagnosed with osteoarthritis of the knee, following American College of Rheumatology criteria and an average pain intensity of ≥40 mm on a 100 mm visual analog scale in the affected knee at baseline will be randomized into two groups. In the Ayurveda group, treatment will include tailored combinations of manual treatments, massages, dietary and lifestyle advice, consideration of selected foods, nutritional supplements, yoga posture advice, and knee massage. Patients in the conventional group will receive self-care advice, pain medication, weight-loss advice (if overweight), and physiotherapy following current international guidelines. Both groups will receive 15 treatment sessions over 12 weeks. Outcomes will be evaluated after 6 and 12 weeks and 6 and 12 months. The primary endpoint is a change in the score on the Western Ontario and McMaster University Osteoarthritis Index (WOMAC) after 12 weeks. Secondary outcome measurements will use WOMAC subscales, a pain disability index, a visual analog scale for pain and sleep quality, a pain experience scale, a quality-of-life index, a profile of mood states, and Likert scales for patient satisfaction, patient diaries, and safety. Using an adapted PRECIS scale, the trial was identified as lying mainly in the middle of the efficacy

  20. Extra Physiotherapy in Critical Care (EPICC) Trial Protocol: a randomised controlled trial of intensive versus standard physical rehabilitation therapy in the critically ill.

    Science.gov (United States)

    Thomas, Kirsty; Wright, Stephen E; Watson, Gillian; Baker, Catherine; Stafford, Victoria; Wade, Clare; Chadwick, Thomas J; Mansfield, Leigh; Wilkinson, Jennifer; Shen, Jing; Deverill, Mark; Bonner, Stephen; Hugill, Keith; Howard, Philip; Henderson, Andrea; Roy, Alistair; Furneval, Julie; Baudouin, Simon V

    2015-05-25

    Patients discharged from Critical Care suffer from excessive longer term morbidity and mortality. Physical and mental health measures of quality of life show a marked and immediate fall after admission to Critical Care with some recovery over time. However, physical function is still significantly reduced at 6 months. The National Institute for Health and Care Excellence clinical guideline on rehabilitation after critical illness, identified the need for high-quality randomised controlled trials to determine the most effective rehabilitation strategy for critically ill patients at risk of critical illness-associated physical morbidity. In response to this, we will conduct a randomised controlled trial, comparing physiotherapy aimed at early and intensive patient mobilisation with routine care. We hypothesise that this intervention will improve physical outcomes and the mental health and functional well-being of survivors of critical illness. 308 adult patients who have received more than 48 h of non-invasive or invasive ventilation in Critical Care will be recruited to a patient-randomised, parallel group, controlled trial, comparing two intensities of physiotherapy. Participants will be randomised to receive either standard or intensive physiotherapy for the duration of their Critical Care admission. Outcomes will be recorded on Critical Care discharge, at 3 and 6 months following initial recruitment to the study. The primary outcome measure is physical health at 6 months, as measured by the SF-36 Physical Component Summary. Secondary outcomes include assessment of mental health, activities of daily living, delirium and ventilator-free days. We will also include a health economic analysis. The trial has ethical approval from Newcastle and North Tyneside 2 Research Ethics Committee (11/NE/0206). There is a Trial Oversight Committee including an independent chair. The results of the study will be submitted for publication in peer-reviewed journals and

  1. 7 CFR 91.23 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ...-2417. (i) Standard Analytical Methods of the Member Companies of Corn Industries Research Foundation... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS...

  2. Analytical mechanics

    CERN Document Server

    Helrich, Carl S

    2017-01-01

    This advanced undergraduate textbook begins with the Lagrangian formulation of Analytical Mechanics and then passes directly to the Hamiltonian formulation and the canonical equations, with constraints incorporated through Lagrange multipliers. Hamilton's Principle and the canonical equations remain the basis of the remainder of the text. Topics considered for applications include small oscillations, motion in electric and magnetic fields, and rigid body dynamics. The Hamilton-Jacobi approach is developed with special attention to the canonical transformation in order to provide a smooth and logical transition into the study of complex and chaotic systems. Finally the text has a careful treatment of relativistic mechanics and the requirement of Lorentz invariance. The text is enriched with an outline of the history of mechanics, which particularly outlines the importance of the work of Euler, Lagrange, Hamilton and Jacobi. Numerous exercises with solutions support the exceptionally clear and concise treatment...

  3. Use of a standardized protocol to identify factors affecting the efficiency of artificial insemination services for cattle through progesterone measurement in fourteen countries

    International Nuclear Information System (INIS)

    Garcia, M.; Goodger, W.J.; Bennett, T.; Perera, B.M.A.O.

    2001-01-01

    The aim of this co-ordinated research project (CRP) was to quantify the main factors limiting the efficiency of artificial insemination (AI) services in cattle under the prevailing conditions of developing countries, in order to recommend suitable strategies for improving conception rates (CR) and the level of usage of AI by cattle farmers. A standardized approach was used in 14 countries over a five-year period (1995-1999). The countries were: Bangladesh, China, Indonesia, Myanmar, Pakistan, Sri Lanka and Vietnam in Asia; and Argentina, Chile, Costa Rica, Cuba, Peru, Uruguay and Venezuela in Latin America. A minimum of 500 cows undergoing first insemination after calving were expected to be monitored in each country. Data regarding farms, AI technicians, semen used, cow inseminated, characteristics of the heat expression and factors related to the insemination were recorded. Three milk samples (or blood samples for dairy heifers and beef cows) were collected for each service to measure progesterone by radioimmunoassay. These were collected on the day of service (day 0) and on days 10-12 and 22-24 after service. Field and laboratory data were recorded in the computer package AIDA (Artificial Insemination Database Application) which was developed for this CRP. The study established the current status of AI services at selected locations in participating countries and showed important differences between Asian and Latin American farming systems. The mean (±s.d.) of the interval from calving to first service for 7992 observations was 120.0 ± 82.1 days (median 95 days) with large differences between countries (P<0.05). The overall CR to first service was 40.9% (n=8196), the most efficient AI services being the ones in Vietnam (62.1%), Chile (61.9%) and Myanmar (58.9%). The interval between the first and second service was 44.6 ± 44.4 days (n=1959). Progesterone data in combination with clinical findings showed that 17.3% of the services were performed in non

  4. Beyond protocols

    DEFF Research Database (Denmark)

    Vanderhoeven, Sonia; Branquart, Etienne; Casaer, Jim

    2017-01-01

    Risk assessment tools for listing invasive alien species need to incorporate all available evidence and expertise. Beyond the wealth of protocols developed to date, we argue that the current way of performing risk analysis has several shortcomings. In particular, lack of data on ecological impacts...... process can be applied to better capture opinions of different experts, thereby maximizing the evidential basis. Elaborating on manageability of invasive species is further needed to fully answer all risk analysis requirements. Tackling the issue of invasive species urges better handling of the acquired...

  5. Comparison of Bruce treadmill exercise test protocols: is ramped Bruce equal or superior to standard bruce in producing clinically valid studies for patients presenting for evaluation of cardiac ischemia or arrhythmia with body mass index equal to or greater than 30?

    Science.gov (United States)

    Bires, Angela Macci; Lawson, Dori; Wasser, Thomas E; Raber-Baer, Donna

    2013-12-01

    Clinically valid cardiac evaluation via treadmill stress testing requires patients to achieve specific target heart rates and to successfully complete the cardiac examination. A comparison of the standard Bruce protocol and the ramped Bruce protocol was performed using data collected over a 1-y period from a targeted patient population with a body mass index (BMI) equal to or greater than 30 to determine which treadmill protocol provided more successful examination results. The functional capacity, metabolic equivalent units achieved, pressure rate product, and total time on the treadmill as measured for the obese patients were clinically valid and comparable to normal-weight and overweight patients (P Bruce protocol achieved more consistent results in comparison across all BMI groups in achieving 80%-85% of their age-predicted maximum heart rate. This study did not adequately establish that the ramped Bruce protocol was superior to the standard Bruce protocol for the examination of patients with a BMI equal to or greater than 30.

  6. Analytical caustic surfaces

    Science.gov (United States)

    Schmidt, R. F.

    1987-01-01

    This document discusses the determination of caustic surfaces in terms of rays, reflectors, and wavefronts. Analytical caustics are obtained as a family of lines, a set of points, and several types of equations for geometries encountered in optics and microwave applications. Standard methods of differential geometry are applied under different approaches: directly to reflector surfaces, and alternatively, to wavefronts, to obtain analytical caustics of two sheets or branches. Gauss/Seidel aberrations are introduced into the wavefront approach, forcing the retention of all three coefficients of both the first- and the second-fundamental forms of differential geometry. An existing method for obtaining caustic surfaces through exploitation of the singularities in flux density is examined, and several constant-intensity contour maps are developed using only the intrinsic Gaussian, mean, and normal curvatures of the reflector. Numerous references are provided for extending the material of the present document to the morphologies of caustics and their associated diffraction patterns.

  7. The Interlibrary Loan Protocol: An OSI Solution to ILL Messaging.

    Science.gov (United States)

    Turner, Fay

    1990-01-01

    Discusses the interlibrary loan (ILL) protocol, a standard based on the principles of the Open Systems Interconnection (OSI) Reference Model. Benefits derived from protocol use are described, the status of the protocol as an international standard is reviewed, and steps taken by the National Library of Canada to facilitate migration to an ILL…

  8. Continuous training and certification in neonatal resuscitation in remote areas using a multi-platform information and communication technology intervention, compared to standard training: A randomized cluster trial study protocol [version 3; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Delgado

    2018-03-01

    Full Text Available Background: About 10% of all newborns may have difficulty breathing and require support by trained personnel. In Peru, 90% of deliveries occur in health facilities. However, there is not a national neonatal resuscitation and certification program for the public health sector. In addition, the Andes and the Amazon regions concentrate large rural remote areas, which further limit the implementation of training programs and the accomplishment of continuous certification. Neonatal resuscitation training through the use of information, communication and technology (ICT tools running on computers, tablets or mobile phones, may overcome such limitations. This strategy allows online and offline access to educational resources, paving the way to more frequent and efficient training and certification processes. Objective: To evaluate the effects of a neonatal resuscitation training and certification program that uses a multi-platform ICT (MP-ICT strategy on neonatal health care in remote areas. Methods: We propose to conduct the study through a cluster-randomized trial, where the study and analysis unit is the health care facility. Eligible facilities will include primary and secondary health care level facilities that are located in provinces with neonatal mortality rates higher than 15 per 1,000 live births. We will compare the proportion of newborns with a heart rate ≥100 beats per minute at two minutes after birth in health care facilities that receive MP-ICT training and certification implementation, with those that receive standard training and certification. Discussion: We expect that the intervention will be shown as more effective than the current standard of care. We are prepared to include it within a national neonatal resuscitation training and certification program to be implemented at national scale together with policymakers and other key stakeholders.  Trial registration: ClinicalTrials.gov Nº NCT03210194  Status of the study: This

  9. SU-G-TeP2-03: Comparison of Standard Dosimetry Protocol in Japan and AAPM TG-51 Addendum in Order to Establish Optimal Dosimetry for FFF Beam

    Energy Technology Data Exchange (ETDEWEB)

    Matsunaga, T; Adachi, Y [Department of Radiology, Seirei Hamamatsu General Hospital, Hamamatsu, Shizuoka (Japan); Hayashi, N [Graduate School of Health Sciences, Fujita Health University, Tayoake, Aichi (Japan); Nozue, M [Department of Radiation Oncology, Seirei Hamamtsu General Hospital, Hamamatsu, Shizuoka (Japan)

    2016-06-15

    Purpose: Japan Standard Dosimetry of Absorbed dose to water in external beam radiotherapy (JSDP12) is widely used to measure radiation dose in radiotherapy. However, JSDP12 does not take flattening-filter-free (FFF) beam into consideration. In addition, JSDP12 applied TPR20,10 for dose quality index for photon beam. The purpose of this study is to compare JSDP12 with AAPM TG-51 addendum in order to establish optimal dosimetry procedure for FFF beam. Method: We evaluated the ion-recombination factor (ks) and the correction factor of radial beam profile (Prp) in FFF beam dosimetry. The ks was introduced by 2 voltages method and verified by Jaffe’s plot. The Prp was given by both film measurement and calculation of treatment planning system, and compared them. Next, we compared the dose quality indexes (kQ) between TPR20,10 method and PDD(10)x method. Finally we considered optimal dosimetry protocol for FFF photon beam using JSDP12 with referring TG-51 addendum protocols. The FFF photon beams of 6 MV (6X-FFF) and 10 MV (10X-FFF) from TrueBeam were investigated in this study. Results: The ks for 6X-FFF and 10X-FFF beams were 1.005 and 1.010, respectively. The Prp of 0.6 cc ionization chamber for 6X-FFF and 10X-FFF beams (Film, TPS) were (1.004, 1.008) and (1.005, 1.008), respectively. The kQ for 6X-FFF and 10X-FFF beams (JSDP12, TG-51 addendum) were (0.9950, 0.9947) and (0.9851, 0.9845), respectively. The most effective factor for uncertainty in FFF photon beam measurement was Prp for JSDP12 formalism. Total dosimetric differences between JSDP12 and TG-51 addendum for 6X-FFF and 10X-FFF were -0.47% and -0.73%, respectively. Conclusion: The total dosimetric difference between JSDP12 and TG-51 addendum was within 1%. The introduction of kQ given by JSDP is feasible for FFF photon beam dosimetry. However, we think Prp should be considered for optimal dosimetry procedure even if JSDP12 is used for FFF photon beam dosimetry.

  10. Hanford analytical sample projections 1996--2001

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M. [Westinghouse Hanford Co., Richland, WA (United States)

    1996-06-26

    This document summarizes the biannual Hanford sample projections for fiscal years 1996 to 2001. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Analytical Services, Site Monitoring, and Industrial Hygiene. This information will be used by Hanford Analytical Services to assure that laboratories and resources are available and effectively utilized to meet these documented needs. Sample projections are categorized by radiation level, protocol, sample matrix and Program. Analyses requirements are also presented.

  11. Comparison of National and International Standards of Good Egg Production Practices

    Directory of Open Access Journals (Sweden)

    GP Sousa

    Full Text Available ABSTRACT Egg production is an important economic activity in Brazil, with about 697 million eggs produced annually. The conventional cage system is commonly used for egg production. However, there has been a growing concern for the welfare of laying hens around the world. In this context, many countries have issued laws, protocols, and other normative technical specifications to ensure the welfare of layers. This study aims at identifying similarities and differences between international standards and Brazilian protocols using the Comparative Law perspective. This article reports an analytical study of selected protocols, performing three analyses using the Comparative Law method. The research concludes that some items of the Brazilian protocols of good egg production practices, such as farm inspection, treatment of diseases, temperature, ventilation, beak trimming, feed and water supply, correspond to international specifications, whereas others, such as housing, freedom movement, use of equipment, and transport, are less strict.

  12. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    Science.gov (United States)

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  13. Publication trends of study protocols in rehabilitation.

    Science.gov (United States)

    Jesus, Tiago S; Colquhoun, Heather L

    2017-09-04

    Growing evidence points for the need to publish study protocols in the health field. To observe whether the growing interest in publishing study protocols in the broader health field has been translated into increased publications of rehabilitation study protocols. Observational study using publication data and its indexation in PubMed. Not applicable. Not applicable. PubMed was searched with appropriate combinations of Medical Subject Headings up to December 2014. The effective presence of study protocols was manually screened. Regression models analyzed the yearly growth of publications. Two-sample Z-tests analyzed whether the proportion of Systematic Reviews (SRs) and Randomized Controlled Trials (RCTs) among study protocols differed from that of the same designs for the broader rehabilitation research. Up to December 2014, 746 publications of rehabilitation study protocols were identified, with an exponential growth since 2005 (r2=0.981; p<0.001). RCT protocols were the most common among rehabilitation study protocols (83%), while RCTs were significantly more prevalent among study protocols than among the broader rehabilitation research (83% vs. 35.8%; p<0.001). For SRs, the picture was reversed: significantly less common among study protocols (2.8% vs. 9.3%; p<0.001). Funding was more often reported by rehabilitation study protocols than the broader rehabilitation research (90% vs. 53.1%; p<0.001). Rehabilitation journals published a significantly lower share of rehabilitation study protocols than they did for the broader rehabilitation research (1.8% vs.16.7%; p<0.001). Identifying the reasons for these discrepancies and reverting unwarranted disparities (e.g. low rate of publication for rehabilitation SR protocols) are likely new avenues for rehabilitation research and its publication. SRs, particularly those aggregating RCT results, are considered the best standard of evidence to guide rehabilitation clinical practice; however, that standard can be improved

  14. Analytic continuation in perturbative QCD

    International Nuclear Information System (INIS)

    Caprini, Irinel

    2002-01-01

    We discuss some attempts to improve standard perturbative expansion in QCD by using the analytic continuation in the momentum and the Borel complex planes. We first analyse the momentum-plane analyticity properties of the Borel-summed Green functions in perturbative QCD and the connection between the Landau singularities and the infrared renormalons. By using the analytic continuation in the Borel complex plane, we propose a new perturbative series replacing the standard expansion in powers of the normalized coupling constant a. The new expansion functions have branch point and essential singularities at the origin of the complex a-plane and divergent Taylor expansions in powers of a. On the other hand the modified expansion of the QCD correlators is convergent under rather conservative conditions. (author)

  15. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  16. Protocols | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    Each reagent on the Antibody Portal has been characterized by a combination of methods specific for that antibody. To view the customized antibody methods and protocols (Standard Operating Procedures) used to generate and characterize each reagent, select an antibody of interest and open the protocols associated with their respective characterization methods along with characterization data.

  17. Multiparametric multidetector computed tomography scanning on suspicion of hyperacute ischemic stroke: validating a standardized protocol Avaliação multiparamétrica por tomografia computadorizada multidetectores na suspeita de isquemia cerebral hiperaguda: validando um protocolo padronizado

    Directory of Open Access Journals (Sweden)

    Felipe Torres Pacheco

    2013-06-01

    Full Text Available Multidetector computed tomography (MDCT scanning has enabled the early diagnosis of hyperacute brain ischemia. We aimed at validating a standardized protocol to read and report MDCT techniques in a series of adult patients. The inter-observer agreement among the trained examiners was tested, and their results were compared with a standard reading. No false positives were observed, and an almost perfect agreement (Kappa>0.81 was documented when the CT angiography (CTA and cerebral perfusion CT (CPCT map data were added to the noncontrast CT (NCCT analysis. The inter-observer agreement was higher for highly trained readers, corroborating the need for specific training to interpret these modern techniques. The authors recommend adding CTA and CPCT to the NCCT analysis in order to clarify the global analysis of structural and hemodynamic brain abnormalities. Our structured report is suitable as a script for the reproducible analysis of the MDCT of patients on suspicion of ischemic stroke.A tomografia computadorizada multidetectores (TCMD permitiu o diagnóstico precoce de isquemia cerebral hiperaguda. O presente estudo objetivou validar a interpretação e a descrição padronizada de um protocolo de TCMD multiparamétrica em uma série de pacientes adultos. A concordância entre os examinadores foi testada, e seus resultados confrontados com uma leitura padrão. Não foram observados resultados falso-positivos, e foi documentado um elevado grau de concordância (Kappa>0,81 quando os dados da angiotomografia (ATC e dos mapas de perfusão cerebral por TC (PCTC foram adicionados à análise da TC sem contraste (TCSC. A concordância interobservador foi superior para os leitores melhor treinados, corroborando a necessidade de formação específica para a interpretação dos exames. Os autores recomendam acrescer a interpretação da ATC e da PCTC à análise da TCSC, visando à análise global das anormalidades cerebrais estruturais e hemodin

  18. National Pesticide Standard Repository

    Science.gov (United States)

    EPA's National Pesticide Standards Repository collects and maintains an inventory of analytical “standards” of registered pesticides in the United States, as well as some that are not currently registered for food and product testing and monitoring.

  19. Multicenter Analytical Validation of Aβ40 Immunoassays

    Directory of Open Access Journals (Sweden)

    Linda J. C. van Waalwijk van Doorn

    2017-07-01

    Full Text Available BackgroundBefore implementation in clinical practice, biomarker assays need to be thoroughly analytically validated. There is currently a strong interest in implementation of the ratio of amyloid-β peptide 1-42 and 1-40 (Aβ42/Aβ40 in clinical routine. Therefore, in this study, we compared the analytical performance of six assays detecting Aβ40 in cerebrospinal fluid (CSF in six laboratories according to a recently standard operating procedure (SOP developed for implementation of ELISA assays for clinical routine.MethodsAβ40 assays of six vendors were validated in up to three centers per assay according to recently proposed international consensus validation protocols. The performance parameters included sensitivity, precision, dilutional linearity, recovery, and parallelism. Inter-laboratory variation was determined using a set of 20 CSF samples. In addition, test results were used to critically evaluate the SOPs that were used to validate the assays.ResultsMost performance parameters of the different Aβ40 assays were similar between labs and within the predefined acceptance criteria. The only exceptions were the out-of-range results of recovery for the majority of experiments and of parallelism by three laboratories. Additionally, experiments to define the dilutional linearity and hook-effect were not executed correctly in part of the centers. The inter-laboratory variation showed acceptable low levels for all assays. Absolute concentrations measured by the assays varied by a factor up to 4.7 for the extremes.ConclusionAll validated Aβ40 assays appeared to be of good technical quality and performed generally well according to predefined criteria. A novel version of the validation SOP is developed based on these findings, to further facilitate implementation of novel immunoassays in clinical practice.

  20. Hanford transuranic analytical capability

    International Nuclear Information System (INIS)

    McVey, C.B.

    1995-01-01

    With the current DOE focus on ER/WM programs, an increase in the quantity of waste samples that requires detailed analysis is forecasted. One of the prime areas of growth is the demand for DOE environmental protocol analyses of TRU waste samples. Currently there is no laboratory capacity to support analysis of TRU waste samples in excess of 200 nCi/gm. This study recommends that an interim solution be undertaken to provide these services. By adding two glove boxes in room 11A of 222S the interim waste analytical needs can be met for a period of four to five years or until a front end facility is erected at or near the 222-S facility. The yearly average of samples is projected to be approximately 600 samples. The figure has changed significantly due to budget changes and has been downgraded from 10,000 samples to the 600 level. Until these budget and sample projection changes become firmer, a long term option is not recommended at this time. A revision to this document is recommended by March 1996 to review the long term option and sample projections

  1. Network Coding Protocols for Smart Grid Communications

    DEFF Research Database (Denmark)

    Prior, Rui; Roetter, Daniel Enrique Lucani; Phulpin, Yannick

    2014-01-01

    but then switches to a denser coding structure towards the end. Our systematic mechanism maintains the sparse structure during the recombination of packets at the intermediate nodes. The performance of our protocol is compared by means of simulations of IEEE reference grids against standard master-slave protocols...... used in real systems. Our results show that network coding achieves 100% reliability, even for hostile network conditions, while gathering data 10 times faster than standard master-slave schemes....

  2. Taking an idea to a research protocol

    African Journals Online (AJOL)

    2013-11-13

    Nov 13, 2013 ... We present a nine-step process to assist with developing an idea into a research protocol. This process ensures that ... research ideas with a potentially “weak” evidence base, rather than starting with a specific research question. The second ..... Cape Town: Juta and Company Ltd; 2011. 9. Kern S. Analytic ...

  3. Asymptotic performance modelling of DCF protocol with prioritized channel access

    Science.gov (United States)

    Choi, Woo-Yong

    2017-11-01

    Recently, the modification of the DCF (Distributed Coordination Function) protocol by the prioritized channel access was proposed to resolve the problem that the DCF performance worsens exponentially as more nodes exist in IEEE 802.11 wireless LANs. In this paper, an asymptotic analytical performance model is presented to analyze the MAC performance of the DCF protocol with the prioritized channel access.

  4. 7 CFR 94.4 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS...) FDA Bacteriological Analytical Manual (BAM), AOAC INTERNATIONAL, 481 North Frederick Avenue, Suite 500...

  5. Pre-analytical phase in clinical chemistry laboratory

    OpenAIRE

    Neogi SS; Mehndiratta M; Gupta S; Puri D

    2016-01-01

    The laboratory testing process is divided into the pre-analytical, analytical and post-analytical phases. For obtaining reliable test results, the prevention and detection of errors at all steps is required. While analytical standards have been developed by recognized quality control criteria, there is a scarcity in the development of standards for the preanalytical phase. This phase is most prone to errors as the steps involved are directly dependent on humans and are out of dire...

  6. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  7. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    Separate abstracts were prepared for 48 papers in these conference proceedings. The topics covered include: analytical chemistry and the environment; environmental radiochemistry; automated instrumentation; advances in analytical mass spectrometry; Fourier transform spectroscopy; analytical chemistry of plutonium; nuclear analytical chemistry; chemometrics; and nuclear fuel technology

  8. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  9. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...... of communication protocols. With it, partners can suggest a new protocol by sending its specification. After formally verifying the specification, each partner generates an implementation, which can then be used for establishing communication. We also present a practical realisation of the Protocol Implementation...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....

  10. Effects of tailored neck-shoulder pain treatment based on a decision model guided by clinical assessments and standardized functional tests. A study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Björklund Martin

    2012-05-01

    Full Text Available Abstract Background A major problem with rehabilitation interventions for neck pain is that the condition may have multiple causes, thus a single treatment approach is seldom efficient. The present study protocol outlines a single blinded randomised controlled trial evaluating the effect of tailored treatment for neck-shoulder pain. The treatment is based on a decision model guided by standardized clinical assessment and functional tests with cut-off values. Our main hypothesis is that the tailored treatment has better short, intermediate and long-term effects than either non-tailored treatment or treatment-as-usual (TAU on pain and function. We sub-sequentially hypothesize that tailored and non-tailored treatment both have better effect than TAU. Methods/Design 120 working women with minimum six weeks of nonspecific neck-shoulder pain aged 20–65, are allocated by minimisation with the factors age, duration of pain, pain intensity and disability in to the groups tailored treatment (T, non-tailored treatment (NT or treatment-as-usual (TAU. Treatment is given to the groups T and NT for 11 weeks (27 sessions evenly distributed. An extensive presentation of the tests and treatment decision model is provided. The main treatment components are manual therapy, cranio-cervical flexion exercise and strength training, EMG-biofeedback training, treatment for cervicogenic headache, neck motor control training. A decision algorithm based on the baseline assessment determines the treatment components given to each participant of T- and NT-groups. Primary outcome measures are physical functioning (Neck Disability Index and average pain intensity last week (Numeric Rating Scale. Secondary outcomes are general improvement (Patient Global Impression of Change scale, symptoms (Profile Fitness Mapping neck questionnaire, capacity to work in the last 6 weeks (quality and quantity and pressure pain threshold of m. trapezius. Primary and secondary outcomes will

  11. The SIMS trial: adjustable anchored single-incision mini-slings versus standard tension-free midurethral slings in the surgical management of female stress urinary incontinence. A study protocol for a pragmatic, multicentre, non-inferiority randomised controlled trial

    Science.gov (United States)

    Abdel-Fattah, Mohamed; MacLennan, Graeme; Kilonzo, Mary; Assassa, R Phil; McCormick, Kirsty; Davidson, Tracey; McDonald, Alison; N’Dow, James; Wardle, Judith; Norrie, John

    2017-01-01

    Introduction Single-incision mini-slings (SIMS) represent the third generation of midurethral slings. They have been developed with the aim of offering a true ambulatory procedure for treatment of female stress urinary incontinence (SUI) with reduced morbidity and earlier recovery while maintaining similar efficacy to standard midurethral slings (SMUS). The aim of this study is to determine the clinical and cost-effectiveness of adjustable anchored SIMS compared with tension-free SMUS in the surgical management of female SUI, with 3-year follow-up. Methods and analysis A pragmatic, multicentre, non-inferiority randomised controlled trial. Primary outcome measure The primary outcome measure is the patient-reported success rate measured by the Patient Global Impression of Improvement at 12 months. The primary economic outcome will be incremental cost per quality-adjusted life year gained at 12 months. Secondary outcome measures The secondary outcomes measures include adverse events, objective success rates, impact on other lower urinary tract symptoms, health-related quality of life profile and sexual function, and reoperation rates for SUI. Secondary economic outcomes include National Health Service and patient primary and secondary care resource use and costs, incremental cost-effectiveness and incremental net benefit. Statistical analysis The statistical analysis of the primary outcome will be by intention-to-treat and also a per-protocol analysis. Results will be displayed as estimates and 95% CIs. CIs around observed differences will then be compared with the prespecified non-inferiority margin. Secondary outcomes will be analysed similarly. Ethics and dissemination The North of Scotland Research Ethics Committee has approved this study (13/NS/0143). The dissemination plans include HTA monograph, presentation at international scientific meetings and publications in high-impact, open-access journals. The results will be included in the updates of the National

  12. Entanglement distillation protocols and number theory

    International Nuclear Information System (INIS)

    Bombin, H.; Martin-Delgado, M.A.

    2005-01-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set Z D n associated with Bell diagonal states is a module rather than a vector space. We find that a partition of Z D n into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D. When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively

  13. ASCI 2010 standardized practice protocol for cardiac magnetic resonance imaging: a report of the Asian society of cardiovascular imaging cardiac computed tomography and cardiac magnetic resonance imaging guideline working group.

    Science.gov (United States)

    Chan, Carmen W S; Choi, Byoung Wook; Jinzaki, Masahiro; Kitagawa, Kakuya; Tsai, I-Chen; Yong, Hwan Seok; Yu, Wei

    2010-12-01

    These practice guidelines are recommended by the Asian Society of Cardiovascular Imaging (ASCI), the sole society in Asia designated for cardiovascular imaging, to provide a framework to healthcare providers for suggested essential elements in cardiac magnetic resonance (CMR) examinations of different disease spectra. The guideline is composed of recommendations on the general technique, acquisition of some basic modules, and protocols on stress tests. The protocols for specific diseases are provided in a table format for quick reference to be easily utilized for everyday clinical CMR.

  14. A Simple XML Producer-Consumer Protocol

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of

  15. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  16. Clustering in analytical chemistry.

    Science.gov (United States)

    Drab, Klaudia; Daszykowski, Michal

    2014-01-01

    Data clustering plays an important role in the exploratory analysis of analytical data, and the use of clustering methods has been acknowledged in different fields of science. In this paper, principles of data clustering are presented with a direct focus on clustering of analytical data. The role of the clustering process in the analytical workflow is underlined, and its potential impact on the analytical workflow is emphasized.

  17. Network Memory Protocol

    National Research Council Canada - National Science Library

    Wilcox, D

    1997-01-01

    This report presents initial research into the design of a new computer system local area network transport layer protocol, designated the network memory protocol, which provides clients with direct...

  18. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  19. Esthetic outcomes in relation to implant-abutment interface design following a standardized treatment protocol in a multicenter randomized controlled trial--a cohort of 12 cases at 1-year follow-up.

    Science.gov (United States)

    McGuire, Michael K; Scheyer, Todd; Ho, Daniel K; Stanford, Clark M; Feine, Jocelyne S; Cooper, Lyndon F

    2015-01-01

    The design of an implant-abutment interface may have an impact on the peri-implant soft tissue esthetics. In an ongoing randomized controlled trial (RCT) with 141 participants, the authors evaluated the peri-implant tissue responses around three different implant-abutment interface designs used to replace single teeth in the esthetic zone. The aim of this report is to describe the treatment protocol utilized in this ongoing RCT by (1) demonstrating in detail a clinical case treated under this protocol and (2) reporting peri-implant soft tissue responses in a cohort of 12 representative cases from the RCT at 1-year follow-up. Male and female adults requiring single implants in the anterior maxilla were enrolled in the RCT according to the study protocol. Five months following any required extraction and/or socket bone grafting/ridge augmentation, one of the following three implant-abutment interfaces was placed and immediately provisionalized: (1) conical interface (CI; OsseoSpeed, Dentsply Implants), n = 4; (2) flat-to-flat interface (FI; NobelSpeedy Replace, Nobel Biocare), n = 4; or (3) platform-switch interface (PS; NanoTite Certain Prevail, Biomet 3i), n = 4. Twelve weeks later, definitive crowns were delivered. Throughout the treatment, peri-implant buccal gingival zenith height and mesial/distal papilla height were measured on stereotactic device photographs, and pink esthetic scores (PES) were determined. The demographics of the participants in each of the three implant-abutment interface groups were very similar. All 12 study sites had ideal ridge form with a minimum width of 5.5 mm following implant site development performed according to the described treatment protocol. Using this treatment protocol for single-tooth replacement in the anterior maxilla, the clinicians were able to obtain esthetic peri-implant soft tissue outcomes with all three types of implant-abutment interface designs at 1-year follow-up as shown by the Canfield data and PES. The

  20. Accuracy of NHANES periodontal examination protocols.

    Science.gov (United States)

    Eke, P I; Thornton-Evans, G O; Wei, L; Borgnakke, W S; Dye, B A

    2010-11-01

    This study evaluates the accuracy of periodontitis prevalence determined by the National Health and Nutrition Examination Survey (NHANES) partial-mouth periodontal examination protocols. True periodontitis prevalence was determined in a new convenience sample of 454 adults ≥ 35 years old, by a full-mouth "gold standard" periodontal examination. This actual prevalence was compared with prevalence resulting from analysis of the data according to the protocols of NHANES III and NHANES 2001-2004, respectively. Both NHANES protocols substantially underestimated the prevalence of periodontitis by 50% or more, depending on the periodontitis case definition used, and thus performed below threshold levels for moderate-to-high levels of validity for surveillance. Adding measurements from lingual or interproximal sites to the NHANES 2001-2004 protocol did not improve the accuracy sufficiently to reach acceptable sensitivity thresholds. These findings suggest that NHANES protocols produce high levels of misclassification of periodontitis cases and thus have low validity for surveillance and research.

  1. Field Monitoring Protocol. Heat Pump Water Heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sparn, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Earle, L. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wilson, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hancock, C. E. [Mountain Energy Partnership, Longmont, CO (United States)

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  2. Field Monitoring Protocol: Heat Pump Water Heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sparn, B.; Earle, L.; Christensen, D.; Maguire, J.; Wilson, E.; Hancock, E.

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  3. Evaluation of a reduced centrifugation time and higher centrifugal force on various general chemistry and immunochemistry analytes in plasma and serum.

    Science.gov (United States)

    Møller, Mette F; Søndergaard, Tove R; Kristensen, Helle T; Münster, Anna-Marie B

    2017-09-01

    Background Centrifugation of blood samples is an essential preanalytical step in the clinical biochemistry laboratory. Centrifugation settings are often altered to optimize sample flow and turnaround time. Few studies have addressed the effect of altering centrifugation settings on analytical quality, and almost all studies have been done using collection tubes with gel separator. Methods In this study, we compared a centrifugation time of 5 min at 3000 ×  g to a standard protocol of 10 min at 2200 ×  g. Nine selected general chemistry and immunochemistry analytes and interference indices were studied in lithium heparin plasma tubes and serum tubes without gel separator. Results were evaluated using mean bias, difference plots and coefficient of variation, compared with maximum allowable bias and coefficient of variation used in laboratory routine quality control. Results For all analytes except lactate dehydrogenase, the results were within the predefined acceptance criteria, indicating that the analytical quality was not compromised. Lactate dehydrogenase showed higher values after centrifugation for 5 min at 3000 ×  g, mean bias was 6.3 ± 2.2% and the coefficient of variation was 5%. Conclusions We found that a centrifugation protocol of 5 min at 3000 ×  g can be used for the general chemistry and immunochemistry analytes studied, with the possible exception of lactate dehydrogenase, which requires further assessment.

  4. Analytical pervaporation: a key technique in the enological laboratory.

    Science.gov (United States)

    Luque de Castro, Maria D; Luque-García, Jose L; Mataix, Eva

    2003-01-01

    This paper reviews the use of analytical pervaporation (defined as the integration of 2 different analytical separation principles, evaporation and gas diffusion, in a single micromodule) coupled to flow-injection manifolds for the determination of analytes of interest in enology; the review discusses the advantages that these techniques can provide in wine analytical laboratories. Special attention is given to methods that enable the determination of either of 2 volatile analytes, or of one volatile analyte and one nonvolatile analyte by taking advantage of the versatility of the designed approaches. In a comparison of these methods with the official and/or standard methods, the results showed good agreement. In addition, the new methods offer improvements in linear determination range, quantitation limit, precision, rapidity, and potential for full automation. Thus, this review demonstrates that although the old technologies used in wine analytical laboratories may be supported by official and standard methods, they should be replaced by properly validated, new, and automated technologies.

  5. Advanced dementia pain management protocols.

    Science.gov (United States)

    Montoro-Lorite, Mercedes; Canalias-Reverter, Montserrat

    2017-08-04

    Pain management in advanced dementia is complex because of neurological deficits present in these patients, and nurses are directly responsible for providing interventions for the evaluation, management and relief of pain for people suffering from this health problem. In order to facilitate and help decision-makers, pain experts recommend the use of standardized protocols to guide pain management, but in Spain, comprehensive pain management protocols have not yet been developed for advanced dementia. This article reflects the need for an integrated management of pain in advanced dementia. From the review and analysis of the most current and relevant studies in the literature, we performed an approximation of the scales for the determination of pain in these patients, with the observational scale PAINAD being the most recommended for the hospital setting. In addition, we provide an overview for comprehensive management of pain in advanced dementia through the conceptual framework «a hierarchy of pain assessment techniques by McCaffery and Pasero» for the development and implementation of standardized protocols, including a four-phase cyclical process (evaluation, planning/performance, revaluation and recording), which can facilitate the correct management of pain in these patients. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  6. Protocol for a systematic review of N-of-1 trial protocol guidelines and protocol reporting guidelines.

    Science.gov (United States)

    Porcino, Antony J; Punja, Salima; Chan, An-Wen; Kravitz, Richard; Orkin, Aaron; Ravaud, Philippe; Schmid, Christopher H; Vohra, Sunita

    2017-07-06

    N-of-1 trials are multiple cross-over trials done in individual participants, generating individual treatment effect information. While reporting guidelines for the CONSORT Extension for N-of-1 trials (CENT) and the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) already exist, there is no standardized recommendation for the reporting of N-of-1 trial protocols. The objective of this study is to evaluate current literature on N-of-1 design and reporting to identify key elements of rigorous N-of-1 protocol design. We will conduct a systematic search for all N-of-1 trial guidelines and protocol-reporting guidelines published in peer-reviewed literature. We will search Medline, Embase, PsycINFO, CINAHL, the Cochrane Methodology Register, CENTRAL, and the NHS Economic Evaluation Database. Eligible articles will contain explicit guidance on N-of-1 protocol construction or reporting. Two reviewers will independently screen all titles and abstracts and then undertake full-text reviews of potential articles to determine eligibility. One reviewer will perform data extraction of selected articles, checked by the second reviewer. Data analysis will ascertain common features of N-of-1 trial protocols and compare them to the SPIRIT and CENT items. This systematic review assesses recommendations on the design and reporting of N-of-1 trial protocols. These findings will inform an international Delphi development process for an N-of-1 trial protocol reporting guideline. The development of this guideline is critical for improving the quality of N-of-1 protocols, leading to improvements in the quality of published N-of-1 trial research.

  7. Pre-Analytical Parameters Affecting Vascular Endothelial Growth Factor Measurement in Plasma: Identifying Confounders.

    Directory of Open Access Journals (Sweden)

    Johanna M Walz

    Full Text Available Vascular endothelial growth factor-A (VEGF-A is intensively investigated in various medical fields. However, comparing VEGF-A measurements is difficult because sample acquisition and pre-analytic procedures differ between studies. We therefore investigated which variables act as confounders of VEGF-A measurements.Following a standardized protocol, blood was taken at three clinical sites from six healthy participants (one male and one female participant at each center twice one week apart. The following pre-analytical parameters were varied in order to analyze their impact on VEGF-A measurements: analyzing center, anticoagulant (EDTA vs. PECT / CTAD, cannula (butterfly vs. neonatal, type of centrifuge (swing-out vs. fixed-angle, time before and after centrifugation, filling level (completely filled vs. half-filled tubes and analyzing method (ELISA vs. multiplex bead array. Additionally, intrapersonal variations over time and sex differences were explored. Statistical analysis was performed using a linear regression model.The following parameters were identified as statistically significant independent confounders of VEGF-A measurements: analyzing center, anticoagulant, centrifuge, analyzing method and sex of the proband. The following parameters were no significant confounders in our data set: intrapersonal variation over one week, cannula, time before and after centrifugation and filling level of collection tubes.VEGF-A measurement results can be affected significantly by the identified pre-analytical parameters. We recommend the use of CTAD anticoagulant, a standardized type of centrifuge and one central laboratory using the same analyzing method for all samples.

  8. Pre-Analytical Parameters Affecting Vascular Endothelial Growth Factor Measurement in Plasma: Identifying Confounders.

    Science.gov (United States)

    Walz, Johanna M; Boehringer, Daniel; Deissler, Heidrun L; Faerber, Lothar; Goepfert, Jens C; Heiduschka, Peter; Kleeberger, Susannah M; Klettner, Alexa; Krohne, Tim U; Schneiderhan-Marra, Nicole; Ziemssen, Focke; Stahl, Andreas

    2016-01-01

    Vascular endothelial growth factor-A (VEGF-A) is intensively investigated in various medical fields. However, comparing VEGF-A measurements is difficult because sample acquisition and pre-analytic procedures differ between studies. We therefore investigated which variables act as confounders of VEGF-A measurements. Following a standardized protocol, blood was taken at three clinical sites from six healthy participants (one male and one female participant at each center) twice one week apart. The following pre-analytical parameters were varied in order to analyze their impact on VEGF-A measurements: analyzing center, anticoagulant (EDTA vs. PECT / CTAD), cannula (butterfly vs. neonatal), type of centrifuge (swing-out vs. fixed-angle), time before and after centrifugation, filling level (completely filled vs. half-filled tubes) and analyzing method (ELISA vs. multiplex bead array). Additionally, intrapersonal variations over time and sex differences were explored. Statistical analysis was performed using a linear regression model. The following parameters were identified as statistically significant independent confounders of VEGF-A measurements: analyzing center, anticoagulant, centrifuge, analyzing method and sex of the proband. The following parameters were no significant confounders in our data set: intrapersonal variation over one week, cannula, time before and after centrifugation and filling level of collection tubes. VEGF-A measurement results can be affected significantly by the identified pre-analytical parameters. We recommend the use of CTAD anticoagulant, a standardized type of centrifuge and one central laboratory using the same analyzing method for all samples.

  9. Efficacy of standard (SLA) and modified sandblasted and acid-etched (SLActive) dental implants in promoting immediate and/or early occlusal loading protocols: a systematic review of prospective studies.

    Science.gov (United States)

    Chambrone, Leandro; Shibli, Jamil Awad; Mercúrio, Carlos Eduardo; Cardoso, Bruna; Preshaw, Philip M

    2015-04-01

    To assess the survival percentage, clinical and radiographic outcomes of sandblasted and acid-etched (SLA) dental implants and its modified surface (SLActive) in protocols involving immediate and early occlusal loading. MEDLINE, EMBASE and the Cochrane Oral Health Group's Trials Register CENTRAL were searched in duplicate up to, and including, June 2013 to include randomised controlled trials (RCTs) and prospective observational studies of at least 6-month duration published in all languages. Studies limited to patients treated with SLA and/or SLActive implants involving a treatment protocol describing immediate and early loading of these implants were eligible for inclusion. Data on clinical and/or radiographic outcomes following implant placement were considered for inclusion. Of the 447 potentially eligible publications identified by the search strategy, seven RCTs comprising a total of 853 implants (8% titanium plasma-sprayed, 41.5% SLA and 50.5% SLActive) and 12 prospective observational studies including 1394 SLA and 145 SLActive implants were included in this review. According to the Cochrane Collaboration's tool for assessing risk of bias, one of the studies was considered to be at a low risk of bias, whereas the remaining studies were considered to be at an unclear risk. Regarding the observational studies, all of them presented a medium methodological quality based on the Modified Newcastle-Ottawa scale. There were no significant differences reported in the studies in relation to implant loss or clinical parameters between the immediate/early loading and delayed loading protocols. Overall, 95% of SLA and 97% of SLActive implants still survive at the end of follow-up. Despite of the positive findings achieved by the included studies, few RCTs were available for analysis for SLActive implants. Study heterogeneity, scarcity of data and the lack of pooled estimates represent a limitation between studies' comparisons and should be considered when interpreting

  10. News for analytical chemists

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Karlberg, Bo

    2009-01-01

    The EuCheMS Division of Analytical Chemistry (DAC) maintains a website with informations on groups of analytical chemistry at European universities (www.dac-euchems. org). Everyone may contribute to the database and contributors are responsible for an annual update of the information. The service...... is offered free of charge. The report on activities of DAC during 2008 was published in journals of analytical chemistry where Manfred Grasserbauer contributed with his personal view on analytical chemistry in the assessment of climate changes and sustainable application of the natural resources to human...... directed to various topics of analytical chemistry. Although affected by the global financial crisis, the Euroanalysis Conference will be held on 6 to 10 September in Innsbruck, Austria. For next year, the programme for the analytical section of the 3rd European Chemistry Congress is in preparation...

  11. Characterization and classification of external quality assessment schemes (EQA) according to objectives such as evaluation of method and participant bias and standard deviation. External Quality Assessment (EQA) Working Group A on Analytical Goals in Laboratory Medicine.

    Science.gov (United States)

    Libeer, J C; Baadenhuijsen, H; Fraser, C G; Petersen, P H; Ricós, C; Stöckl, D; Thienpont, L

    1996-08-01

    Within the scope of this paper, the Working Group has attempted to place external quality assessment (EQA) within the whole context of quality management in laboratory medicine. First, the objectives of EQA schemes are defined and current EQA schemes evaluated. In most schemes, the objectives are not defined a priori and do not allow the definition of the origin of unacceptable individual results from participants. There is an ongoing trend for making traditional EQA schemes more interesting for the participants. Analysis of the factors involved in analytical quality allow the definition of the essential analytical tasks of educational EQA schemes. Beside these quality control tasks, educational EQA also includes quality assurance elements. EQA today has not only an important role to play in the assessment of each participant's performance but also in the assessment of the method. Efficiency of the schemes and educational impact can be improved by appropriate scheme designs according to objectives. After this theoretical approach, some practical examples of problem related EQA designs are given.

  12. Analytical Chemistry in Russia.

    Science.gov (United States)

    Zolotov, Yuri

    2016-09-06

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  13. Spacelab system analysis: The modified free access protocol: An access protocol for communication systems with periodic and Poisson traffic

    Science.gov (United States)

    Ingels, Frank; Owens, John; Daniel, Steven

    1989-01-01

    The protocol definition and terminal hardware for the modified free access protocol, a communications protocol similar to Ethernet, are developed. A MFA protocol simulator and a CSMA/CD math model are also developed. The protocol is tailored to communication systems where the total traffic may be divided into scheduled traffic and Poisson traffic. The scheduled traffic should occur on a periodic basis but may occur after a given event such as a request for data from a large number of stations. The Poisson traffic will include alarms and other random traffic. The purpose of the protocol is to guarantee that scheduled packets will be delivered without collision. This is required in many control and data collection systems. The protocol uses standard Ethernet hardware and software requiring minimum modifications to an existing system. The modification to the protocol only affects the Ethernet transmission privileges and does not effect the Ethernet receiver.

  14. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    Woo-Yong Choi

    2009-01-01

    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  15. Science Update: Analytical Chemistry.

    Science.gov (United States)

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  16. Analytical Electron Microscope

    Data.gov (United States)

    Federal Laboratory Consortium — The Titan 80-300 is a transmission electron microscope (TEM) equipped with spectroscopic detectors to allow chemical, elemental, and other analytical measurements to...

  17. Agricultural Soil Spectral Response and Properties Assessment: Effects of Measurement Protocol and Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Asa Gholizadeh

    2017-10-01

    Full Text Available Soil spectroscopy has shown to be a fast, cost-effective, environmentally friendly, non-destructive, reproducible and repeatable analytical technique. Soil components, as well as types of instruments, protocols, sampling methods, sample preparation, spectral acquisition techniques and analytical algorithms have a combined influence on the final performance. Therefore, it is important to characterize these differences and to introduce an effective approach in order to minimize the technical factors that alter reflectance spectra and consequent prediction. To quantify this alteration, a joint project between Czech University of Life Sciences Prague (CULS and Tel-Aviv University (TAU was conducted to estimate Cox, pH-H2O, pH-KCl and selected forms of Fe and Mn. Two different soil spectral measurement protocols and two data mining techniques were used to examine seventy-eight soil samples from five agricultural areas in different parts of the Czech Republic. Spectral measurements at both laboratories were made using different ASD spectroradiometers. The CULS protocol was based on employing a contact probe (CP spectral measurement scheme, while the TAU protocol was carried out using a CP measurement method, accompanied with the internal soil standard (ISS procedure. Two spectral datasets, acquired from different protocols, were both analyzed using partial least square regression (PLSR technique as well as the PARACUDA II®, a new data mining engine for optimizing PLSR models. The results showed that spectra based on the CULS setup (non-ISS demonstrated significantly higher albedo intensity and reflectance values relative to the TAU setup with ISS. However, the majority of statistics using the TAU protocol was not noticeably better than the CULS spectra. The paper also highlighted that under both measurement protocols, the PARACUDA II® engine proved to be a powerful tool for providing better results than PLSR. Such initiative is not only a way to

  18. A Weak Value Based QKD Protocol Robust Against Detector Attacks

    Science.gov (United States)

    Troupe, James

    2015-03-01

    We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.

  19. Low cost CD4 enumeration using generic monoclonal antibody reagents and a two-color user-defined MultiSET protocol.

    Science.gov (United States)

    Pattanapanyasat, Kovit; Shain, Hla; Prasertsilpa, Varipin; Noulsri, Egarit; Lerdwana, Surada; Eksaengsri, Achara

    2006-09-15

    The standard three-tube, three-color flow cytometric method utilizing the TriTEST reagents in conjunction with the MultiSET software commonly used in most laboratories in Thailand for CD4 enumeration is expensive and thus unavailable to most HIV-infected patients. A more affordable method, i.e., the PanLeucogating protocol using only two monoclonal antibody reagents, has been described but requires the use of the CellQUEST software that does not have automatic gating and reporting facilities. We describe a simple protocol that utilizes a two-color user-defined protocol with the automated MultiSET software for the acquisition, analysis, and reporting of CD4 results. A two-color user-defined protocol was set up following instructions in the Becton Dickinson Biosciences MultiSET manual, adhering strictly to the information regarding the Gate and Attractor Hierarchy for analyzing various reagent combinations. This simple two-color user-defined MultiSET software was evaluated using generic monoclonal reagents in comparison with the standard TriTEST/MultiSET protocol. The two-color user-defined MultiSET software is easy to use. It requires only modification of the original MultiSET program and the results obtained are comparable with those derived from the standard TriTEST/MultiSET protocol. The use of this easy and reliable two-color user-defined MultiSET protocol represents an affordable alternative to CD4 testing in resource-poor settings. Copyright 2006 International Society for Analytical Cytology.

  20. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    Directory of Open Access Journals (Sweden)

    Lui Wing-bong

    2006-02-01

    Full Text Available Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR. In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using the MagNA Pure LC instrument (Roche Diagnostics was evaluated. We developed a modified protocol in compliance with the recommended biosafety guidelines from the World Health Organization based on the use of the MagNA Pure total nucleic acid large volume isolation kit for the extraction of SARS-coronavirus RNA. The modified protocol was compared with a column-based extraction kit (QIAamp viral RNA mini kit, Qiagen for quantitative performance, analytical sensitivity and precision. Results The newly developed automated protocol was shown to be free from carry-over contamination and have comparable performance with other standard protocols and kits designed for the MagNA Pure LC instrument. However, the automated method was found to be less sensitive, less precise and led to consistently lower serum SARS-coronavirus concentrations when compared with the column-based extraction method. Conclusion As the diagnostic efficiency and prognostic value of the serum SARS-CoV RNA RT-PCR test is critically associated with the analytical sensitivity and quantitative performance contributed both by the RNA extraction and RT-PCR components of the test, we recommend the use of the column-based manual RNA extraction method.

  1. A simple protocol for the routine calibration of pH meters

    Directory of Open Access Journals (Sweden)

    A. FEDERMAN NETO

    2009-01-01

    Full Text Available

    A simplified laboratory protocol for the calibration of pH meters is described and tested. It is based on the use of two analytical primary buffer solutions, potassium hydrogen phthalate and Borax (sodium tetraborate decahydrate of precisely known concentrations and pH. The solutions may be stored at room temperature for long periods, without decomposition and used directly. The calibration of the meter can be checked with standard solutions of sodium dihydrogen phosphate, sodium carbonate, sodium benzoate, sodium salicylate or potassium oxalate. Methods for the purification of Borax and potassium chloride are also given, and a new method for the neutralization of 0.9% saline is suggested. Keywords: pH meters (calibration; saline (0.9%; pH standards; potassium biphthalate; Borax.

  2. Performance analysis of a multiple transmission protocol for VSAT networks

    Science.gov (United States)

    Vaman, Dhadesugoor R.; Kumar, Sharad

    1990-08-01

    This paper presents a multiple transmission contention (MTC) protocol for a very small aperture terminal (VSAT) network environment. The MTC protocol offers a significant advantage in delay performance over the slotted ALOHA-based VSAT network. This is based on the fact that a frame is transmitted in a number of slots randomly selected for transmission. A frame collision is said to occur in this protocol if and only if the frame experiences collision in each of the m slots of transmission. Thus, the probability of collision is reduced and the frame transmission delay due to collisions is also reduced. However, the penalty paid by the MTC is the fact that the available channel capacity for data transmission is less than that of the single transmission case of slotted ALOHA. This paper presents analytical results for the MTC protocol performance as well as the simulation results which verify the analytical performance results.

  3. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Learning Analytics Considered Harmful

    Science.gov (United States)

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  5. Smart city analytics

    DEFF Research Database (Denmark)

    Hansen, Casper; Hansen, Christian; Alstrup, Stephen

    2017-01-01

    is very useful when full records are not accessible or available. Smart city analytics does not necessarily require full city records. To our knowledge this preliminary study is the first to predict large increases in home care for smart city analytics....

  6. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  7. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  8. Of the Analytical Engine

    Indian Academy of Sciences (India)

    with me, at breakfast, the various powers of the Analytical Engine. After a long conversa- tion on the subject, he inquired what the machine could do if, .... The following conditions relate to the algebraic portion of the Analytical Engine: (e) The number of literal constants must be unlimited. (f) The number of variables must be ...

  9. European Analytical Column

    DEFF Research Database (Denmark)

    Karlberg, B.; Grasserbauer, M.; Andersen, Jens Enevold Thaulov

    2009-01-01

    The European Analytical Column has once more invited a guest columnist to give his views on various matters related to analytical chemistry in Europe. This year, we have invited Professor Manfred Grasserbauer of the Vienna University of Technology to present some of the current challenges for Eur...

  10. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  11. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  12. Analytical laboratory quality audits

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  13. Interference Calculation in Asynchronous Random Access Protocols using Diversity

    OpenAIRE

    Meloni, Alessio; Murroni, Maurizio

    2015-01-01

    The use of Aloha-based Random Access protocols is interesting when channel sensing is either not possible or not convenient and the traffic from terminals is unpredictable and sporadic. In this paper an analytic model for packet interference calculation in asynchronous Random Access protocols using diversity is presented. The aim is to provide a tool that avoids time-consuming simulations to evaluate packet loss and throughput in case decodability is still possible when a certain interference...

  14. Comparison of Analytical and Measured Performance Results on Network Coding in IEEE 802.11 Ad-Hoc Networks

    DEFF Research Database (Denmark)

    Zhao, Fang; Médard, Muriel; Hundebøll, Martin

    2012-01-01

    CATWOMAN that can run on standard WiFi hardware. We present an analytical model to evaluate the performance of COPE in simple networks, and our results show the excellent predictive quality of this model. By closely examining the performance in two simple topologies, we observe that the coding gain results......Network coding is a promising technology that has been shown to improve throughput in wireless mesh networks. In this paper, we compare the analytical and experimental performance of COPE-style network coding in IEEE 802.11 ad-hoc networks. In the experiments, we use a lightweight scheme called...... from the interaction between network coding and the MAC protocol, and the gap between the theoretical and practical gains is due to the different channel qualities of sending nodes. This understanding is helpful for design of larger mesh networks that use network coding....

  15. Technical Analysis of SSP-21 Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bromberger, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-09

    As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will use other advanced technologies to provide a subset of security services.

  16. Network Authentication Protocol Studies

    Science.gov (United States)

    2009-04-01

    This is called the “Decision Diffie-Hellman Problem” ( DDH ) [68]. For the protocols discussed in this paper, the cryptographic strength of the generated...group key is solely dependent upon the strength of the underlying two-party protocols. That said, we will not address the DDH problem further in this

  17. Recombinant gene expression protocols

    National Research Council Canada - National Science Library

    Tuan, Rocky S

    1997-01-01

    .... A fundamental requirement for successful recombinant gene expression is the design of the cloning vector and the choice of the host organism for expression. Recombinant Gene Expression Protocols grows out of the need for a laboratory manual that provides the reader the background and rationale, as well as the practical protocols for the preparation of...

  18. Linear Logical Voting Protocols

    DEFF Research Database (Denmark)

    DeYoung, Henry; Schürmann, Carsten

    2012-01-01

    . In response, we promote linear logic as a high-level language for both specifying and implementing voting protocols. Our linear logical specifications of the single-winner first-past-the-post (SW- FPTP) and single transferable vote (STV) protocols demonstrate that this approach leads to concise...

  19. Denial of Service Attacks on 802.1X Security Protocol

    National Research Council Canada - National Science Library

    Ozan, Orhan

    2004-01-01

    ... infrastructure, such as military and administrative government LANs. The IEEE 802.11 wireless standard specifies both an authentication service and encryption protocol, but research has demonstrated that these protocols are severely flawed...

  20. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  1. A new primary dental care service compared with standard care for child and family to reduce the re-occurrence of childhood dental caries (Dental RECUR): study protocol for a randomised controlled trial.

    Science.gov (United States)

    Pine, Cynthia; Adair, Pauline; Burnside, Girvan; Robinson, Louise; Edwards, Rhiannon Tudor; Albadri, Sondos; Curnow, Morag; Ghahreman, Marjan; Henderson, Mary; Malies, Clare; Wong, Ferranti; Muirhead, Vanessa; Weston-Price, Sally; Whitehead, Hilary

    2015-11-04

    also added to the challenges in implementing the Dental RECUR protocol during the recruitment phase. ISRCTN24958829 (date of registration: 27 September 2013), Current protocol version: 5.0.

  2. The Sleep Or Mood Novel Adjunctive therapy (SOMNA) trial: a study protocol for a randomised controlled trial evaluating an internet-delivered cognitive behavioural therapy program for insomnia on outcomes of standard treatment for depression in men.

    Science.gov (United States)

    Cockayne, Nicole L; Christensen, Helen M; Griffiths, Kathleen M; Naismith, Sharon L; Hickie, Ian B; Thorndike, Frances P; Ritterband, Lee M; Glozier, Nick S

    2015-02-05

    Insomnia is a significant risk factor for depression onset, can result in more disabling depressive illness, and is a common residual symptom following treatment cessation that can increase the risk of relapse. Internet-based cognitive behavioural therapy for insomnia has demonstrated efficacy and acceptability to men who are less likely than women to seek help in standard care. We aim to evaluate whether internet delivered cognitive behavioural therapy for insomnia as an adjunct to a standard depression therapeutic plan can lead to improved mood outcomes. Male participants aged 50 years or more, meeting Diagnostic and Statistical Manual of Mental Disorders criteria for current Major Depressive Episode and/or Dysthymia and self-reported insomnia symptoms, will be screened to participate in a single-centre double-blind randomised controlled trial with two parallel groups involving adjunctive internet-delivered cognitive behavioural therapy for insomnia and an internet-based control program. The trial will consist of a nine-week insomnia intervention period with a six-month follow-up period. During the insomnia intervention period participants will have their depression management coordinated by a psychiatrist using standard guideline-based depression treatments. The study will be conducted in urban New South Wales, Australia, where 80 participants from primary and secondary care and direct from the local community will be recruited. The primary outcome is change in the severity of depressive symptoms from baseline to week 12. This study will provide evidence on whether a widely accessible, evidence-based, internet-delivered cognitive behavioural therapy for insomnia intervention can lead to greater improvements than standard treatment for depression alone, in a group who traditionally do not readily access psychotherapy. The study is designed to establish effect size, feasibility and processes associated with implementing e-health solutions alongside standard

  3. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples.

    Science.gov (United States)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart; Larsen, Pia Bükmann

    2017-12-01

    Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability in incurred samples. We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used to determine stability of each analyte. Additionally, evaporation from the decapped blood collection tubes and the residual platelet count in the plasma after centrifugation were quantified. We report a post-analytical stability of most routine analytes of ≥8h and do therefore - with few exceptions - suggest a standard 8hour-time limit for reordering and reanalysis of analytes in incurred samples. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  4. Protocols for communication and governance of risks

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.; Lind, N.C.; Faber, M.H.

    2015-01-01

    The aim of this paper is to explain the need to organize the development of standard protocols for communication about major public risks. Tragic events, such as inadequate earthquake preparedness or great unnecessary losses of life due to public misunderstandings underline the importance of such

  5. The IAEA's programme on analytical quality control

    International Nuclear Information System (INIS)

    Radecki, Zbigniew

    2001-01-01

    In the early 1960s, the IAEA decided to launch a programme for the assessment of the reliability of low level radiochemical analysis and since then, the Analytical Quality Control Services (AQCS) has developed into a major organiser of world-wide intercomparison runs. Over the intervening years, the types of matrices studied and the analytes of interest have been extended beyond the limits of radioactivity measurements to encompass: trace elements, organic contaminants, stable isotopes and methyl mercury. The Agency provides assistance to its Member States through the AQCS programme to improve the standard of analytical results in their laboratories. These results must be of a certain quality (i.e. accuracy and precision) which is determined by their intended use and should be comparable with other analytical measurements produced elsewhere. In order to enable laboratories in Member States to generate analytical measurements with appropriate and internationally recognised quality the main objectives of AQCS programme are: to provide the analyst with tools to compare and evaluate their performance relative to other laboratories, to assess the accuracy and precision of the analytical method used, to provide objective evidence on the quality of the results, and to ensure comparable analytical results within projects and networks

  6. Protocolized versus non-protocolized weaning for reducing the duration of mechanical ventilation in critically ill adult patients (Review)

    OpenAIRE

    Blackwood, Bronagh; Alderdice, Fiona; Burns, Karen EA; Cardwell, Chris R; Lavery, Gavin; O'Halloran, Peter

    2010-01-01

    BACKGROUND: Reducing weaning time is desirable in minimizing potential complications from mechanical ventilation. Standardized weaning protocols are purported to reduce time spent on mechanical ventilation. However, evidence supporting their use in clinical practice is inconsistent. OBJECTIVES: To assess the effects of protocolized weaning from mechanical ventilation on the total duration of mechanical ventilation for critically ill adults; ascertain differences between protocolized and non-p...

  7. Chemometrics in analytical spectroscopy

    National Research Council Canada - National Science Library

    Adams, Mike J

    1995-01-01

    This book provides students and practising analysts with a tutorial guide to the use and application of the more commonly encountered techniques used in processing and interpreting analytical spectroscopic data...

  8. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  9. Enzymes in Analytical Chemistry.

    Science.gov (United States)

    Fishman, Myer M.

    1980-01-01

    Presents tabular information concerning recent research in the field of enzymes in analytic chemistry, with methods, substrate or reaction catalyzed, assay, comments and references listed. The table refers to 128 references. Also listed are 13 general citations. (CS)

  10. On complex functions analyticity

    CERN Document Server

    Karavashkin, S B

    2002-01-01

    We analyse here the conventional definitions of analyticity and differentiability of functions of complex variable. We reveal the possibility to extend the conditions of analyticity and differentiability to the functions implementing the non-conformal mapping. On this basis we formulate more general definitions of analyticity and differentiability covering those conventional. We present some examples of such functions. By the example of a horizontal belt on a plane Z mapped non-conformally onto a crater-like harmonic vortex, we study the pattern of trajectory variation of a body motion in such field in case of field power function varying in time. We present the technique to solve the problems of such type with the help of dynamical functions of complex variable implementing the analytical non-conformal mapping

  11. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  12. Process Analytical Chemistry

    OpenAIRE

    Trevisan, Marcello G.; Poppi, Ronei J.

    2006-01-01

    Process Analytical Chemistry (PAC) is an important and growing area in analytical chemistry, that has received little attention in academic centers devoted to the gathering of knowledge and to optimization of chemical processes. PAC is an area devoted to optimization and knowledge acquisition of chemical processes, to reducing costs and wastes and to making an important contribution to sustainable development. The main aim of this review is to present to the Brazilian community the developmen...

  13. Analytical Calculations for CAMEA

    OpenAIRE

    Markó, Márton

    2014-01-01

    CAMEA is a novel instrument concept, thus the performance has not been explored. Furthermore it is a complex instrument using many analyser arrays in a wide angular range. The performance of the instrument has been studied by use of three approaches: McStas simulations, analytical calculations, and prototyping. Due to the complexity of the instrument all of the previously mentioned methods can have faults misleading us during the instrument development. We use Monte Carlo and analytical model...

  14. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  15. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  16. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  17. Ore and metal standards vital to labs

    International Nuclear Information System (INIS)

    Steger, H.

    1981-01-01

    Information provided on ore and metal analytical standards available from CANMET includes preparation, certification, and use. The collection of standards includes four samples of uranium ore and three samples of uranium-thorium ores

  18. Playing With Population Protocols

    Directory of Open Access Journals (Sweden)

    Xavier Koegler

    2009-06-01

    Full Text Available Population protocols have been introduced as a model of sensor networks consisting of very limited mobile agents with no control over their own movement: A collection of anonymous agents, modeled by finite automata, interact in pairs according to some rules. Predicates on the initial configurations that can be computed by such protocols have been characterized under several hypotheses. We discuss here whether and when the rules of interactions between agents can be seen as a game from game theory. We do so by discussing several basic protocols.

  19. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    Science.gov (United States)

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  20. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  1. Analysing Password Protocol Security Against Off-line Dictionary Attacks

    NARCIS (Netherlands)

    Corin, R.J.; Doumen, J.M.; Etalle, Sandro

    2003-01-01

    We study the security of password protocols against off-line dictionary attacks. In addition to the standard adversary abilities, we also consider further cryptographic advantages given to the adversary when considering the password protocol being instantiated with particular encryption schemes. We

  2. Analysing Password Protocol Security Against Off-line Dictionary Attacks

    NARCIS (Netherlands)

    Corin, R.J.; Doumen, J.M.; Etalle, Sandro; Busi, Nadia; Gorrieri, Roberto; Martinelli, Fabio

    We study the security of password protocols against off-line dictionary attacks. In addition to the standard adversary abilities, we also consider further cryptographic advantages given to the adversary when considering the password protocol being instantiated with particular encryption schemes. We

  3. Assessment of sampling and analytical uncertainty of trace element contents in arable field soils.

    Science.gov (United States)

    Buczko, Uwe; Kuchenbuch, Rolf O; Ubelhör, Walter; Nätscher, Ludwig

    2012-07-01

    depending not only on the analyte but also on the field and the sampling trial. Comparison of PT with CT sampling suggests that standardization of sampling protocols reduces sampling uncertainty, especially for fields of low heterogeneity.

  4. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  5. 40 CFR 140.5 - Analytical procedures.

    Science.gov (United States)

    2010-07-01

    ... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...

  6. Evaluation of sample preparation protocols for spider venom profiling by MALDI-TOF MS.

    Science.gov (United States)

    Bočánek, Ondřej; Šedo, Ondrej; Pekár, Stano; Zdráhal, Zbyněk

    2017-07-01

    Spider venoms are highly complex mixtures containing biologically active substances with potential for use in biotechnology or pharmacology. Fingerprinting of venoms by Matrix-Assisted Laser Desorption-Ionization - Time of Flight Mass Spectrometry (MALDI-TOF MS) is a thriving technology, enabling the rapid detection of peptide/protein components that can provide comparative information. In this study, we evaluated the effects of sample preparation procedures on MALDI-TOF mass spectral quality to establish a protocol providing the most reliable analytical outputs. We adopted initial sample preparation conditions from studies already published in this field. Three different MALDI matrixes, three matrix solvents, two sample deposition methods, and different acid concentrations were tested. As a model sample, venom from Brachypelma albopilosa was used. The mass spectra were evaluated on the basis of absolute and relative signal intensities, and signal resolution. By conducting three series of analyses at three weekly intervals, the reproducibility of the mass spectra were assessed as a crucial factor in the selection for optimum conditions. A sample preparation protocol based on the use of an HCCA matrix dissolved in 50% acetonitrile with 2.5% TFA deposited onto the target by the dried-droplet method was found to provide the best results in terms of information yield and repeatability. We propose that this protocol should be followed as a standard procedure, enabling the comparative assessment of MALDI-TOF MS spider venom fingerprints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A QoS guaranteeing MAC layer protocol for the "underdog" traffic

    Directory of Open Access Journals (Sweden)

    Paolini Christopher

    2011-01-01

    Full Text Available Abstract With the tremendous boom in the wireless local area network arena, there has been a phenomenal spike in the web traffic which has been triggered by the growing popularity of real-time multimedia applications. Towards this end, the IEEE 802.11e medium access control (MAC standard specifies a set of quality-of-service (QoS enhancement features to ensure QoS for these delay sensitive multimedia applications. Most of these features are unfair and inefficient from the perspective of low priority (non-real time traffic flows as they tend to starve the non-real time flows depriving them of appropriate channel access, hence throughput. To that extent, this article proposes a MAC protocol that ensures fairness in the overall network performance by still providing QoS for real-time traffic without starving the "underdog" or non-real-time flows. The article first presents analytical expressions supported by Matlab simulation results which highlight the performance drawbacks of biased protocols such as 802.11e. It then evaluates the efficiency of the proposed "fair MAC protocol" through extensive simulations conducted on the QualNet simulation platform. The simulation results validate the fairness aspect of the proposed scheme.

  8. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    Science.gov (United States)

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  9. Intelligent Local Avoided Collision (iLAC) MAC Protocol for Very High Speed Wireless Network

    Science.gov (United States)

    Hieu, Dinh Chi; Masuda, Akeo; Rabarijaona, Verotiana Hanitriniala; Shimamoto, Shigeru

    Future wireless communication systems aim at very high data rates. As the medium access control (MAC) protocol plays the central role in determining the overall performance of the wireless system, designing a suitable MAC protocol is critical to fully exploit the benefit of high speed transmission that the physical layer (PHY) offers. In the latest 802.11n standard [2], the problem of long overhead has been addressed adequately but the issue of excessive colliding transmissions, especially in congested situation, remains untouched. The procedure of setting the backoff value is the heart of the 802.11 distributed coordination function (DCF) to avoid collision in which each station makes its own decision on how to avoid collision in the next transmission. However, collision avoidance is a problem that can not be solved by a single station. In this paper, we introduce a new MAC protocol called Intelligent Local Avoided Collision (iLAC) that redefines individual rationality in choosing the backoff counter value to avoid a colliding transmission. The distinguishing feature of iLAC is that it fundamentally changes this decision making process from collision avoidance to collaborative collision prevention. As a result, stations can avoid colliding transmissions with much greater precision. Analytical solution confirms the validity of this proposal and simulation results show that the proposed algorithm outperforms the conventional algorithms by a large margin.

  10. Throughput and Fairness of Collision Avoidance Protocols in Ad Hoc Networks

    National Research Council Canada - National Science Library

    Garcia-Luna-Aceves, J. J; Wang, Yu

    2004-01-01

    .... In Section 1, The authors present an analytical modeling to derive the saturation throughput of these sender-initiated collision avoidance protocols in multi-hop ad hoc networks with nodes randomly...

  11. Heavy vehicle driver workload assessment. Task 1, task analysis data and protocols review

    Science.gov (United States)

    This report contains a review of available task analytic data and protocols pertinent to heavy vehicle operation and determination of the availability and relevance of such data to heavy vehicle driver workload assessment. Additionally, a preliminary...

  12. A multi-centre, parallel group superiority trial of silk therapeutic clothing compared to standard care for the management of eczema in children (CLOTHES Trial): study protocol for a randomised controlled trial.

    Science.gov (United States)

    Harrison, Eleanor F; Haines, Rachel H; Cowdell, Fiona; Sach, Tracey H; Dean, Taraneh; Pollock, Ian; Burrows, Nigel P; Buckley, Hannah; Batchelor, Jonathan; Williams, Hywel C; Lawton, Sandra; Brown, Sara J; Bradshaw, Lucy E; Ahmed, Amina; Montgomery, Alan A; Mitchell, Eleanor J; Thomas, Kim S

    2015-09-02

    Eczema is a chronic, itchy skin condition that can have a large impact on the quality of life of patients and their families. People with eczema are often keen to try out non-pharmacological therapies like silk therapeutic garments that could reduce itching or the damage caused by scratching. However, the effectiveness and cost-effectiveness of these garments in the management of eczema has yet to be proven. The CLOTHES Trial will test the hypothesis that 'silk therapeutic garments plus standard eczema care' is superior to 'standard care alone' for children with moderate to severe eczema. Parallel group, observer-blind, pragmatic, multi-centre randomised controlled trial of 6 months' duration. Three hundred children aged 1 to 15 years with moderate to severe eczema will be randomised (1:1) to receive silk therapeutic garments plus standard eczema care, or standard eczema care alone. Primary outcome is eczema severity, as assessed by trained and blinded investigators at 2, 4 and 6 months (using the Eczema Area and Severity Index (EASI)). Secondary outcomes include: patient-reported eczema symptoms (collected weekly for 6 months to capture long-term control); global assessment of severity; quality of life of the child, family and main carer; use of standard eczema treatments (emollients, corticosteroids applied topically, calcineurin inhibitors applied topically and wet wraps); frequency of infections; and cost-effectiveness. The acceptability and durability of the clothing will also be assessed, as will adherence to wearing the garments. A nested qualitative study will assess the views of a subset of children wearing the garments and their parents, and those of healthcare providers and commissioners. Randomisation uses a computer-generated sequence of permuted blocks of randomly varying size, stratified by recruiting hospital and child's age ( 5 years), and concealed using a secure web-based system. The sequence of treatment allocations will remain concealed until

  13. Metabolism of the synthetic cannabinoid 5F-PY-PICA by human and rat hepatocytes and identification of biliary analytical targets by directional efflux in sandwich-cultured rat hepatocytes using UHPLC-HR-MS/MS

    DEFF Research Database (Denmark)

    Mardal, Marie; Annaert, Pieter; Noble, Carolina

    2018-01-01

    F-PY-PICA and to determine which analytical targets are excreted into the bile and urine. Metabolites identified after incubation of 5F-PY-PICA with pooled human liver microsomes (pHLM), pooled human hepatocytes (pHH), or suspended and sandwich-cultured rat hepatocytes (SCRH). Rat hepatocytes were...... harvested following a two-step perfusion protocol and the SCRH were prepared between layers of rat-tail collagen. The biliary efflux of 5F-PY-PICA and its metabolites was determined in three-day–cultured SCRH by differential efflux into either standard buffer from intact bile canaliculi or standard buffer...

  14. Analytical Determination of Auxins and Cytokinins.

    Science.gov (United States)

    Dobrev, Petre I; Hoyerová, Klára; Petrášek, Jan

    2017-01-01

    Parallel determination of auxin and cytokinin levels within plant organs and tissues represents an invaluable tool for studies of their physiological effects and mutual interactions. Thanks to their different chemical structures, auxins, cytokinins and their metabolites are often determined separately, using specialized procedures of sample purification, extraction, and quantification. However, recent progress in the sensitivity of analytical methods of liquid chromatography coupled to mass spectrometry (LC-MS) allows parallel analysis of multiple compounds. Here we describe a method that is based on single step purification protocol followed by LC-MS separation and detection for parallel analysis of auxins, cytokinins and their metabolites in various plant tissues and cell cultures.

  15. Transdiagnostic group CBT vs. standard group CBT for depression, social anxiety disorder and agoraphobia/panic disorder: Study protocol for a pragmatic, multicenter non-inferiority randomized controlled trial.

    Science.gov (United States)

    Arnfred, Sidse M; Aharoni, Ruth; Hvenegaard, Morten; Poulsen, Stig; Bach, Bo; Arendt, Mikkel; Rosenberg, Nicole K; Reinholt, Nina

    2017-01-23

    Transdiagnostic Cognitive Behavior Therapy (TCBT) manuals delivered in individual format have been reported to be just as effective as traditional diagnosis specific CBT manuals. We have translated and modified the "The Unified Protocol for Transdiagnostic Treatment of Emotional Disorders" (UP-CBT) for group delivery in Mental Health Service (MHS), and shown effects comparable to traditional CBT in a naturalistic study. As the use of one manual instead of several diagnosis-specific manuals could simplify logistics, reduce waiting time, and increase therapist expertise compared to diagnosis specific CBT, we aim to test the relative efficacy of group UP-CBT and diagnosis specific group CBT. The study is a partially blinded, pragmatic, non-inferiority, parallel, multi-center randomized controlled trial (RCT) of UP-CBT vs diagnosis specific CBT for Unipolar Depression, Social Anxiety Disorder and Agoraphobia/Panic Disorder. In total, 248 patients are recruited from three regional MHS centers across Denmark and included in two intervention arms. The primary outcome is patient-ratings of well-being (WHO Well-being Index, WHO-5), secondary outcomes include level of depressive and anxious symptoms, personality variables, emotion regulation, reflective functioning, and social adjustment. Assessments are conducted before and after therapy and at 6 months follow-up. Weekly patient-rated outcomes and group evaluations are collected for every session. Outcome assessors, blind to treatment allocation, will perform the observer-based symptom ratings, and fidelity assessors will monitor manual adherence. The current study will be the first RCT investigating the dissemination of the UP in a MHS setting, the UP delivered in groups, and with depressive patients included. Hence the results are expected to add substantially to the evidence base for rational group psychotherapy in MHS. The planned moderator and mediator analyses could spur new hypotheses about mechanisms of change in

  16. Mobile Internet Protocol Analysis

    National Research Council Canada - National Science Library

    Brachfeld, Lawrence

    1999-01-01

    ...) and User Datagram Protocol (UDP). Mobile IP allows mobile computers to send and receive packets addressed with their home network IP address, regardless of the IP address of their current point of attachment on the Internet...

  17. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  18. USA-USSR protocol

    CERN Multimedia

    1970-01-01

    On 30 November the USA Atomic Energy Commission and the USSR State Committee for the Utilization of Atomic Energy signed, in Washington, a protocol 'on carrying out of joint projects in the field of high energy physics at the accelerators of the National Accelerator Laboratory (Batavia) and the Institute for High Energy Physics (Serpukhov)'. The protocol will be in force for five years and can be extended by mutual agreement.

  19. Principles of Protocol Design

    DEFF Research Database (Denmark)

    Sharp, Robin

    This is a new and updated edition of a book first published in 1994. The book introduces the reader to the principles used in the construction of a large range of modern data communication protocols, as used in distributed computer systems of all kinds. The approach taken is rather a formal one......, primarily based on descriptions of the protocols in the notation of CSP (Communicating Sequential Processes)....

  20. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    Directory of Open Access Journals (Sweden)

    Elodie Jobard

    2016-12-01

    Full Text Available The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies.

  1. Treatment of subcutaneous abdominal wound healing impairment after surgery without fascial dehiscence by vacuum assisted closure™ (SAWHI-V.A.C.®-study) versus standard conventional wound therapy: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Seidel, Dörthe; Lefering, Rolf; Neugebauer, Edmund A M

    2013-11-20

    A decision of the Federal Joint Committee Germany in 2008 stated that negative pressure wound therapy is not accepted as a standard therapy for full reimbursement by the health insurance companies in Germany. This decision is based on the final report of the Institute for Quality and Efficiency in Health Care in 2006, which demonstrated through systematic reviews and meta-analysis of previous study projects, that an insufficient state of evidence regarding the use of negative pressure wound therapy for the treatment of acute and chronic wounds exists. Further studies were therefore indicated. The study is designed as a multinational, multicenter, prospective randomized controlled, adaptive design, clinical superiority trial, with blinded photographic analysis of the primary endpoint. Efficacy and effectiveness of negative pressure wound therapy for wounds in both medical sectors (in- and outpatient care) will be evaluated. The trial compares the treatment outcome of the application of a technical medical device which is based on the principle of negative pressure wound therapy (intervention group) and standard conventional wound therapy (control group) in the treatment of subcutaneous abdominal wounds after surgery. The aim of the SAWHI-VAC® study is to compare the clinical, safety and economic results of both treatment arms. The study project is designed and conducted with the aim of providing solid evidence regarding the efficacy of negative pressure wound therapy. Study results will be provided until the end of 2014 to contribute to the final decision of the Federal Joint Committee Germany regarding the general admission of negative pressure wound therapy as a standard of performance within both medical sectors. Clinical Trials.gov NCT01528033German Clinical Trials Register DRKS00000648.

  2. A cluster randomized trial of standard quality improvement versus patient-centered interventions to enhance depression care for African Americans in the primary care setting: study protocol NCT00243425

    Directory of Open Access Journals (Sweden)

    Ghods Bri K

    2010-02-01

    Full Text Available Abstract Background Several studies document disparities in access to care and quality of care for depression for African Americans. Research suggests that patient attitudes and clinician communication behaviors may contribute to these disparities. Evidence links patient-centered care to improvements in mental health outcomes; therefore, quality improvement interventions that enhance this dimension of care are promising strategies to improve treatment and outcomes of depression among African Americans. This paper describes the design of the BRIDGE (Blacks Receiving Interventions for Depression and Gaining Empowerment Study. The goal of the study is to compare the effectiveness of two interventions for African-American patients with depression--a standard quality improvement program and a patient-centered quality improvement program. The main hypothesis is that patients in the patient-centered group will have a greater reduction in their depression symptoms, higher rates of depression remission, and greater improvements in mental health functioning at six, twelve, and eighteen months than patients in the standard group. The study also examines patient ratings of care and receipt of guideline-concordant treatment for depression. Methods/Design A total of 36 primary care clinicians and 132 of their African-American patients with major depressive disorder were recruited into a cluster randomized trial. The study uses intent-to-treat analyses to compare the effectiveness of standard quality improvement interventions (academic detailing about depression guidelines for clinicians and disease-oriented care management for their patients and patient-centered quality improvement interventions (communication skills training to enhance participatory decision-making for clinicians and care management focused on explanatory models, socio-cultural barriers, and treatment preferences for their patients for improving outcomes over 12 months of follow

  3. Telemetry Standards, Part 1

    Science.gov (United States)

    2015-07-01

    protocol RCC Range Commanders Council RFC Request For Comment RIU remote interface unit RMM removable memory module RS Recommended Standard RSCF...followed by hex characters Comments COMMENTS G\\COM Allowed when: Always Provide the additional information requested or any other information desired...if applicable. Range: 6 characters Comments COMMENTS M-x\\COM Allowed when: When M\\ID is specified Provide the additional information requested or

  4. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  5. Doing social media analytics

    Directory of Open Access Journals (Sweden)

    Phillip Brooker

    2016-07-01

    Full Text Available In the few years since the advent of ‘Big Data’ research, social media analytics has begun to accumulate studies drawing on social media as a resource and tool for research work. Yet, there has been relatively little attention paid to the development of methodologies for handling this kind of data. The few works that exist in this area often reflect upon the implications of ‘grand’ social science methodological concepts for new social media research (i.e. they focus on general issues such as sampling, data validity, ethics, etc.. By contrast, we advance an abductively oriented methodological suite designed to explore the construction of phenomena played out through social media. To do this, we use a software tool – Chorus – to illustrate a visual analytic approach to data. Informed by visual analytic principles, we posit a two-by-two methodological model of social media analytics, combining two data collection strategies with two analytic modes. We go on to demonstrate each of these four approaches ‘in action’, to help clarify how and why they might be used to address various research questions.

  6. Evaluation of the CLSI EP26-A protocol for detection of reagent lot-to-lot differences.

    Science.gov (United States)

    Katzman, Brooke M; Ness, Karl M; Algeciras-Schimnich, Alicia

    2017-09-01

    Verification of new reagent lots is a required laboratory task. The Clinical and Laboratory Standards Institute (CLSI) EP26-A guideline provides a lot-to-lot verification protocol to detect significant changes in test performance. The aim of this study was to compare the performance of EP26-A with our laboratory reagent lot verification protocol. Prospective evaluations for two reagent lots each for thyroid stimulating hormone (TSH), thyroglobulin (Tg), thyroxine (T4), triiodothyronine (T3), free triiodothyronine (fT3), and thyroid peroxidase antibody (TPOAb) were performed. The laboratory's lot verification process included evaluation of 20 patient samples with the current and new lots and acceptability based on a predefined criteria. For EP26-A, method imprecision data and critical differences based on previously defined lot-to-lot consistency goals were used to define sample size requirements and rejection limits. EP26-A required the following number of samples: 23 for TSH, 17 for Tg, 33 for T4, 31 for T3, 48 for fT3, and 1 for TPOAb. Our current protocol and EP26-A were in agreement in 9 of the 12 (75%) paired verifications. Of the 3 discrepant verifications, Tg and TSH reagent lots were rejected by EP26-A due to significant differences at medical decision points; whereas TPOAb was rejected by the current laboratory protocol. The EP26-A protocol arrived at the same conclusions as our protocol in 75% of the evaluations and required more samples for 4 of the 6 analytes tested. Challenges associated with determining rejection limits and the need for increased sample sizes may be critical factors that limit the utility of EP26-A. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  7. 40 CFR 141.74 - Analytical and monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ... Titration 4500-ClO2 C 4500-ClO2 C-00 DPD Method 4500-ClO2 D Amperometric Titration 4500-ClO2 E 4500-ClO2 E... Analytical and monitoring requirements. (a) Analytical requirements. Only the analytical method(s) specified... accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies of the methods published in Standard Methods for...

  8. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  9. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  10. Competing on talent analytics.

    Science.gov (United States)

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  11. Advances in analytical chemistry

    Science.gov (United States)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  12. Uniform measurement standard for heat supply in housing and utility building construction. A protocol to compare alternatives for heat supply at building construction sites. Version 3.1; Uniforme Maatlat voor de warmtevoorziening in de woning- en utiliteitsbouw. Een protocol voor het vergelijken van alternatieven voor de warmtevoorziening op bouwlocaties. Versie 3.1

    Energy Technology Data Exchange (ETDEWEB)

    Nuiten, P. [W/E adviseurs, Utrecht (Netherlands); Van der Ree, B. [Primum, Driebergen-Rijsenburg (Netherlands)

    2012-02-15

    The uniform measurement standard is one of the methods that the National Expertise Heat Centre develops to compare the energy efficiency of different techniques (such as heat pumps, heat and cold storage, collective systems, waste heat, combustion of woody biomass, etc.). The development of the uniform standard was started for the residential and utility building construction industry. In the future, the standard calculation method will also be made available for other sectors [Dutch] De uniforme maatlat is 1 van de methoden die het Nationaal Expertisecentrum Warmte ontwikkelt om de energieprestaties van verschillende technieken (zoals warmtepompen, warmte-koudeopslag, collectieve systemen, restwarmte, verbranding van houtachtige biomassa, etc.) goed vergelijkbaar te maken. De ontwikkeling van de uniforme maatlat is gestart voor de woning- en utiliteitsbouw. In de toekomst zal de uniforme maatlat ook voor andere sectoren beschikbaar komen. In deze versie van de uniforme maatlat is aansluiting gezocht bij de kengetallen van de Energieprestatienorm voor Maatregelen op Gebiedsniveau (EMG), zodat de resultaten van de uniforme maatlat in lijn zijn met die van de EMG.

  13. Assessing the efficacy of the healthy eating and lifestyle programme (HELP compared with enhanced standard care of the obese adolescent in the community: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Christie Deborah

    2011-11-01

    Full Text Available Abstract Background The childhood obesity epidemic is one of the foremost UK health priorities. Childhood obesity tracks into adult life and places individuals at considerable risk for diabetes, cardiovascular disease, liver disease and other morbidities. There is widespread need for paediatric lifestyle programmes as change may be easier to accomplish in childhood than later in life. Study Design/Method The study will evaluate the management of adolescent obesity by conducting a Medical Research Council complex intervention phase III efficacy randomised clinical trial of the Healthy Eating Lifestyle Programme within primary care. The study tests a community delivered multi-component intervention designed for adolescents developed from best practice as identified by National Institute for Health and Clinical Excellence. The hospital based pilot reduced body mass index and improved health-related quality of life. Subjects will be individually randomised to receiving either the Healthy Eating Lifestyle Programme (12 fortnightly family sessions or enhanced standard care. Baseline and follow up assessments will be undertaken blind to allocation status. A health economic evaluation is also being conducted. 200 obese young people (13-17 years, body mass index > 98th centile for age and sex will be recruited from primary care within the greater London area. The primary hypothesis is that a motivational and solution-focused family-based weight management programme delivered over 6 months is more efficacious in reducing body mass index in obese adolescents identified in the community than enhanced standard care. The primary outcome will be body mass index at the end of the intervention, adjusted for baseline body mass index, age and sex. The secondary hypothesis is that the Healthy Eating Lifestyle Programme is more efficacious in improving quality of life and psychological function and reducing waist circumference and cardiovascular risk factors in

  14. The Nagoya Protocol: Fragmentation or Consolidation?

    Directory of Open Access Journals (Sweden)

    Carmen Richerzhagen

    2014-02-01

    Full Text Available In October, 2010, a protocol on access and benefit-sharing (ABS of genetic resources was adopted, the so-called Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity. Before the adoption of the Nagoya Protocol, the governance architecture of ABS was already characterized by a multifaceted institutional environment. The use of genetic resources is confronted with many issues (conservation, research and development, intellectual property rights, food security, health issues, climate change that are governed by different institutions and agreements. The Nagoya Protocol contributes to increased fragmentation. However, the question arises whether this new regulatory framework can help to advance the implementation of the ABS provisions of the Convention on Biological Diversity (CBD. This paper attempts to find an answer to that question by following three analytical steps. First, it analyzes the causes of change against the background of theories of institutional change. Second, it aims to assess the typology of the architecture in order to find out if this new set of rules will contribute to a more synergistic, cooperative or conflictive architecture of ABS governance. Third, the paper looks at the problem of “fit” and identifies criteria that can be used to assess the new ABS governance architecture with regard to its effectiveness.

  15. A Business Evaluation Of The Next Generation Ipv6 Protocol In Fixed And Mobile Communication Services

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2002-01-01

    textabstractThis paper gives an analytical business model of the Internet IPv4 and IPv6 protocols ,focussing on the business implications of intrinsic technical properties of these protocols .The technical properties modeled in business terms are : address space, payload, autoconfiguration, IP

  16. Development of characterization protocol for mixed liquid radioactive waste classification

    Energy Technology Data Exchange (ETDEWEB)

    Zakaria, Norasalwa, E-mail: norasalwa@nuclearmalaysia.gov.my [Waste Technology Development Centre, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Wafa, Syed Asraf [Radioisotop Technology and Innovation, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Wo, Yii Mei [Radiochemistry and Environment, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Mahat, Sarimah [Material Technology Group, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia)

    2015-04-29

    Mixed liquid organic waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclides posed specific challenges in its management. Often, these wastes become legacy waste in many nuclear facilities and being considered as ‘problematic’ waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this study is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using various analytical procedures including gross alpha/ gross beta, gamma spectrometry, and LSC method. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste.

  17. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  18. Social network data analytics

    CERN Document Server

    Aggarwal, Charu C

    2011-01-01

    Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Pr

  19. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  20. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  1. Protocol for an economic evaluation alongside the University Health Network Whiplash Intervention Trial: cost-effectiveness of education and activation, a rehabilitation program, and the legislated standard of care for acute whiplash injury in Ontario

    Directory of Open Access Journals (Sweden)

    van der Velde Gabrielle

    2011-07-01

    Full Text Available Abstract Background Whiplash injury affects 83% of persons in a traffic collision and leads to whiplash-associated disorders (WAD. A major challenge facing health care decision makers is identifying cost-effective interventions due to lack of economic evidence. Our objective is to compare the cost-effectiveness of: 1 physician-based education and activation, 2 a rehabilitation program developed by Aviva Canada (a group of property and casualty insurance providers, and 3 the legislated standard of care in the Canadian province of Ontario: the Pre-approved Framework Guideline for Whiplash developed by the Financial Services Commission of Ontario. Methods/Design The economic evaluation will use participant-level data from the University Health Network Whiplash Intervention Trial and will be conducted from the societal perspective over the trial's one-year follow-up. Resource use (costs will include all health care goods and services, and benefits provided during the trial's 1-year follow-up. The primary health effect will be the quality-adjusted life year. We will identify the most cost-effective intervention using the incremental cost-effectiveness ratio and incremental net-benefit. Confidence ellipses and cost-effectiveness acceptability curves will represent uncertainty around these statistics, respectively. A budget impact analysis will assess the total annual impact of replacing the current legislated standard of care with each of the other interventions. An expected value of perfect information will determine the maximum research expenditure Canadian society should be willing to pay for, and inform priority setting in, research of WAD management. Discussion Results will provide health care decision makers with much needed economic evidence on common interventions for acute whiplash management. Trial Registration http://ClinicalTrials.gov identifier NCT00546806 [Trial registry date: October 18, 2007; Date first patient was randomized: February

  2. GnRH antagonist versus long agonist protocols in IVF

    DEFF Research Database (Denmark)

    Lambalk, C B; Banga, F R; Huirne, J A

    2017-01-01

    was not the only variable between the compared study arms. OBJECTIVE AND RATIONALE: The aim of the current study was to compare GnRH antagonist protocols versus standard long agonist protocols in couples undergoing IVF or ICSI, while accounting for various patient populations and treatment schedules. SEARCH......BACKGROUND: Most reviews of IVF ovarian stimulation protocols have insufficiently accounted for various patient populations, such as ovulatory women, women with polycystic ovary syndrome (PCOS) or women with poor ovarian response, and have included studies in which the agonist or antagonist...... METHODS: The Cochrane Menstrual Disorders and Subfertility Review Group specialized register of controlled trials and Pubmed and Embase databases were searched from inception until June 2016. Eligible trials were those that compared GnRH antagonist protocols and standard long GnRH agonist protocols...

  3. An Evaluation of Protocol Enhancing Proxies and File Transport Protocols for Satellite Communication

    Science.gov (United States)

    Finch, Patrick Eugene; Sullivan, Donald; Ivancic, William D.

    2012-01-01

    NASA is utilizing Global Hawk aircraft in high-altitude, long-duration Earth science missions. Communications with the onboard research equipment and sensors (the science payload) is via Ku-Band radio utilizing satellites in geostationary orbits. All payload communications use standard Internet Protocols and routing, and much of the data to be transferred is comprised of very large files. The science community is interested in fully utilizing these communication links to retrieve data as quickly and reliably as possible. A test bed was developed at NASA Ames to evaluate modern transport protocols as well as Protocol Enhancing Proxies (PEPs) to determine what tools best fit the needs of the science community. This paper describes the test bed used, the protocols, the PEPs that were evaluated, the particular tests performed and the results and conclusions.

  4. Establishing a protocol for element determination in human nail clippings by neutron activation analysis

    International Nuclear Information System (INIS)

    Sanches, Thalita Pinheiro; Saiki, Mitiko

    2011-01-01

    Human nail samples have been analyzed to evaluate occupational exposure, nutritional status and to diagnose certain diseases. However, sampling and washing protocols for nail analyses vary from study to study not allowing comparisons between studies. One of the difficulties in analyzing nail samples is to eliminate only surface contamination without removing elements of interest in this tissue. In the present study, a protocol was defined in order to obtain reliable results of element concentrations in human nail clippings. Nail clippings collected from all 10 fingers or toes were previously pre cleaned using an ethyl alcohol solution to eliminate microbes. Then, the clippings were cut in small pieces and submitted to different reagents for washing by shaking. Neutron activation analysis (NAA) was applied for nail samples analysis which consisted of irradiating aliquots of samples together with synthetic elemental standards in the IEA-R1 nuclear research reactor followed by gamma ray spectrometry. Comparisons made between the results obtained for nails submitted to different reagents for cleaning indicated that the procedure using acetone and Triton X100 solution is more effective than that of nitric acid solution. Analyses in triplicates of a nail sample indicated results with relative standard deviations lower than 15% for most of elements, showing the homogeneity of the prepared sample. Qualitative analyses of different nail polishes showed that the presence of elements determined in the present study is negligible in these products. Quality control of the analytical results indicated that the applied NAA procedure is adequate for human nail analysis. (author)

  5. Authentication and common key generation cryptographic protocol for vehicle tachographs

    Directory of Open Access Journals (Sweden)

    Victor S. Gorbatov

    2017-11-01

    Full Text Available We present a public key generation protocol. The key is used for subscriber authentication in tachographs installed on vehicles in order to provide traffic safety. The protocol is based on the well-known Russian cryptographic algorithms. It ensures integrity and authenticity of data transmitted through communication channel between the on-board devices and vehicle tachograph cards. The protocol was developed in accordance with the Rosstandart recommendations and complies with the development and modernization principles for data protection encryption (cryptographic means. The protocol was suggested as a national standard draft and is open for public discussion in accordance with the established procedure. The main results of our study is the formulation of certain security tasks identical to those used by potential infringers to compromise the protocol. This allows one to account for structural features that will ensure further protocol compliance to the target security characteristics, as well as to guarantee subsequent justification of feature set sufficiency.

  6. Procurement and characterization of LEU special nuclear materials standards for PERLA

    International Nuclear Information System (INIS)

    Guardini, S.; Mousty, F.; Nonneman, S.; Schillebeeckx, P.; Gerard, J.; Vanaken, K.; Aigner, H.; Bagliano, G.; Deron, S.; Vandevelde, L.

    1995-01-01

    A major activity of PERLA, the PERformance LAboratory of the Joint Research Center of the European Union is to select, procure and characterize Uranium and Plutonium working reference materials representative of the nuclear materials, subject to safeguards verification measurements by means of increasingly precise NDA techniques. These reference materials ensure the traceability of the measurements back to international standard units. They are used to develop, improve and calibrate the NDA instruments and to train the inspectors to operate these instruments correctly. The paper describes the approach adopted for the procurement, production and characterization plans of LEU standards in the form of UO 2 pellets, U 3 O 8 powders, pins and a mock up PWR fuel element. Homogeneity tests, weighing protocols, humidity checks and impurity analysis were performed on the feed materials (UO 2 powders) and on the final product materials (U 3 O 8 powders and UO 2 pellets). The sampling and the analytical scheme followed for the characterization and the analytical results obtained are presented and statistically evaluated. The characterization of these reference materials carried out by four experienced analytical laboratories allows a certification of the uranium content and isotopic abundance with small confidence intervals in a minimal time and minimal analytical effort

  7. INEEL AIR MODELING PROTOCOL ext

    Energy Technology Data Exchange (ETDEWEB)

    C. S. Staley; M. L. Abbott; P. D. Ritter

    2004-12-01

    Various laws stemming from the Clean Air Act of 1970 and the Clean Air Act amendments of 1990 require air emissions modeling. Modeling is used to ensure that air emissions from new projects and from modifications to existing facilities do not exceed certain standards. For radionuclides, any new airborne release must be modeled to show that downwind receptors do not receive exposures exceeding the dose limits and to determine the requirements for emissions monitoring. For criteria and toxic pollutants, emissions usually must first exceed threshold values before modeling of downwind concentrations is required. This document was prepared to provide guidance for performing environmental compliance-driven air modeling of emissions from Idaho National Engineering and Environmental Laboratory facilities. This document assumes that the user has experience in air modeling and dose and risk assessment. It is not intended to be a "cookbook," nor should all recommendations herein be construed as requirements. However, there are certain procedures that are required by law, and these are pointed out. It is also important to understand that air emissions modeling is a constantly evolving process. This document should, therefore, be reviewed periodically and revised as needed. The document is divided into two parts. Part A is the protocol for radiological assessments, and Part B is for nonradiological assessments. This document is an update of and supersedes document INEEL/INT-98-00236, Rev. 0, INEEL Air Modeling Protocol. This updated document incorporates changes in some of the rules, procedures, and air modeling codes that have occurred since the protocol was first published in 1998.

  8. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  9. Working towards accreditation by the International Standards Organization 15189 Standard: how to validate an in-house developed method an example of lead determination in whole blood by electrothermal atomic absorption spectrometry.

    Science.gov (United States)

    Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe

    2014-09-01

    Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.

  10. Protocol for a randomised controlled trial of standard wound management versus negative pressure wound therapy in the treatment of adult patients with an open fracture of the lower limb: UK Wound management of Lower Limb Fractures (UK WOLLF).

    Science.gov (United States)

    Achten, Juul; Parsons, Nick R; Bruce, Julie; Petrou, Stavros; Tutton, Elizabeth; Willett, Keith; Lamb, Sarah E; Costa, Matthew L

    2015-09-22

    Patients who sustain open lower limb fractures have reported infection risks as high as 27%. The type of dressing applied after initial debridement could potentially affect this risk. In this trial, standard dressings will be compared with a new emerging treatment, negative pressure wound therapy, for patients with open lower limb fractures. All adult patients presenting with an open lower limb fracture, with a Gustilo and Anderson (G&A) grade 2/3, will be considered for inclusion. 460 consented patients will provide 90% power to detect a difference of eight points in the Disability Rating Index (DRI) score at 12 months, at the 5% level. A randomisation sequence, stratified by trial centre and G&A grade, will be produced and administered by a secure web-based service. A qualitative substudy will assess patients' experience of giving consent for the trial, and acceptability of trial procedures to patients and staff. Patients will have clinical follow-up in a fracture clinic up to a minimum of 12 months as per standard National Health Service (NHS) practice. Functional and quality of life outcome data will be collected using the DRI, SF12 and EQ-5D questionnaires at 3, 6, 9 and 12 months postoperatively. In addition, information will be requested with regards to resource use and any late complications or surgical interventions related to their injury. The main analysis will investigate differences in the DRI score at 1 year after injury, between the two treatment groups on an intention-to-treat basis. Tests will be two sided and considered to provide evidence for a significant difference if p values are less than 0.05. Ethical approval was given by NRES Committee West Midlands-Coventry & Warwickshire on 6/2/2012 (ref: 12/WM/0001). The results of the trial will be disseminated via peer-reviewed publications and presentations at relevant conferences. ISRCTN33756652. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted

  11. A three-group study, internet-based, face-to-face based and standard- management after acute whiplash associated disorders (WAD) - choosing the most efficient and cost-effective treatment: study protocol of a randomized controlled trial.

    Science.gov (United States)

    Söderlund, Anne; Bring, Annika; Asenlöf, Pernilla

    2009-07-22

    The management of whiplash associated disorders is one of the most complicated challenges with high expenses for the health care system and society. There are still no general guidelines or scientific documentation to unequivocally support any single treatment for acute care following whiplash injury. The main purpose of this study is to try a new behavioural medicine intervention strategy at acute phase aimed to reduce the number of patients who have persistent problems after the whiplash injury. The goal is also to identify which of three different interventions that is most cost-effective for patients with whiplash associated disorders. In this study we are controlling for two factors. First, the effect of behavioural medicine approach is compared with standard care. Second, the manner in which the behavioural medicine treatment is administered, Internet or face-to-face, is evaluated in it's effectiveness and cost-effectiveness. The study is a randomized, prospective, experimental three-group study with analyses of cost-effectiveness up to two-years follow-up. Internet - based programme and face-to-face group treatment programme are compared to standard-treatment only. Patient follow-ups take place three, six, twelve and 24 months, that is, short-term as well as long-term effects are evaluated. Patients will be enrolled via the emergency ward during the first week after the accident. This new self-help management will concentrate to those psychosocial factors that are shown to be predictive in long-term problems in whiplash associated disorders, i.e. the importance of self-efficacy, fear of movement, and the significance of catastrophizing as a coping strategy for restoring and sustaining activities of daily life. Within the framework of this project, we will develop, broaden and evaluate current physical therapy treatment methods for acute whiplash associated disorders. The project will contribute to the creation of a cost-effective behavioural medicine

  12. A three-group study, internet-based, face-to-face based and standard- management after acute whiplash associated disorders (WAD – choosing the most efficient and cost-effective treatment: study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Bring Annika

    2009-07-01

    Full Text Available Abstract Background The management of Whiplash Associated Disorders is one of the most complicated challenges with high expenses for the health care system and society. There are still no general guidelines or scientific documentation to unequivocally support any single treatment for acute care following whiplash injury. The main purpose of this study is to try a new behavioural medicine intervention strategy at acute phase aimed to reduce the number of patients who have persistent problems after the whiplash injury. The goal is also to identify which of three different interventions that is most cost-effective for patients with Whiplash Associated Disorders. In this study we are controlling for two factors. First, the effect of behavioural medicine approach is compared with standard care. Second, the manner in which the behavioural medicine treatment is administered, Internet or face-to-face, is evaluated in it's effectiveness and cost-effectiveness. Methods/Design The study is a randomized, prospective, experimental three-group study with analyses of cost-effectiveness up to two-years follow-up. Internet – based programme and face-to-face group treatment programme are compared to standard-treatment only. Patient follow-ups take place three, six, twelve and 24 months, that is, short-term as well as long-term effects are evaluated. Patients will be enrolled via the emergency ward during the first week after the accident. Discussion This new self-help management will concentrate to those psychosocial factors that are shown to be predictive in long-term problems in Whiplash Associated Disorders, i.e. the importance of self-efficacy, fear of movement, and the significance of catastrophizing as a coping strategy for restoring and sustaining activities of daily life. Within the framework of this project, we will develop, broaden and evaluate current physical therapy treatment methods for acute Whiplash Associated Disorders. The project will

  13. Network protocols for real-time applications

    Science.gov (United States)

    Johnson, Marjory J.

    1987-01-01

    The Fiber Distributed Data Interface (FDDI) and the SAE AE-9B High Speed Ring Bus (HSRB) are emerging standards for high-performance token ring local area networks. FDDI was designed to be a general-purpose high-performance network. HSRB was designed specifically for military real-time applications. A workshop was conducted at NASA Ames Research Center in January, 1987 to compare and contrast these protocols with respect to their ability to support real-time applications. This report summarizes workshop presentations and includes an independent comparison of the two protocols. A conclusion reached at the workshop was that current protocols for the upper layers of the Open Systems Interconnection (OSI) network model are inadequate for real-time applications.

  14. Analytics: Changing the Conversation

    Science.gov (United States)

    Oblinger, Diana G.

    2013-01-01

    In this third and concluding discussion on analytics, the author notes that we live in an information culture. We are accustomed to having information instantly available and accessible, along with feedback and recommendations. We want to know what people think and like (or dislike). We want to know how we compare with "others like me."…

  15. Social Learning Analytics

    Science.gov (United States)

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…

  16. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  17. Ada & the Analytical Engine.

    Science.gov (United States)

    Freeman, Elisabeth

    1996-01-01

    Presents a brief history of Ada Byron King, Countess of Lovelace, focusing on her primary role in the development of the Analytical Engine--the world's first computer. Describes the Ada Project (TAP), a centralized World Wide Web site that serves as a clearinghouse for information related to women in computing, and provides a Web address for…

  18. History of analytic geometry

    CERN Document Server

    Boyer, Carl B

    2012-01-01

    Designed as an integrated survey of the development of analytic geometry, this study presents the concepts and contributions from before the Alexandrian Age through the eras of the great French mathematicians Fermat and Descartes, and on through Newton and Euler to the "Golden Age," from 1789 to 1850.

  19. Analytic number theory

    CERN Document Server

    Matsumoto, Kohji

    2002-01-01

    The book includes several survey articles on prime numbers, divisor problems, and Diophantine equations, as well as research papers on various aspects of analytic number theory such as additive problems, Diophantine approximations and the theory of zeta and L-function Audience Researchers and graduate students interested in recent development of number theory

  20. Analytical Chemistry Laboratory

    Science.gov (United States)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  1. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and socia...

  2. User Behavior Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, Melissa [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Moore, Juston Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-28

    User Behaviour Analytics is the tracking, collecting and assessing of user data and activities. The goal is to detect misuse of user credentials by developing models for the normal behaviour of user credentials within a computer network and detect outliers with respect to their baseline.

  3. Environmental analytical chemistry

    International Nuclear Information System (INIS)

    Gasco Sanchez, L.

    1990-01-01

    The environmental analytical chemistry has a big relation with the stochastic methods. The environmetry is an interdisciplinary science that is formed by the computer science, statistics science and environmental science. Today we must apply the logic of the laboratory and with the environmetry we can apply better the chemical analysis into the environmental control and pollutants control

  4. Analytics for Customer Engagement

    NARCIS (Netherlands)

    Bijmolt, Tammo H. A.; Leeflang, Peter S. H.; Block, Frank; Eisenbeiss, Maik; Hardie, Bruce G. S.; Lemmens, Aurelie; Saffert, Peter

    In this article, we discuss the state of the art of models for customer engagement and the problems that are inherent to calibrating and implementing these models. The authors first provide an overview of the data available for customer analytics and discuss recent developments. Next, the authors

  5. Analytics in Higher Education

    Science.gov (United States)

    Universities UK, 2016

    2016-01-01

    Learning analytics provide a set of powerful tools to inform and support learners. They enable institutions and individuals to better understand and predict personal learning needs and performance. Universities already collect vast amounts of data about their student populations, but often this is underutilised. The current "state of the…

  6. Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus

    Directory of Open Access Journals (Sweden)

    CHRISTIE GRAF RIBEIRO

    Full Text Available ABSTRACT Objective: to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c. Methods: this is a descriptive study, with the methodology divided into three phases: (1 development of a theoretical ophthalmologic database with emphasis on strabismus; (2 computerization of this theoretical ophthalmologic database using SINPE(c and (3 interpretation of the information with demonstration of results to validate the protocol. We inputed data from the charts of fifty patients with known strabismus through the Electronic Protocol for testing and validation. Results: the new electronic protocol was able to store information regarding patient history, physical examination, laboratory exams, imaging results, diagnosis and treatment of patients with ophthalmologic diseases, with emphasis on strabismus. We included 2,141 items in this master protocol and created 20 new specific electronic protocols for strabismus, each with its own specifics. Validation was achieved through correlation and corroboration of the symptoms and confirmed diagnoses of the fifty included patients with the diagnostic criteria for the twenty new strabismus protocols. Conclusion: a new, validated electronic database focusing on ophthalmology, with emphasis on strabismus, was successfully created through the standardized collection of information, and computerization of the database using proprietary software. This protocol is ready for deployment to facilitate data collection, sorting and application for practitioners and researchers in numerous specialties.

  7. Business protocol in integrated Europe

    OpenAIRE

    Pavelová, Nina

    2009-01-01

    The first chapter devotes to definitions of basic terms such as protocol or business protocol, to differences between protocol and etiquette, and between social etiquette and business etiquette. The second chapter focuses on the factors influencing the European business protocol. The third chapter is devoted to the etiquette of business protocol in the European countries. It touches the topics such as punctuality and planning of business appointment, greeting, business cards, dress and appear...

  8. The DYD-RCT protocol: an on-line randomised controlled trial of an interactive computer-based intervention compared with a standard information website to reduce alcohol consumption among hazardous drinkers

    Directory of Open Access Journals (Sweden)

    Godfrey Christine

    2007-10-01

    Full Text Available Abstract Background Excessive alcohol consumption is a significant public health problem throughout the world. Although there are a range of effective interventions to help heavy drinkers reduce their alcohol consumption, these have little proven population-level impact. Researchers internationally are looking at the potential of Internet interventions in this area. Methods/Design In a two-arm randomised controlled trial, an on-line psychologically enhanced interactive computer-based intervention is compared with a flat, text-based information web-site. Recruitment, consent, randomisation and data collection are all on-line. The primary outcome is total past-week alcohol consumption; secondary outcomes include hazardous or harmful drinking, dependence, harm caused by alcohol, and mental health. A health economic analysis is included. Discussion This trial will provide information on the effectiveness and cost-effectiveness of an on-line intervention to help heavy drinkers drink less. Trial registration International Standard Randomised Controlled Trial Number Register ISRCTN31070347

  9. Protocol for Communication Networking for Formation Flying

    Science.gov (United States)

    Jennings, Esther; Okino, Clayton; Gao, Jay; Clare, Loren

    2009-01-01

    An application-layer protocol and a network architecture have been proposed for data communications among multiple autonomous spacecraft that are required to fly in a precise formation in order to perform scientific observations. The protocol could also be applied to other autonomous vehicles operating in formation, including robotic aircraft, robotic land vehicles, and robotic underwater vehicles. A group of spacecraft or other vehicles to which the protocol applies could be characterized as a precision-formation- flying (PFF) network, and each vehicle could be characterized as a node in the PFF network. In order to support precise formation flying, it would be necessary to establish a corresponding communication network, through which the vehicles could exchange position and orientation data and formation-control commands. The communication network must enable communication during early phases of a mission, when little positional knowledge is available. Particularly during early mission phases, the distances among vehicles may be so large that communication could be achieved only by relaying across multiple links. The large distances and need for omnidirectional coverage would limit communication links to operation at low bandwidth during these mission phases. Once the vehicles were in formation and distances were shorter, the communication network would be required to provide high-bandwidth, low-jitter service to support tight formation-control loops. The proposed protocol and architecture, intended to satisfy the aforementioned and other requirements, are based on a standard layered-reference-model concept. The proposed application protocol would be used in conjunction with conventional network, data-link, and physical-layer protocols. The proposed protocol includes the ubiquitous Institute of Electrical and Electronics Engineers (IEEE) 802.11 medium access control (MAC) protocol to be used in the datalink layer. In addition to its widespread and proven use in

  10. Padronização de um protocolo experimental de treinamento periodizado em natação utilizando ratos Wistar Standardization of an experimental periodized training protocol in swimming rats

    Directory of Open Access Journals (Sweden)

    Gustavo Gomes de Araujo

    2010-02-01

    creatine kinase (CK in rats. Seventy male Wistar rats were randomly separated in two groups: Control Group (CG n = 30 and Training Periodization Group (TPG n = 30. All experiments were preceded by 2 weeks of individual adaptation to the water. The TPG was carried out during a period of 12 weeks (w with frequency of 6 days/w. The training period was subdivided in three specialized series blocks: Preparation (6 w, Specific (4.5 w and Taper (1.5 w. The Lactate Minimun Test (LACm was adapted to determine the aerobic capacity. Anaerobic performance was evaluated by maximal exhaustion time (Tlim verified during hyperlactatemia induction phase in the LACm protocol. Training stimulus was based on intensities corresponding to the LACm: Endurance (END 1 = 80%; END2= 100%; END3= 120% and Anaerobic (ANA 240% of the LACm. Two-way Anova and Newman-Keuls post-hoc test (P<0.05were also used. Aerobic performance was not different from initial training (Preparation: 4.57 ± 0.24% of body weigh (bw; Specific: 4.59 ± 0.44 %bw, but at the end of taper, the LACm was higher (Taper: 5.01 ± 0.71% bw. The anaerobic parameter (Tlim was significantly higher at the end of taper (73 ± 14s when compared to the Preparation (50 ± 13s and Specific blocks (65 ± 18s. The CG reduced the LACm and anaerobic performance along the experimental period. The glycogen M increased after taper but CK did not alter during training. Training periodization in rats acted as an important tool to evaluate specific effects of training. This is supported by sensitive responses of the rats along the blocks, based on improvement of aerobic and anaerobic performance as well as glycogen concentration obtained after the taper block.

  11. Analysis of communication systems with timed token protocols using the power-series algorithm

    NARCIS (Netherlands)

    Blanc, J.P.C.; Lenzini, L.

    1995-01-01

    The IEEE 802.4 and FDDI (Fibre Distributed Data Interface) standards are high speed MAC (Medium Access Control) protocols for LAN/MANs employing a timer-controlled token passing mechanism, the so-called Timed Token Protocol, to control station access to the shared media. These protocols support

  12. Evaluation of dose reduction versus standard dosing for maintenance of remission in patients with spondyloarthritis and clinical remission with anti-TNF (REDES-TNF): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Pontes, Caridad; Gratacós, Jordi; Torres, Ferran; Avendaño, Cristina; Sanz, Jesús; Vallano, Antoni; Juanola, Xavier; de Miguel, Eugenio; Sanmartí, Raimon; Calvo, Gonzalo

    2015-08-20

    Dose reduction schedules of tumor necrosis factor antagonists (anti-TNF) as maintenance therapy in patients with spondyloarthritis are used empirically in clinical practice, despite the lack of clinical trials providing evidence for this practice. To address this issue the Spanish Society of Rheumatology (SER) and Spanish Society of Clinical Pharmacology (SEFC) designed a 3-year multicenter, randomized, open-label, controlled clinical trial (2 years for inclusion and 1 year of follow-up). The study is expected to include 190 patients with axial spondyloarthritis on stable maintenance treatment (≥4 months) with any anti-TNF agent at doses recommended in the summary of product characteristics. Patients will be randomized to either a dose reduction arm or maintenance of the dosing regimen as per the official labelling recommendations. Randomization will be stratified according to the anti-TNF agent received before study inclusion. Patient follow-up, visit schedule, and examinations will be maintained as per normal clinical practice recommendations according to SER guidelines. The study aims to test the hypothesis of noninferiority of the dose reduction strategy compared with standard treatment. The first patients were recruited in July 2012, and study completion is scheduled for the end of April 2015. The REDES-TNF study is a pragmatic clinical trial that aims to provide evidence to support a medical decision now made empirically. The study results may help inform clinical decisions relevant to both patients and healthcare decision makers. EudraCT 2011-005871-18 (21 December 2011).

  13. Tight intra-operative blood pressure control versus standard care for patients undergoing hip fracture repair - Hip Fracture Intervention Study for Prevention of Hypotension (HIP-HOP) trial: study protocol for a randomised controlled trial.

    Science.gov (United States)

    Moppett, Iain Keith; White, Stuart; Griffiths, Richard; Buggy, Donal

    2017-07-25

    Hypotension during anaesthesia for hip fracture surgery is common. Recent data suggest that there is an association between the lowest intra-operative blood pressure and mortality, even when adjusted for co-morbidities. This is consistent with data derived from the wider surgical population, where magnitude and duration of hypotension are associated with mortality and peri-operative complications. However, there are no trial to data to support more aggressive blood pressure control. We are conducting a three-centre, randomised, double-blinded pilot study in three hospitals in the United Kingdom. The sample size will be 75 patients (25 from each centre). Randomisation will be done using computer-generated concealed tables. Both participants and investigators will be blinded to group allocation. Participants will be aged >70 years, cognitively intact (Abbreviated Mental Test Score 7 or greater), able to give informed consent and admitted directly through the emergency department with a fractured neck of the femur requiring operative repair. Patients randomised to tight blood pressure control or avoidance of intra-operative hypotension will receive active treatment as required to maintain both of the following: systolic arterial blood pressure >80% of baseline pre-operative value and mean arterial pressure >75 mmHg throughout. All participants will receive standard hospital care, including spinal or general anaesthesia, at the discretion of the clinical team. The primary outcome is a composite of the presence or absence of defined cardiovascular, renal and delirium morbidity within 7 days of surgery (myocardial injury, stroke, acute kidney injury, delirium). Secondary endpoints will include the defined individual morbidities, mortality, early mobility and discharge to usual residence. This is a small-scale pilot study investigating the feasibility of a trial of tight intra-operative blood pressure control in a frail elderly patient group with known high morbidity

  14. Clinical evaluation of short 6-mm implants alone, short 8-mm implants combined with osteotome sinus floor elevation and standard 10-mm implants combined with osteotome sinus floor elevation in posterior maxillae: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Shi, Jun-Yu; Gu, Ying-Xin; Qiao, Shi-Chong; Zhuang, Long-Fei; Zhang, Xiao-Meng; Lai, Hong-Chang

    2015-07-30

    Nowadays, short dental implants are being increasingly applied in extremely resorbed posterior regions. The recent studies have indicated that short implants present a similar success rate to conventional implants. It is assumed that short implants can avoid additional surgical morbidity and are less technically demanding. However, high-quality evidence (≥ Ib: evidence from at least one randomized controlled trial) on comparing the clinical outcome of short implants and longer implants combined with osteotome sinus floor elevation (OSFE) technique is limited. The proposed study is designed as a prospective single-center, three-arm parallel group, randomized controlled trial. We plan to enroll 150 patients in need of dental implant treatment in the posterior maxilla. The inclusion criteria include: age ≧ 18 years, partial edentulism in the posterior maxilla for at least 3 months from tooth loss, residual bone height ranging from 6 to 8 mm, sufficient bone width (≥ 6 mm) in the edentulous region. The patients will be divided into three groups according to a table of random numbers: group 1: short implants (6 mm) alone; group 2: short implants (8 mm) combined with osteotome sinus floor elevation (OSFE); group 3: standard implants (10 mm) combined with OSFE. The assignment will be concealed from the clinical operators until the beginning of implant surgery. The outcome examiners and patients will be kept blinded to the assignment. Implant survival rates, implant success rates, complications, resonance frequency analysis (RFA) measurements, marginal bone level, treatment time and patient-reported outcome (visual analogue scale for intraoperative discomfort and postoperative pain) will be recorded. Clinical re-evaluations will be performed at 12, 24, 36 and 60 months after crown placement. The results of the trial will support better decision-making for dental implant treatment in atrophic maxillary ridges. If favorable, the use of short implants may avoid adjunct

  15. Standard dilution analysis.

    Science.gov (United States)

    Jones, Willis B; Donati, George L; Calloway, Clifton P; Jones, Bradley T

    2015-02-17

    Standard dilution analysis (SDA) is a novel calibration method that may be applied to most instrumental techniques that will accept liquid samples and are capable of monitoring two wavelengths simultaneously. It combines the traditional methods of standard additions and internal standards. Therefore, it simultaneously corrects for matrix effects and for fluctuations due to changes in sample size, orientation, or instrumental parameters. SDA requires only 200 s per sample with inductively coupled plasma optical emission spectrometry (ICP OES). Neither the preparation of a series of standard solutions nor the construction of a universal calibration graph is required. The analysis is performed by combining two solutions in a single container: the first containing 50% sample and 50% standard mixture; the second containing 50% sample and 50% solvent. Data are collected in real time as the first solution is diluted by the second one. The results are used to prepare a plot of the analyte-to-internal standard signal ratio on the y-axis versus the inverse of the internal standard concentration on the x-axis. The analyte concentration in the sample is determined from the ratio of the slope and intercept of that plot. The method has been applied to the determination of FD&C dye Blue No. 1 in mouthwash by molecular absorption spectrometry and to the determination of eight metals in mouthwash, wine, cola, nitric acid, and water by ICP OES. Both the accuracy and precision for SDA are better than those observed for the external calibration, standard additions, and internal standard methods using ICP OES.

  16. Natural matrix standards

    International Nuclear Information System (INIS)

    Bowen, V.T.

    1976-01-01

    Environmental radiochemistry needs, for use in analytical intercomparision and as standard reference materials, very large homogeneous samples of a variety of matrices, each naturally contaminated by a variety of longer-lived radionuclides, at several different ranges of concentrations. The reasons for this need are discussed, and the minimum assortment of matrices of radionuclides, and of concentrations is established. Sources of suitable materials are suggested, and the international approach to meeting this need is emphasized

  17. Method and platform standardization in MRM-based quantitative plasma proteomics.

    Science.gov (United States)

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of M