WorldWideScience

Sample records for standard analytical protocol

  1. Analytical protocols for characterisation of sulphur-free lignin

    NARCIS (Netherlands)

    Gosselink, R.J.A.; Abächerli, A.; Semke, H.; Malherbe, R.; Käuper, P.; Nadif, A.; Dam, van J.E.G.

    2004-01-01

    Interlaboratory tests for chemical characterisation of sulphur-free lignins were performed by five laboratories to develop useful analytical protocols, which are lacking, and identify quality-related properties. Protocols have been established for reproducible determination of the chemical

  2. A Standardized and Reproducible Urine Preparation Protocol for Cancer Biomarkers Discovery

    Directory of Open Access Journals (Sweden)

    Julia Beretov

    2014-01-01

    Full Text Available A suitable and standardized protein purification technique is essential to maintain consistency and to allow data comparison between proteomic studies for urine biomarker discovery. Ultimately, efforts should be made to standardize urine preparation protocols. The aim of this study was to develop an optimal analytical protocol to achieve maximal protein yield and to ensure that this method was applicable to examine urine protein patterns that distinguish disease and disease-free states. In this pilot study, we compared seven different urine sample preparation methods to remove salts, and to precipitate and isolate urinary proteins. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE profiles showed that the sequential preparation of urinary proteins by combining acetone and trichloroacetic acid (TCA alongside high speed centrifugation (HSC provided the best separation, and retained the most urinary proteins. Therefore, this approach is the preferred method for all further urine protein analysis.

  3. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  4. Telemetry Standards, IRIG Standard 106-17, Chapter 22, Network Based Protocol Suite

    Science.gov (United States)

    2017-07-01

    requirements. 22.2 Network Access Layer 22.2.1 Physical Layer Connectors and cable media should meet the electrical or optical properties required by the...Telemetry Standards, IRIG Standard 106-17 Chapter 22, July 2017 i CHAPTER 22 Network -Based Protocol Suite Acronyms...iii Chapter 22. Network -Based Protocol Suite

  5. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  6. The development of standard operating protocols for paediatric radiology

    International Nuclear Information System (INIS)

    Hardwick, J.; Mencik, C.; McLaren, C.; Young, C.; Scadden, S.; Mashford, P.; McHugh, K.; Beckett, M.; Calvert, M.; Marsden, P.J.

    2001-01-01

    This paper describes how the requirement for operating protocols for standard radiological practice was expanded to provide a comprehensive aide to the operator conducting a medical exposure. The protocols adopted now include justification criteria, patient preparation, radiographic technique, standard exposure charts, diagnostic reference levels and image quality criteria. In total, the protocols have been welcomed as a tool for ensuring that medical exposures are properly optimised. (author)

  7. Synergistic relationships between Analytical Chemistry and written standards

    International Nuclear Information System (INIS)

    Valcárcel, Miguel; Lucena, Rafael

    2013-01-01

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived

  8. Synergistic relationships between Analytical Chemistry and written standards

    Energy Technology Data Exchange (ETDEWEB)

    Valcárcel, Miguel, E-mail: qa1vacam@uco.es; Lucena, Rafael

    2013-07-25

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived.

  9. Synthetic salt cake standards for analytical laboratory quality control

    International Nuclear Information System (INIS)

    Schilling, A.E.; Miller, A.G.

    1980-01-01

    The validation of analytical results in the characterization of Hanford Nuclear Defense Waste requires the preparation of synthetic waste for standard reference materials. Two independent synthetic salt cake standards have been prepared to monitor laboratory quality control for the chemical characterization of high-level salt cake and sludge waste in support of Rockwell Hanford Operations' High-Level Waste Management Program. Each synthetic salt cake standard contains 15 characterized chemical species and was subjected to an extensive verification/characterization program in two phases. Phase I consisted of an initial verification of each analyte in salt cake form in order to determine the current analytical capability for chemical analysis. Phase II consisted of a final characterization of those chemical species in solution form where conflicting verification data were observed. The 95 percent confidence interval on the mean for the following analytes within each standard is provided: sodium, nitrate, nitrite, phosphate, carbonate, sulfate, hydroxide, chromate, chloride, fluoride, aluminum, plutonium-239/240, strontium-90, cesium-137, and water

  10. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA)

    DEFF Research Database (Denmark)

    Muharemovic, O; Troelsen, A; Thomsen, M G

    2018-01-01

    INTRODUCTION: Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time...... imaging. Radiographers in the control group used a standard RSA protocol. RESULTS: At three months, radiographers in the case group significantly reduced (p .... No significant improvements were found in the control group at any time point. CONCLUSION: There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute...

  11. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  12. The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy

    Science.gov (United States)

    Jenett, Arnim; Schindelin, Johannes E; Heisenberg, Martin

    2006-01-01

    Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB) protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc.) [1]. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy) to a common coordinate system and computes average intensities for each voxel (volume pixel) the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at [2] PMID:17196102

  13. [Protocol for the study of bone tumours and standardization of pathology reports].

    Science.gov (United States)

    Machado, Isidro; Pozo, José Juan; Marcilla, David; Cruz, Julia; Tardío, Juan C; Astudillo, Aurora; Bagué, Sílvia

    Primary bone neoplasms represent a rare and heterogeneous group of mesenchymal tumours. The prevalence of benign and malignant tumours varies; the latter (sarcomas) account for less than 0.2% of all malignant tumours. Primary bone neoplasms are usually diagnosed and classified according to the criteria established and published by the World Health Organization (WHO 2013). These criteria are a result of advances in molecular pathology, which complements the histopathological diagnosis. Bone tumours should be diagnosed and treated in referral centers by a multidisciplinary team including pathologists, radiologists, orthopedic surgeons and oncologists. We analyzed different national and international protocols in order to provide a guide of recommendations for the improvement of pathological evaluation and management of bone tumours. We include specific recommendations for the pre-analytical, analytical, and post-analytical phases, as well as protocols for gross and microscopic pathology. Copyright © 2016 Sociedad Española de Anatomía Patológica. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Building America House Simulation Protocols

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engebrecht, Cheryn [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2010-09-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  15. Navigating the Benford Labyrinth: A big-data analytic protocol illustrated using the academic library context

    Directory of Open Access Journals (Sweden)

    Michael Halperin

    2016-03-01

    Full Text Available Objective: Big Data Analytics is a panoply of techniques the principal intention of which is to ferret out dimensions or factors from certain data streamed or available over the WWW. We offer a subset or “second” stage protocol of Big Data Analytics (BDA that uses these dimensional datasets as benchmarks for profiling related data. We call this Specific Context Benchmarking (SCB. Method: In effecting this benchmarking objective, we have elected to use a Digital Frequency Profiling (DFP technique based upon the work of Newcomb and Benford, who have developed a profiling benchmark based upon the Log10 function. We illustrate the various stages of the SCB protocol using the data produced by the Academic Research Libraries to enhance insights regarding the details of the operational benchmarking context and so offer generalizations needed to encourage adoption of SCB across other functional domains. Results: An illustration of the SCB protocol is offered using the recently developed Benford Practical Profile as the Conformity Benchmarking Measure. ShareWare: We have developed a Decision Support System called: SpecificContextAnalytics (SCA:DSS to create the various information sets presented in this paper. The SCA:DSS, programmed in Excel VBA, is available from the corresponding author as a free download without restriction to its use. Conclusions: We note that SCB effected using the DFPs is an enhancement not a replacement for the usual statistical and analytic techniques and fits very well in the BDA milieu.

  16. The Virtual Insect Brain protocol: creating and comparing standardized neuroanatomy

    Directory of Open Access Journals (Sweden)

    Schindelin Johannes E

    2006-12-01

    Full Text Available Abstract Background In the fly Drosophila melanogaster, new genetic, physiological, molecular and behavioral techniques for the functional analysis of the brain are rapidly accumulating. These diverse investigations on the function of the insect brain use gene expression patterns that can be visualized and provide the means for manipulating groups of neurons as a common ground. To take advantage of these patterns one needs to know their typical anatomy. Results This paper describes the Virtual Insect Brain (VIB protocol, a script suite for the quantitative assessment, comparison, and presentation of neuroanatomical data. It is based on the 3D-reconstruction and visualization software Amira, version 3.x (Mercury Inc. 1. Besides its backbone, a standardization procedure which aligns individual 3D images (series of virtual sections obtained by confocal microscopy to a common coordinate system and computes average intensities for each voxel (volume pixel the VIB protocol provides an elaborate data management system for data administration. The VIB protocol facilitates direct comparison of gene expression patterns and describes their interindividual variability. It provides volumetry of brain regions and helps to characterize the phenotypes of brain structure mutants. Using the VIB protocol does not require any programming skills since all operations are carried out at an intuitively usable graphical user interface. Although the VIB protocol has been developed for the standardization of Drosophila neuroanatomy, the program structure can be used for the standardization of other 3D structures as well. Conclusion Standardizing brains and gene expression patterns is a new approach to biological shape and its variability. The VIB protocol provides a first set of tools supporting this endeavor in Drosophila. The script suite is freely available at http://www.neurofly.de2

  17. Comparison of test protocols for standard room/corner tests

    Science.gov (United States)

    R. H. White; M. A. Dietenberger; H. Tran; O. Grexa; L. Richardson; K. Sumathipala; M. Janssens

    1998-01-01

    As part of international efforts to evaluate alternative reaction-to-fire tests, several series of room/comer tests have been conducted. This paper reviews the overall results of related projects in which different test protocols for standard room/corner tests were used. Differences in the test protocols involved two options for the ignition burner scenario and whether...

  18. Solution standards for quality control of nuclear-material analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.

    1981-01-01

    Analytical chemistry measurement control depends upon reliable solution standards. At the Savannah River Plant Control Laboratory over a thousand analytical measurements are made daily for process control, product specification, accountability, and nuclear safety. Large quantities of solution standards are required for a measurement quality control program covering the many different analytical chemistry methods. Savannah River Plant produced uranium, plutonium, neptunium, and americium metals or oxides are dissolved to prepare stock solutions for working or Quality Control Standards (QCS). Because extensive analytical effort is required to characterize or confirm these solutions, they are prepared in large quantities. These stock solutions are diluted and blended with different chemicals and/or each other to synthesize QCS that match the matrices of different process streams. The target uncertainty of a standard's reference value is 10% of the limit of error of the methods used for routine measurements. Standard Reference Materials from NBS are used according to special procedures to calibrate the methods used in measuring the uranium and plutonium standards so traceability can be established. Special precautions are required to minimize the effects of temperature, radiolysis, and evaporation. Standard reference values are periodically corrected to eliminate systematic errors caused by evaporation or decay products. Measurement control is achieved by requiring analysts to analyze a blind QCS each shift a measurement system is used on plant samples. Computer evaluation determines whether or not a measurement is within the +- 3 sigma control limits. Monthly evaluations of the QCS measurements are made to determine current bias correction factors for accountability measurements and detect significant changes in the bias and precision statistics. The evaluations are also used to plan activities for improving the reliability of the analytical chemistry measurements

  19. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).

    Science.gov (United States)

    Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K

    2018-05-01

    Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  20. Synergistic relationships between Analytical Chemistry and written standards.

    Science.gov (United States)

    Valcárcel, Miguel; Lucena, Rafael

    2013-07-25

    This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Protocol Standards for Reporting Video Data in Academic Journals.

    Science.gov (United States)

    Rowland, Pamela A; Ignacio, Romeo C; de Moya, Marc A

    2016-04-01

    Editors of biomedical journals have estimated that a majority (40%-90%) of studies published in scientific journals cannot be replicated, even though an inherent principle of publication is that others should be able to replicate and build on published claims. Each journal sets its own protocols for establishing "quality" in articles, yet over the past 50 years, few journals in any field--especially medical education--have specified protocols for reporting the use of video data in research. The authors found that technical and industry-driven aspects of video recording, as well as a lack of standardization and reporting requirements by research journals, have led to major limitations in the ability to assess or reproduce video data used in research. Specific variables in the videotaping process (e.g., camera angle), which can be changed or be modified, affect the quality of recorded data, leading to major reporting errors and, in turn, unreliable conclusions. As more data are now in the form of digital videos, the historical lack of reporting standards makes it increasingly difficult to accurately replicate medical educational studies. Reproducibility is especially important as the medical education community considers setting national high-stakes standards in medicine and surgery based on video data. The authors of this Perspective provide basic protocol standards for investigators and journals using video data in research publications so as to allow for reproducibility.

  2. Building America House Simulation Protocols (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Engebrecht, C.

    2010-10-01

    The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.

  3. 42 CFR 493.1289 - Standard: Analytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493.1289 Section 493.1289 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... through 493.1283. (b) The analytic systems quality assessment must include a review of the effectiveness...

  4. Standardized CT protocols and nomenclature: better, but not yet there

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Sarabjeet; Kalra, Mannudeep K. [Harvard Medical School, Department of Radiology, Massachusetts General Hospital, Boston, MA (United States)

    2014-10-15

    Radiation dose associated with CT is an important safety concern in patient care, especially in children. Technical advancements in multidetector-row CT scanner technology offer several advantages for clinical applications; these advancements have considerably increased CT utilization and enhanced the complexity of CT scanning protocols. Furthermore there are several scan manufacturers spearheading these technical advancements, leading to different commercial names causing confusion among the users, especially at imaging sites with scanners from different vendors. Several scientific studies and the National Council on Radiation Protection and Measurements (NCRP) have shown variation in CT radiation doses for same body region and similar scanning protocols. Therefore there is a need for standardization of scanning protocols and nomenclature of scan parameters. The following material reviews the status and challenges in standardization of CT scanning and nomenclature. (orig.)

  5. A Novel Process Audit for Standardized Perioperative Handoff Protocols.

    Science.gov (United States)

    Pallekonda, Vinay; Scholl, Adam T; McKelvey, George M; Amhaz, Hassan; Essa, Deanna; Narreddy, Spurthy; Tan, Jens; Templonuevo, Mark; Ramirez, Sasha; Petrovic, Michelle A

    2017-11-01

    A perioperative handoff protocol provides a standardized delivery of communication during a handoff that occurs from the operating room to the postanestheisa care unit or ICU. The protocol's success is dependent, in part, on its continued proper use over time. A novel process audit was developed to help ensure that a perioperative handoff protocol is used accurately and appropriately over time. The Audit Observation Form is used for the Audit Phase of the process audit, while the Audit Averages Form is used for the Data Analysis Phase. Employing minimal resources and using quantitative methods, the process audit provides the necessary means to evaluate the proper execution of any perioperative handoff protocol. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  6. SPIRIT 2013 Statement: defining standard protocol items for clinical trials.

    Science.gov (United States)

    Chan, An-Wen; Tetzlaff, Jennifer M; Altman, Douglas G; Laupacis, Andreas; Gøtzsche, Peter C; Krle A-Jerić, Karmela; Hrobjartsson, Asbjørn; Mann, Howard; Dickersin, Kay; Berlin, Jesse A; Dore, Caroline J; Parulekar, Wendy R; Summerskill, William S M; Groves, Trish; Schulz, Kenneth F; Sox, Harold C; Rockhold, Frank W; Rennie, Drummond; Moher, David

    2015-12-01

    The protocol of a clinical trial serves as the foundation for study planning, conduct, reporting, and appraisal. However, trial protocols and existing protocol guidelines vary greatly in content and quality. This article describes the systematic development and scope of SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials) 2013, a guideline for the minimum content of a clinical trial protocol. The 33-item SPIRIT checklist applies to protocols for all clinical trials and focuses on content rather than format. The checklist recommends a full description of what is planned; it does not prescribe how to design or conduct a trial. By providing guidance for key content, the SPIRIT recommendations aim to facilitate the drafting of high-quality protocols. Adherence to SPIRIT would also enhance the transparency and completeness of trial protocols for the benefit of investigators, trial participants, patients, sponsors, funders, research ethics committees or institutional review boards, peer reviewers, journals, trial registries, policymakers, regulators, and other key stakeholders.

  7. A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.

    Science.gov (United States)

    Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram

    2017-04-01

    Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.

  8. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  9. Analytical standards for accountability of uranium hexafluoride - 1972

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    An analytical standard for the accountability of uranium hexafluoride is presented that includes procedures for subsampling, determination of uranium, determination of metallic impurities and isotopic analysis by gas and thermal ionization mass spectrometry

  10. Development of a standard communication protocol for an emergency situation management in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Man Cheol, E-mail: charleskim@kaeri.re.k [Integrated Risk Assessment Center, Korea Atomic Energy Research Institute, 150, Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Park, Jinkyun; Jung, Wondea [Integrated Risk Assessment Center, Korea Atomic Energy Research Institute, 150, Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Hanjeom; Kim, Yoon Joong [YGN Nuclear Power Division Training Center, Korea Hydro and Nuclear Power Company, 517 Kyemari, Hongnong-eup, Yeongkwang-gun, Chonnam 513-880 (Korea, Republic of)

    2010-06-15

    Correct communication between main control room (MCR) operators is an important factor in the management of emergency situations in nuclear power plants (NPPs). For this reason, a standard communication protocol for the management of emergency situations in NPPs has been developed, with the basic direction of enhancing the safety of NPPs and the standardization of communication protocols. To validate the newly developed standard communication protocol, validation experiments with 10 licensed NPP MCR operator teams was performed. From the validation experiments, it was found that the use of the standard communication protocol required more time, but it can contribute to the enhancement of the safety of NPPs by an operators' better grasp of the safety-related parameters and a more efficient and clearer communication between NPP operators, while imposing little additional workloads on the NPP MCR operators. The standard communication protocol is expected to be used to train existing NPP MCR operators without much aversion, as well as new operators.

  11. A standard protocol for describing individual-based and agent-based models

    Science.gov (United States)

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  12. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    Science.gov (United States)

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  13. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    Science.gov (United States)

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while

  14. Local properties of analytic functions and non-standard analysis

    International Nuclear Information System (INIS)

    O'Brian, N.R.

    1976-01-01

    This is an expository account which shows how the methods of non-standard analysis can be applied to prove the Nullstellensatz for germs of analytic functions. This method of proof was discovered originally by Abraham Robinson. The necessary concepts from model theory are described in some detail and the Nullstellensatz is proved by investigating the relation between the set of infinitesimal elements in the complex n-plane and the spectrum of the ring of germs of analytic functions. (author)

  15. From Expert Protocols to Standardized Management of Infectious Diseases.

    Science.gov (United States)

    Lagier, Jean-Christophe; Aubry, Camille; Delord, Marion; Michelet, Pierre; Tissot-Dupont, Hervé; Million, Matthieu; Brouqui, Philippe; Raoult, Didier; Parola, Philippe

    2017-08-15

    We report here 4 examples of management of infectious diseases (IDs) at the University Hospital Institute Méditerranée Infection in Marseille, France, to illustrate the value of expert protocols feeding standardized management of IDs. First, we describe our experience on Q fever and Tropheryma whipplei infection management based on in vitro data and clinical outcome. Second, we describe our management-based approach for the treatment of infective endocarditis, leading to a strong reduction of mortality rate. Third, we report our use of fecal microbiota transplantation to face severe Clostridium difficile infections and to perform decolonization of patients colonized by emerging highly resistant bacteria. Finally, we present the standardized management of the main acute infections in patients admitted in the emergency department, promoting antibiotics by oral route, checking compliance with the protocol, and avoiding the unnecessary use of intravenous and urinary tract catheters. Overall, the standardization of the management is the keystone to reduce both mortality and morbidity related to IDs. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  16. Standardization of a Videofluoroscopic Swallow Study Protocol to Investigate Dysphagia in Dogs.

    Science.gov (United States)

    Harris, R A; Grobman, M E; Allen, M J; Schachtel, J; Rawson, N E; Bennett, B; Ledyayev, J; Hopewell, B; Coates, J R; Reinero, C R; Lever, T E

    2017-03-01

    Videofluoroscopic swallow study (VFSS) is the gold standard for diagnosis of dysphagia in veterinary medicine but lacks standardized protocols that emulate physiologic feeding practices. Age impacts swallow function in humans but has not been evaluated by VFSS in dogs. To develop a protocol with custom kennels designed to allow free-feeding of 3 optimized formulations of contrast media and diets that address limitations of current VFSS protocols. We hypothesized that dogs evaluated by a free-feeding VFSS protocol would show differences in objective swallow metrics based on age. Healthy juvenile, adult, and geriatric dogs (n = 24). Prospective, experimental study. Custom kennels were developed to maintain natural feeding behaviors during VFSS. Three food consistencies (thin liquid, pureed food, and dry kibble) were formulated with either iohexol or barium to maximize palatability and voluntary prehension. Dogs were evaluated by 16 swallow metrics and compared across age groups. Development of a standardized VFSS protocol resulted in successful collection of swallow data in healthy dogs. No significant differences in swallow metrics were observed among age groups. Substantial variability was observed in healthy dogs when evaluated under these physiologic conditions. Features typically attributed to pathologic states, such as gastric reflux, were seen in healthy dogs. Development of a VFSS protocol that reflects natural feeding practices may allow emulation of physiology resulting in clinical signs of dysphagia. Age did not result in significant changes in swallow metrics, but additional studies are needed, particularly in light of substantial normal variation. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  17. Inter-laboratory variation in DNA damage using a standard comet assay protocol

    DEFF Research Database (Denmark)

    Forchhammer, Lykke; Ersson, Clara; Loft, Steffen

    2012-01-01

    determined the baseline level of DNA strand breaks (SBs)/alkaline labile sites and formamidopyrimidine DNA glycosylase (FPG)-sensitive sites in coded samples of mononuclear blood cells (MNBCs) from healthy volunteers. There were technical problems in seven laboratories in adopting the standard protocol...... analysed by the standard protocol. The SBs and FPG-sensitive sites were measured in the same experiment, indicating that the large spread in the latter lesions was the main reason for the reduced inter-laboratory variation. However, it remains worrying that half of the participating laboratories obtained...

  18. A Standardized Protocol for the Prospective Follow-Up of Cleft Lip and Palate Patients.

    Science.gov (United States)

    Salimi, Negar; Jolanta, Aleksejūnienė; Edwin, Yen; Angelina, Loo

    2018-01-01

    To develop a standardized all-encompassing protocol for the assessment of cleft lip and palate patients with clinical and research implications. Electronic database searches were conducted and 13 major cleft centers worldwide were contacted in order to prepare for the development of the protocol. In preparation, the available evidence was reviewed and potential fistula-related risk determinants from 4 different domains were identified. No standardized protocol for the assessment of cleft patients could be found in any of the electronic database searches that were conducted. Interviews with representatives from several major centers revealed that the majority of centers do not have a standardized comprehensive strategy for the reporting and follow-up of cleft lip and palate patients. The protocol was developed and consisted of the following domains of determinants: (1) the sociodemographic domain, (2) the cleft defect domain, (3) the surgery domain, and (4) the fistula domain. The proposed protocol has the potential to enhance the quality of patient care by ensuring that multiple patient-related aspects are consistently reported. It may also facilitate future multicenter research, which could contribute to the reduction of fistula occurrence in cleft lip and palate patients.

  19. Pre-analytical and post-analytical evaluation in the era of molecular diagnosis of sexually transmitted diseases: cellularity control and internal control

    Directory of Open Access Journals (Sweden)

    Loria Bianchi

    2014-06-01

    Full Text Available Background. Increase of molecular tests performed on DNA extracted from various biological materials should not be carried out without an adequate standardization of the pre-analytical and post-analytical phase. Materials and Methods. Aim of this study was to evaluate the role of internal control (IC to standardize pre-analytical phase and the role of cellularity control (CC in the suitability evaluation of biological matrices, and their influence on false negative results. 120 cervical swabs (CS were pre-treated and extracted following 3 different protocols. Extraction performance was evaluated by amplification of: IC, added in each mix extraction; human gene HPRT1 (CC with RT-PCR to quantify sample cellularity; L1 region of HPV with SPF10 primers. 135 urine, 135 urethral swabs, 553 CS and 332 ThinPrep swabs (TP were tested for C. trachomatis (CT and U. parvum (UP with RT-PCR and for HPV by endpoint-PCR. Samples were also tested for cellularity. Results. Extraction protocol with highest average cellularity (Ac/sample showed lowest number of samples with inhibitors; highest HPV positivity was achieved by protocol with greatest Ac/PCR. CS and TP under 300.000 cells/sample showed a significant decrease of UP (P<0.01 and HPV (P<0.005 positivity. Female urine under 40.000 cells/mL were inadequate to detect UP (P<0.05. Conclusions. Our data show that IC and CC allow optimization of pre-analytical phase, with an increase of analytical quality. Cellularity/sample allows better sample adequacy evaluation, crucial to avoid false negative results, while cellularity/PCR allows better optimization of PCR amplification. Further data are required to define the optimal cut-off for result normalization.

  20. Preliminary results of testing bioassay analytical performance standards

    International Nuclear Information System (INIS)

    Fisher, D.R.; Robinson, A.V.; Hadley, R.T.

    1983-08-01

    The analytical performance of both in vivo and in vitro bioassay laboratories is being studied to determine the capability of these laboratories to meet the minimum criteria for accuracy and precision specified in the draft ANSI Standard N13.30, Performance Criteria for Radiobioassay. This paper presents preliminary results of the first round of testing

  1. Analytical chemistry methods for boron carbide absorber material. [Standard

    Energy Technology Data Exchange (ETDEWEB)

    DELVIN WL

    1977-07-01

    This standard provides analytical chemistry methods for the analysis of boron carbide powder and pellets for the following: total C and B, B isotopic composition, soluble C and B, fluoride, chloride, metallic impurities, gas content, water, nitrogen, and oxygen. (DLC)

  2. Analytical approach to cross-layer protocol optimization in wireless sensor networks

    Science.gov (United States)

    Hortos, William S.

    2008-04-01

    In the distributed operations of route discovery and maintenance, strong interaction occurs across mobile ad hoc network (MANET) protocol layers. Quality of service (QoS) requirements of multimedia service classes must be satisfied by the cross-layer protocol, along with minimization of the distributed power consumption at nodes and along routes to battery-limited energy constraints. In previous work by the author, cross-layer interactions in the MANET protocol are modeled in terms of a set of concatenated design parameters and associated resource levels by multivariate point processes (MVPPs). Determination of the "best" cross-layer design is carried out using the optimal control of martingale representations of the MVPPs. In contrast to the competitive interaction among nodes in a MANET for multimedia services using limited resources, the interaction among the nodes of a wireless sensor network (WSN) is distributed and collaborative, based on the processing of data from a variety of sensors at nodes to satisfy common mission objectives. Sensor data originates at the nodes at the periphery of the WSN, is successively transported to other nodes for aggregation based on information-theoretic measures of correlation and ultimately sent as information to one or more destination (decision) nodes. The "multimedia services" in the MANET model are replaced by multiple types of sensors, e.g., audio, seismic, imaging, thermal, etc., at the nodes; the QoS metrics associated with MANETs become those associated with the quality of fused information flow, i.e., throughput, delay, packet error rate, data correlation, etc. Significantly, the essential analytical approach to MANET cross-layer optimization, now based on the MVPPs for discrete random events occurring in the WSN, can be applied to develop the stochastic characteristics and optimality conditions for cross-layer designs of sensor network protocols. Functional dependencies of WSN performance metrics are described in

  3. Importance of Standardized DXA Protocol for Assessing Physique Changes in Athletes.

    Science.gov (United States)

    Nana, Alisa; Slater, Gary J; Hopkins, Will G; Halson, Shona L; Martin, David T; West, Nicholas P; Burke, Louise M

    2016-06-01

    The implications of undertaking DXA scans using best practice protocols (subjects fasted and rested) or a less precise but more practical protocol in assessing chronic changes in body composition following training and a specialized recovery technique were investigated. Twenty-one male cyclists completed an overload training program, in which they were randomized to four sessions per week of either cold water immersion therapy or control groups. Whole-body DXA scans were undertaken with best practice protocol (Best) or random activity protocol (Random) at baseline, after 3 weeks of overload training, and after a 2-week taper. Magnitudes of changes in total, lean and fat mass from baseline-overload, overload-taper and baseline-taper were assessed by standardization (Δmean/SD). The standard deviations of change scores for total and fat-free soft tissue mass (FFST) from Random scans (2-3%) were approximately double those observed in the Best (1-2%), owing to extra random errors associated with Random scans at baseline. There was little difference in change scores for fat mass. The effect of cold water immersion therapy on baseline-taper changes in FFST was possibly harmful (-0.7%; 90% confidence limits ±1.2%) with Best scans but unclear with Random scans (0.9%; ±2.0%). Both protocols gave similar possibly harmful effects of cold water immersion therapy on changes in fat mass (6.9%; ±13.5% and 5.5%; ±14.3%, respectively). An interesting effect of cold water immersion therapy on training-induced changes in body composition might have been missed with a less precise scanning protocol. DXA scans should be undertaken with Best.

  4. The evaluation of an analytical protocol for the determination of substances in waste for hazard classification

    Energy Technology Data Exchange (ETDEWEB)

    Hennebert, Pierre, E-mail: pierre.hennebert@ineris.fr [INERIS – Institut National de l’Environnement Industriel et des Risques, Domaine du Petit Arbois BP33, F-13545 Aix-en-Provence (France); Papin, Arnaud [INERIS, Parc Technologique ALATA, BP No. 2, 60550 Verneuil en Halatte (France); Padox, Jean-Marie [INERIS – Institut National de l’Environnement Industriel et des Risques, Domaine du Petit Arbois BP33, F-13545 Aix-en-Provence (France); Hasebrouck, Benoît [INERIS, Parc Technologique ALATA, BP No. 2, 60550 Verneuil en Halatte (France)

    2013-07-15

    Highlights: • Knowledge of wastes in substances will be necessary to assess HP1–HP15 hazard properties. • A new analytical protocol is proposed for this and tested by two service laboratories on 32 samples. • Sixty-three percentage of the samples have a satisfactory analytical balance between 90% and 110%. • Eighty-four percentage of the samples were classified identically (Seveso Directive) for their hazardousness by the two laboratories. • The method, in progress, is being normalized in France and is be proposed to CEN. - Abstract: The classification of waste as hazardous could soon be assessed in Europe using largely the hazard properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC–MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of ‘pools’ of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved ‘mass’ during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved ‘pools’) should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter

  5. Comparison of a fast 5-min knee MRI protocol with a standard knee MRI protocol. A multi-institutional multi-reader study

    Energy Technology Data Exchange (ETDEWEB)

    FitzGerald Alaia, Erin; Beltran, Luis S.; Garwood, Elisabeth; Burke, Christopher J.; Gyftopoulos, Soterios [NYU Langone Medical Center, Department of Radiology, Musculoskeletal Division, New York, NY (United States); Benedick, Alex [Case Western Reserve University, School of Medicine, Cleveland, OH (United States); Obuchowski, Nancy A. [Cleveland Clinic, Department of Quantitative Health Sciences, Cleveland, OH (United States); Polster, Joshua M.; Schils, Jean; Subhas, Naveen [Cleveland Clinic, Department of Radiology, Musculoskeletal Division, Cleveland, OH (United States); Chang, I. Yuan Joseph [Texas Scottish Rite Hospital for Children, Dallas, TX (United States)

    2018-01-15

    To compare diagnostic performance of a 5-min knee MRI protocol to that of a standard knee MRI. One hundred 3 T (100 patients, mean 38.8 years) and 50 1.5 T (46 patients, mean 46.4 years) MRIs, consisting of 5 fast, 2D multi-planar fast-spin-echo (FSE) sequences and five standard multiplanar FSE sequences, from two academic centers (1/2015-1/2016), were retrospectively reviewed by four musculoskeletal radiologists. Agreement between fast and standard (interprotocol agreement) and between standard (intraprotocol agreement) readings for meniscal, ligamentous, chondral, and bone pathology was compared for interchangeability. Frequency of major findings, sensitivity, and specificity was also tested for each protocol. Interprotocol agreement using fast MRI was similar to intraprotocol agreement with standard MRI (83.0-99.5%), with no excess disagreement (≤ 1.2; 95% CI, -4.2 to 3.8%), across all structures. Frequency of major findings (1.1-22.4% across structures) on fast and standard MRI was not significantly different (p ≥ 0.215), except more ACL tears on fast MRI (p = 0.021) and more cartilage defects on standard MRI (p < 0.001). Sensitivities (59-100%) and specificities (73-99%) of fast and standard MRI were not significantly different for meniscal and ligament tears (95% CI for difference, -0.08-0.08). For cartilage defects, fast MRI was slightly less sensitive (95% CI for difference, -0.125 to -0.01) but slightly more specific (95% CI for difference, 0.01-0.5) than standard MRI. A fast 5-min MRI protocol is interchangeable with and has similar accuracy to a standard knee MRI for evaluating internal derangement of the knee. (orig.)

  6. Comparison of a fast 5-min knee MRI protocol with a standard knee MRI protocol. A multi-institutional multi-reader study

    International Nuclear Information System (INIS)

    FitzGerald Alaia, Erin; Beltran, Luis S.; Garwood, Elisabeth; Burke, Christopher J.; Gyftopoulos, Soterios; Benedick, Alex; Obuchowski, Nancy A.; Polster, Joshua M.; Schils, Jean; Subhas, Naveen; Chang, I. Yuan Joseph

    2018-01-01

    To compare diagnostic performance of a 5-min knee MRI protocol to that of a standard knee MRI. One hundred 3 T (100 patients, mean 38.8 years) and 50 1.5 T (46 patients, mean 46.4 years) MRIs, consisting of 5 fast, 2D multi-planar fast-spin-echo (FSE) sequences and five standard multiplanar FSE sequences, from two academic centers (1/2015-1/2016), were retrospectively reviewed by four musculoskeletal radiologists. Agreement between fast and standard (interprotocol agreement) and between standard (intraprotocol agreement) readings for meniscal, ligamentous, chondral, and bone pathology was compared for interchangeability. Frequency of major findings, sensitivity, and specificity was also tested for each protocol. Interprotocol agreement using fast MRI was similar to intraprotocol agreement with standard MRI (83.0-99.5%), with no excess disagreement (≤ 1.2; 95% CI, -4.2 to 3.8%), across all structures. Frequency of major findings (1.1-22.4% across structures) on fast and standard MRI was not significantly different (p ≥ 0.215), except more ACL tears on fast MRI (p = 0.021) and more cartilage defects on standard MRI (p < 0.001). Sensitivities (59-100%) and specificities (73-99%) of fast and standard MRI were not significantly different for meniscal and ligament tears (95% CI for difference, -0.08-0.08). For cartilage defects, fast MRI was slightly less sensitive (95% CI for difference, -0.125 to -0.01) but slightly more specific (95% CI for difference, 0.01-0.5) than standard MRI. A fast 5-min MRI protocol is interchangeable with and has similar accuracy to a standard knee MRI for evaluating internal derangement of the knee. (orig.)

  7. Effect of a Standardized Protocol of Antibiotic Therapy on Surgical Site Infection after Laparoscopic Surgery for Complicated Appendicitis.

    Science.gov (United States)

    Park, Hyoung-Chul; Kim, Min Jeong; Lee, Bong Hwa

    Although it is accepted that complicated appendicitis requires antibiotic therapy to prevent post-operative surgical infections, consensus protocols on the duration and regimens of treatment are not well established. This study aimed to compare the outcome of post-operative infectious complications in patients receiving old non-standardized and new standard antibiotic protocols, involving either 5 or 10 days of treatment, respectively. We enrolled 1,343 patients who underwent laparoscopic surgery for complicated appendicitis between January 2009 and December 2014. At the beginning of the new protocol, the patients were divided into two groups; 10 days of various antibiotic regimens (between January 2009 and June 2012, called the non-standardized protocol; n = 730) and five days of cefuroxime and metronidazole regimen (between July 2012 and December 2014; standardized protocol; n = 613). We compared the clinical outcomes, including surgical site infection (SSI) (superficial and deep organ/space infections) in the two groups. The standardized protocol group had a slightly shorter operative time (67 vs. 69 min), a shorter hospital stay (5 vs. 5.4 d), and lower medical cost (US$1,564 vs. US$1,654). Otherwise, there was no difference between the groups. No differences were found in the non-standardized and standard protocol groups with regard to the rate of superficial infection (10.3% vs. 12.7%; p = 0.488) or deep organ/space infection (2.3% vs. 2.1%; p = 0.797). In patients undergoing laparoscopic surgery for complicated appendicitis, five days of cefuroxime and metronidazole did not lead to more SSIs, and it decreased the medical costs compared with non-standardized antibiotic regimens.

  8. Biocoder: A programming language for standardizing and automating biology protocols.

    Science.gov (United States)

    Ananthanarayanan, Vaishnavi; Thies, William

    2010-11-08

    Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains.

  9. Industrial wireless sensor networks applications, protocols, and standards

    CERN Document Server

    Güngör, V Çagri

    2013-01-01

    The collaborative nature of industrial wireless sensor networks (IWSNs) brings several advantages over traditional wired industrial monitoring and control systems, including self-organization, rapid deployment, flexibility, and inherent intelligent processing. In this regard, IWSNs play a vital role in creating more reliable, efficient, and productive industrial systems, thus improving companies' competitiveness in the marketplace. Industrial Wireless Sensor Networks: Applications, Protocols, and Standards examines the current state of the art in industrial wireless sensor networks and outline

  10. Shoulder muscle endurance: the development of a standardized and reliable protocol

    Directory of Open Access Journals (Sweden)

    Roy Jean-Sébastien

    2011-01-01

    Full Text Available Abstract Background Shoulder muscle fatigue has been proposed as a possible link to explain the association between repetitive arm use and the development of rotator cuff disorders. To our knowledge, no standardized clinical endurance protocol has been developed to evaluate the effects of muscle fatigue on shoulder function. Such a test could improve clinical examination of individuals with shoulder disorders. Therefore, the purpose of this study was to establish a reliable protocol for objective assessment of shoulder muscle endurance. Methods An endurance protocol was developed on a stationary dynamometer (Biodex System 3. The endurance protocol was performed in isotonic mode with the resistance set at 50% of each subject's peak torque as measured for shoulder external (ER and internal rotation (IR. Each subject performed 60 continuous repetitions of IR/ER rotation. The endurance protocol was performed by 36 healthy individuals on two separate occasions at least two days apart. Maximal isometric shoulder strength tests were performed before and after the fatigue protocol to evaluate the effects of the endurance protocol and its reliability. Paired t-tests were used to evaluate the reduction in shoulder strength due to the protocol, while intraclass correlation coefficients (ICC and minimal detectable change (MDC were used to evaluate its reliability. Results Maximal isometric strength was significantly decreased after the endurance protocol (P 0.84. Conclusions Changes in muscular performance observed during and after the muscular endurance protocol suggests that the protocol did result in muscular fatigue. Furthermore, this study established that the resultant effects of fatigue of the proposed isotonic protocol were reproducible over time. The protocol was performed without difficulty by all volunteers and took less than 10 minutes to perform, suggesting that it might be feasible for clinical practice. This protocol could be used to induce

  11. Beyond communication: the role of standardized protocols in a changing health care environment.

    Science.gov (United States)

    Vardaman, James M; Cornell, Paul; Gondo, Maria B; Amis, John M; Townsend-Gervis, Mary; Thetford, Carol

    2012-01-01

    Communication errors have grave consequences in health care settings. The situation-background-assessment-recommendation (SBAR) protocol has been theorized to improve communication by creating a common language between nurses and physicians in acute care situations. This practice is gaining acceptance across the health care field. However, as yet, there has been little investigation of the ways in which SBAR may have an impact on how health care professionals operate beyond the creation of a common language. The purposes of the study were to explore the implementation of the SBAR protocol and investigate the potential impact of SBAR on the day-to-day experiences of nurses. We performed a qualitative case study of 2 hospitals that were implementing the SBAR protocol. We collected data from 80 semistructured interviews with nurses, nurse manager, and physicians; observation of nursing and other hospital activities; and documents that pertained to the implementation of the SBAR protocol. Data were analyzed using a thematic approach. Our analysis revealed 4 dimensions of impact that SBAR has beyond its use as a communication tool: schema formation, development of legitimacy, development of social capital, and reinforcement of dominant logics. The results indicate that SBAR may function as more than a tool to standardize communication among nurses and physicians. Rather, the findings indicate that SBAR may aid in schema development that allows rapid decision making by nurses, provide social capital and legitimacy for less-tenured nurses, and reinforce a move toward standardization in the nursing profession. Our findings further suggest that standardized protocols such as SBAR may be a cost-effective method for hospital managers and administrators to accelerate the socialization of nurses, particularly new hires.

  12. The impact of a new standard labor protocol on maternal and neonatal outcomes.

    Science.gov (United States)

    Wang, Dingran; Ye, Shenglong; Tao, Liyuan; Wang, Yongqing

    2017-12-01

    To analyze the clinical outcomes following the implementation of a new standard labor procedure. This was a retrospective analysis that included a study group consisting of patients managed based on a new standard labor protocol and a control group comprising patients managed under an old standard labor protocol. The following maternal and perinatal outcomes were compared in the two groups: the indications for a cesarean section and the incidence of cesarean section, postpartum hemorrhage, fetal distress, neonatal asphyxia and pediatric intervention. We also compared the average number of days spent in the hospital, the incidence of medical disputes and hospitalization expenses. The cesarean section rates for the study and control groups were 19.29% (401/2079) and 33.53% (753/2246), respectively (P labor, fetal distress and intrapartum fever; the percentages of each indication were significantly different from those of the control group (P labor protocol reduced the cesarean section rate without negatively impacting maternal and neonatal outcomes. In practice, bed turnover and the hospital utilization rate should be better controlled, patient-doctor communication should be strengthened and the quality of obstetrical service should be improved.

  13. Improving biofeedback for the treatment of fecal incontinence in women: implementation of a standardized multi-site manometric biofeedback protocol.

    Science.gov (United States)

    Markland, A D; Jelovsek, J E; Whitehead, W E; Newman, D K; Andy, U U; Dyer, K; Harm-Ernandes, I; Cichowski, S; McCormick, J; Rardin, C; Sutkin, G; Shaffer, A; Meikle, S

    2017-01-01

    Standardized training and clinical protocols using biofeedback for the treatment of fecal incontinence (FI) are important for clinical care. Our primary aims were to develop, implement, and evaluate adherence to a standardized protocol for manometric biofeedback to treat FI. In a Pelvic Floor Disorders Network (PFDN) trial, participants were enrolled from eight PFDN clinical centers across the United States. A team of clinical and equipment experts developed biofeedback software on a novel tablet computer platform for conducting standardized anorectal manometry with separate manometric biofeedback protocols for improving anorectal muscle strength, sensation, and urge resistance. The training protocol also included education on bowel function, anal sphincter exercises, and bowel diary monitoring. Study interventionists completed online training prior to attending a centralized, standardized certification course. For the certification, expert trainers assessed the ability of the interventionists to perform the protocol components for a paid volunteer who acted as a standardized patient. Postcertification, the trainers audited interventionists during trial implementation to improve protocol adherence. Twenty-four interventionists attended the in-person training and certification, including 46% advanced practice registered nurses (11/24), 50% (12/24) physical therapists, and 4% physician assistants (1/24). Trainers performed audio audits for 88% (21/24), representing 84 audited visits. All certified interventionists met or exceeded the prespecified 80% pass rate for the audit process, with an average passing rate of 93%. A biofeedback protocol can be successfully imparted to experienced pelvic floor health care providers from various disciplines. Our process promoted high adherence to a standard protocol and is applicable to many clinical settings. © 2016 John Wiley & Sons Ltd.

  14. Outcomes of Optimized over Standard Protocol of Rabbit Antithymocyte Globulin for Severe Aplastic Anemia: A Single-Center Experience

    Science.gov (United States)

    Ge, Meili; Shao, Yingqi; Huang, Jinbo; Huang, Zhendong; Zhang, Jing; Nie, Neng; Zheng, Yizhou

    2013-01-01

    Background Previous reports showed that outcome of rabbit antithymocyte globulin (rATG) was not satisfactory as the first-line therapy for severe aplastic anemia (SAA). We explored a modifying schedule of administration of rATG. Design and Methods Outcomes of a cohort of 175 SAA patients, including 51 patients administered with standard protocol (3.55 mg/kg/d for 5 days) and 124 cases with optimized protocol (1.97 mg/kg/d for 9 days) of rATG plus cyclosporine (CSA), were analyzed retrospectively. Results Of all 175 patients, response rates at 3 and 6 months were 36.6% and 56.0%, respectively. 51 cases received standard protocol had poor responses at 3 (25.5%) and 6 months (41.2%). However, 124 patients received optimized protocol had better responses at 3 (41.1%, P = 0.14) and 6 (62.1%, P = 0.01). Higher incidences of infection (57.1% versus 37.9%, P = 0.02) and early mortality (17.9% versus 0.8%, P<0.001) occurred in patients received standard protocol compared with optimized protocol. The 5-year overall survival in favor of the optimized over standard rATG protocol (76.0% versus. 50.3%, P<0.001) was observed. By multivariate analysis, optimized protocol (RR = 2.21, P = 0.04), response at 3 months (RR = 10.31, P = 0.03) and shorter interval (<23 days) between diagnosis and initial dose of rATG (RR = 5.35, P = 0.002) were independent favorable predictors of overall survival. Conclusions Optimized instead of standard rATG protocol in combination with CSA remained efficacious as a first-line immunosuppressive regimen for SAA. PMID:23554855

  15. Why standard brain-computer interface (BCI) training protocols should be changed: an experimental study

    Science.gov (United States)

    Jeunet, Camille; Jahanpour, Emilie; Lotte, Fabien

    2016-06-01

    Objective. While promising, electroencephaloraphy based brain-computer interfaces (BCIs) are barely used due to their lack of reliability: 15% to 30% of users are unable to control a BCI. Standard training protocols may be partly responsible as they do not satisfy recommendations from psychology. Our main objective was to determine in practice to what extent standard training protocols impact users’ motor imagery based BCI (MI-BCI) control performance. Approach. We performed two experiments. The first consisted in evaluating the efficiency of a standard BCI training protocol for the acquisition of non-BCI related skills in a BCI-free context, which enabled us to rule out the possible impact of BCIs on the training outcome. Thus, participants (N = 54) were asked to perform simple motor tasks. The second experiment was aimed at measuring the correlations between motor tasks and MI-BCI performance. The ten best and ten worst performers of the first study were recruited for an MI-BCI experiment during which they had to learn to perform two MI tasks. We also assessed users’ spatial ability and pre-training μ rhythm amplitude, as both have been related to MI-BCI performance in the literature. Main results. Around 17% of the participants were unable to learn to perform the motor tasks, which is close to the BCI illiteracy rate. This suggests that standard training protocols are suboptimal for skill teaching. No correlation was found between motor tasks and MI-BCI performance. However, spatial ability played an important role in MI-BCI performance. In addition, once the spatial ability covariable had been controlled for, using an ANCOVA, it appeared that participants who faced difficulty during the first experiment improved during the second while the others did not. Significance. These studies suggest that (1) standard MI-BCI training protocols are suboptimal for skill teaching, (2) spatial ability is confirmed as impacting on MI-BCI performance, and (3) when faced

  16. Net Analyte Signal Standard Additions Method for Simultaneous Determination of Sulfamethoxazole and Trimethoprim in Pharmaceutical Formulations and Biological Fluids

    OpenAIRE

    Givianrad, M. H.; Mohagheghian, M.

    2012-01-01

    The applicability of a novel net analyte signal standard addition method (NASSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim was verified by UV-visible spectrophotometry. The results confirmed that the net analyte signal standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. Moreover, applying the net analyte signal standard a...

  17. Improving post-stroke dysphagia outcomes through a standardized and multidisciplinary protocol: an exploratory cohort study.

    Science.gov (United States)

    Gandolfi, Marialuisa; Smania, Nicola; Bisoffi, Giulia; Squaquara, Teresa; Zuccher, Paola; Mazzucco, Sara

    2014-12-01

    Stroke is a major cause of dysphagia. Few studies to date have reported on standardized multidisciplinary protocolized approaches to the management of post-stroke dysphagia. The aim of this retrospective cohort study was to evaluate the impact of a standardized multidisciplinary protocol on clinical outcomes in patients with post-stroke dysphagia. We performed retrospective chart reviews of patients with post-stroke dysphagia admitted to the neurological ward of Verona University Hospital from 2004 to 2008. Outcomes after usual treatment for dysphagia (T- group) were compared versus outcomes after treatment under a standardized diagnostic and rehabilitative multidisciplinary protocol (T+ group). Outcome measures were death, pneumonia on X-ray, need for respiratory support, and proportion of patients on tube feeding at discharge. Of the 378 patients admitted with stroke, 84 had dysphagia and were enrolled in the study. A significantly lower risk of in-hospital death (odds ratio [OR] 0.20 [0.53-0.78]), pneumonia (OR 0.33 [0.10-1.03]), need for respiratory support (OR 0.48 [0.14-1.66]), and tube feeding at discharge (OR 0.30 [0.09-0.91]) was recorded for the T+ group (N = 39) as compared to the T- group (N = 45). The adjusted OR showed no difference between the two groups for in-hospital death and tube feeding at discharge. Use of a standardized multidisciplinary protocolized approach to the management of post-stroke dysphagia may significantly reduce rates of aspiration pneumonia, in-hospital mortality, and tube feeding in dysphagic stroke survivors. Consistent with the study's exploratory purposes, our findings suggest that the multidisciplinary protocol applied in this study offers an effective model of management of post-stroke dysphagia.

  18. Standard protocol for conducting pre-operational environmental surveillance around nuclear facilities

    International Nuclear Information System (INIS)

    Hegde, A.G.; Verma, P.C.; Rajan, M.P.

    2009-02-01

    This document presents the standard procedures for evaluation of site specific environmental transfer factors around NPP sites. The scope of this document is to provide standard protocol to be followed for conducting pre-operational environmental surveillance around nuclear facilities. Such surveillances have been proposed to be carried out by university professionals under DAE-BRNS projects. This document contains a common methodology in terms of sampling, processing, measurements and analysis of elemental/radionuclides, while keeping the site specific requirements also in place. (author)

  19. Standard protocol for conducting pre-operational environmental surveillance around nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hegde, A G; Verma, P C; Rajan, M P [Health Safety and Environment Group, Bhabha Atomic Research Centre, Mumbai (India)

    2009-02-15

    This document presents the standard procedures for evaluation of site specific environmental transfer factors around NPP sites. The scope of this document is to provide standard protocol to be followed for conducting pre-operational environmental surveillance around nuclear facilities. Such surveillances have been proposed to be carried out by university professionals under DAE-BRNS projects. This document contains a common methodology in terms of sampling, processing, measurements and analysis of elemental/radionuclides, while keeping the site specific requirements also in place. (author)

  20. Asynchronous transfer mode and Local Area Network emulation standards, protocols, and security implications

    OpenAIRE

    Kirwin, John P.

    1999-01-01

    A complex networking technology called Asynchronous Transfer Mode (ATM) and a networking protocol called Local Area Network Emulation (LANE) are being integrated into many naval networks without any security-driven naval configuration guidelines. No single publication is available that describes security issues of data delivery and signaling relating to the transition of Ethernet to LANE and ATM. The thesis' focus is to provide: (1) an overview and security analysis of standardized protocols ...

  1. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Ganapathi Iyer, S.; Ali, M.M.; Thantry, S.S.; Verma, R.; Arunachalam, J.; Walvekar, A.P.

    1992-01-01

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  2. Pelvic Muscle Rehabilitation: A Standardized Protocol for Pelvic Floor Dysfunction

    Directory of Open Access Journals (Sweden)

    Rodrigo Pedraza

    2014-01-01

    Full Text Available Introduction. Pelvic floor dysfunction syndromes present with voiding, sexual, and anorectal disturbances, which may be associated with one another, resulting in complex presentation. Thus, an integrated diagnosis and management approach may be required. Pelvic muscle rehabilitation (PMR is a noninvasive modality involving cognitive reeducation, modification, and retraining of the pelvic floor and associated musculature. We describe our standardized PMR protocol for the management of pelvic floor dysfunction syndromes. Pelvic Muscle Rehabilitation Program. The diagnostic assessment includes electromyography and manometry analyzed in 4 phases: (1 initial baseline phase; (2 rapid contraction phase; (3 tonic contraction and endurance phase; and (4 late baseline phase. This evaluation is performed at the onset of every session. PMR management consists of 6 possible therapeutic modalities, employed depending on the diagnostic evaluation: (1 down-training; (2 accessory muscle isolation; (3 discrimination training; (4 muscle strengthening; (5 endurance training; and (6 electrical stimulation. Eight to ten sessions are performed at one-week intervals with integration of home exercises and lifestyle modifications. Conclusions. The PMR protocol offers a standardized approach to diagnose and manage pelvic floor dysfunction syndromes with potential advantages over traditional biofeedback, involving additional interventions and a continuous pelvic floor assessment with management modifications over the clinical course.

  3. Anthropometric protocols for the construction of new international fetal and newborn growth standards: the INTERGROWTH-21st Project.

    Science.gov (United States)

    Cheikh Ismail, L; Knight, H E; Bhutta, Z; Chumlea, W C

    2013-09-01

    The primary aim of the INTERGROWTH-21(st) Project is to construct new, prescriptive standards describing optimal fetal and preterm postnatal growth. The anthropometric measurements include the head circumference, recumbent length and weight of the infants, and the stature and weight of the parents. In such a large, international, multicentre project, it is critical that all study sites follow standardised protocols to ensure maximal validity of the growth and nutrition indicators used. This paper describes, in detail, the selection of anthropometric personnel, equipment, and measurement and calibration protocols used to construct the new standards. Implementing these protocols at each study site ensures that the anthropometric data are of the highest quality to construct the international standards. © 2013 Royal College of Obstetricians and Gynaecologists.

  4. Comparison of a new whole-body continuous-table-movement protocol versus a standard whole-body MR protocol for the assessment of multiple myeloma

    International Nuclear Information System (INIS)

    Weckbach, S.; Michaely, H.J.; Schoenberg, S.O.; Dinter, D.J.; Stemmer, A.

    2010-01-01

    To evaluate a whole body (WB) continuous-table-movement (CTM) MR protocol for the assessment of multiple myeloma (MM) in comparison to a step-by-step WB protocol. Eighteen patients with MM were examined at 1.5T using a WB CTM protocol (axial T2-w fs BLADE, T1-w GRE sequence) and a step-by-step WB protocol including coronal/sagittal T1-w SE and STIR sequences as reference. Protocol time was assessed. Image quality, artefacts, liver/spleen assessability, and the ability to depict bone marrow lesions less than or greater than 1 cm as well as diffuse infiltration and soft tissue lesions were rated. Potential changes in the Durie and Salmon Plus stage and the detectability of complications were assessed. Mean protocol time was 6:38 min (CTM) compared to 24:32 min (standard). Image quality was comparable. Artefacts were more prominent using the CTM protocol (P = 0.0039). Organ assessability was better using the CTM protocol (P < 0.001). Depiction of bone marrow and soft tissue lesions was identical without a staging shift. Vertebral fractures were not detected using the CTM protocol. The new protocol allows a higher patient throughput and facilitates the depiction of extramedullary lesions. However, as long as vertebral fractures are not detectable, the protocol cannot be safely used for clinical routine without the acquisition of an additional sagittal sequence. (orig.)

  5. Protocol for Usability Testing and Validation of the ISO Draft International Standard 19223 for Lung Ventilators

    Science.gov (United States)

    2017-01-01

    Background Clinicians, such as respiratory therapists and physicians, are often required to set up pieces of medical equipment that use inconsistent terminology. Current lung ventilator terminology that is used by different manufacturers contributes to the risk of usage errors, and in turn the risk of ventilator-associated lung injuries and other conditions. Human factors and communication issues are often associated with ventilator-related sentinel events, and inconsistent ventilator terminology compounds these issues. This paper describes our proposed protocol, which will be implemented at the University of Waterloo, Canada when this project is externally funded. Objective We propose to determine whether a standardized vocabulary improves the ease of use, safety, and utility as it relates to the usability of medical devices, compared to legacy medical devices from multiple manufacturers, which use different terms. Methods We hypothesize that usage errors by clinicians will be lower when standardization is consistently applied by all manufacturers. The proposed study will experimentally examine the impact of standardized nomenclature on performance declines in the use of an unfamiliar ventilator product in clinically relevant scenarios. Participants will be respiratory therapy practitioners and trainees, and we propose studying approximately 60 participants. Results The work reported here is in the proposal phase. Once the protocol is implemented, we will report the results in a follow-up paper. Conclusions The proposed study will help us better understand the effects of standardization on medical device usability. The study will also help identify any terms in the International Organization for Standardization (ISO) Draft International Standard (DIS) 19223 that may be associated with recurrent errors. Amendments to the standard will be proposed if recurrent errors are identified. This report contributes a protocol that can be used to assess the effect of

  6. Standard protocol for evaluation of environmental transfer factors around NPP sites

    International Nuclear Information System (INIS)

    Hegde, A.G.; Verma, P.C.; Rao, D.D.

    2009-01-01

    This document presents the standard procedures for evaluation of site specific environmental transfer factors around NPP sites. The scope of this document is to provide standard protocol to be followed for evaluation of environmental transfer factors around NPP sites. The studies on transfer factors are being carried out at various NPP sites under DAE-BRNS projects for evaluation of site specific transfer factors for radionuclides released from power plants. This document contains a common methodology in terms of sampling, processing, measurements and analysis of elemental/radionuclides, while keeping the site specific requirements also in place. (author)

  7. Standardized protocol for artery-only fingertip replantation.

    Science.gov (United States)

    Buntic, Rudolf F; Brooks, Darrell

    2010-09-01

    Artery-only fingertip replantation can be reliable if low-resistance flow through the replant is maintained until venous outflow is restored naturally. Injuring the tip of the replant to promote ongoing bleeding augmented with anticoagulation usually accomplishes this; however, such management results in prolonged hospitalization. In this study, we analyzed the outcomes of artery-only fingertip replantation using a standardized postoperative protocol consisting of dextran-40, heparin, and leech therapy. Between 2001 and 2008, we performed 19 artery-only fingertip replants for 17 patients. All patients had the replanted nail plate removed and received intravenous dextran-40, heparin, and aspirin to promote fingertip bleeding and vascular outflow. Anticoagulation was titrated to promote a controlled bleed until physiologic venous outflow was restored by neovascularization. We used medicinal leeches and mechanical heparin scrubbing for acute decongestion. By postoperative day 6, bleeding was no longer promoted. We initiated fluorescent dye perfusion studies to assess circulatory competence and direct further anticoagulant intervention if necessary. The absence of bleeding associated with an initial rise followed by an appropriate fall in fluorescent dye concentration would trigger a weaning of anticoagulation. All of the 19 replants survived. The average length of hospital stay was 9 days (range, 7-17 d). Eleven patients received blood transfusions. The average transfusion was 1.8 units (range, 0-9 units). All patients were happy with the decision to replant, and the cosmetic result. A protocol that promotes temporary, controlled bleeding from the fingertip is protective of artery-only replants distal to the distal interphalangeal joint until physiologic venous outflow is restored. The protocol described is both safe and reliable. The patient should be informed that such replant attempts may result in the need for transfusions and extended hospital stays, factors that

  8. Standardization of Nanoparticle Characterization: Methods for Testing Properties, Stability, and Functionality of Edible Nanoparticles.

    Science.gov (United States)

    McClements, Jake; McClements, David Julian

    2016-06-10

    There has been a rapid increase in the fabrication of various kinds of edible nanoparticles for oral delivery of bioactive agents, such as those constructed from proteins, carbohydrates, lipids, and/or minerals. It is currently difficult to compare the relative advantages and disadvantages of different kinds of nanoparticle-based delivery systems because researchers use different analytical instruments and protocols to characterize them. In this paper, we briefly review the various analytical methods available for characterizing the properties of edible nanoparticles, such as composition, morphology, size, charge, physical state, and stability. This information is then used to propose a number of standardized protocols for characterizing nanoparticle properties, for evaluating their stability to environmental stresses, and for predicting their biological fate. Implementation of these protocols would facilitate comparison of the performance of nanoparticles under standardized conditions, which would facilitate the rational selection of nanoparticle-based delivery systems for different applications in the food, health care, and pharmaceutical industries.

  9. A critical analysis of a locally agreed protocol for clinical practice

    International Nuclear Information System (INIS)

    Owen, A.; Hogg, P.; Nightingale, J.

    2004-01-01

    Within the traditional scope of radiographic practice (including advanced practice) there is a need to demonstrate effective patient care and management. Such practice should be set within a context of appropriate evidence and should also reflect peer practice. In order to achieve such practice the use of protocols is encouraged. Effective protocols can maximise care and management by minimising inter- and intra-professional variation; they can also allow for detailed procedural records to be kept in case of legal claims. However, whilst literature exists to encourage the use of protocols there is little published material available to indicate how to create, manage and archive them. This article uses an analytical approach to propose a suitable method for protocol creation and archival, it also offers suggestions on the scope and content of a protocol. To achieve this an existing clinical protocol for radiographer reporting barium enemas is analysed to draw out the general issues. Proposals for protocol creation, management, and archival were identified. The clinical practice described or inferred in the protocol should be drawn from evidence, such evidence could include peer-reviewed material, national standards and peer practice. The protocol should include an explanation of how to proceed when the radiographers reach the limit of their ability. It should refer to the initial training required to undertake the clinical duties as well as the on-going continual professional updating required to maintain competence. Audit of practice should be indicated, including the preferred audit methodology, and associated with this should be a clear statement about standards and what to do if standards are not adequately met. Protocols should be archived, in a paper-based form, for lengthy periods in case of legal claims. On the archived protocol the date it was in clinical use should be included

  10. Microsystems for liquid-liquid extraction of radionuclides in the analytical protocols

    International Nuclear Information System (INIS)

    Helle, Gwendolyne

    2014-01-01

    Radiochemical analyses are necessary to numerous steps for nuclear wastes management and for the control of the environment. An analytical protocol generally includes different steps of chemical separations which are lengthy, manual and complicated to implement because of their confinement in glove boxes and because of the hostile chemical and radiochemical media. Thus there is a huge importance to propose innovative and robust solutions to automate these steps but also to reduce the volumes of the radioactive and chemical wastes at the end of the analytical cycle. One solution consists in the miniaturization of the analyses through the use of lab-on-chip. The objective of this thesis work was to propose a rational approach to the conception of separative microsystems for the liquid-liquid extraction of radionuclides. To achieve this, the hydrodynamic behavior as well as the extraction performances have been investigated in one chip for three different chemical systems: Eu(III)-HNO 3 /DMDBTDMA, Eu(III)-AcO(H,Na)-HNO 3 /HDEHP and U(VI)-HCl/Aliquat336. A methodology has been developed for the implementation of the liquid-liquid extraction in micro-system for each chemical system. The influence of various geometric parameters such as channel length or specific interfacial area has been studied and the comparison of the liquid-liquid extraction performances has led to highlight the influence of the phases viscosities ratio on the flows. Thanks to the modeling of both hydrodynamics and mass transfer in micro-system, the criteria related to physical and kinetic properties of the chemical systems have been distinguished to propose a rational conception of tailor-made chips. Finally, several examples of the liquid-liquid extraction implementation in micro-system have been described for analytical applications in the nuclear field: U/Co separation by Aliquat336, Eu/Sm separation by DMDBTDMA or even the coupling between a liquid-liquid extraction chip and the system of

  11. Improving treatment times for patients with in-hospital stroke using a standardized protocol.

    Science.gov (United States)

    Koge, Junpei; Matsumoto, Shoji; Nakahara, Ichiro; Ishii, Akira; Hatano, Taketo; Sadamasa, Nobutake; Kai, Yasutoshi; Ando, Mitsushige; Saka, Makoto; Chihara, Hideo; Takita, Wataru; Tokunaga, Keisuke; Kamata, Takahiko; Nishi, Hidehisa; Hashimoto, Tetsuya; Tsujimoto, Atsushi; Kira, Jun-Ichi; Nagata, Izumi

    2017-10-15

    Previous reports have shown significant delays in treatment of in-hospital stroke (IHS). We developed and implemented our IHS alert protocol in April 2014. We aimed to determine the influence of implementation of our IHS alert protocol. Our implementation processes comprise the following four main steps: IHS protocol development, workshops for hospital staff to learn about the protocol, preparation of standardized IHS treatment kits, and obtaining feedback in a monthly hospital staff conference. We retrospectively compared protocol metrics and clinical outcomes of patients with IHS treated with intravenous thrombolysis and/or endovascular therapy between before (January 2008-March 2014) and after implementation (April 2014-December 2016). Fifty-five patients were included (pre, 25; post, 30). After the implementation, significant reductions occurred in the median time from stroke recognition to evaluation by a neurologist (30 vs. 13.5min, pvs. 26.5min, pvs. 16min, p=0.02). The median time from first neuroimaging to endovascular therapy had a tendency to decrease (75 vs. 53min, p=0.08). There were no differences in the favorable outcomes (modified Rankin scale score of 0-2) at discharge or the incidence of symptomatic intracranial hemorrhage between the two periods. Our IHS alert protocol implementation saved time in treating patients with IHS without compromising safety. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    Science.gov (United States)

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  13. Standards-Based Wireless Sensor Networking Protocols for Spaceflight Applications

    Science.gov (United States)

    Wagner, Raymond S.

    2010-01-01

    Wireless sensor networks (WSNs) have the capacity to revolutionize data gathering in both spaceflight and terrestrial applications. WSNs provide a huge advantage over traditional, wired instrumentation since they do not require wiring trunks to connect sensors to a central hub. This allows for easy sensor installation in hard to reach locations, easy expansion of the number of sensors or sensing modalities, and reduction in both system cost and weight. While this technology offers unprecedented flexibility and adaptability, implementing it in practice is not without its difficulties. Recent advances in standards-based WSN protocols for industrial control applications have come a long way to solving many of the challenges facing practical WSN deployments. In this paper, we will overview two of the more promising candidates - WirelessHART from the HART Communication Foundation and ISA100.11a from the International Society of Automation - and present the architecture for a new standards-based sensor node for networking and applications research.

  14. Nanometrology, Standardization and Regulation of Nanomaterials in Brazil: A Proposal for an Analytical-Prospective Model

    Directory of Open Access Journals (Sweden)

    Ana Rusmerg Giménez Ledesma

    2013-05-01

    Full Text Available The main objective of this paper is to propose an analytical-prospective model as a tool to support decision-making processes concerning metrology, standardization and regulation of nanomaterials in Brazil, based on international references and ongoing initiatives in the world. In the context of nanotechnology development in Brazil, the motivation for carrying out this research was to identify potential benefits of metrology, standardization and regulation of nanomaterials production, from the perspective of future adoption of the model by the main stakeholders of development of these areas in Brazil. The main results can be summarized as follows: (i an overview of international studies on metrology, standardization and regulation of nanomaterials, and nanoparticles, in special; (ii the analytical-prospective model; and (iii the survey questionnaire and the roadmapping tool for metrology, standardization and regulation of nanomaterials in Brazil, based on international references and ongoing initiatives in the world.

  15. The evaluation of an analytical protocol for the determination of substances in waste for hazard classification.

    Science.gov (United States)

    Hennebert, Pierre; Papin, Arnaud; Padox, Jean-Marie; Hasebrouck, Benoît

    2013-07-01

    The classification of waste as hazardous could soon be assessed in Europe using largely the hazard properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC-MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of 'pools' of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved 'mass' during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved 'pools') should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and

  16. Working towards accreditation by the International Standards Organization 15189 Standard: how to validate an in-house developed method an example of lead determination in whole blood by electrothermal atomic absorption spectrometry.

    Science.gov (United States)

    Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe

    2014-09-01

    Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.

  17. A Standardized Shift Handover Protocol: Improving Nurses’ Safe Practice in Intensive Care Units

    Directory of Open Access Journals (Sweden)

    Javad Malekzadeh

    2013-08-01

    Full Text Available Introduction: For maintaining the continuity of care and improving the quality of care, effective inter-shift information communication is necessary. Any handover error can endanger patient safety. Despite the importance of shift handover, there is no standard handover protocol in our healthcare settings. Methods In this one-group pretest-posttest quasi-experimental study conducted in spring and summer of 2011, we recruited a convenience sample of 56 ICU nurses. The Nurses’ Safe Practice Evaluation Checklist was used for data collection. The Content Validity Index and the inter-rater correlation coefficient of the checklist was 0.92 and 89, respectively. We employed the SPSS 11.5 software and the Mc Nemar and paired-samples t test for data analysis. Results: Study findings revealed that nurses’ mean score on the Safe Practice Evaluation Checklist increased significantly from 11.6 (2.7 to 17.0 (1.8 (P < 0.001. Conclusion: using a standard handover protocol for communicating patient’s needs and information improves nurses’ safe practice in the area of basic nursing care.

  18. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    Science.gov (United States)

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Net Analyte Signal Standard Additions Method for Simultaneous Determination of Sulfamethoxazole and Trimethoprim in Pharmaceutical Formulations and Biological Fluids

    Directory of Open Access Journals (Sweden)

    M. H. Givianrad

    2012-01-01

    Full Text Available The applicability of a novel net analyte signal standard addition method (NASSAM to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim was verified by UV-visible spectrophotometry. The results confirmed that the net analyte signal standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. Moreover, applying the net analyte signal standard additions method revealed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:35 to 60:1 in the mixed samples. In addition, the limits of detections were 0.26 and 0.23 μmol L-1 for sulfamethoxazole and trimethoprim, respectively. The proposed method has been effectively applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples.

  20. Report on the results of the FY1999 standardization for new standard interface protocol for electrical measuring use; 1999 nendo shinki sangyo ikusei sokkogata kokusai hyojun kaihatsu jigyo seika hokokusho. Denki keisokukiyo shinhyojun interface purotokoru no hyojunka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    The purpose of this project is to create a new protocol for electric measuring instruments that use a built-in the Universal Serial Bus (USB) interface which is no incorporated in PCs. This new interface will replace the existing GPIB interface (IEEE-488), which is currently used in electric measuring instruments as standard interface. Our goal is to create a protocol that will provide the same data rate as GPIB and better connectivity. To evaluate our new protocol, we create virtual instruments using evaluation boards for USB chips, as well as software, and performed transfer tests to isolate and solve problems found in the tests. We will make the protocol and the software available for general use, and register the protocol as a standard to the USB Developer's Forum, a body that manages USB standards, to make it as a de facto standard. Ultimately we are aiming at making it an international standard. (NEDO)

  1. Use of a Microprocessor to Implement an ADCCP Protocol (Federal Standard 1003).

    Science.gov (United States)

    1980-07-01

    results of other studies, to evaluate the operational and economic impact of incorporating various options in Federal Standard 1003. The effort...the LSI interface and the microprocessor; the LSI chip deposits bytes in its buffer as the producer, and the MPU reads this data as the consumer...on the interface between the MPU and the LSI protocol chip. This requires two main processes to be running at the same time--transmit and receive. The

  2. The reasonable woman standard: a meta-analytic review of gender differences in perceptions of sexual harassment.

    Science.gov (United States)

    Blumenthal, J A

    1998-02-01

    Courts and legislatures have begun to develop the "reasonable woman standard" (RWS) as a criterion for deciding sexual harassment trials. This standard rests on assumptions of a "wide divergence" between the perceptions of men and women when viewing social-sexual behavior that may be considered harassing. Narrative reviews of the literature on such perceptions have suggested that these assumptions are only minimally supported. To test these assumptions quantitatively, a meta-analytic review was conducted that assessed the size, stability, and moderators of gender differences in perceptions of sexual harassment. The effect of the actor's status relative to the target also was evaluated meta-analytically, as one alternative to the importance of gender effects. Results supported the claims of narrative reviews for a relatively small gender effect, and draw attention to the status effect. In discussing legal implications of the present findings, earlier claims are echoed suggesting caution in establishing the reasonable woman standard, and one alternative to the RWS, the "reasonable victim standard," is discussed.

  3. Quantification of theobromine and caffeine in saliva, plasma and urine via liquid chromatography-tandem mass spectrometry: a single analytical protocol applicable to cocoa intervention studies.

    Science.gov (United States)

    Ptolemy, Adam S; Tzioumis, Emma; Thomke, Arjun; Rifai, Sami; Kellogg, Mark

    2010-02-01

    Targeted analyses of clinically relevant metabolites in human biofluids often require extensive sample preparation (e.g., desalting, protein removal and/or preconcentration) prior to quantitation. In this report, a single ultra-centrifugation based sample pretreatment combined with a designed liquid chromatography-tandem mass spectrometry (LC-MS/MS) protocol provides selective quantification of 3,7-dimethylxanthine (theobromine) and 1,3,7-trimethylxanthine (caffeine) in human saliva, plasma and urine samples. The optimized chromatography permitted elution of both analytes within 1.3 min of the applied gradient. Positive-mode electrospray ionization and a triple quadruple MS/MS instrument operated in multiple reaction mode were used for detection. (13)C(3) isotopically labeled caffeine was included as an internal standard to improve accuracy and precision. Implementing a 20-fold dilution of the isolated low MW biofluid fraction prior to injection effectively minimized the deleterious contributions of all three matrices to quantitation. The assay was linear over a 160-fold concentration range from 2.5 to 400 micromol L(-1) for both theobromine (average R(2) 0.9968) and caffeine (average R(2) 0.9997) respectively. Analyte peak area variations for 2.5 micromol L(-1) caffeine and theobromine in saliva, plasma and urine ranged from 5 and 10% (intra-day, N=10) to 9 and 13% (inter-day, N=25) respectively. The intra- and inter-day precision of theobromine and caffeine elution times were 3 and theobromine ranged from 114 to 118% and 99 to 105% at concentration levels of 10 and 300 micromol L(-1). This validated protocol also permitted the relative saliva, plasma and urine distribution of both theobromine and caffeine to be quantified following a cocoa intervention. 2009 Elsevier B.V. All rights reserved.

  4. Standardizing serum 25-hydroxyvitamin D data from four Nordic population samples using the Vitamin D Standardization Program protocols: Shedding new light on vitamin D status in Nordic individuals

    DEFF Research Database (Denmark)

    Cashman, Kevin D; Dowling, Kirsten G; Škrabáková, Zuzana

    2015-01-01

    for the European Union are of variable quality making it difficult to estimate the prevalence of vitamin D deficiency across member states. As a consequence of the widespread, method-related differences in measurements of serum 25(OH)D concentrations, the Vitamin D Standardization Program (VDSP) developed...... protocols for standardizing existing serum 25(OH)D data from national surveys around the world. The objective of the present work was to apply the VDSP protocols to existing serum 25(OH)D data from a Danish, a Norwegian, and a Finnish population-based health survey and from a Danish randomized controlled...

  5. The need for LWR metrology standardization: the imec roughness protocol

    Science.gov (United States)

    Lorusso, Gian Francesco; Sutani, Takumichi; Rutigliani, Vito; van Roey, Frieda; Moussa, Alain; Charley, Anne-Laure; Mack, Chris; Naulleau, Patrick; Constantoudis, Vassilios; Ikota, Masami; Ishimoto, Toru; Koshihara, Shunsuke

    2018-03-01

    As semiconductor technology keeps moving forward, undeterred by the many challenges ahead, one specific deliverable is capturing the attention of many experts in the field: Line Width Roughness (LWR) specifications are expected to be less than 2nm in the near term, and to drop below 1nm in just a few years. This is a daunting challenge and engineers throughout the industry are trying to meet these targets using every means at their disposal. However, although current efforts are surely admirable, we believe they are not enough. The fact is that a specification has a meaning only if there is an agreed methodology to verify if the criterion is met or not. Such a standardization is critical in any field of science and technology and the question that we need to ask ourselves today is whether we have a standardized LWR metrology or not. In other words, if a single reference sample were provided, would everyone measuring it get reasonably comparable results? We came to realize that this is not the case and that the observed spread in the results throughout the industry is quite large. In our opinion, this makes the comparison of LWR data among institutions, or to a specification, very difficult. In this paper, we report the spread of measured LWR data across the semiconductor industry. We investigate the impact of image acquisition, measurement algorithm, and frequency analysis parameters on LWR metrology. We review critically some of the International Technology Roadmap for Semiconductors (ITRS) metrology guidelines (such as measurement box length larger than 2μm and the need to correct for SEM noise). We compare the SEM roughness results to AFM measurements. Finally, we propose a standardized LWR measurement protocol - the imec Roughness Protocol (iRP) - intended to ensure that every time LWR measurements are compared (from various sources or to specifications), the comparison is sensible and sound. We deeply believe that the industry is at a point where it is

  6. Standard Operational Protocols in professional nursing practice: use, weaknesses and potentialities.

    Science.gov (United States)

    Sales, Camila Balsero; Bernardes, Andrea; Gabriel, Carmen Silvia; Brito, Maria de Fátima Paiva; Moura, André Almeida de; Zanetti, Ariane Cristina Barboza

    2018-01-01

    to evaluate the use of Standard Operational Protocols (SOPs) in the professional practice of the nursing team based on the theoretical framework of Donabedian, as well as to identify the weaknesses and potentialities from its implementation. Evaluative research, with quantitative approach performed with nursing professionals working in the Health Units of a city of São Paulo, composed of two stages: document analysis and subsequent application of a questionnaire to nursing professionals. A total of 247 nursing professionals participated and reported changes in the way the interventions were performed. The main weaknesses were the small number of professionals, inadequate physical structure and lack of materials. Among the potentialities were: the standardization of materials and concern of the manager and professional related to patient safety. The reassessment of SOPs is necessary, as well as the adoption of a strategy of permanent education of professionals aiming at improving the quality of care provided.

  7. Defining standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to mastitis pathogens.

    Science.gov (United States)

    Schukken, Y H; Rauch, B J; Morelli, J

    2013-04-01

    The objective of this paper was to define standardized protocols for determining the efficacy of a postmilking teat disinfectant following experimental exposure of teats to both Staphylococcus aureus and Streptococcus agalactiae. The standardized protocols describe the selection of cows and herds and define the critical points in performing experimental exposure, performing bacterial culture, evaluating the culture results, and finally performing statistical analyses and reporting of the results. The protocols define both negative control and positive control trials. For negative control trials, the protocol states that an efficacy of reducing new intramammary infections (IMI) of at least 40% is required for a teat disinfectant to be considered effective. For positive control trials, noninferiority to a control disinfectant with a published efficacy of reducing new IMI of at least 70% is required. Sample sizes for both negative and positive control trials are calculated. Positive control trials are expected to require a large trial size. Statistical analysis methods are defined and, in the proposed methods, the rate of IMI may be analyzed using generalized linear mixed models. The efficacy of the test product can be evaluated while controlling for important covariates and confounders in the trial. Finally, standards for reporting are defined and reporting considerations are discussed. The use of the defined protocol is shown through presentation of the results of a recent trial of a test product against a negative control. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Difficulties in controlling mobilization pain using a standardized patient-controlled analgesia protocol in burns.

    Science.gov (United States)

    Nilsson, Andreas; Kalman, Sigga; Sonesson, Lena Karin; Arvidsson, Anders; Sjöberg, Folke

    2011-01-01

    The aim of this study was to evaluate pain relief for patients with burns during rest and mobilization with morphine according to a standard protocol for patient-controlled analgesia (PCA). Eighteen patients with a mean (SD) burned TBSA% of 26 (20) were studied for 10 days. Using a numeric rating scale (NRS, 0 = no pain and 10 = unbearable pain), patients were asked to estimate their acceptable and worst experienced pain by specifying a number on a scale and at what point they would like additional analgesics. Patients were allowed free access to morphine with a PCA pump device. Bolus doses were set according to age, (100 - age)/24 = bolus dose (mg), and 6 minutes lockout time. Degrees of pain, morphine requirements, doses delivered and demanded, oral intake of food, and antiemetics given were used as endpoints. Acceptable pain (mean [SD]) was estimated to be 3.8 (1.3) on the NRS, and additional treatment was considered necessary at scores of 4.3 (1.6) or more. NRS at rest was 2.7 (2.2) and during mobilization 4.7 (2.6). Required mean morphine per day was 81 (15) mg, and the number of doses requested increased during the first 6 days after the burn. The authors found no correlation between dose of morphine required and any other variables. Background pain can be controlled adequately with a standard PCA protocol. During mobilization, the pain experienced was too intense, despite having the already high doses of morphine increased. The present protocol must be refined further to provide analgesia adequate to cover mobilization as well.

  9. Standard Operational Protocols in professional nursing practice: use, weaknesses and potentialities

    Directory of Open Access Journals (Sweden)

    Camila Balsero Sales

    Full Text Available ABSTRACT Objective: to evaluate the use of Standard Operational Protocols (SOPs in the professional practice of the nursing team based on the theoretical framework of Donabedian, as well as to identify the weaknesses and potentialities from its implementation. Method: Evaluative research, with quantitative approach performed with nursing professionals working in the Health Units of a city of São Paulo, composed of two stages: document analysis and subsequent application of a questionnaire to nursing professionals. Results: A total of 247 nursing professionals participated and reported changes in the way the interventions were performed. The main weaknesses were the small number of professionals, inadequate physical structure and lack of materials. Among the potentialities were: the standardization of materials and concern of the manager and professional related to patient safety. Conclusion: The reassessment of SOPs is necessary, as well as the adoption of a strategy of permanent education of professionals aiming at improving the quality of care provided.

  10. Determination of trace impurities in uranium-transition metal alloy fuels by ICP-MS using extended common analyte internal standardization (ECAIS) technique

    International Nuclear Information System (INIS)

    Saha, Abhijit; Deb, S.B.; Nagar, B.K.; Saxena, M.K.

    2015-01-01

    An analytical methodology was developed for the determination of eight trace impurities viz, Al, B, Cd, Co, Cu, Mg, Mn and Ni in three different uranium-transition metal alloy fuels (U-Me; Me = Ti, Zr and Mo) employing inductively coupled plasma mass spectrometry (ICP-MS). The well known common analyte internal standardization (CAIS) chemometric technique was modified and then employed to minimize and account for the matrix effect on analyte intensity. Standard addition of analytes to the pure synthetic U-Me sample solutions and subsequently their ≥ 94% recovery by the ICP-MS measurement validates the proposed methodology. One real sample of each of these alloys was analyzed by the developed analytical methodology and the %RSD observed was in the range of 5-8%. The method detection limits were found to be within 4-10 μg L -1 . (author)

  11. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  12. Standardized communication protocol for BAS (IEIEJ/p); BAS hyojun interface shiyo (IEIEJ/p)

    Energy Technology Data Exchange (ETDEWEB)

    Toyoda, T. [Hitachi Building System Co. Ltd., Tokyo (Japan)

    2000-10-05

    For the BEMS user, to construct his BEMS tinder the multiple vendors environment is very beneficial because he could choice the most appropriate vendor among many vendors about every subsystem for the view point of technique and cost at any time. The effective tool which makes the BEMS tinder the multiple environment possible is the BACnet protocol which had developed and been standardized by ANSI/ASHRAE of U.S.. Institute of Electrical Installation Engineers-of Japan (IEIEJ) offers IEIEJ/p based on BACnet as IEIEJ's standard which is added the function of autonomous decentralized control to enhance the BEMS reliability to fit the Japanese multiple vendors environment, In this paper I present the outline of it's specification and feature of IEIEJ/p. (author)

  13. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    Science.gov (United States)

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of

  14. A protocol using coho salmon to monitor Tongass National Forest Land and Resource Management Plan standards and guidelines for fish habitat.

    Science.gov (United States)

    M.D. Bryant; Trent McDonald; R. Aho; B.E. Wright; Michelle Bourassa Stahl

    2008-01-01

    We describe a protocol to monitor the effectiveness of the Tongass Land Management Plan (TLMP) management standards for maintaining fish habitat. The protocol uses juvenile coho salmon (Oncorhynchus kisutch) in small tributary streams in forested watersheds. We used a 3-year pilot study to develop detailed methods to estimate juvenile salmonid...

  15. Introduction of a standardized multimodality image protocol for navigation-guided surgery of suspected low-grade gliomas.

    Science.gov (United States)

    Mert, Aygül; Kiesel, Barbara; Wöhrer, Adelheid; Martínez-Moreno, Mauricio; Minchev, Georgi; Furtner, Julia; Knosp, Engelbert; Wolfsberger, Stefan; Widhalm, Georg

    2015-01-01

    OBJECT Surgery of suspected low-grade gliomas (LGGs) poses a special challenge for neurosurgeons due to their diffusely infiltrative growth and histopathological heterogeneity. Consequently, neuronavigation with multimodality imaging data, such as structural and metabolic data, fiber tracking, and 3D brain visualization, has been proposed to optimize surgery. However, currently no standardized protocol has been established for multimodality imaging data in modern glioma surgery. The aim of this study was therefore to define a specific protocol for multimodality imaging and navigation for suspected LGG. METHODS Fifty-one patients who underwent surgery for a diffusely infiltrating glioma with nonsignificant contrast enhancement on MRI and available multimodality imaging data were included. In the first 40 patients with glioma, the authors retrospectively reviewed the imaging data, including structural MRI (contrast-enhanced T1-weighted, T2-weighted, and FLAIR sequences), metabolic images derived from PET, or MR spectroscopy chemical shift imaging, fiber tracking, and 3D brain surface/vessel visualization, to define standardized image settings and specific indications for each imaging modality. The feasibility and surgical relevance of this new protocol was subsequently prospectively investigated during surgery with the assistance of an advanced electromagnetic navigation system in the remaining 11 patients. Furthermore, specific surgical outcome parameters, including the extent of resection, histological analysis of the metabolic hotspot, presence of a new postoperative neurological deficit, and intraoperative accuracy of 3D brain visualization models, were assessed in each of these patients. RESULTS After reviewing these first 40 cases of glioma, the authors defined a specific protocol with standardized image settings and specific indications that allows for optimal and simultaneous visualization of structural and metabolic data, fiber tracking, and 3D brain

  16. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  17. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  18. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    OpenAIRE

    Saurabh B. Ganorkar; Dinesh M. Dhumal; Atul A. Shirkhedkar

    2017-01-01

    A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral), oxidative, photolytic (acidic, basic, neutral, solid state) and thermal (dry heat) degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm) by isocratic mode at ambie...

  19. Standardized terminology for clinical trial protocols based on top-level ontological categories.

    Science.gov (United States)

    Heller, B; Herre, H; Lippoldt, K; Loeffler, M

    2004-01-01

    This paper describes a new method for the ontologically based standardization of concepts with regard to the quality assurance of clinical trial protocols. We developed a data dictionary for medical and trial-specific terms in which concepts and relations are defined context-dependently. The data dictionary is provided to different medical research networks by means of the software tool Onto-Builder via the internet. The data dictionary is based on domain-specific ontologies and the top-level ontology of GOL. The concepts and relations described in the data dictionary are represented in natural language, semi-formally or formally according to their use.

  20. Accelerated rehabilitation compared with a standard protocol after distal radial fractures treated with volar open reduction and internal fixation: a prospective, randomized, controlled study.

    Science.gov (United States)

    Brehmer, Jess L; Husband, Jeffrey B

    2014-10-01

    There are relatively few studies in the literature that specifically evaluate accelerated rehabilitation protocols for distal radial fractures treated with open reduction and internal fixation (ORIF). The purpose of this study was to compare the early postoperative outcomes (at zero to twelve weeks postoperatively) of patients enrolled in an accelerated rehabilitation protocol with those of patients enrolled in a standard rehabilitation protocol following ORIF for a distal radial fracture. We hypothesized that patients with accelerated rehabilitation after volar ORIF for a distal radial fracture would have an earlier return to function compared with patients who followed a standard protocol. From November 2007 to November 2010, eighty-one patients with an unstable distal radial fracture were prospectively randomized to follow either an accelerated or a standard rehabilitation protocol after undergoing ORIF with a volar plate for a distal radial fracture. Both groups began with gentle active range of motion at three to five days postoperatively. At two weeks, the accelerated group initiated wrist/forearm passive range of motion and strengthening exercises, whereas the standard group initiated passive range of motion and strengthening at six weeks postoperatively. Patients were assessed at three to five days, two weeks, three weeks, four weeks, six weeks, eight weeks, twelve weeks, and six months postoperatively. Outcomes included Disabilities of the Arm, Shoulder and Hand (DASH) scores (primary outcome) and measurements of wrist flexion/extension, supination, pronation, grip strength, and palmar pinch. The patients in the accelerated group had better mobility, strength, and DASH scores at the early postoperative time points (zero to eight weeks postoperatively) compared with the patients in the standard rehabilitation group. The difference between the groups was both clinically relevant and statistically significant. Patients who follow an accelerated rehabilitation

  1. Standardization and Optimization of Computed Tomography Protocols to Achieve Low-Dose

    Science.gov (United States)

    Chin, Cynthia; Cody, Dianna D.; Gupta, Rajiv; Hess, Christopher P.; Kalra, Mannudeep K.; Kofler, James M.; Krishnam, Mayil S.; Einstein, Andrew J.

    2014-01-01

    The increase in radiation exposure due to CT scans has been of growing concern in recent years. CT scanners differ in their capabilities and various indications require unique protocols, but there remains room for standardization and optimization. In this paper we summarize approaches to reduce dose, as discussed in lectures comprising the first session of the 2013 UCSF Virtual Symposium on Radiation Safety in Computed Tomography. The experience of scanning at low dose in different body regions, for both diagnostic and interventional CT procedures, is addressed. An essential primary step is justifying the medical need for each scan. General guiding principles for reducing dose include tailoring a scan to a patient, minimizing scan length, use of tube current modulation and minimizing tube current, minimizing-tube potential, iterative reconstruction, and periodic review of CT studies. Organized efforts for standardization have been spearheaded by professional societies such as the American Association of Physicists in Medicine. Finally, all team members should demonstrate an awareness of the importance of minimizing dose. PMID:24589403

  2. Comparison of National and International Standards of Good Egg Production Practices

    Directory of Open Access Journals (Sweden)

    GP Sousa

    Full Text Available ABSTRACT Egg production is an important economic activity in Brazil, with about 697 million eggs produced annually. The conventional cage system is commonly used for egg production. However, there has been a growing concern for the welfare of laying hens around the world. In this context, many countries have issued laws, protocols, and other normative technical specifications to ensure the welfare of layers. This study aims at identifying similarities and differences between international standards and Brazilian protocols using the Comparative Law perspective. This article reports an analytical study of selected protocols, performing three analyses using the Comparative Law method. The research concludes that some items of the Brazilian protocols of good egg production practices, such as farm inspection, treatment of diseases, temperature, ventilation, beak trimming, feed and water supply, correspond to international specifications, whereas others, such as housing, freedom movement, use of equipment, and transport, are less strict.

  3. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  4. An Advanced Encryption Standard Powered Mutual Authentication Protocol Based on Elliptic Curve Cryptography for RFID, Proven on WISP

    Directory of Open Access Journals (Sweden)

    Alaauldin Ibrahim

    2017-01-01

    Full Text Available Information in patients’ medical histories is subject to various security and privacy concerns. Meanwhile, any modification or error in a patient’s medical data may cause serious or even fatal harm. To protect and transfer this valuable and sensitive information in a secure manner, radio-frequency identification (RFID technology has been widely adopted in healthcare systems and is being deployed in many hospitals. In this paper, we propose a mutual authentication protocol for RFID tags based on elliptic curve cryptography and advanced encryption standard. Unlike existing authentication protocols, which only send the tag ID securely, the proposed protocol could also send the valuable data stored in the tag in an encrypted pattern. The proposed protocol is not simply a theoretical construct; it has been coded and tested on an experimental RFID tag. The proposed scheme achieves mutual authentication in just two steps and satisfies all the essential security requirements of RFID-based healthcare systems.

  5. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Effectiveness of a Rapid Lumbar Spine MRI Protocol Using 3D T2-Weighted SPACE Imaging Versus a Standard Protocol for Evaluation of Degenerative Changes of the Lumbar Spine.

    Science.gov (United States)

    Sayah, Anousheh; Jay, Ann K; Toaff, Jacob S; Makariou, Erini V; Berkowitz, Frank

    2016-09-01

    Reducing lumbar spine MRI scanning time while retaining diagnostic accuracy can benefit patients and reduce health care costs. This study compares the effectiveness of a rapid lumbar MRI protocol using 3D T2-weighted sampling perfection with application-optimized contrast with different flip-angle evolutions (SPACE) sequences with a standard MRI protocol for evaluation of lumbar spondylosis. Two hundred fifty consecutive unenhanced lumbar MRI examinations performed at 1.5 T were retrospectively reviewed. Full, rapid, and complete versions of each examination were interpreted for spondylotic changes at each lumbar level, including herniations and neural compromise. The full examination consisted of sagittal T1-weighted, T2-weighted turbo spin-echo (TSE), and STIR sequences; and axial T1- and T2-weighted TSE sequences (time, 18 minutes 40 seconds). The rapid examination consisted of sagittal T1- and T2-weighted SPACE sequences, with axial SPACE reformations (time, 8 minutes 46 seconds). The complete examination consisted of the full examination plus the T2-weighted SPACE sequence. Sensitivities and specificities of the full and rapid examinations were calculated using the complete study as the reference standard. The rapid and full studies had sensitivities of 76.0% and 69.3%, with specificities of 97.2% and 97.9%, respectively, for all degenerative processes. Rapid and full sensitivities were 68.7% and 66.3% for disk herniation, 85.2% and 81.5% for canal compromise, 82.9% and 69.1% for lateral recess compromise, and 76.9% and 69.7% for foraminal compromise, respectively. Isotropic SPACE T2-weighted imaging provides high-quality imaging of lumbar spondylosis, with multiplanar reformatting capability. Our SPACE-based rapid protocol had sensitivities and specificities for herniations and neural compromise comparable to those of the protocol without SPACE. This protocol fits within a 15-minute slot, potentially reducing costs and discomfort for a large subgroup of

  7. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  8. Adaptive behaviors of experts in following standard protocol in trauma management: implications for developing flexible guidelines.

    Science.gov (United States)

    Vankipuram, Mithra; Ghaemmaghami, Vafa; Patel, Vimla L

    2012-01-01

    Critical care environments are complex and dynamic. To adapt to such environments, clinicians may be required to make alterations to their workflows resulting in deviations from standard procedures. In this work, deviations from standards in trauma critical care are studied. Thirty trauma cases were observed in a Level 1 trauma center. Activities tracked were compared to the Advance Trauma Life Support standard to determine (i) if deviations had occurred, (ii) type of deviations and (iii) whether deviations were initiated by individuals or collaboratively by the team. Results show that expert clinicians deviated to innovate, while deviations of novices result mostly in error. Experts' well developed knowledge allows for flexibility and adaptiveness in dealing with standards, resulting in innovative deviations while minimizing errors made. Providing informatics solution, in such a setting, would mean that standard protocols would have be flexible enough to "learn" from new knowledge, yet provide strong support for the trainees.

  9. South African Research Ethics Committee Review of Standards of Prevention in HIV Vaccine Trial Protocols.

    Science.gov (United States)

    Essack, Zaynab; Wassenaar, Douglas R

    2018-04-01

    HIV prevention trials provide a prevention package to participants to help prevent HIV acquisition. As new prevention methods are proven effective, this raises ethical and scientific design complexities regarding the prevention package or standard of prevention. Given its high HIV incidence and prevalence, South Africa has become a hub for HIV prevention research. For this reason, it is critical to study the implementation of relevant ethical-legal frameworks for such research in South Africa. This qualitative study used in-depth interviews to explore the practices and perspectives of eight members of South African research ethics committees (RECs) who have reviewed protocols for HIV vaccine trials. Their practices and perspectives are compared with ethics guideline requirements for standards of prevention.

  10. Difficulties in Controlling Mobilization Pain Using a Standardized Patient-Controlled Analgesia Protocol in Burns

    OpenAIRE

    Nilsson, Andreas; Kalman, Sigga; Arvidsson, Anders; Sjöberg, Folke

    2011-01-01

    The aim of this study was to evaluate pain relief for patients with burns during rest and mobilization with morphine according to a standard protocol for patient-controlled analgesia (PCA). Eighteen patients with a mean (SD) burned TBSA% of 26 (20) were studied for 10 days. Using a numeric rating scale (NRS, 0 = no pain and 10 = unbearable pain), patients were asked to estimate their acceptable and worst experienced pain by specifying a number on a scale and at what point they would like addi...

  11. Protocol Fuel Mix reporting

    International Nuclear Information System (INIS)

    2002-07-01

    The protocol in this document describes a method for an Electricity Distribution Company (EDC) to account for the fuel mix of electricity that it delivers to its customers, based on the best available information. Own production, purchase and sale of electricity, and certificates trading are taken into account. In chapter 2 the actual protocol is outlined. In the appendixes additional (supporting) information is given: (A) Dutch Standard Fuel Mix, 2000; (B) Calculation of the Dutch Standard fuel mix; (C) Procedures to estimate and benchmark the fuel mix; (D) Quality management; (E) External verification; (F) Recommendation for further development of the protocol; (G) Reporting examples

  12. Analytical protocol to study the food safety of (multiple-)recycled high-density polyethylene (HDPE) and polypropylene (PP) crates: Influence of recycling on the migration and formation of degradation products

    NARCIS (Netherlands)

    Coulier, L.; Orbons, H.G.M.; Rijk, R.

    2007-01-01

    An analytical protocol was set up and successfully applied to study the food safety of recycled HDPE and PP crates. A worst-case scenario was applied that focused not only on overall migration and specific migration of accepted starting materials but also on migratable degradation products of

  13. Standardized Duplex Ultrasound-Based Protocol for Early Diagnosis of Transplant Renal Artery Stenosis: Results of a Single-Institution Retrospective Cohort Study

    Directory of Open Access Journals (Sweden)

    Vincenzo Li Marzi

    2018-01-01

    Full Text Available Transplant renal artery stenosis (TRAS is the most frequent vascular complication after kidney transplantation (KT and has been associated with potentially reversible refractory hypertension, graft dysfunction, and reduced patient survival. The aim of the study is to describe the outcomes of a standardized Duplex Ultrasound- (DU- based screening protocol for early diagnosis of TRAS and for selection of patients potentially requiring endovascular intervention. We retrospectively reviewed our prospectively collected database of KT from January 1998 to select patients diagnosed with TRAS. The follow-up protocol was based on a risk-adapted, dynamic subdivision of eligible KT patients in different risk categories (RC with different protocol strategies (PS. Of 598 patients included in the study, 52 (9% patients had hemodynamically significant TRAS and underwent percutaneous angioplasty (PTA and stent placement. Technical and clinical success rates were 97% and 90%, respectively. 7 cases of restenosis were recorded at follow-up and treated with re-PTA plus stenting. Both DU imaging and clinical parameters improved after stent placement. Prospective high-quality studies are needed to test the efficacy and safety of our protocol in larger series. Accurate trial design and standardized reporting of patient outcomes will be key to address the current clinical needs.

  14. From basic survival analytic theory to a non-standard application

    CERN Document Server

    Zimmermann, Georg

    2017-01-01

    Georg Zimmermann provides a mathematically rigorous treatment of basic survival analytic methods. His emphasis is also placed on various questions and problems, especially with regard to life expectancy calculations arising from a particular real-life dataset on patients with epilepsy. The author shows both the step-by-step analyses of that dataset and the theory the analyses are based on. He demonstrates that one may face serious and sometimes unexpected problems, even when conducting very basic analyses. Moreover, the reader learns that a practically relevant research question may look rather simple at first sight. Nevertheless, compared to standard textbooks, a more detailed account of the theory underlying life expectancy calculations is needed in order to provide a mathematically rigorous framework. Contents Regression Models for Survival Data Model Checking Procedures Life Expectancy Target Groups Researchers, lecturers, and students in the fields of mathematics and statistics Academics and experts work...

  15. Data distribution architecture based on standard real time protocol

    International Nuclear Information System (INIS)

    Castro, R.; Vega, J.; Pereira, A.; Portas, A.

    2009-01-01

    Data distribution architecture (DDAR) has been designed conforming to new requirements, taking into account the type of data that is going to be generated from experiments in International Thermonuclear Experimental Reactor (ITER). The main goal of this architecture is to implement a system that is able to manage on line all data that is being generated by an experiment, supporting its distribution for: processing, storing, analysing or visualizing. The first objective is to have a distribution architecture that supports long pulse experiments (even hours). The described system is able to distribute, using real time protocol (RTP), stored data or live data generated while the experiment is running. It enables researchers to access data on line instead of waiting for the end of the experiment. Other important objective is scalability, so the presented architecture can easily grow based on actual necessities, simplifying estimation and design tasks. A third important objective is security. In this sense, the architecture is based on standards, so complete security mechanisms can be applied, from secure transmission solutions until elaborated access control policies, and it is full compatible with multi-organization federation systems as PAPI or Shibboleth.

  16. Data distribution architecture based on standard real time protocol

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense No. 22, 28040 Madrid (Spain)], E-mail: rodrigo.castro@ciemat.es; Vega, J.; Pereira, A.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense No. 22, 28040 Madrid (Spain)

    2009-06-15

    Data distribution architecture (DDAR) has been designed conforming to new requirements, taking into account the type of data that is going to be generated from experiments in International Thermonuclear Experimental Reactor (ITER). The main goal of this architecture is to implement a system that is able to manage on line all data that is being generated by an experiment, supporting its distribution for: processing, storing, analysing or visualizing. The first objective is to have a distribution architecture that supports long pulse experiments (even hours). The described system is able to distribute, using real time protocol (RTP), stored data or live data generated while the experiment is running. It enables researchers to access data on line instead of waiting for the end of the experiment. Other important objective is scalability, so the presented architecture can easily grow based on actual necessities, simplifying estimation and design tasks. A third important objective is security. In this sense, the architecture is based on standards, so complete security mechanisms can be applied, from secure transmission solutions until elaborated access control policies, and it is full compatible with multi-organization federation systems as PAPI or Shibboleth.

  17. [Food Security in Europe: comparison between the "Hygiene Package" and the British Retail Consortium (BRC) & International Food Standard (IFS) protocols].

    Science.gov (United States)

    Stilo, A; Parisi, S; Delia, S; Anastasi, F; Bruno, G; Laganà, P

    2009-01-01

    The birth of Hygiene Package and of the Reg. CE no 2073/2005 in the food production field signalled a change in Italy. This process started in Italy in 1997 with the legislative decree no 155 on Self-control but in reality, it was implemented in the UK in 1990 with the promulgation of the Food Safety Act. This legal act was influenced by some basic rules corresponding to the application of HACCP standards. Since 1990 the British chains of distribution (Retailers) have involved all aspects of the food line in this type of responsibility. Due to this growing awareness for a need for greater regulation, a protocol, edited by British Retail Consortium was created in 1998. This protocol acted as a "stamp" of approval for food products and it is now known as the BRC Global Food Standard. In July 2008, this protocol became effective in its fifth version. After the birth of BRC, also French and German Retailers have established a standard practically equivalent and perhaps more pertinent to safety food, that is International Food Standard (IFS). The new approach is specific to the food field and strictly applies criteria which will ensure "safety, quality and legality" of food products, similarly to ISO 22000:2005 (mainly based on BRC & IFS past experiences). New standards aim to create a sort of green list with fully "proper and fit" Suppliers only, because of comprehensible exigencies of Retailers. It is expected, as we have shown, that Auditor authorities who are responsible for ensuring that inspections are now carried out like the Hygiene Package, will find these new standards useful. The advantages of streamlining this system is that it will allow enterprises to diligently enforce food safety practices without fear of upset or legal consequence, to improve the quality (HACCP) of management & traceability system; to restrict wastes, reprocessing and withdrawal of products. However some discordances about the interpretation of certain sub-field norms (e.g., water

  18. Analysis of 855 upper extremity fistulas created using a standard protocol: the role of graft extension to achieve functional status.

    Science.gov (United States)

    Allan, Bassan J; Perez, Enrique R; Tabbara, Marwan

    2013-06-01

    The Fistula First Breakthrough Initiative (FFBI) has been one of the most important national programs to help achieve considerable improvements in the care of patients on chronic hemodialysis. FFBI has helped place guidelines to push practitioners to reduce the use of tunneled central venous catheters and to increase the rate of arteriovenous fistula use in patients requiring chronic hemodialysis access. However, despite current guidelines, no specific protocols exist for the creation and management of autogenous arteriovenous fistulas and outcomes at most centers are below national benchmarks. In this study, we examine the effectiveness of a standard protocol used at our institution for the creation of autogenous upper extremity fistulas for hemodialysis access in achieving early cannulation and early removal of tunneled dialysis catheters. Our review encompasses 855 consecutive autogenous fistulas created over a 10-year period. Our findings suggest that the use of a standard protocol for creation and management of autogenous fistulas can help increase the rate of functional accesses over national benchmarks. Additionally, extension/conversion of malfunctioning fistulas to grafts appears to be an excellent method to expedite removal of a tunneled dialysis catheter with concomitant preservation of a fistula.

  19. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  20. DNA repair protocols

    DEFF Research Database (Denmark)

    Bjergbæk, Lotte

    In its 3rd edition, this Methods in Molecular Biology(TM) book covers the eukaryotic response to genomic insult including advanced protocols and standard techniques in the field of DNA repair. Offers expert guidance for DNA repair, recombination, and replication. Current knowledge of the mechanisms...... that regulate DNA repair has grown significantly over the past years with technology advances such as RNA interference, advanced proteomics and microscopy as well as high throughput screens. The third edition of DNA Repair Protocols covers various aspects of the eukaryotic response to genomic insult including...... recent advanced protocols as well as standard techniques used in the field of DNA repair. Both mammalian and non-mammalian model organisms are covered in the book, and many of the techniques can be applied with only minor modifications to other systems than the one described. Written in the highly...

  1. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  2. FPGA based data-flow injection module at 10 Gbit/s reading data from network exported storage and using standard protocols

    International Nuclear Information System (INIS)

    Lemouzy, B; Garnier, J-C; Neufeld, N

    2011-01-01

    The goal of the LHCb readout upgrade is to accelerate the DAQ to 40 MHz. Such a DAQ system will certainly employ 10 Gigabit or similar technologies and might also need new networking protocols such as a customized, light-weight TCP or more specialized protocols. A test module is being implemented to be integrated in the existing LHCb infrastructure. It is a multiple 10-Gigabit traffic generator, driven by a Stratix IV FPGA, and flexible enough to generate LHCb's raw data packets. Traffic data are either internally generated or read from external storage via the network. We have implemented a light-weight industry standard protocol ATA over Ethernet (AoE) and we present an outlook of using a file-system on these network-exported disk-drivers.

  3. e-SCP-ECG+ Protocol: An Expansion on SCP-ECG Protocol for Health Telemonitoring—Pilot Implementation

    Directory of Open Access Journals (Sweden)

    George J. Mandellos

    2010-01-01

    Full Text Available Standard Communication Protocol for Computer-assisted Electrocardiography (SCP-ECG provides standardized communication among different ECG devices and medical information systems. This paper extends the use of this protocol in order to be included in health monitoring systems. It introduces new sections into SCP-ECG structure for transferring data for positioning, allergies, and five additional biosignals: noninvasive blood pressure (NiBP, body temperature (Temp, Carbon dioxide (CO2, blood oxygen saturation (SPO2, and pulse rate. It also introduces new tags in existing sections for transferring comprehensive demographic data. The proposed enhanced version is referred to as e-SCP-ECG+ protocol. This paper also considers the pilot implementation of the new protocol as a software component in a Health Telemonitoring System.

  4. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry: protocol standardization and database expansion for rapid identification of clinically important molds.

    Science.gov (United States)

    Paul, Saikat; Singh, Pankaj; Rudramurthy, Shivaprakash M; Chakrabarti, Arunaloke; Ghosh, Anup K

    2017-12-01

    To standardize the matrix-assisted laser desorption ionization-time of flight mass spectrometry protocols and expansion of existing Bruker Biotyper database for mold identification. Four different sample preparation methods (protocol A, B, C and D) were evaluated. On analyzing each protein extraction method, reliable identification and best log scores were achieved through protocol D. The same protocol was used to identify 153 clinical isolates. Of these 153, 123 (80.3%) were accurately identified by using existing database and remaining 30 (19.7%) were not identified due to unavailability in database. On inclusion of missing main spectrum profile in existing database, all 153 isolates were identified. Matrix-assisted laser desorption ionization-time of flight mass spectrometry can be used for routine identification of clinically important molds.

  5. Establishment and intra-/inter-laboratory validation of a standard protocol of reactive oxygen species assay for chemical photosafety evaluation.

    Science.gov (United States)

    Onoue, Satomi; Hosoi, Kazuhiro; Wakuri, Shinobu; Iwase, Yumiko; Yamamoto, Toshinobu; Matsuoka, Naoko; Nakamura, Kazuichi; Toda, Tsuguto; Takagi, Hironori; Osaki, Naoto; Matsumoto, Yasuhiro; Kawakami, Satoru; Seto, Yoshiki; Kato, Masashi; Yamada, Shizuo; Ohno, Yasuo; Kojima, Hajime

    2013-11-01

    A reactive oxygen species (ROS) assay was previously developed for photosafety evaluation of pharmaceuticals, and the present multi-center study aimed to establish and validate a standard protocol for ROS assay. In three participating laboratories, two standards and 42 coded chemicals, including 23 phototoxins and 19 nonphototoxic drugs/chemicals, were assessed by the ROS assay according to the standardized protocol. Most phototoxins tended to generate singlet oxygen and/or superoxide under UV-vis exposure, but nonphototoxic chemicals were less photoreactive. In the ROS assay on quinine (200 µm), a typical phototoxic drug, the intra- and inter-day precisions (coefficient of variation; CV) were found to be 1.5-7.4% and 1.7-9.3%, respectively. The inter-laboratory CV for quinine averaged 15.4% for singlet oxygen and 17.0% for superoxide. The ROS assay on 42 coded chemicals (200 µm) provided no false negative predictions upon previously defined criteria as compared with the in vitro/in vivo phototoxicity, although several false positives appeared. Outcomes from the validation study were indicative of satisfactory transferability, intra- and inter-laboratory variability, and predictive capacity of the ROS assay. Copyright © 2012 John Wiley & Sons, Ltd.

  6. Comparing Short Dental Implants to Standard Dental Implants: Protocol for a Systematic Review.

    Science.gov (United States)

    Rokn, Amir Reza; Keshtkar, Abbasali; Monzavi, Abbas; Hashemi, Kazem; Bitaraf, Tahereh

    2018-01-18

    Short dental implants have been proposed as a simpler, cheaper, and faster alternative for the rehabilitation of atrophic edentulous areas to avoid the disadvantages of surgical techniques for increasing bone volume. This review will compare short implants (4 to 8 mm) to standard implants (larger than 8 mm) in edentulous jaws, evaluating on the basis of marginal bone loss (MBL), survival rate, complications, and prosthesis failure. We will electronically search for randomized controlled trials comparing short dental implants to standard dental implants in the following databases: PubMed, Web of Science, EMBASE, Scopus, the Cochrane Central Register of Controlled Trials, and ClinicalTrials.gov with English language restrictions. We will manually search the reference lists of relevant reviews and the included articles in this review. The following journals will also be searched: European Journal of Oral Implantology, Clinical Oral Implants Research, and Clinical Implant Dentistry and Related Research. Two reviewers will independently perform the study selection, data extraction and quality assessment (using the Cochrane Collaboration tool) of included studies. All meta-analysis procedures including appropriate effect size combination, sub-group analysis, meta-regression, assessing publication or reporting bias will be performed using Stata (Statacorp, TEXAS) version 12.1. Short implant effectiveness will be assessed using the mean difference of MBL in terms of weighted mean difference (WMD) and standardized mean difference (SMD) using Cohen's method. The combined effect size measures in addition to the related 95% confidence intervals will be estimated by a fixed effect model. The heterogeneity of the related effect size will be assessed using a Q Cochrane test and I2 measure. The MBL will be presented by a standardized mean difference with a 95% confidence interval. The survival rate of implants, prostheses failures, and complications will be reported using a risk

  7. Method and platform standardization in MRM-based quantitative plasma proteomics.

    Science.gov (United States)

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  8. Performance evaluation of working standard NE2581 in comparison with reference standard NE2561 in the determination of absorbed dose to water using IAEA, HPA, NACP, AAPM, NCRP and ICRU protocols

    International Nuclear Information System (INIS)

    Dolah, M.T.; Supian Samat; Taiman Kadni

    2001-01-01

    The aim of this study was to evaluate the performance of NE 2581 in comparison with NE 2561 in the determination of the absorbed dose to water in a γ-ray beam using IAEA, HPA, NACP, AAPM, NCRP and ICRU protocols. 13 exposures of the γ-ray beams were used. This number of exposures yielded 26 chamber rate readings for NE 2581 (N=13) and for NE 2561 (N=13). From the 13 NE 2581 readings, 78 absorbed dose to water values were calculated for the IAEA (N=13), HPA (N=13), NACP (N=13), AAPM (N=13), NCRP (N=13) and ICRU (N=13) protocols. Similarly, from the 13 NE 2561 readings, 78 absorbed dose to water values were calculated for IAEA (N=13), HPA (N=13), NACP (N=13), AAPM (N=13), NCRP (N=13) and ICRU (N=13) protocols. From these 156 (=78 x 2) absorbed dose to water values, 78 percentage deviations between the NE 2581 and NE 2561 results were calculated for IAEA (N=13), HPA (N=13), NACP (N=13), AAPM (N=13), NCRP (N=13) and ICRU (N=13) protocols. For a single protocol, the mean μ and standard error σ+s+e of the percentage deviations (N=13) were calculated. results obtained in terms of [protocol, μ ± σ se ] were [IAEA,1.55 ± 0.12], [HPA, 0.98 ± 0.12], [NACP, 1.93 ± 0.12], [AAPM, -0.06 ± 0.12], [NCRP, 0.97 ± 0.12], [ICRU, 0.97 ± 0.12]. It can be seen that the range of percentage deviations is from -0.06 to 1.93. As the quoted IAEA acceptable limit of deviation is ± 3.0%, it was concluded that the working standard NE 2581 chamber has shown acceptable performance. In addition, the use of the AAPM protocol has enable NE 2581 to show a performance that is very similar like NE 2561. (Author)

  9. Reproducibility of microbial mutagenicity assays. I. Tests with Salmonella typhimurium and Escherichia coli using a standardized protocol

    International Nuclear Information System (INIS)

    Dunkel, V.C.; Zeiger, E.; Brusick, D.; McCoy, E.; McGregor, D.; Mortelmans, K.; Rosenkranz, H.S.; Simmon, V.F.

    1984-01-01

    The Salmonella/microsome test developed by Ames and his coworkers has been widely used in the evaluation of chemicals for genotoxic potential. Although the value of this assay is well recognized, there have been no comprehensive studies on the interlaboratory reproducibility of the method using a standardized protocol. A program was therefore initiated to compare the results obtained in four laboratories from testing a series of coded mutagens and nonmutagens using a standardized protocol. Additional objectives of this study were to compare male Fisher 344 rat, B6C3F1 mouse, and Syrian hamster liver S-9 preparations for the activation of chemicals; to compare Aroclor 1254-induced liver S-9 from all three species with the corresponding non-induced liver S-9's; and to compare the response of Escherichia coli WP-2 uvrA with the Salmonella typhimurium tester strains recommended by Ames. Since a primary use of in vitro microbial mutagenesis tests is the identification of potential carcinogens by their mutagenicity, the authors decided to compare the animal species and strains used by the National Cancer Institute/National Toxicology Program (NCI/NTP) for animal carcinogenicity studies

  10. Phase Transition in Protocols Minimizing Work Fluctuations

    Science.gov (United States)

    Solon, Alexandre P.; Horowitz, Jordan M.

    2018-05-01

    For two canonical examples of driven mesoscopic systems—a harmonically trapped Brownian particle and a quantum dot—we numerically determine the finite-time protocols that optimize the compromise between the standard deviation and the mean of the dissipated work. In the case of the oscillator, we observe a collection of protocols that smoothly trade off between average work and its fluctuations. However, for the quantum dot, we find that as we shift the weight of our optimization objective from average work to work standard deviation, there is an analog of a first-order phase transition in protocol space: two distinct protocols exchange global optimality with mixed protocols akin to phase coexistence. As a result, the two types of protocols possess qualitatively different properties and remain distinct even in the infinite duration limit: optimal-work-fluctuation protocols never coalesce with the minimal-work protocols, which therefore never become quasistatic.

  11. RadNet: Open network protocol for radiation data

    International Nuclear Information System (INIS)

    Rees, B.; Olson, K.; Beckes-Talcott, J.; Kadner, S.; Wenderlich, T.; Hoy, M.; Doyle, W.; Koskelo, M.

    1998-01-01

    Safeguards instrumentation is increasingly being incorporated into remote monitoring applications. In the past, vendors of radiation monitoring instruments typically provided the tools for uploading the monitoring data to a host. However, the proprietary nature of communication protocols lends itself to increased computer support needs and increased installation expenses. As a result, a working group of suppliers and customers of radiation monitoring instruments defined an open network protocol for transferring packets on a local area network from radiation monitoring equipment to network hosts. The protocol was termed RadNet. While it is now primarily used for health physics instruments, RadNet's flexibility and strength make it ideal for remote monitoring of nuclear materials. The incorporation of standard, open protocols ensures that future work will not render present work obsolete; because RadNet utilizes standard Internet protocols, and is itself a non-proprietary standard. The use of industry standards also simplifies the development and implementation of ancillary services, e.g. E-main generation or even pager systems

  12. Is a New Protocol for Acute Lymphoblastic Leukemia Research or Standard Therapy?

    NARCIS (Netherlands)

    Dekking, SAS; van der Graaf, R; de Vries, Martine; Bierings, MB; van Delden, JJM; Kodish, Eric; Lantos, John

    2015-01-01

    In the United States, doctors generally develop new cancer chemotherapy for children by testing innovative chemotherapy protocols against existing protocols in prospective randomized trials. In the Netherlands, children with leukemia are treated by protocols that are agreed upon by the Dutch

  13. Agricultural Soil Spectral Response and Properties Assessment: Effects of Measurement Protocol and Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Asa Gholizadeh

    2017-10-01

    Full Text Available Soil spectroscopy has shown to be a fast, cost-effective, environmentally friendly, non-destructive, reproducible and repeatable analytical technique. Soil components, as well as types of instruments, protocols, sampling methods, sample preparation, spectral acquisition techniques and analytical algorithms have a combined influence on the final performance. Therefore, it is important to characterize these differences and to introduce an effective approach in order to minimize the technical factors that alter reflectance spectra and consequent prediction. To quantify this alteration, a joint project between Czech University of Life Sciences Prague (CULS and Tel-Aviv University (TAU was conducted to estimate Cox, pH-H2O, pH-KCl and selected forms of Fe and Mn. Two different soil spectral measurement protocols and two data mining techniques were used to examine seventy-eight soil samples from five agricultural areas in different parts of the Czech Republic. Spectral measurements at both laboratories were made using different ASD spectroradiometers. The CULS protocol was based on employing a contact probe (CP spectral measurement scheme, while the TAU protocol was carried out using a CP measurement method, accompanied with the internal soil standard (ISS procedure. Two spectral datasets, acquired from different protocols, were both analyzed using partial least square regression (PLSR technique as well as the PARACUDA II®, a new data mining engine for optimizing PLSR models. The results showed that spectra based on the CULS setup (non-ISS demonstrated significantly higher albedo intensity and reflectance values relative to the TAU setup with ISS. However, the majority of statistics using the TAU protocol was not noticeably better than the CULS spectra. The paper also highlighted that under both measurement protocols, the PARACUDA II® engine proved to be a powerful tool for providing better results than PLSR. Such initiative is not only a way to

  14. Standardization and optimization of fluorescence in situ hybridization (FISH for HER-2 assessment in breast cancer: A single center experience

    Directory of Open Access Journals (Sweden)

    Magdalena Bogdanovska-Todorovska

    2018-05-01

    Full Text Available Accurate assessment of human epidermal growth factor receptor 2 (HER-2 is crucial in selecting patients for targeted therapy. Commonly used methods for HER-2 testing are immunohistochemistry (IHC and fluorescence in situ hybridization (FISH. Here we presented the implementation, optimization and standardization of two FISH protocols using breast cancer samples and assessed the impact of pre-analytical and analytical factors on HER-2 testing. Formalin fixed paraffin embedded (FFPE tissue samples from 70 breast cancer patients were tested for HER-2 using PathVysion™ HER-2 DNA Probe Kit and two different paraffin pretreatment kits, Vysis/Abbott Paraffin Pretreatment Reagent Kit (40 samples and DAKO Histology FISH Accessory Kit (30 samples. The concordance between FISH and IHC results was determined. Pre-analytical and analytical factors (i.e., fixation, baking, digestion, and post-hybridization washing affected the efficiency and quality of hybridization. The overall hybridization success in our study was 98.6% (69/70; the failure rate was 1.4%. The DAKO pretreatment kit was more time-efficient and resulted in more uniform signals that were easier to interpret, compared to the Vysis/Abbott kit. The overall concordance between IHC and FISH was 84.06%, kappa coefficient 0.5976 (p < 0.0001. The greatest discordance (82% between IHC and FISH was observed in IHC 2+ group. A standardized FISH protocol for HER-2 assessment, with high hybridization efficiency, is necessary due to variability in tissue processing and individual tissue characteristics. Differences in the pre-analytical and analytical steps can affect the hybridization quality and efficiency. The use of DAKO pretreatment kit is time-saving and cost-effective.

  15. Net analyte signal standard addition method for simultaneous determination of sulphadiazine and trimethoprim in bovine milk and veterinary medicines.

    Science.gov (United States)

    Hajian, Reza; Mousavi, Esmat; Shams, Nafiseh

    2013-06-01

    Net analyte signal standard addition method has been used for the simultaneous determination of sulphadiazine and trimethoprim by spectrophotometry in some bovine milk and veterinary medicines. The method combines the advantages of standard addition method with the net analyte signal concept which enables the extraction of information concerning a certain analyte from spectra of multi-component mixtures. This method has some advantages such as the use of a full spectrum realisation, therefore it does not require calibration and prediction step and only a few measurements require for the determination. Cloud point extraction based on the phenomenon of solubilisation used for extraction of sulphadiazine and trimethoprim in bovine milk. It is based on the induction of micellar organised media by using Triton X-100 as an extraction solvent. At the optimum conditions, the norm of NAS vectors increased linearly with concentrations in the range of 1.0-150.0 μmolL(-1) for both sulphadiazine and trimethoprim. The limits of detection (LOD) for sulphadiazine and trimethoprim were 0.86 and 0.92 μmolL(-1), respectively. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee Pro

    Science.gov (United States)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Standards-based wireless sensor network (WSN) protocols are promising candidates for spacecraft avionics systems, offering unprecedented instrumentation flexibility and expandability. Ensuring reliable data transport is key, however, when migrating from wired to wireless data gathering systems. In this paper, we conduct a rigorous laboratory analysis of the relative performances of the ZigBee Pro and ISA100.11a protocols in a representative crewed aerospace environment. Since both operate in the 2.4 GHz radio frequency (RF) band shared by systems such as Wi-Fi, they are subject at times to potentially debilitating RF interference. We compare goodput (application-level throughput) achievable by both under varying levels of 802.11g Wi-Fi traffic. We conclude that while the simpler, more inexpensive ZigBee Pro protocol performs well under moderate levels of interference, the more complex and costly ISA100.11a protocol is needed to ensure reliable data delivery under heavier interference. This paper represents the first published, rigorous analysis of WSN protocols in an aerospace environment that we are aware of and the first published head-to-head comparison of ZigBee Pro and ISA100.11a.

  17. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  19. Shortened screening method for phosphorus fractionation in sediments A complementary approach to the standards, measurements and testing harmonised protocol

    International Nuclear Information System (INIS)

    Pardo, Patricia; Rauret, Gemma; Lopez-Sanchez, Jose Fermin

    2004-01-01

    The SMT protocol, a sediment phosphorus fractionation method harmonised and validated in the frame of the standards, measurements and testing (SMT) programme (European Commission), establishes five fractions of phosphorus according to their extractability. The determination of phosphate extracted is carried out spectrophotometrically. This protocol has been applied to 11 sediments of different origin and characteristics and the phosphorus extracted in each fraction was determined not only by UV-Vis spectrophotometry, but also by inductively coupled plasma-atomic emission spectrometry. The use of these two determination techniques allowed the differentiation between phosphorus that was present in the extracts as soluble reactive phosphorus and as total phosphorus. From the comparison of data obtained with both determination techniques a shortened screening method, for a quick evaluation of the magnitude and importance of the fractions given by the SMT protocol, is proposed and validated using two certified reference materials

  20. A Simple XML Producer-Consumer Protocol

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of

  1. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    Directory of Open Access Journals (Sweden)

    Lui Wing-bong

    2006-02-01

    Full Text Available Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR. In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using the MagNA Pure LC instrument (Roche Diagnostics was evaluated. We developed a modified protocol in compliance with the recommended biosafety guidelines from the World Health Organization based on the use of the MagNA Pure total nucleic acid large volume isolation kit for the extraction of SARS-coronavirus RNA. The modified protocol was compared with a column-based extraction kit (QIAamp viral RNA mini kit, Qiagen for quantitative performance, analytical sensitivity and precision. Results The newly developed automated protocol was shown to be free from carry-over contamination and have comparable performance with other standard protocols and kits designed for the MagNA Pure LC instrument. However, the automated method was found to be less sensitive, less precise and led to consistently lower serum SARS-coronavirus concentrations when compared with the column-based extraction method. Conclusion As the diagnostic efficiency and prognostic value of the serum SARS-CoV RNA RT-PCR test is critically associated with the analytical sensitivity and quantitative performance contributed both by the RNA extraction and RT-PCR components of the test, we recommend the use of the column-based manual RNA extraction method.

  2. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    Science.gov (United States)

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  3. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  4. LHCb: FPGA based data-flow injection module at 10 Gbit/s reading data from network exported storage and using standard protocols

    CERN Multimedia

    Lemouzy, B; Garnier, J-C

    2010-01-01

    The goal of the LHCb readout upgrade is to speed up the DAQ to 40 MHz. Such a DAQ system will certainly employ 10 Gigabit or similar technologies and might also need new networking protocols such as a customized, light-weight TCP or more specialised protocols. A test module is being implemented, which integrates in the existing LHCb infrastructure. It is a multiple 10-Gigabit traffic generator, driven by a Stratix IV FPGA, which is flexibile enough to either generate LHCb's raw data packets internally or read them from external storage via the network. For reading the data we have implemented a light-weight industry standard protocol ATA over Ethernet (AoE) and we present an outlook of using a filesystem on these network-exported disk-drivers.

  5. CPM Test-Retest Reliability: "Standard" vs "Single Test-Stimulus" Protocols.

    Science.gov (United States)

    Granovsky, Yelena; Miller-Barmak, Adi; Goldstein, Oren; Sprecher, Elliot; Yarnitsky, David

    2016-03-01

    Assessment of pain inhibitory mechanisms using conditioned pain modulation (CPM) is relevant clinically in prediction of pain and analgesic efficacy. Our objective is to provide necessary estimates of intersession CPM reliability, to enable transformation of the CPM paradigm into a clinical tool. Two cohorts of young healthy subjects (N = 65) participated in two dual-session studies. In Study I, a Bath-Thermode CPM protocol was used, with hot water immersion and contact heat as conditioning- and test-stimuli, respectively, in a classical parallel CPM design introducing test-stimulus first, and then the conditioning- and repeated test-stimuli in parallel. Study II consisted of two CPM protocols: 1) Two-Thermodes, one for each of the stimuli, in the same parallel design as above, and 2) single test-stimulus (STS) protocol with a single administration of a contact heat test-stimulus, partially overlapped in time by a remote shorter contact heat as conditioning stimulus. Test-retest reliability was assessed within 3-7 days. The STS-CPM had superior reliability intraclass correlation (ICC 2 ,: 1  = 0.59) over Bath-Thermode (ICC 2 ,: 1  = 0.34) or Two-Thermodes (ICC 2 ,: 1  = 0.21) protocols. The hand immersion conditioning pain had higher reliability than thermode pain (ICC 2 ,: 1  = 0.76 vs ICC 2 ,: 1  = 0.16). Conditioned test-stimulus pain scores were of good (ICC 2 ,: 1  = 0.62) or fair (ICC 2 ,: 1  = 0.43) reliability for the Bath-Thermode and the STS, respectively, but not for the Two-Thermodes protocol (ICC 2 ,: 1  = 0.20). The newly developed STS-CPM paradigm was more reliable than other CPM protocols tested here, and should be further investigated for its clinical relevance. It appears that large contact size of the conditioning-stimulus and use of single rather than dual test-stimulus pain contribute to augmentation of CPM reliability. © 2015 American Academy of Pain Medicine. All rights reserved. For permissions, please e

  6. Impact of a standardized nurse observation protocol including MEWS after Intensive Care Unit discharge.

    Science.gov (United States)

    De Meester, K; Das, T; Hellemans, K; Verbrugghe, W; Jorens, P G; Verpooten, G A; Van Bogaert, P

    2013-02-01

    Analysis of in-hospital mortality after serious adverse events (SAE's) in our hospital showed the need for more frequent observation in medical and surgical wards. We hypothesized that the incidence of SAE's could be decreased by introducing a standard nurse observation protocol. To investigate the effect of a standard nurse observation protocol implementing the Modified Early Warning Score (MEWS) and a color graphic observation chart. Pre- and post-intervention study by analysis of patients records for a 5-day period after Intensive Care Unit (ICU) discharge to 14 medical and surgical wards before (n=530) and after (n=509) the intervention. For the total study population the mean Patient Observation Frequency Per Nursing Shift (POFPNS) during the 5-day period after ICU discharge increased from .9993 (95% C.I. .9637-1.0350) in the pre-intervention period to 1.0732 (95% C.I. 1.0362-1.1101) (p=.005) in the post-intervention period. There was an increased risk of a SAE in patients with MEWS 4 or higher in the present nursing shift (HR 8.25; 95% C.I. 2.88-23.62) and the previous nursing shift (HR 12.83;95% C.I. 4.45-36.99). There was an absolute risk reduction for SAE's within 120h after ICU discharge of 2.2% (95% C.I. -0.4-4.67%) from 5.7% to 3.5%. The intervention had a positive impact on the observation frequency. MEWS had a predictive value for SAE's in patients after ICU discharge. The drop in SAE's was substantial but did not reach statistical significance. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. Development of standardized bioassay protocols for the toxicity assessment of waste, manufactured products, and effluents in Latin America: Venezuela, a Case Study

    International Nuclear Information System (INIS)

    Rodriquez-Grau, J.

    1993-01-01

    The present status of the toxicity assessment of industrial products in Latin America is well below North America/EC standards. As an example, most of Latin America regulatory laws regarding effluent discharge are still based upon concentration limits of certain major pollutants, and BOD/COD measurements; no reference is made to the necessity of aquatic bioassay toxicity data. Aware of this imperative need, the Venezuelan Petroleum Industry (PDVSA), through its R ampersand D Corporative branch (INTEVEP) gave priority to the development of standardized acute/sublethal toxicity test protocols as sound means of evaluating their products and wastes. Throughout this presentation, the Venezuelan case will be studied, showing strategies undertaken to accelerate protocol development. Results will show the assessment of 14 different protocols encompassing a variety of species of aquatic/terrestrial organisms, and a series of toxicity test endpoints including mortality, reproductive, biological and immunological measurements, most of which are currently in use or being developed. These protocols have already yielded useful results in numerous cases where toxicity assessment was required, including evaluations of effluent, oil dispersants, drilling fluids, toxic wastes, fossil fuels and newly developed products. The Venezuelan case demonstrates that the integration of Industry, Academia and Government, which is an essential part of SETAC's philosophy, is absolutely necessary for the successful advancement of environmental scientific/regulatory issues

  8. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    Science.gov (United States)

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  9. Evaluation of a reduced centrifugation time and higher centrifugal force on various general chemistry and immunochemistry analytes in plasma and serum.

    Science.gov (United States)

    Møller, Mette F; Søndergaard, Tove R; Kristensen, Helle T; Münster, Anna-Marie B

    2017-09-01

    Background Centrifugation of blood samples is an essential preanalytical step in the clinical biochemistry laboratory. Centrifugation settings are often altered to optimize sample flow and turnaround time. Few studies have addressed the effect of altering centrifugation settings on analytical quality, and almost all studies have been done using collection tubes with gel separator. Methods In this study, we compared a centrifugation time of 5 min at 3000 ×  g to a standard protocol of 10 min at 2200 ×  g. Nine selected general chemistry and immunochemistry analytes and interference indices were studied in lithium heparin plasma tubes and serum tubes without gel separator. Results were evaluated using mean bias, difference plots and coefficient of variation, compared with maximum allowable bias and coefficient of variation used in laboratory routine quality control. Results For all analytes except lactate dehydrogenase, the results were within the predefined acceptance criteria, indicating that the analytical quality was not compromised. Lactate dehydrogenase showed higher values after centrifugation for 5 min at 3000 ×  g, mean bias was 6.3 ± 2.2% and the coefficient of variation was 5%. Conclusions We found that a centrifugation protocol of 5 min at 3000 ×  g can be used for the general chemistry and immunochemistry analytes studied, with the possible exception of lactate dehydrogenase, which requires further assessment.

  10. Connectivity-Based Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs

    Directory of Open Access Journals (Sweden)

    Woo-Yong Choi

    2009-01-01

    Full Text Available We propose the efficient reliable multicast MAC protocol based on the connectivity information among the recipients. Enhancing the BMMM (Batch Mode Multicast MAC protocol, the reliable multicast MAC protocol significantly reduces the RAK (Request for ACK frame transmissions in a reasonable computational time and enhances the MAC performance. By the analytical performance analysis, the throughputs of the BMMM protocol and our proposed MAC protocol are derived. Numerical examples show that our proposed MAC protocol increases the reliable multicast MAC performance for IEEE 802.11 wireless LANs.

  11. A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies

    Directory of Open Access Journals (Sweden)

    Elodie Jobard

    2016-12-01

    Full Text Available The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies.

  12. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  13. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  14. Network Coding to Enhance Standard Routing Protocols in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Fitzek, Frank

    2013-01-01

    This paper introduces a design and simulation of a locally optimized network coding protocol, called PlayNCool, for wireless mesh networks. PlayN-Cool is easy to implement and compatible with existing routing protocols and devices. This allows the system to gain from network coding capabilities i...

  15. Optimizing the high-resolution manometry (HRM) study protocol.

    Science.gov (United States)

    Patel, A; Ding, A; Mirza, F; Gyawali, C P

    2015-02-01

    Intolerance of the esophageal manometry catheter may prolong high-resolution manometry (HRM) studies and increase patient distress. We assessed the impact of obtaining the landmark phase at the end of the study when the patient has acclimatized to the HRM catheter. 366 patients (mean age 55.4 ± 0.8 years, 62.0% female) undergoing esophageal HRM over a 1-year period were studied. The standard protocol consisted of the landmark phase, 10 5 mL water swallows 20-30 s apart, and multiple rapid swallows where 4-6 2 mL swallows were administered in rapid succession. The modified protocol consisted of the landmark phase at the end of the study after test swallows. Study duration, technical characteristics, indications, and motor findings were compared between standard and modified protocols. Of the 366 patients, 89.6% underwent the standard protocol (study duration 12.9 ± 0.3 min). In 10.4% with poor catheter tolerance undergoing the modified protocol, study duration was significantly longer (15.6 ± 1.0 min, p = 0.004) despite similar duration of study maneuvers. Only elevated upper esophageal sphincter basal pressures at the beginning of the study segregated modified protocol patients. The 95th percentile time to landmark phase in the standard protocol patients was 6.1 min; as many as 31.4% of modified protocol patients could not obtain their first study maneuver within this period (p = 0.0003). Interpretation was not impacted by shifting the landmark phase to the end of the study. Modification of the HRM study protocol with the landmark phase obtained at the end of the study optimizes study duration without compromising quality. © 2014 John Wiley & Sons Ltd.

  16. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    Science.gov (United States)

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  17. A simple protocol for the routine calibration of pH meters

    Directory of Open Access Journals (Sweden)

    A. FEDERMAN NETO

    2009-01-01

    Full Text Available

    A simplified laboratory protocol for the calibration of pH meters is described and tested. It is based on the use of two analytical primary buffer solutions, potassium hydrogen phthalate and Borax (sodium tetraborate decahydrate of precisely known concentrations and pH. The solutions may be stored at room temperature for long periods, without decomposition and used directly. The calibration of the meter can be checked with standard solutions of sodium dihydrogen phosphate, sodium carbonate, sodium benzoate, sodium salicylate or potassium oxalate. Methods for the purification of Borax and potassium chloride are also given, and a new method for the neutralization of 0.9% saline is suggested. Keywords: pH meters (calibration; saline (0.9%; pH standards; potassium biphthalate; Borax.

  18. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  19. Evaluation of Protocol Uniformity Concerning Laparoscopic Cholecystectomy in The Netherlands

    Science.gov (United States)

    Goossens, Richard H. M.; van Eijk, Daan J.; Lange, Johan F.

    2008-01-01

    Background Iatrogenic bile duct injury remains a current complication of laparoscopic cholecystectomy. One uniform and standardized protocol, based on the “critical view of safety” concept of Strasberg, should reduce the incidence of this complication. Furthermore, owing to the rapid development of minimally invasive surgery, technicians are becoming more frequently involved. To improve communication between the operating team and technicians, standardized actions should also be defined. The aim of this study was to compare existing protocols for laparoscopic cholecystectomy from various Dutch hospitals. Methods Fifteen Dutch hospitals were contacted for evaluation of their protocols for laparoscopic cholecystectomy. All evaluated protocols were divided into six steps and were compared accordingly. Results In total, 13 hospitals responded—5 academic hospitals, 5 teaching hospitals, 3 community hospitals—of which 10 protocols were usable for comparison. Concerning the trocar positions, only minor differences were found. The concept of “critical view of safety” was represented in just one protocol. Furthermore, the order of clipping and cutting the cystic artery and duct differed. Descriptions of instruments and apparatus were also inconsistent. Conclusions Present protocols differ too much to define a universal procedure among surgeons in The Netherlands. The authors propose one (inter)national standardized protocol, including standardized actions. This uniform standardized protocol has to be officially released and recommended by national scientific associations (e.g., the Dutch Society of Surgery) or international societies (e.g., European Association for Endoscopic Surgery and Society of American Gastrointestinal and Endoscopic Surgeons). The aim is to improve patient safety and professional communication, which are necessary for new developments. PMID:18224485

  20. Security of Semi-Device-Independent Random Number Expansion Protocols.

    Science.gov (United States)

    Li, Dan-Dan; Wen, Qiao-Yan; Wang, Yu-Kun; Zhou, Yu-Qian; Gao, Fei

    2015-10-27

    Semi-device-independent random number expansion (SDI-RNE) protocols require some truly random numbers to generate fresh ones, with making no assumptions on the internal working of quantum devices except for the dimension of the Hilbert space. The generated randomness is certified by non-classical correlation in the prepare-and-measure test. Until now, the analytical relations between the amount of the generated randomness and the degree of non-classical correlation, which are crucial for evaluating the security of SDI-RNE protocols, are not clear under both the ideal condition and the practical one. In the paper, first, we give the analytical relation between the above two factors under the ideal condition. As well, we derive the analytical relation under the practical conditions, where devices' behavior is not independent and identical in each round and there exists deviation in estimating the non-classical behavior of devices. Furthermore, we choose a different randomness extractor (i.e., two-universal random function) and give the security proof.

  1. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    Science.gov (United States)

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support.

  2. A Protocol Layer Trust-Based Intrusion Detection Scheme for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2017-05-01

    Full Text Available This article proposes a protocol layer trust-based intrusion detection scheme for wireless sensor networks. Unlike existing work, the trust value of a sensor node is evaluated according to the deviations of key parameters at each protocol layer considering the attacks initiated at different protocol layers will inevitably have impacts on the parameters of the corresponding protocol layers. For simplicity, the paper mainly considers three aspects of trustworthiness, namely physical layer trust, media access control layer trust and network layer trust. The per-layer trust metrics are then combined to determine the overall trust metric of a sensor node. The performance of the proposed intrusion detection mechanism is then analyzed using the t-distribution to derive analytical results of false positive and false negative probabilities. Numerical analytical results, validated by simulation results, are presented in different attack scenarios. It is shown that the proposed protocol layer trust-based intrusion detection scheme outperforms a state-of-the-art scheme in terms of detection probability and false probability, demonstrating its usefulness for detecting cross-layer attacks.

  3. Recent applications of nuclear analytical methods to the certification of elemental content in NIST standard reference materials

    International Nuclear Information System (INIS)

    Greenberg, R.R.; Zeisler, R.; Mackey, E.A.

    2006-01-01

    Well-characterized, certified reference materials (CRMs) play an essential role in assuring the quality of analytical measurements. NIST has been producing CRMs, currently called NIST Standard Reference Materials (SRMs), to validate analytical measurements for nearly one hundred years. The predominant mode of certifying inorganic constituents in complex-matrix SRMs is through the use of two critically evaluated, independent analytical techniques at NIST. These techniques should have no significant sources of error in common. The use of nuclear analytical methods in combination with one of the chemically based analytical method at NIST eliminates the possibility of any significant, common error source. The inherent characteristics of the various forms of nuclear analytical methods make them extremely valuable for SRM certification. Instrumental NAA is nondestructive, which eliminates the possibility of any dissolution problems, and often provides homogeneity information. Radiochemical NAA typically provides nearly blank-free determinations of some highly important, but difficult elements at very low levels. Prompt-gamma NAA complements INAA, and provides independent determinations of some key elements. In addition, all significant uncertainty components can be evaluated for these techniques, and we believe these methods can meet all the requirements of a primary method of measurement as defined by ISO and the CCQM. NIST has certified several SRMs using INAA and RNAA as primary methods. In addition, NIST has compared measurements by INAA and PGAA with other primary methods as part of the CCQM intercomparisons of national metrology institutes. Some significant SRMs recently certified for inorganic constituents with contributions from the nuclear analytical methods include: Toxic Substances in Urine (SRM 2670a), Lake Superior Fish Tissue (SRM 1946), Air Particulate on Filter Media (SRM 2783), Inorganics in Marine Sediment (SRM 2702), Sediment for Solid Sampling (Small

  4. Is the standard compliance check protocol a valid measure of the accessibility of tobacco to underage smokers?

    Science.gov (United States)

    DiFranza, J.; Savageau, J.; Bouchard, J.

    2001-01-01

    OBJECTIVE—To determine if the standard compliance check protocol is a valid measure of the experience of underage smokers when purchasing tobacco in unfamiliar communities.
SETTING—160 tobacco outlets in eight Massachusetts communities where underage tobacco sales laws are vigorously enforced.
PROCEDURE—Completed purchase rates were compared between underage smokers who behaved normally and inexperienced non-smoking youths who were not allowed to lie or present proof of age (ID).
RESULTS—The "smoker protocol" increased the likelihood of a sale nearly sixfold over that for the non-smokers (odds ratio (OR) 5.7, 95% confidence interval (CI) 1.5 to 22). When the youths presented an ID with an underage birth date, the odds of a completed sale increased dramatically (OR 27, 95% CI 3.4 to 212). Clerks judged to be under 21 years of age were seven times more likely to make an illegal sale (OR 7.6, 95% CI 2.4 to 24.0).
CONCLUSIONS—Commonly used compliance check protocols are too artificial to reflect accurately the experience of underage smokers. The validity of compliance checks might be improved by having youths present ID, and by employing either tobacco users, or non-tobacco users who are sufficiently experienced to mimic the self confidence exhibited by tobacco users in this situation. Consideration should be given to prohibiting the sale of tobacco by individuals under 21 years of age.


Keywords: compliance check protocol; underage smokers PMID:11544386

  5. Pre-Analytical Parameters Affecting Vascular Endothelial Growth Factor Measurement in Plasma: Identifying Confounders.

    Science.gov (United States)

    Walz, Johanna M; Boehringer, Daniel; Deissler, Heidrun L; Faerber, Lothar; Goepfert, Jens C; Heiduschka, Peter; Kleeberger, Susannah M; Klettner, Alexa; Krohne, Tim U; Schneiderhan-Marra, Nicole; Ziemssen, Focke; Stahl, Andreas

    2016-01-01

    Vascular endothelial growth factor-A (VEGF-A) is intensively investigated in various medical fields. However, comparing VEGF-A measurements is difficult because sample acquisition and pre-analytic procedures differ between studies. We therefore investigated which variables act as confounders of VEGF-A measurements. Following a standardized protocol, blood was taken at three clinical sites from six healthy participants (one male and one female participant at each center) twice one week apart. The following pre-analytical parameters were varied in order to analyze their impact on VEGF-A measurements: analyzing center, anticoagulant (EDTA vs. PECT / CTAD), cannula (butterfly vs. neonatal), type of centrifuge (swing-out vs. fixed-angle), time before and after centrifugation, filling level (completely filled vs. half-filled tubes) and analyzing method (ELISA vs. multiplex bead array). Additionally, intrapersonal variations over time and sex differences were explored. Statistical analysis was performed using a linear regression model. The following parameters were identified as statistically significant independent confounders of VEGF-A measurements: analyzing center, anticoagulant, centrifuge, analyzing method and sex of the proband. The following parameters were no significant confounders in our data set: intrapersonal variation over one week, cannula, time before and after centrifugation and filling level of collection tubes. VEGF-A measurement results can be affected significantly by the identified pre-analytical parameters. We recommend the use of CTAD anticoagulant, a standardized type of centrifuge and one central laboratory using the same analyzing method for all samples.

  6. Pre-Analytical Parameters Affecting Vascular Endothelial Growth Factor Measurement in Plasma: Identifying Confounders.

    Directory of Open Access Journals (Sweden)

    Johanna M Walz

    Full Text Available Vascular endothelial growth factor-A (VEGF-A is intensively investigated in various medical fields. However, comparing VEGF-A measurements is difficult because sample acquisition and pre-analytic procedures differ between studies. We therefore investigated which variables act as confounders of VEGF-A measurements.Following a standardized protocol, blood was taken at three clinical sites from six healthy participants (one male and one female participant at each center twice one week apart. The following pre-analytical parameters were varied in order to analyze their impact on VEGF-A measurements: analyzing center, anticoagulant (EDTA vs. PECT / CTAD, cannula (butterfly vs. neonatal, type of centrifuge (swing-out vs. fixed-angle, time before and after centrifugation, filling level (completely filled vs. half-filled tubes and analyzing method (ELISA vs. multiplex bead array. Additionally, intrapersonal variations over time and sex differences were explored. Statistical analysis was performed using a linear regression model.The following parameters were identified as statistically significant independent confounders of VEGF-A measurements: analyzing center, anticoagulant, centrifuge, analyzing method and sex of the proband. The following parameters were no significant confounders in our data set: intrapersonal variation over one week, cannula, time before and after centrifugation and filling level of collection tubes.VEGF-A measurement results can be affected significantly by the identified pre-analytical parameters. We recommend the use of CTAD anticoagulant, a standardized type of centrifuge and one central laboratory using the same analyzing method for all samples.

  7. Multiparametric multidetector computed tomography scanning on suspicion of hyperacute ischemic stroke: validating a standardized protocol

    Directory of Open Access Journals (Sweden)

    Felipe Torres Pacheco

    2013-06-01

    Full Text Available Multidetector computed tomography (MDCT scanning has enabled the early diagnosis of hyperacute brain ischemia. We aimed at validating a standardized protocol to read and report MDCT techniques in a series of adult patients. The inter-observer agreement among the trained examiners was tested, and their results were compared with a standard reading. No false positives were observed, and an almost perfect agreement (Kappa>0.81 was documented when the CT angiography (CTA and cerebral perfusion CT (CPCT map data were added to the noncontrast CT (NCCT analysis. The inter-observer agreement was higher for highly trained readers, corroborating the need for specific training to interpret these modern techniques. The authors recommend adding CTA and CPCT to the NCCT analysis in order to clarify the global analysis of structural and hemodynamic brain abnormalities. Our structured report is suitable as a script for the reproducible analysis of the MDCT of patients on suspicion of ischemic stroke.

  8. Asymptotic performance modelling of DCF protocol with prioritized channel access

    Science.gov (United States)

    Choi, Woo-Yong

    2017-11-01

    Recently, the modification of the DCF (Distributed Coordination Function) protocol by the prioritized channel access was proposed to resolve the problem that the DCF performance worsens exponentially as more nodes exist in IEEE 802.11 wireless LANs. In this paper, an asymptotic analytical performance model is presented to analyze the MAC performance of the DCF protocol with the prioritized channel access.

  9. Effects of Adding an Internet-Based Pain Coping Skills Training Protocol to a Standardized Education and Exercise Program for People With Persistent Hip Pain (HOPE Trial): Randomized Controlled Trial Protocol.

    Science.gov (United States)

    Bennell, Kim L; Rini, Christine; Keefe, Francis; French, Simon; Nelligan, Rachel; Kasza, Jessica; Forbes, Andrew; Dobson, Fiona; Abbott, J Haxby; Dalwood, Andrew; Vicenzino, Bill; Harris, Anthony; Hinman, Rana S

    2015-10-01

    Persistent hip pain in older people is usually due to hip osteoarthritis (OA), a major cause of pain, disability, and psychological dysfunction. The purpose of this study is to evaluate whether adding an Internet-based pain coping skills training (PCST) protocol to a standardized intervention of education followed by physical therapist-instructed home exercise leads to greater reductions in pain and improvements in function. An assessor-, therapist-, and participant-blinded randomized controlled trial will be conducted. The study will be conducted in a community setting. The participants will be 142 people over 50 years of age with self-reported hip pain consistent with hip OA. Participants will be randomly allocated to: (1) a control group receiving a 24-week standardized intervention comprising an 8-week Internet-based education package followed by 5 individual physical therapy exercise sessions plus home exercises (3 times weekly) or (2) a PCST group receiving an 8-week Internet-based PCST protocol in addition to the control intervention. Outcomes will be measured at baseline and 8, 24, and 52 weeks, with the primary time point at 24 weeks. Primary outcomes are hip pain on walking and self-reported physical function. Secondary outcomes include health-related quality-of-life, participant-perceived treatment response, self-efficacy for pain management and function, pain coping attempts, pain catastrophizing, and physical activity. Measurements of adherence, adverse events, use of health services, and process measures will be collected at 24 and 52 weeks. Cost-effectiveness will be assessed at 52 weeks. A self-reported diagnosis of persistent hip pain will be used. The findings will help determine whether adding an Internet-based PCST protocol to standardized education and physical therapist-instructed home exercise is more effective than education and exercise alone for persistent hip pain. This study has the potential to guide clinical practice toward innovative

  10. Entanglement distillation protocols and number theory

    International Nuclear Information System (INIS)

    Bombin, H.; Martin-Delgado, M.A.

    2005-01-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set Z D n associated with Bell diagonal states is a module rather than a vector space. We find that a partition of Z D n into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D. When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively

  11. ATM and Internet protocol

    CERN Document Server

    Bentall, M; Turton, B

    1998-01-01

    Asynchronous Transfer Mode (ATM) is a protocol that allows data, sound and video being transferred between independent networks via ISDN links to be supplied to, and interpreted by, the various system protocols.ATM and Internet Protocol explains the working of the ATM and B-ISDN network for readers with a basic understanding of telecommunications. It provides a handy reference to everyone working with ATM who may not require the full standards in detail, but need a comprehensive guide to ATM. A substantial section is devoted to the problems of running IP over ATM and there is some discussion o

  12. Development of Characterization Protocol for Mixed Liquid Radioactive Waste Classification

    International Nuclear Information System (INIS)

    Norasalwa Zakaria; Syed Asraf Wafa; Wo, Y.M.; Sarimah Mahat; Mohamad Annuar Assadat Husain

    2017-01-01

    Mixed organic liquid waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclide posed specific challenges in its management. Often, this waste becomes legacy waste in many nuclear facilities and being considered as 'problematic' waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this study is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using analytical procedures involving gross alpha beta, and gamma spectrometry. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste. (author)

  13. Development of characterization protocol for mixed liquid radioactive waste classification

    Energy Technology Data Exchange (ETDEWEB)

    Zakaria, Norasalwa, E-mail: norasalwa@nuclearmalaysia.gov.my [Waste Technology Development Centre, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Wafa, Syed Asraf [Radioisotop Technology and Innovation, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Wo, Yii Mei [Radiochemistry and Environment, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Mahat, Sarimah [Material Technology Group, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia)

    2015-04-29

    Mixed liquid organic waste generated from health-care and research activities containing tritium, carbon-14, and other radionuclides posed specific challenges in its management. Often, these wastes become legacy waste in many nuclear facilities and being considered as ‘problematic’ waste. One of the most important recommendations made by IAEA is to perform multistage processes aiming at declassification of the waste. At this moment, approximately 3000 bottles of mixed liquid waste, with estimated volume of 6000 litres are currently stored at the National Radioactive Waste Management Centre, Malaysia and some have been stored for more than 25 years. The aim of this study is to develop a characterization protocol towards reclassification of these wastes. The characterization protocol entails waste identification, waste screening and segregation, and analytical radionuclides profiling using various analytical procedures including gross alpha/ gross beta, gamma spectrometry, and LSC method. The results obtained from the characterization protocol are used to establish criteria for speedy classification of the waste.

  14. Anthropometric standardisation and quality control protocols for the construction of new, international, fetal and newborn growth standards: the INTERGROWTH-21st Project.

    Science.gov (United States)

    Cheikh Ismail, L; Knight, H E; Ohuma, E O; Hoch, L; Chumlea, W C

    2013-09-01

    The primary aim of the INTERGROWTH-21(st) Project is to construct new, prescriptive standards describing optimal fetal and preterm postnatal growth. The anthropometric measurements include the head circumference, recumbent length and weight of the infants, and the stature and weight of the parents. In such a large, international, multicentre project, it is critical that all study sites follow standardised protocols to ensure maximal validity of the growth and nutrition indicators used. This paper describes in detail the anthropometric training, standardisation and quality control procedures used to collect data for these new standards. The initial standardisation session was in Nairobi, Kenya, using newborns, which was followed by similar sessions in the eight participating study sites in Brazil, China, India, Italy, Kenya, Oman, UK and USA. The intraobserver and inter-observer technical error of measurement values for head circumference range from 0.3 to 0.4 cm, and for recumbent length from 0.3 to 0.5 cm. These standardisation protocols implemented at each study site worldwide ensure that the anthropometric data collected are of the highest quality to construct international growth standards. © 2013 Royal College of Obstetricians and Gynaecologists.

  15. A collaborative study on a Nordic standard protocol for detection and enumeration of thermotolerant Campylobacter in food (NMKL 119, 3. Ed., 2007)

    DEFF Research Database (Denmark)

    Rosenquist, Hanne; Bengtsson, Anja; Hansen, Tina Beck

    2007-01-01

    A Nordic standard protocol for detection and enumeration of thermotolerant Campylobacter in food has been elaborated (NMKL 119, 3. Ed., 2007). Performance and precision characteristics of this protocol were evaluated in a collaborative study with participation of 14 laboratories from seven European...... jejuni (SLV-542). Expected concentrations (95% C.I.) (cfu g(-1) or ml(-1)) of both strains in matrices were 0.6-1.4 and 23-60 for qualitative detection, and 0.6-1.4; 23-60; and 420-1200 for semi-quantitative detection. For quantitative determination, the expected concentrations of C. jejuni/C. coli were...

  16. Optimization of Saanen sperm genes amplification: evaluation of standardized protocols in genetically uncharacterized rural goats reared under a subtropical environment.

    Science.gov (United States)

    Barbour, Elie K; Saade, Maya F; Sleiman, Fawwak T; Hamadeh, Shady K; Mouneimne, Youssef; Kassaifi, Zeina; Kayali, Ghazi; Harakeh, Steve; Jaber, Lina S; Shaib, Houssam A

    2012-10-01

    The purpose of this research is to optimize quantitatively the amplification of specific sperm genes in reference genomically characterized Saanen goat and to evaluate the standardized protocols applicability on sperms of uncharacterized genome of rural goats reared under subtropical environment for inclusion in future selection programs. The optimization of the protocols in Saanen sperms included three production genes (growth hormone (GH) exons 2, 3, and 4, αS1-casein (CSN1S1), and α-lactalbumin) and two health genes (MHC class II DRB and prion (PrP)). The optimization was based on varying the primers concentrations and the inclusion of a PCR cosolvent (Triton X). The impact of the studied variables on statistically significant increase in the yield of amplicons was noticed in four out of five (80%) optimized protocols, namely in those related to GH, CSN1S1, α-lactalbumin, and PrP genes (P 0.05). The applicability of the optimized protocols of Saanen sperm genes on amplification of uncharacterized rural goat sperms revealed a 100% success in tested individuals for amplification of GH, CSN1S1, α-lactalbumin, and MHC class II DRB genes and a 75% success for the PrP gene. The significant success in applicability of the Saanen quantitatively optimized protocols to other uncharacterized genome of rural goats allows for their inclusion in future selection, targeting the sustainability of this farming system in a subtropical environment and the improvement of the farmers livelihood.

  17. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  18. Protocol Interoperability Between DDN and ISO (Defense Data Network and International Organization for Standardization) Protocols

    Science.gov (United States)

    1988-08-01

    services and protocols above the transport layer are usually implemented as user- callable utilities on the host computers, it is desirable to offer them...Networks, Prentice-hall, New Jersey, 1987 [ BOND 87] Bond , John, "Parallel-Processing Concepts Finally Come together in Real Systems", Computer Design

  19. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools.

    Science.gov (United States)

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-04

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  1. Performance Comparison of Wireless Sensor Network Standard Protocols in an Aerospace Environment: ISA100.11a and ZigBee

    Science.gov (United States)

    Wagner, Raymond S.; Barton, Richard J.

    2011-01-01

    Wireless Sensor Networks (WSNs) can provide a substantial benefit in spacecraft systems, reducing launch weight and providing unprecedented flexibility by allowing instrumentation capabilities to grow and change over time. Achieving data transport reliability on par with that of wired systems, however, can prove extremely challenging in practice. Fortunately, much progress has been made in developing standard WSN radio protocols for applications from non-critical home automation to mission-critical industrial process control. The relative performances of candidate protocols must be compared in representative aerospace environments, however, to determine their suitability for spaceflight applications. In this paper, we will present the results of a rigorous laboratory analysis of the performance of two standards-based, low power, low data rate WSN protocols: ZigBee Pro and ISA100.11a. Both are based on IEEE 802.15.4 and augment that standard's specifications to build complete, multi-hop networking stacks. ZigBee Pro targets primarily the home and office automation markets, providing an ad-hoc protocol that is computationally lightweight and easy to implement in inexpensive system-on-a-chip components. As a result of this simplicity, however, ZigBee Pro can be susceptible to radio frequency (RF) interference. ISA100.11a, on the other hand, targets the industrial process control market, providing a robust, centrally-managed protocol capable of tolerating a significant amount of RF interference. To achieve these gains, a coordinated channel hopping mechanism is employed, which entails a greater computational complexity than ZigBee and requires more sophisticated and costly hardware. To guide future aerospace deployments, we must understand how well these standards relatively perform in analog environments under expected operating conditions. Specifically, we are interested in evaluating goodput -- application level throughput -- in a representative crewed environment

  2. Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol.

    Science.gov (United States)

    Klonoff, David C; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A; Arreaza-Rubin, Guillermo; Burk, Robert D; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W

    2016-05-01

    Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled "Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program" is attached as supplementary material. This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. © 2015 Diabetes Technology Society.

  3. Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol

    Science.gov (United States)

    Klonoff, David C.; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A.; Arreaza-Rubin, Guillermo; Burk, Robert D.; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B.; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W.

    2015-01-01

    Background: Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. Methods: The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. Results: A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled “Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program” is attached as supplementary material. Conclusion: This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. PMID:26481642

  4. Multi-site study of additive genetic effects on fractional anisotropy of cerebral white matter: comparing meta and mega analytical approaches for data pooling

    Science.gov (United States)

    Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E.; Mandl, René C.; Almasy, Laura; Booth, Tom; Brouwer, Rachel M.; Curran, Joanne E.; de Zubicaray, Greig I.; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T.; Hong, L. Elliot; Landman, Bennett A.; Lemaitre, Hervé; Lopez, Lorna; Martin, Nicholas G.; McMahon, Katie L.; Mitchell, Braxton D.; Olvera, Rene L.; Peterson, Charles P.; Starr, John M.; Sussmann, Jessika E.; Toga, Arthur W.; Wardlaw, Joanna M.; Wright, Margaret J.; Wright, Susan N.; Bastin, Mark E.; McIntosh, Andrew M.; Boomsma, Dorret I.; Kahn, René S.; den Braber, Anouk; de Geus, Eco JC; Deary, Ian J.; Hulshoff Pol, Hilleke E.; Williamson, Douglas E.; Blangero, John; van ’t Ent, Dennis; Thompson, Paul M.; Glahn, David C.

    2014-01-01

    Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9–85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large “mega-family”. We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. PMID:24657781

  5. Analytic continuation in perturbative QCD

    International Nuclear Information System (INIS)

    Caprini, Irinel

    2002-01-01

    We discuss some attempts to improve standard perturbative expansion in QCD by using the analytic continuation in the momentum and the Borel complex planes. We first analyse the momentum-plane analyticity properties of the Borel-summed Green functions in perturbative QCD and the connection between the Landau singularities and the infrared renormalons. By using the analytic continuation in the Borel complex plane, we propose a new perturbative series replacing the standard expansion in powers of the normalized coupling constant a. The new expansion functions have branch point and essential singularities at the origin of the complex a-plane and divergent Taylor expansions in powers of a. On the other hand the modified expansion of the QCD correlators is convergent under rather conservative conditions. (author)

  6. Estimation of the Thurstonian model for the 2-AC protocol

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen; Lee, Hye-Seong; Brockhoff, Per B.

    2012-01-01

    . This relationship makes it possible to extract estimates and standard errors of δ and τ from general statistical software, and furthermore, it makes it possible to combine standard regression modelling with the Thurstonian model for the 2-AC protocol. A model for replicated 2-AC data is proposed using cumulative......The 2-AC protocol is a 2-AFC protocol with a “no-difference” option and is technically identical to the paired preference test with a “no-preference” option. The Thurstonian model for the 2-AC protocol is parameterized by δ and a decision parameter τ, the estimates of which can be obtained...... by fairly simple well-known methods. In this paper we describe how standard errors of the parameters can be obtained and how exact power computations can be performed. We also show how the Thurstonian model for the 2-AC protocol is closely related to a statistical model known as a cumulative probit model...

  7. A slotted access control protocol for metropolitan WDM ring networks

    Science.gov (United States)

    Baziana, P. A.; Pountourakis, I. E.

    2009-03-01

    In this study we focus on the serious scalability problems that many access protocols for WDM ring networks introduce due to the use of a dedicated wavelength per access node for either transmission or reception. We propose an efficient slotted MAC protocol suitable for WDM ring metropolitan area networks. The proposed network architecture employs a separate wavelength for control information exchange prior to the data packet transmission. Each access node is equipped with a pair of tunable transceivers for data communication and a pair of fixed tuned transceivers for control information exchange. Also, each access node includes a set of fixed delay lines for synchronization reasons; to keep the data packets, while the control information is processed. An efficient access algorithm is applied to avoid both the data wavelengths and the receiver collisions. In our protocol, each access node is capable of transmitting and receiving over any of the data wavelengths, facing the scalability issues. Two different slot reuse schemes are assumed: the source and the destination stripping schemes. For both schemes, performance measures evaluation is provided via an analytic model. The analytical results are validated by a discrete event simulation model that uses Poisson traffic sources. Simulation results show that the proposed protocol manages efficient bandwidth utilization, especially under high load. Also, comparative simulation results prove that our protocol achieves significant performance improvement as compared with other WDMA protocols which restrict transmission over a dedicated data wavelength. Finally, performance measures evaluation is explored for diverse numbers of buffer size, access nodes and data wavelengths.

  8. Heavy element stable isotope ratios. Analytical approaches and applications

    International Nuclear Information System (INIS)

    Tanimizu, Masaharu; Sohrin, Yoshiki; Hirata, Takafumi

    2013-01-01

    Continuous developments in inorganic mass spectrometry techniques, including a combination of an inductively coupled plasma ion source and a magnetic sector-based mass spectrometer equipped with a multiple-collector array, have revolutionized the precision of isotope ratio measurements, and applications of inorganic mass spectrometry for biochemistry, geochemistry, and marine chemistry are beginning to appear on the horizon. Series of pioneering studies have revealed that natural stable isotope fractionations of many elements heavier than S (e.g., Fe, Cu, Zn, Sr, Ce, Nd, Mo, Cd, W, Tl, and U) are common on Earth, and it had been widely recognized that most physicochemical reactions or biochemical processes induce mass-dependent isotope fractionation. The variations in isotope ratios of the heavy elements can provide new insights into past and present biochemical and geochemical processes. To achieve this, the analytical community is actively solving problems such as spectral interference, mass discrimination drift, chemical separation and purification, and reduction of the contamination of analytes. This article describes data calibration and standardization protocols to allow interlaboratory comparisons or to maintain traceability of data, and basic principles of isotope fractionation in nature, together with high-selectivity and high-yield chemical separation and purification techniques for stable isotope studies.

  9. Methodological Study to Develop Standard Operational Protocol on Oral Drug Administration for Children.

    Science.gov (United States)

    Bijarania, Sunil Kumar; Saini, Sushma Kumari; Verma, Sanjay; Kaur, Sukhwinder

    2017-05-01

    To develop standard operational protocol (SOP) on oral drug administration and checklist to assess the implementation of the developed SOP. In this prospective methodological study, SOPs were developed in five phases. In the first phase, the preliminary draft of SOPs and checklists were prepared based on literature review, assessment of current practices and focus group discussion (FGD) with bedside working nurses. In the second phase, content validity was checked with the help of Delphi technique (12 experts). Total four drafts were prepared in stages and necessary modifications were made as per suggestions after each Delphi round. Fourth Delphi round was performed after conducting a pilot study. In the fourth phase, all bedside nurses were trained as per SOPs and asked to practice accordingly and observation of thirty oral drug administrations in children was done to check reliability of checklists for implementation of SOPs. In Phase-V, 7 FGDs were conducted with bedside nurses to assess the effectiveness of SOPs. The Content Validity Index (CVI) of SOP and checklists was 99.77%. Overall standardized Cronbach's alpha was calculated as 0.94. All the nurses felt that the SOP is useful. Valid and feasible SOP for drug administration to children through oral route along with valid and reliable checklist were developed. It is recommended to use this document for drug administration to children.

  10. Quantitative gel electrophoresis: new records in precision by elaborated staining and detection protocols.

    Science.gov (United States)

    Deng, Xi; Schröder, Simone; Redweik, Sabine; Wätzig, Hermann

    2011-06-01

    Gel electrophoresis (GE) is a very common analytical technique for proteome research and protein analysis. Despite being developed decades ago, there is still a considerable need to improve its precision. Using the fluorescence of Colloidal Coomassie Blue -stained proteins in near-infrared (NIR), the major error source caused by the unpredictable background staining is strongly reduced. This result was generalized for various types of detectors. Since GE is a multi-step procedure, standardization of every single step is required. After detailed analysis of all steps, the staining and destaining were identified as the major source of the remaining variation. By employing standardized protocols, pooled percent relative standard deviations of 1.2-3.1% for band intensities were achieved for one-dimensional separations in repetitive experiments. The analysis of variance suggests that the same batch of staining solution should be used for gels of one experimental series to minimize day-to-day variation and to obtain high precision. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A standards-based ontology and support for Big Data Analytics in the insurance industry

    Directory of Open Access Journals (Sweden)

    Dimitrios A. Koutsomitropoulos

    2017-06-01

    Full Text Available Standardization efforts have led to the emergence of conceptual models in the insurance industry. Simultaneously, the proliferation of digital information poses new challenges for the efficient management and analysis of available data. Based on the property and casualty data model, we propose an OWL ontology to represent insurance processes and to map large data volumes collected in traditional data stores. By the virtue of reasoning, we demonstrate a set of semantic queries using the ontology vocabulary that can simplify analytics and deduce implicit facts from these data. We compare this mapping approach to data in native RDF format, as in a triple store. As proof-of-concept, we use a large anonymized dataset for car policies from an actual insurance company.

  12. Age and gender leucocytes variances and references values generated using the standardized ONE-Study protocol

    DEFF Research Database (Denmark)

    Kverneland, Anders H.; Streitz, Mathias; Geissler, Edward

    2016-01-01

    blood flow cytometry. The performance of our protocols was challenged here by profiling samples from healthy volunteers to reveal age- and gender-dependent differences and to establish a standardized reference cohort for use in clinical trials. Whole blood samples from two different cohorts were...... analyzed (first cohort: n = 52, second cohort: n = 46, both 20–84 years with equal gender distribution). The second cohort was run as a validation cohort by a different operator. The “ONE Study” panels were applied to analyze expression of >30 different surface markers to enumerate proportional...... cohort. Thus, we have proven the utility of our strategy and generated reproducible reference ranges accounting for age- and gender-dependent differences, which are crucial for a better patient monitoring and individualized therapy....

  13. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1998-01-01

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs

  14. Hanford analytical sample projections FY 1998--FY 2002

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  15. Publication trends of study protocols in rehabilitation.

    Science.gov (United States)

    Jesus, Tiago S; Colquhoun, Heather L

    2017-09-04

    Growing evidence points for the need to publish study protocols in the health field. To observe whether the growing interest in publishing study protocols in the broader health field has been translated into increased publications of rehabilitation study protocols. Observational study using publication data and its indexation in PubMed. Not applicable. Not applicable. PubMed was searched with appropriate combinations of Medical Subject Headings up to December 2014. The effective presence of study protocols was manually screened. Regression models analyzed the yearly growth of publications. Two-sample Z-tests analyzed whether the proportion of Systematic Reviews (SRs) and Randomized Controlled Trials (RCTs) among study protocols differed from that of the same designs for the broader rehabilitation research. Up to December 2014, 746 publications of rehabilitation study protocols were identified, with an exponential growth since 2005 (r2=0.981; p<0.001). RCT protocols were the most common among rehabilitation study protocols (83%), while RCTs were significantly more prevalent among study protocols than among the broader rehabilitation research (83% vs. 35.8%; p<0.001). For SRs, the picture was reversed: significantly less common among study protocols (2.8% vs. 9.3%; p<0.001). Funding was more often reported by rehabilitation study protocols than the broader rehabilitation research (90% vs. 53.1%; p<0.001). Rehabilitation journals published a significantly lower share of rehabilitation study protocols than they did for the broader rehabilitation research (1.8% vs.16.7%; p<0.001). Identifying the reasons for these discrepancies and reverting unwarranted disparities (e.g. low rate of publication for rehabilitation SR protocols) are likely new avenues for rehabilitation research and its publication. SRs, particularly those aggregating RCT results, are considered the best standard of evidence to guide rehabilitation clinical practice; however, that standard can be improved

  16. Throughput and Fairness of Collision Avoidance Protocols in Ad Hoc Networks

    National Research Council Canada - National Science Library

    Garcia-Luna-Aceves, J. J; Wang, Yu

    2004-01-01

    .... In Section 1, The authors present an analytical modeling to derive the saturation throughput of these sender-initiated collision avoidance protocols in multi-hop ad hoc networks with nodes randomly...

  17. Critical Response Protocol

    Science.gov (United States)

    Ellingson, Charlene; Roehrig, Gillian; Bakkum, Kris; Dubinsky, Janet M.

    2016-01-01

    This article introduces the Critical Response Protocol (CRP), an arts-based technique that engages students in equitable critical discourse and aligns with the "Next Generation Science Standards" vision for providing students opportunities for language learning while advancing science learning (NGSS Lead States 2013). CRP helps teachers…

  18. A Weak Value Based QKD Protocol Robust Against Detector Attacks

    Science.gov (United States)

    Troupe, James

    2015-03-01

    We propose a variation of the BB84 quantum key distribution protocol that utilizes the properties of weak values to insure the validity of the quantum bit error rate estimates used to detect an eavesdropper. The protocol is shown theoretically to be secure against recently demonstrated attacks utilizing detector blinding and control and should also be robust against all detector based hacking. Importantly, the new protocol promises to achieve this additional security without negatively impacting the secure key generation rate as compared to that originally promised by the standard BB84 scheme. Implementation of the weak measurements needed by the protocol should be very feasible using standard quantum optical techniques.

  19. Design Protocols and Analytical Strategies that Incorporate Structural Reliability Models

    Science.gov (United States)

    Duffy, Stephen F.

    1997-01-01

    Ceramic matrix composites (CMC) and intermetallic materials (e.g., single crystal nickel aluminide) are high performance materials that exhibit attractive mechanical, thermal and chemical properties. These materials are critically important in advancing certain performance aspects of gas turbine engines. From an aerospace engineer's perspective the new generation of ceramic composites and intermetallics offers a significant potential for raising the thrust/weight ratio and reducing NO(x) emissions of gas turbine engines. These aspects have increased interest in utilizing these materials in the hot sections of turbine engines. However, as these materials evolve and their performance characteristics improve a persistent need exists for state-of-the-art analytical methods that predict the response of components fabricated from CMC and intermetallic material systems. This need provided the motivation for the technology developed under this research effort. Continuous ceramic fiber composites exhibit an increase in work of fracture, which allows for "graceful" rather than catastrophic failure. When loaded in the fiber direction, these composites retain substantial strength capacity beyond the initiation of transverse matrix cracking despite the fact that neither of its constituents would exhibit such behavior if tested alone. As additional load is applied beyond first matrix cracking, the matrix tends to break in a series of cracks bridged by the ceramic fibers. Any additional load is born increasingly by the fibers until the ultimate strength of the composite is reached. Thus modeling efforts supported under this research effort have focused on predicting this sort of behavior. For single crystal intermetallics the issues that motivated the technology development involved questions relating to material behavior and component design. Thus the research effort supported by this grant had to determine the statistical nature and source of fracture in a high strength, Ni

  20. The effect of a standardized protocol for iron supplementation to blood donors low in hemoglobin concentration.

    Science.gov (United States)

    Magnussen, Karin; Bork, Nanna; Asmussen, Lisa

    2008-04-01

    Iron deficiency leading to low hemoglobin concentration (cHb) is a common problem for blood donors as well as for blood banks. A standardized protocol offering iron supplementation based on P-ferritin determination may help to reduce the problem and retain donors. This was a prospective study where 879 blood donors, presenting with cHb at or below the limit of acceptance for donation, were included. The predonation cHb result was read after donation. The donors received 50 iron tablets (JernC or Ferrochel, 100 or 25 mg elemental iron, respectively), and samples for P-ferritin, mean corpuscular volume, and control of cHb were secured. Based on a P-ferritin level of less than 60 microg per L, 20 iron tablets were offered after all following donations. Mean cHb was 7.6 mmol per L (122 g/L) and 8.2 mmol per L (132 g/L) in women and men, respectively. In 80 percent of the women and 48 percent of the men, iron stores were low (P-ferritin protocol offering iron supplementation and simple oral and written advice based on P-ferritin measurements is effective in normalizing cHb and retaining donors presenting with cHb at or below the limit of acceptance for donation.

  1. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  2. Multielement trace determination in SiC powders: assessment of interlaboratory comparisons aimed at the validation and standardization of analytical procedures with direct solid sampling based on ETV ICP OES and DC arc OES.

    Science.gov (United States)

    Matschat, Ralf; Hassler, Jürgen; Traub, Heike; Dette, Angelika

    2005-12-01

    The members of the committee NMP 264 "Chemical analysis of non-oxidic raw and basic materials" of the German Standards Institute (DIN) have organized two interlaboratory comparisons for multielement determination of trace elements in silicon carbide (SiC) powders via direct solid sampling methods. One of the interlaboratory comparisons was based on the application of inductively coupled plasma optical emission spectrometry with electrothermal vaporization (ETV ICP OES), and the other on the application of optical emission spectrometry with direct current arc (DC arc OES). The interlaboratory comparisons were organized and performed in the framework of the development of two standards related to "the determination of mass fractions of metallic impurities in powders and grain sizes of ceramic raw and basic materials" by both methods. SiC powders were used as typical examples of this category of material. The aim of the interlaboratory comparisons was to determine the repeatability and reproducibility of both analytical methods to be standardized. This was an important contribution to the practical applicability of both draft standards. Eight laboratories participated in the interlaboratory comparison with ETV ICP OES and nine in the interlaboratory comparison with DC arc OES. Ten analytes were investigated by ETV ICP OES and eleven by DC arc OES. Six different SiC powders were used for the calibration. The mass fractions of their relevant trace elements were determined after wet chemical digestion. All participants followed the analytical requirements described in the draft standards. In the calculation process, three of the calibration materials were used successively as analytical samples. This was managed in the following manner: the material that had just been used as the analytical sample was excluded from the calibration, so the five other materials were used to establish the calibration plot. The results from the interlaboratory comparisons were summarized and

  3. Abbreviated Combined MR Protocol: A New Faster Strategy for Characterizing Breast Lesions.

    Science.gov (United States)

    Moschetta, Marco; Telegrafo, Michele; Rella, Leonarda; Stabile Ianora, Amato Antonio; Angelelli, Giuseppe

    2016-06-01

    The use of an abbreviated magnetic resonance (MR) protocol has been recently proposed for cancer screening. The aim of our study is to evaluate the diagnostic accuracy of an abbreviated MR protocol combining short TI inversion recovery (STIR), turbo-spin-echo (TSE)-T2 sequences, a pre-contrast T1, and a single intermediate (3 minutes after contrast injection) post-contrast T1 sequence for characterizing breast lesions. A total of 470 patients underwent breast MR examination for screening, problem solving, or preoperative staging. Two experienced radiologists evaluated both standard and abbreviated protocols in consensus. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy for both protocols were calculated (with the histological findings and 6-month ultrasound follow-up as the reference standard) and compared with the McNemar test. The post-processing and interpretation times for the MR images were compared with the paired t test. In 177 of 470 (38%) patients, the MR sequences detected 185 breast lesions. Standard and abbreviated protocols obtained sensitivity, specificity, diagnostic accuracy, PPV, and NPV values respectively of 92%, 92%, 92%, 68%, and 98% and of 89%, 91%, 91%, 64%, and 98% with no statistically significant difference (P < .0001). The mean post-processing and interpretation time were, respectively, 7 ± 1 minutes and 6 ± 3.2 minutes for the standard protocol and 1 ± 1.2 minutes and 2 ± 1.2 minutes for the abbreviated protocol, with a statistically significant difference (P < .01). An abbreviated combined MR protocol represents a time-saving tool for radiologists and patients with the same diagnostic potential as the standard protocol in patients undergoing breast MRI for screening, problem solving, or preoperative staging. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Optimized UAV Communication Protocol Based on Prior Locations

    OpenAIRE

    Sboui, Lokman; Rabah, Abdullatif

    2015-01-01

    In this paper, we adopt a new communication protocol between the UAV and fixed on-ground nodes. This protocol tends to reduce communication power consumption by stopping communication if the channel is not good to communicate (i.e. far nodes, obstacles, etc.) The communication is performed using the XBee 868M standard and Libelium wapsmotes. Our designed protocol is based on a new communication model that we propose in this paper. The protocole decides wether to communicate or not after compu...

  5. A Business Evaluation Of The Next Generation Ipv6 Protocol In Fixed And Mobile Communication Services

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2002-01-01

    textabstractThis paper gives an analytical business model of the Internet IPv4 and IPv6 protocols ,focussing on the business implications of intrinsic technical properties of these protocols .The technical properties modeled in business terms are : address space, payload, autoconfiguration, IP

  6. Technical Analysis of SSP-21 Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bromberger, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-09

    As part of the California Energy Systems for the Twenty-First Century (CES-21) program, in December 2016 San Diego Gas and Electric (SDG&E) contracted with Lawrence Livermore National Laboratory (LLNL) to perform an independent verification and validation (IV&V) of a white paper describing their Secure SCADA Protocol for the Twenty-First Century (SSP-21) in order to analyze the effectiveness and propriety of cryptographic protocol use within the SSP-21 specification. SSP-21 is designed to use cryptographic protocols to provide (optional) encryption, authentication, and nonrepudiation, among other capabilities. The cryptographic protocols to be used reflect current industry standards; future versions of SSP-21 will use other advanced technologies to provide a subset of security services.

  7. Principles of the new quantum cryptography protocols building

    International Nuclear Information System (INIS)

    Kurochkin, V.; Kurochkin, Yu.

    2009-01-01

    The main aim of the quantum cryptography protocols is the maximal secrecy under the conditions of the real experiment. This work presents the result of the new protocol building with the use of the secrecy maximization. While using some well-known approaches this method has allowed one to achieve completely new results in quantum cryptography. The process of the protocol elaboration develops from the standard BB84 protocol upgrading to the building of completely new protocol with arbitrary large bases number. The secrecy proofs of the elaborated protocol appear to be natural continuation of the protocol building process. This approach reveals possibility to reach extremely high parameters of the protocol. It suits both the restrictions of contemporary technologies and requirements for high bit rate while being absolutely secret

  8. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    Energy Technology Data Exchange (ETDEWEB)

    Bernini, Patrizia; Bertini, Ivano, E-mail: bertini@cerm.unifi.it; Luchinat, Claudio [University of Florence, Magnetic Resonance Center (CERM) (Italy); Nincheri, Paola; Staderini, Samuele [FiorGen Foundation (Italy); Turano, Paola [University of Florence, Magnetic Resonance Center (CERM) (Italy)

    2011-04-15

    {sup 1}H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  9. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks.

    Science.gov (United States)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-04-01

    (1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  10. Standard operating procedures for pre-analytical handling of blood and urine for metabolomic studies and biobanks

    International Nuclear Information System (INIS)

    Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola

    2011-01-01

    1 H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0−4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.

  11. Simultaneous validation of the SunTech CT40 automated blood pressure measurement device by the 1993 British Hypertension Society protocol and the Association for the Advancement of Medical Instrumentation/International Organization for Standardization 81060-2: 2013 standard.

    Science.gov (United States)

    Polo Friz, Hernan; Punzi, Veronica; Petri, Francesco; Orlandi, Riccardo; Maggiolini, Daniele; Polo Friz, Melisa; Primitz, Laura; Vighi, Giuseppe

    2017-10-01

    This study aimed to perform a simultaneous, third-party, independent validation of the oscillometric SunTech CT40 device for blood pressure (BP) measurement, according to the 1993 protocol of the British Hypertension Society and the standard of the Association for the Advancement of Medical Instrumentation (AAMI)/the International Organization for Standardization (ISO) 81060-2:2013. Patient recruitment, study procedures, and data analysis followed the recommendations stated by the protocols. The study was approved by the institutional review board. A total of 94 participants were included, 52 (55.3%) women, mean±SD age: 63.1±18.0 years, mean±SD arm circumference: 35.0±9.0 cm. The average of observers' entry BPs was 146.9±37.2 mmHg for systolic blood pressure (SBP) and 82.2±22.1 mmHg for diastolic blood pressure (DBP). Differences between the standard measurement and the test device within 5, 10, and 15 mmHg, for the better observer, were 79.4, 96.5, and 100.0% for SBP and 82.6, 97.5, and 100.0% for DBP, respectively. The mean±SD differences between the readings obtained using the test device and those obtained by the observers (AAMI/ISO 81060-2:2013 standard criterion 1) were 0.3±5.0 mmHg (SBP) and -0.8±4.3 mmHg (DBP), and the mean±SD differences between average of reference readings and average of test device readings in each patient (criterion 2) were 0.3±3.9 and -0.8±3.5 mmHg for SBP and DBP, respectively. The CT40 BP device achieved A/A grade of the British Hypertension Society protocol and fulfilled the requirements (criteria 1 and 2) of the AAMI/ISO standard. CT40 can be recommended for BP measurement in adults.

  12. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  13. Effectiveness of individualized physiotherapy on pain and functioning compared to a standard exercise protocol in patients presenting with clinical signs of subacromial impingement syndrome. A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    de Bie Rob A

    2010-06-01

    Full Text Available Abstract Background Shoulder impingement syndrome is a common musculoskeletal complaint leading to significant reduction of health and disability. Physiotherapy is often the first choice of treatment although its effectiveness is still under debate. Systematic reviews in this field highlight the need for more high quality trials to investigate the effectiveness of physiotherapy interventions in patients with subacromial impingement syndrome. Methods/Design This randomized controlled trial will investigate the effectiveness of individualized physiotherapy in patients presenting with clinical signs and symptoms of subacromial impingement, involving 90 participants aged 18-75. Participants are recruited from outpatient physiotherapy clinics, general practitioners, and orthopaedic surgeons in Germany. Eligible participants will be randomly allocated to either individualized physiotherapy or to a standard exercise protocol using central randomization. The control group will perform the standard exercise protocol aiming to restore muscular deficits in strength, mobility, and coordination of the rotator cuff and the shoulder girdle muscles to unload the subacromial space during active movements. Participants of the intervention group will perform the standard exercise protocol as a home program, and will additionally be treated with individualized physiotherapy based on clinical examination results, and guided by a decision tree. After the intervention phase both groups will continue their home program for another 7 weeks. Outcome will be measured at 5 weeks and at 3 and 12 months after inclusion using the shoulder pain and disability index and patients' global impression of change, the generic patient-specific scale, the average weekly pain score, and patient satisfaction with treatment. Additionally, the fear avoidance beliefs questionnaire, the pain catastrophizing scale, and patients' expectancies of treatment effect are assessed. Participants

  14. Use of CTX-I and PINP as bone turnover markers: National Bone Health Alliance recommendations to standardize sample handling and patient preparation to reduce pre-analytical variability.

    Science.gov (United States)

    Szulc, P; Naylor, K; Hoyle, N R; Eastell, R; Leary, E T

    2017-09-01

    The National Bone Health Alliance (NBHA) recommends standardized sample handling and patient preparation for C-terminal telopeptide of type I collagen (CTX-I) and N-terminal propeptide of type I procollagen (PINP) measurements to reduce pre-analytical variability. Controllable and uncontrollable patient-related factors are reviewed to facilitate interpretation and minimize pre-analytical variability. The IOF and the International Federation of Clinical Chemistry (IFCC) Bone Marker Standards Working Group have identified PINP and CTX-I in blood to be the reference markers of bone turnover for the fracture risk prediction and monitoring of osteoporosis treatment. Although used in clinical research for many years, bone turnover markers (BTM) have not been widely adopted in clinical practice primarily due to their poor within-subject and between-lab reproducibility. The NBHA Bone Turnover Marker Project team aim to reduce pre-analytical variability of CTX-I and PINP measurements through standardized sample handling and patient preparation. Recommendations for sample handling and patient preparations were made based on review of available publications and pragmatic considerations to reduce pre-analytical variability. Controllable and un-controllable patient-related factors were reviewed to facilitate interpretation and sample collection. Samples for CTX-I must be collected consistently in the morning hours in the fasted state. EDTA plasma is preferred for CTX-I for its greater sample stability. Sample collection conditions for PINP are less critical as PINP has minimal circadian variability and is not affected by food intake. Sample stability limits should be observed. The uncontrollable aspects (age, sex, pregnancy, immobility, recent fracture, co-morbidities, anti-osteoporotic drugs, other medications) should be considered in BTM interpretation. Adopting standardized sample handling and patient preparation procedures will significantly reduce controllable pre-analytical

  15. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    Science.gov (United States)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  16. The Singapore protocol [for quantum cryptography

    International Nuclear Information System (INIS)

    Englert, B.

    2005-01-01

    The qubit protocol for quantum key distribution presented in this talk is fully tomographic and more efficient than other tomographic protocols. Under ideal circumstances the efficiency is log 2 (4/3) = 0.415 key bits per qubit sent, which is 25% more than the efficiency of 1/3 = 0.333 for the standard 6-state protocol. One can extract 0.4 key bits per qubit by a simple two-way communication scheme, and can so get close to the information-theoretical limit. The noise thresholds for secure key bit generation in the presence of unbiased noise will be reported and discussed. (author)

  17. On consensus through communication without a commonly known protocol

    OpenAIRE

    Tsakas Elias; Voorneveld Mark

    2010-01-01

    The present paper extends the standard model of pairwise communication among Bayesianagents to cases where the structure of the communication protocol is not commonly known.We show that, even under strict conditions on the structure of the protocols and the nature of the transmitted signals, a consensus may never be reached if very little asymmetric information about the protocol is introduced.

  18. Addressing the need for biomarker liquid chromatography/mass spectrometry assays: a protocol for effective method development for the bioanalysis of endogenous compounds in cerebrospinal fluid.

    Science.gov (United States)

    Benitex, Yulia; McNaney, Colleen A; Luchetti, David; Schaeffer, Eric; Olah, Timothy V; Morgan, Daniel G; Drexler, Dieter M

    2013-08-30

    Research on disorders of the central nervous system (CNS) has shown that an imbalance in the levels of specific endogenous neurotransmitters may underlie certain CNS diseases. These alterations in neurotransmitter levels may provide insight into pathophysiology, but can also serve as disease and pharmacodynamic biomarkers. To measure these potential biomarkers in vivo, the relevant sample matrix is cerebrospinal fluid (CSF), which is in equilibrium with the brain's interstitial fluid and circulates through the ventricular system of the brain and spinal cord. Accurate analysis of these potential biomarkers can be challenging due to low CSF sample volume, low analyte levels, and potential interferences from other endogenous compounds. A protocol has been established for effective method development of bioanalytical assays for endogenous compounds in CSF. Database searches and standard-addition experiments are employed to qualify sample preparation and specificity of the detection thus evaluating accuracy and precision. This protocol was applied to the study of the histaminergic neurotransmitter system and the analysis of histamine and its metabolite 1-methylhistamine in rat CSF. The protocol resulted in a specific and sensitive novel method utilizing pre-column derivatization ultra high performance liquid chromatography/tandem mass spectrometry (UHPLC/MS/MS), which is also capable of separating an endogenous interfering compound, identified as taurine, from the analytes of interest. Copyright © 2013 John Wiley & Sons, Ltd.

  19. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  20. Protocols for pressure ulcer prevention: are they evidence-based?

    Science.gov (United States)

    Chaves, Lidice M; Grypdonck, Mieke H F; Defloor, Tom

    2010-03-01

    This study is a report of a study to determine the quality of protocols for pressure ulcer prevention in home care in the Netherlands. If pressure ulcer prevention protocols are evidence-based and practitioners use them correctly in practice, this will result a reduction in pressure ulcers. Very little is known about the evidence-based content and quality of the pressure ulcer prevention protocols. In 2008, current pressure ulcer prevention protocols from 24 home-care agencies in the Netherlands were evaluated. A checklist developed and validated by two pressure ulcer prevention experts was used to assess the quality of the protocols, and weighted and unweighted quality scores were computed and analysed using descriptive statistics. The 24 pressure ulcer prevention protocols had a mean weighted quality score of 63.38 points out of a maximum of 100 (sd 5). The importance of observing the skin at the pressure points at least once a day was emphasized in 75% of the protocols. Only 42% correctly warned against the use of materials that were 'less effective or that could potentially cause harm'. Pressure ulcer prevention commands a reasonable amount of attention in home care, but the incidence of pressure ulcers and lack of a consistent, standardized document for use in actual practice indicate a need for systematic implementation of national pressure ulcer prevention standards in the Netherlands to ensure adherence to the established protocols.

  1. EU-US standards harmonization task group report : status of ITS communication standards.

    Science.gov (United States)

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  2. Denial of Service Attacks on 802.1X Security Protocol

    National Research Council Canada - National Science Library

    Ozan, Orhan

    2004-01-01

    ... infrastructure, such as military and administrative government LANs. The IEEE 802.11 wireless standard specifies both an authentication service and encryption protocol, but research has demonstrated that these protocols are severely flawed...

  3. Analysing Password Protocol Security Against Off-line Dictionary Attacks

    NARCIS (Netherlands)

    Corin, R.J.; Doumen, J.M.; Etalle, Sandro; Busi, Nadia; Gorrieri, Roberto; Martinelli, Fabio

    We study the security of password protocols against off-line dictionary attacks. In addition to the standard adversary abilities, we also consider further cryptographic advantages given to the adversary when considering the password protocol being instantiated with particular encryption schemes. We

  4. A Mac Protocol Implementation for Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Jamila Bhar

    2015-01-01

    Full Text Available IEEE 802.15.4 is an important standard for Low Rate Wireless Personal Area Network (LRWPAN. The IEEE 802.15.4 presents a flexible MAC protocol that provides good efficiency for data transmission by adapting its parameters according to characteristics of different applications. In this research work, some restrictions of this standard are explained and an improvement of traffic efficiency by optimizing MAC layer is proposed. Implementation details for several blocks of communication system are carefully modeled. The protocol implementation is done using VHDL language. The analysis gives a full understanding of the behavior of the MAC protocol with regard to backoff delay, data loss probability, congestion probability, slot effectiveness, and traffic distribution for terminals. Two ideas are proposed and tested to improve efficiency of CSMA/CA mechanism for IEEE 802.15.4 MAC Layer. Primarily, we dynamically adjust the backoff exponent (BE according to queue level of each node. Secondly, we vary the number of consecutive clear channel assessment (CCA for packet transmission. We demonstrate also that slot compensation provided by the enhanced MAC protocol can greatly avoid unused slots. The results show the significant improvements expected by our approach among the IEEE 802.15.4 MAC standards. Synthesis results show also hardware performances of our proposed architecture.

  5. Review: domestic animal forensic genetics - biological evidence, genetic markers, analytical approaches and challenges.

    Science.gov (United States)

    Kanthaswamy, S

    2015-10-01

    This review highlights the importance of domestic animal genetic evidence sources, genetic testing, markers and analytical approaches as well as the challenges this field is facing in view of the de facto 'gold standard' human DNA identification. Because of the genetic similarity between humans and domestic animals, genetic analysis of domestic animal hair, saliva, urine, blood and other biological material has generated vital investigative leads that have been admitted into a variety of court proceedings, including criminal and civil litigation. Information on validated short tandem repeat, single nucleotide polymorphism and mitochondrial DNA markers and public access to genetic databases for forensic DNA analysis is becoming readily available. Although the fundamental aspects of animal forensic genetic testing may be reliable and acceptable, animal forensic testing still lacks the standardized testing protocols that human genetic profiling requires, probably because of the absence of monetary support from government agencies and the difficulty in promoting cooperation among competing laboratories. Moreover, there is a lack in consensus about how to best present the results and expert opinion to comply with court standards and bear judicial scrutiny. This has been the single most persistent challenge ever since the earliest use of domestic animal forensic genetic testing in a criminal case in the mid-1990s. Crime laboratory accreditation ensures that genetic test results have the courts' confidence. Because accreditation requires significant commitments of effort, time and resources, the vast majority of animal forensic genetic laboratories are not accredited nor are their analysts certified forensic examiners. The relevance of domestic animal forensic genetics in the criminal justice system is undeniable. However, further improvements are needed in a wide range of supporting resources, including standardized quality assurance and control protocols for sample

  6. Intelligent Local Avoided Collision (iLAC) MAC Protocol for Very High Speed Wireless Network

    Science.gov (United States)

    Hieu, Dinh Chi; Masuda, Akeo; Rabarijaona, Verotiana Hanitriniala; Shimamoto, Shigeru

    Future wireless communication systems aim at very high data rates. As the medium access control (MAC) protocol plays the central role in determining the overall performance of the wireless system, designing a suitable MAC protocol is critical to fully exploit the benefit of high speed transmission that the physical layer (PHY) offers. In the latest 802.11n standard [2], the problem of long overhead has been addressed adequately but the issue of excessive colliding transmissions, especially in congested situation, remains untouched. The procedure of setting the backoff value is the heart of the 802.11 distributed coordination function (DCF) to avoid collision in which each station makes its own decision on how to avoid collision in the next transmission. However, collision avoidance is a problem that can not be solved by a single station. In this paper, we introduce a new MAC protocol called Intelligent Local Avoided Collision (iLAC) that redefines individual rationality in choosing the backoff counter value to avoid a colliding transmission. The distinguishing feature of iLAC is that it fundamentally changes this decision making process from collision avoidance to collaborative collision prevention. As a result, stations can avoid colliding transmissions with much greater precision. Analytical solution confirms the validity of this proposal and simulation results show that the proposed algorithm outperforms the conventional algorithms by a large margin.

  7. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  8. Do we need 3D tube current modulation information for accurate organ dosimetry in chest CT? Protocols dose comparisons.

    Science.gov (United States)

    Lopez-Rendon, Xochitl; Zhang, Guozhi; Coudyzer, Walter; Develter, Wim; Bosmans, Hilde; Zanca, Federica

    2017-11-01

    To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI vol of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI vol of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI vol values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. • The z-TCM information is sufficient for accurate dosimetry for standard protocols. • The z-TCM information is sufficient for accurate dosimetry for fast-speed scanning protocols. • For organ-based TCM schemes, the 3D-TCM information is necessary for accurate dosimetry. • At identical CTDI vol , the fast-speed scanning protocol delivered the highest doses. • Lung dose was higher in XCare than standard protocol at identical CTDI vol .

  9. Patient controlled sedation using a standard protocol for dressing changes in burns: patients' preference, procedural details and a preliminary safety evaluation.

    Science.gov (United States)

    Nilsson, Andreas; Steinvall, Ingrid; Bak, Zoltan; Sjöberg, Folke

    2008-11-01

    Patient controlled sedation (PCS) enables patients to titrate doses of drugs by themselves during different procedures involving pain or discomfort. We studied it in a prospective crossover design using a fixed protocol without lockout time to examine it as an alternative method of sedation for changing dressings in burned patients. Eleven patients with >10% total burn surface area (TBSA) had their dressings changed, starting with sedation by an anaesthetist (ACS). The second dressing change was done with PCS (propofol/alfentanil) and the third time the patients had to choose ACS or PCS. During the procedures, data on cardiopulmonary variables, sedation (bispectral index), pain intensity (VAS), procedural details, doses of drugs, and patients' preferences were collected to compare the two sedation techniques. The study data indicated that wound care in burned patients is feasible with a standardized PCS protocol. The patients preferred PCS to ACS on the basis of self-control, and because they had less discomfort during the recovery period. Wound care was also considered adequate by the staff during PCS. No respiratory (respiratory rate/transcutaneous PCO(2)) or cardiovascular (heart rate/blood pressure) adverse events were recorded at any time during any of the PCS procedures. The doses of propofol and alfentanil and BIS index decrease were less during PCS than ACS. Procedural pain was higher during PCS but lower after the procedure. We suggest that PCS using a standard protocol is an interesting alternative to anaesthetist-provided sedation during dressing changes. It seems effective, saves resources, is safe, and at same time is preferred by the patients. The strength of these conclusions is, however, hampered by the small size of this investigation and therefore further studies are warranted.

  10. Two analytical models for evaluating performance of Gigabit Ethernet Hosts

    International Nuclear Information System (INIS)

    Salah, K.

    2006-01-01

    Two analytical models are developed to study the impact of interrupt overhead on operating system performance of network hosts when subjected to Gigabit network traffic. Under heavy network traffic, the system performance will be negatively affected due to interrupt overhead caused by incoming traffic. In particular, excessive latency and significant degradation in system throughput can be experienced. Also user application may livelock as the CPU power is mostly consumed by interrupt handling and protocol processing. In this paper we present and compare two analytical models that capture host behavior and evaluate its performance. The first model is based Markov processes and queuing theory, while the second, which is more accurate but more complex is a pure Markov process. For the most part both models give mathematically-equivalent closed-form solutions for a number of important system performance metrics. These metrics include throughput, latency and stability condition, CPU utilization of interrupt handling and protocol processing and CPU availability for user applications. The analysis yields insight into understanding and predicting the impact of system and network choices on the performance of interrupt-driven systems when subjected to light and heavy network loads. More, importantly, our analytical work can also be valuable in improving host performance. The paper gives guidelines and recommendations to address design and implementation issues. Simulation and reported experimental results show that our analytical models are valid and give a good approximation. (author)

  11. Standardization of 8-color flow cytometry across different flow cytometer instruments: A feasibility study in clinical laboratories in Switzerland.

    Science.gov (United States)

    Glier, Hana; Heijnen, Ingmar; Hauwel, Mathieu; Dirks, Jan; Quarroz, Stéphane; Lehmann, Thomas; Rovo, Alicia; Arn, Kornelius; Matthes, Thomas; Hogan, Cassandra; Keller, Peter; Dudkiewicz, Ewa; Stüssi, Georg; Fernandez, Paula

    2017-07-29

    The EuroFlow Consortium developed a fully standardized flow cytometric approach from instrument settings, through antibody panel, reagents and sample preparation protocols, to data acquisition and analysis. The Swiss Cytometry Society (SCS) promoted a study to evaluate the feasibility of using such standardized measurements of 8-color data across two different flow cytometry platforms - Becton Dickinson (BD) FACSCanto II and Beckman Coulter (BC) Navios, aiming at increasing reproducibility and inter-laboratory comparability of immunophenotypic data in clinical laboratories in Switzerland. The study was performed in two phases, i.e. a learning phase (round 1) and an analytical phase (rounds 2 and 3) consisting of a total of three rounds. Overall, 10 laboratories using BD FACSCanto II (n=6) or BC Navios (n=4) flow cytometers participated. Each laboratory measured peripheral blood samples from healthy donors stained with a uniform antibody panel of reagents - EuroFlow Lymphoid Screening Tube (LST) - applying the EuroFlow standardized protocols for instrument setup and sample preparation (www.EuroFlow.org). All data files were analyzed centrally and median fluorescence intensity (MedFI) values for individual markers on defined lymphocyte subsets were recorded; variability from reference MedFI values was assessed using performance scores. Data troubleshooting and discussion of the results with the participants followed after each round at SCS meetings. The results of the learning phase demonstrated that standardized instrument setup and data acquisition are feasible in routine clinical laboratories without previous experience with EuroFlow. During the analytical phase, highly comparable data were obtained at the different laboratories using either BD FACSCanto II or BC Navios. The coefficient of variation of MedFI for 7 of 11 markers performed repeatedly below 30%. In the last study round, 89% of participants scored over 90% MedFI values within the acceptance criteria

  12. Protocol of the COSMIN study: COnsensus-based Standards for the selection of health Measurement INstruments

    Directory of Open Access Journals (Sweden)

    Patrick DL

    2006-01-01

    Full Text Available Abstract Background Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability, the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs, i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist. Method An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1 a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2 an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will

  13. Test Protocols for Advanced Inverter Interoperability Functions - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as

  14. The effect of the introduction of a standard monitoring protocol on ...

    African Journals Online (AJOL)

    2011-03-07

    Mar 7, 2011 ... management protocol had had any effect on the metabolic and nonmetabolic .... Ages were not recorded. ..... of happiness and wealth.14 Three-quarters of the girls associated ... Information on lifestyle and advice to reduce.

  15. On Equivalence between Critical Probabilities of Dynamic Gossip Protocol and Static Site Percolation

    Science.gov (United States)

    Ishikawa, Tetsuya; Hayakawa, Tomohisa

    The relationship between the critical probability of gossip protocol on the square lattice and the critical probability of site percolation on the square lattice is discussed. Specifically, these two critical probabilities are analytically shown to be equal to each other. Furthermore, we present a way of evaluating the critical probability of site percolation by approximating the saturation of gossip protocol. Finally, we provide numerical results which support the theoretical analysis.

  16. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    oljić, I. Eškinja, M. Kaštelan-Macan, I. Piljac. Š. Cerjan-Stefanović and others translated Chromatographic nomenclature (IUPAC Compendium of Analytical Nomenclature. The related area is covered by books of V. Grdinić and F. Plavšić.During the project Croatian nomenclature of analytical chemistry there shall be an analysis of dictionaries, textbooks, handbooks, professional and scientific monographs and articles, official governmental and economic publications, regulations and instructions. The Compendium of Analytical Nomenclature is expected to have been translated and the translation mostly adjusted to the Croatian language standard. EUROLAB and EURACHEM documents related to quality assurance in analytical laboratories, especially in research and development have not yet been included in the Compendium, and due to the globalization of the information and service market, such documents need to be adjusted to the Croatian language standard in collaboration with consultants from the Institute for Croatian Language and Lingiustics. The terms shall be sorted according to the analytical process from sampling to final information.It is expected that the project's results shall be adopted by the Croatian scientific and professional community, so as to raise the awareness of the necessity of using Croatian terms in everyday professional communication and particularly in scientific and educational work. The Croatian language is rich enough for all analytical terms to be translated appropriately. This shall complete the work our predecessors began several times. We face a great challenge of contributing to the creation of the Croatian scientific terminology and believe we shall succeed.

  17. Dynamic federations: storage aggregation using open tools and protocols

    CERN Document Server

    Fabrizio Furano, F F; Ricardo Brito da Rocha, R R; Adrien Devresse, A D; Oliver Keeble, O K; Alejandro Alvarez Ayllon, A A

    2012-01-01

    A number of storage elements now offer standard protocol interfaces like NFS 4.1/pNFS and WebDAV, for access to their data repositories, in line with the standardization effort of the European Middleware Initiative (EMI). Also the LCG FileCatalogue (LFC) can offer such features. Here we report on work that seeks to exploit the federation potential of these protocols and build a system that offers a unique view of the storage and metadata ensemble and the possibility of integration of other compatible resources such as those from cloud providers. The challenge, here undertaken by the providers of dCache and DPM, and pragmatically open to other Grid and Cloud storage solutions, is to build such a system while being able to accommodate name translations from existing catalogues (e.g. LFCs), experiment- based metadata catalogues, or stateless algorithmic name translations, also known as ”trivial file catalogues”. Such so-called storage federations of standard protocols-based storage elements give a unique vie...

  18. EU-US standards harmonization task group report : feedback to standards development organizations - security

    Science.gov (United States)

    2012-11-12

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  19. CytometryML: a markup language for analytical cytology

    Science.gov (United States)

    Leif, Robert C.; Leif, Stephanie H.; Leif, Suzanne B.

    2003-06-01

    Cytometry Markup Language, CytometryML, is a proposed new analytical cytology data standard. CytometryML is a set of XML schemas for encoding both flow cytometry and digital microscopy text based data types. CytometryML schemas reference both DICOM (Digital Imaging and Communications in Medicine) codes and FCS keywords. These schemas provide representations for the keywords in FCS 3.0 and will soon include DICOM microscopic image data. Flow Cytometry Standard (FCS) list-mode has been mapped to the DICOM Waveform Information Object. A preliminary version of a list mode binary data type, which does not presently exist in DICOM, has been designed. This binary type is required to enhance the storage and transmission of flow cytometry and digital microscopy data. Index files based on Waveform indices will be used to rapidly locate the cells present in individual subsets. DICOM has the advantage of employing standard file types, TIF and JPEG, for Digital Microscopy. Using an XML schema based representation means that standard commercial software packages such as Excel and MathCad can be used to analyze, display, and store analytical cytometry data. Furthermore, by providing one standard for both DICOM data and analytical cytology data, it eliminates the need to create and maintain special purpose interfaces for analytical cytology data thereby integrating the data into the larger DICOM and other clinical communities. A draft version of CytometryML is available at www.newportinstruments.com.

  20. International Society for Analytical Cytology biosafety standard for sorting of unfixed cells.

    Science.gov (United States)

    Schmid, Ingrid; Lambert, Claude; Ambrozak, David; Marti, Gerald E; Moss, Delynn M; Perfetto, Stephen P

    2007-06-01

    Cell sorting of viable biological specimens has become very prevalent in laboratories involved in basic and clinical research. As these samples can contain infectious agents, precautions to protect instrument operators and the environment from hazards arising from the use of sorters are paramount. To this end the International Society of Analytical Cytology (ISAC) took a lead in establishing biosafety guidelines for sorting of unfixed cells (Schmid et al., Cytometry 1997;28:99-117). During the time period these recommendations have been available, they have become recognized worldwide as the standard practices and safety precautions for laboratories performing viable cell sorting experiments. However, the field of cytometry has progressed since 1997, and the document requires an update. Initially, suggestions about the document format and content were discussed among members of the ISAC Biosafety Committee and were incorporated into a draft version that was sent to all committee members for review. Comments were collected, carefully considered, and incorporated as appropriate into a draft document that was posted on the ISAC web site to invite comments from the flow cytometry community at large. The revised document was then submitted to ISAC Council for review. Simultaneously, further comments were sought from newly-appointed ISAC Biosafety committee members. This safety standard for performing viable cell sorting experiments was recently generated. The document contains background information on the biohazard potential of sorting and the hazard classification of infectious agents as well as recommendations on (1) sample handling, (2) operator training and personal protection, (3) laboratory design, (4) cell sorter set-up, maintenance, and decontamination, and (5) testing the instrument for the efficiency of aerosol containment. This standard constitutes an updated and expanded revision of the 1997 biosafety guideline document. It is intended to provide

  1. Hopping control channel MAC protocol for opportunistic spectrum access networks

    Institute of Scientific and Technical Information of China (English)

    FU Jing-tuan; JI Hong; MAO Xu

    2010-01-01

    Opportunistic spectrum access (OSA) is considered as a promising approach to mitigate spectrum scarcity by allowing unlicensed users to exploit spectrum opportunities in licensed frequency bands. Derived from the existing channel-hopping multiple access (CHMA) protocol,we introduce a hopping control channel medium access control (MAC) protocol in the context of OSA networks. In our proposed protocol,all nodes in the network follow a common channel-hopping sequence; every frequency channel can be used as control channel and data channel. Considering primary users' occupancy of the channel,we use a primary user (PU) detection model to calculate the channel availability for unlicensed users' access. Then,a discrete Markov chain analytical model is applied to describe the channel states and deduce the system throughput. Through simulation,we present numerical results to demonstrate the throughput performance of our protocol and thus validate our work.

  2. Development of a manualized protocol of massage therapy for clinical trials in osteoarthritis

    Directory of Open Access Journals (Sweden)

    Ali Ather

    2012-10-01

    Full Text Available Abstract Background Clinical trial design of manual therapies may be especially challenging as techniques are often individualized and practitioner-dependent. This paper describes our methods in creating a standardized Swedish massage protocol tailored to subjects with osteoarthritis of the knee while respectful of the individualized nature of massage therapy, as well as implementation of this protocol in two randomized clinical trials. Methods The manualization process involved a collaborative process between methodologic and clinical experts, with the explicit goals of creating a reproducible semi-structured protocol for massage therapy, while allowing some latitude for therapists’ clinical judgment and maintaining consistency with a prior pilot study. Results The manualized protocol addressed identical specified body regions with distinct 30- and 60-min protocols, using standard Swedish strokes. Each protocol specifies the time allocated to each body region. The manualized 30- and 60-min protocols were implemented in a dual-site 24-week randomized dose-finding trial in patients with osteoarthritis of the knee, and is currently being implemented in a three-site 52-week efficacy trial of manualized Swedish massage therapy. In the dose-finding study, therapists adhered to the protocols and significant treatment effects were demonstrated. Conclusions The massage protocol was manualized, using standard techniques, and made flexible for individual practitioner and subject needs. The protocol has been applied in two randomized clinical trials. This manualized Swedish massage protocol has real-world utility and can be readily utilized both in the research and clinical settings. Trial registration Clinicaltrials.gov NCT00970008 (18 August 2009

  3. A standardized imaging protocol for the endoscopic prediction of dysplasia within sessile serrated polyps (with video).

    Science.gov (United States)

    Tate, David J; Jayanna, Mahesh; Awadie, Halim; Desomer, Lobke; Lee, Ralph; Heitman, Steven J; Sidhu, Mayenaaz; Goodrick, Kathleen; Burgess, Nicholas G; Mahajan, Hema; McLeod, Duncan; Bourke, Michael J

    2018-01-01

    Dysplasia within sessile serrated polyps (SSPs) is difficult to detect and may be mistaken for an adenoma, risking incomplete resection of the background serrated tissue, and is strongly implicated in interval cancer after colonoscopy. The use of endoscopic imaging to detect dysplasia within SSPs has not been systematically studied. Consecutively detected SSPs ≥8 mm in size were evaluated by using a standardized imaging protocol at a tertiary-care endoscopy center over 3 years. Lesions suspected as SSPs were analyzed with high-definition white light then narrow-band imaging. A demarcated area with a neoplastic pit pattern (Kudo type III/IV, NICE type II) was sought among the serrated tissue. If this was detected, the lesion was labeled dysplastic (sessile serrated polyp with dysplasia); if not, it was labeled non-dysplastic (sessile serrated polyp without dysplasia). Histopathology was reviewed by 2 blinded specialist GI pathologists. A total of 141 SSPs were assessed in 83 patients. Median lesion size was 15.0 mm (interquartile range 10-20), and 54.6% were in the right side of the colon. Endoscopic evidence of dysplasia was detected in 36 of 141 (25.5%) SSPs; of these, 5 of 36 (13.9%) lacked dysplasia at histopathology. Two of 105 (1.9%) endoscopically designated non-dysplastic SSPs had dysplasia at histopathology. Endoscopic imaging, therefore, had an accuracy of 95.0% (95% confidence interval [CI], 90.1%-97.6%) and a negative predictive value of 98.1% (95% CI, 92.6%-99.7%) for detection of dysplasia within SSPs. Dysplasia within SSPs can be detected accurately by using a simple, broadly applicable endoscopic imaging protocol that allows complete resection. Independent validation of this protocol and its dissemination to the wider endoscopic community may have a significant impact on rates of interval cancer. (Clinical trial registration number: NCT03100552.). Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All

  4. Visualization of the internal globus pallidus: sequence and orientation for deep brain stimulation using a standard installation protocol at 3.0 Tesla.

    Science.gov (United States)

    Nölte, Ingo S; Gerigk, Lars; Al-Zghloul, Mansour; Groden, Christoph; Kerl, Hans U

    2012-03-01

    Deep-brain stimulation (DBS) of the internal globus pallidus (GPi) has shown remarkable therapeutic benefits for treatment-resistant neurological disorders including dystonia and Parkinson's disease (PD). The success of the DBS is critically dependent on the reliable visualization of the GPi. The aim of the study was to evaluate promising 3.0 Tesla magnetic resonance imaging (MRI) methods for pre-stereotactic visualization of the GPi using a standard installation protocol. MRI at 3.0 T of nine healthy individuals and of one patient with PD was acquired (FLAIR, T1-MPRAGE, T2-SPACE, T2*-FLASH2D, susceptibility-weighted imaging mapping (SWI)). Image quality and visualization of the GPi for each sequence were assessed by two neuroradiologists independently using a 6-point scale. Axial, coronal, and sagittal planes of the T2*-FLASH2D images were compared. Inter-rater reliability, contrast-to-noise ratios (CNR) and signal-to-noise ratios (SNR) for the GPi were determined. For illustration, axial T2*-FLASH2D images were fused with a section schema of the Schaltenbrand-Wahren stereotactic atlas. The GPi was best and reliably visualized in axial and to a lesser degree on coronal T2*-FLASH2D images. No major artifacts in the GPi were observed in any of the sequences. SWI offered a significantly higher CNR for the GPi compared to standard T2-weighted imaging using the standard parameters. The fusion of the axial T2*-FLASH2D images and the atlas projected the GPi clearly in the boundaries of the section schema. Using a standard installation protocol at 3.0 T T2*-FLASH2D imaging (particularly axial view) provides optimal and reliable delineation of the GPi.

  5. Evaluation of sample preparation protocols for spider venom profiling by MALDI-TOF MS.

    Science.gov (United States)

    Bočánek, Ondřej; Šedo, Ondrej; Pekár, Stano; Zdráhal, Zbyněk

    2017-07-01

    Spider venoms are highly complex mixtures containing biologically active substances with potential for use in biotechnology or pharmacology. Fingerprinting of venoms by Matrix-Assisted Laser Desorption-Ionization - Time of Flight Mass Spectrometry (MALDI-TOF MS) is a thriving technology, enabling the rapid detection of peptide/protein components that can provide comparative information. In this study, we evaluated the effects of sample preparation procedures on MALDI-TOF mass spectral quality to establish a protocol providing the most reliable analytical outputs. We adopted initial sample preparation conditions from studies already published in this field. Three different MALDI matrixes, three matrix solvents, two sample deposition methods, and different acid concentrations were tested. As a model sample, venom from Brachypelma albopilosa was used. The mass spectra were evaluated on the basis of absolute and relative signal intensities, and signal resolution. By conducting three series of analyses at three weekly intervals, the reproducibility of the mass spectra were assessed as a crucial factor in the selection for optimum conditions. A sample preparation protocol based on the use of an HCCA matrix dissolved in 50% acetonitrile with 2.5% TFA deposited onto the target by the dried-droplet method was found to provide the best results in terms of information yield and repeatability. We propose that this protocol should be followed as a standard procedure, enabling the comparative assessment of MALDI-TOF MS spider venom fingerprints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  7. Analytical and pre-analytical performance characteristics of a novel cartridge-type blood gas analyzer for point-of-care and laboratory testing.

    Science.gov (United States)

    Oyaert, Matthijs; Van Maerken, Tom; Bridts, Silke; Van Loon, Silvi; Laverge, Heleen; Stove, Veronique

    2018-03-01

    Point-of-care blood gas test results may benefit therapeutic decision making by their immediate impact on patient care. We evaluated the (pre-)analytical performance of a novel cartridge-type blood gas analyzer, the GEM Premier 5000 (Werfen), for the determination of pH, partial carbon dioxide pressure (pCO 2 ), partial oxygen pressure (pO 2 ), sodium (Na + ), potassium (K + ), chloride (Cl - ), ionized calcium ( i Ca 2+ ), glucose, lactate, and total hemoglobin (tHb). Total imprecision was estimated according to the CLSI EP5-A2 protocol. The estimated total error was calculated based on the mean of the range claimed by the manufacturer. Based on the CLSI EP9-A2 evaluation protocol, a method comparison with the Siemens RapidPoint 500 and Abbott i-STAT CG8+ was performed. Obtained data were compared against preset quality specifications. Interference of potential pre-analytical confounders on co-oximetry and electrolyte concentrations were studied. The analytical performance was acceptable for all parameters tested. Method comparison demonstrated good agreement to the RapidPoint 500 and i-STAT CG8+, except for some parameters (RapidPoint 500: pCO 2 , K + , lactate and tHb; i-STAT CG8+: pO 2 , Na + , i Ca 2+ and tHb) for which significant differences between analyzers were recorded. No interference of lipemia or methylene blue on CO-oximetry results was found. On the contrary, significant interference for benzalkonium and hemolysis on electrolyte measurements were found, for which the user is notified by an interferent specific flag. Identification of sample errors from pre-analytical sources, such as interferences and automatic corrective actions, along with the analytical performance, ease of use and low maintenance time of the instrument, makes the evaluated instrument a suitable blood gas analyzer for both POCT and laboratory use. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. An analytical model for the performance of geographical multi-hop broadcast

    NARCIS (Netherlands)

    Klein Wolterink, W.; Heijenk, G.; Berg, J.L. van den

    2012-01-01

    In this paper we present an analytical model accurately describing the behaviour of a multi-hop broadcast protocol. Our model covers the scenario in which a message is forwarded over a straight road and inter-node distances are distributed exponentially. Intermediate forwarders draw a small random

  9. Results of ERAS protocol in patients with colorectal cancer

    Directory of Open Access Journals (Sweden)

    A. O. Rasulov

    2016-01-01

    Full Text Available Objective: explore the use of enhanced recovery after surgery (ERAS in the treatment of patients with colorectal cancer, evaluate its efficacy and safety.Materials and methods. Prospective, single-site, randomized study for the implementation of enhanced recovery after surgery in patients with colorectal cancer has been conducted from October 2014 till the present time. All patients after laparoscopic surgeries undergo treatment according to ERAS protocol, patients after open surgeries are randomized (1:1 in groups of the standard treatment or treatment according to ERAS protocol. The study included patients with localized and locally disseminated colorectal cancer aged from 18 to 75 years, ECOG score ≤ 2. The primary evaluated parameters were the following: the number of postoperative complications (according to Clavien– Dindo classification, postoperative hospital days, incidence of complications and mortality in the 30-day period, timing of activation.Results. Up to date, the study includes 105 patients: laparoscopic group – 51 patients, open-surgery group of patients treated by ERAS protocol – 27 patients, open-surgery group of patients with the standard post-op treatment – 26 patients. Complications requiring emergency surgery for anastomotic leak (p = 0.159 developed in 3.7 % of patients with the standard post-op treatment and in 3.9 % of patients after laparoscopic surgery, while 1 patient required repeat hospitalization. The total number of complications was significantly lower in opensurgery group of patients treated by ERAS protocol compared with the standard post-op treatment (p = 0.021. However, there were no differences between laparoscopic and open-surgery group with the standard post-op treatment (p = 0.159. An average hospitalization stay in patients with the standard post-op treatment was equal to 10 days compared to 7 days in patients treated by ERAS protocol (p = 0.067 and 6 days after laparoscopic

  10. Comparison of protocols and RNA carriers for plasma miRNA isolation. Unraveling RNA carrier influence on miRNA isolation

    Science.gov (United States)

    Martos, Laura; Fernández-Pardo, Álvaro; Oto, Julia; Medina, Pilar; España, Francisco; Navarro, Silvia

    2017-01-01

    microRNAs are promising biomarkers in biological fluids in several diseases. Different plasma RNA isolation protocols and carriers are available, but their efficiencies have been scarcely compared. Plasma microRNAs were isolated using a phenol and column-based procedure and a column-based procedure, in the presence or absence of two RNA carriers (yeast RNA and MS2 RNA). We evaluated the presence of PCR inhibitors and the relative abundance of certain microRNAs by qRT-PCR. Furthermore, we analyzed the association between different isolation protocols, the relative abundance of the miRNAs in the sample, the GC content and the free energy of microRNAs. In all microRNAs analyzed, the addition of yeast RNA as a carrier in the different isolation protocols used gave lower raw Cq values, indicating higher microRNA recovery. Moreover, this increase in microRNAs recovery was dependent on their own relative abundance in the sample, their GC content and the free-energy of their own most stable secondary structure. Furthermore, the normalization of microRNA levels by an endogenous microRNA is more reliable than the normalization by plasma volume, as it reduced the difference in microRNA fold abundance between the different isolation protocols evaluated. Our thorough study indicates that a standardization of pre- and analytical conditions is necessary to obtain reproducible inter-laboratory results in plasma microRNA studies. PMID:29077772

  11. Standardization and optimization of arthropod inventories-the case of Iberian spiders

    DEFF Research Database (Denmark)

    Bondoso Cardoso, Pedro Miguel

    2009-01-01

    and optimization of sampling protocols, especially for mega-diverse arthropod taxa. This study had two objectives: (1) propose guidelines and statistical methods to improve the standardization and optimization of arthropod inventories, and (2) to propose a standardized and optimized protocol for Iberian spiders......, by finding common results between the optimal options for the different sites. The steps listed were successfully followed in the determination of a sampling protocol for Iberian spiders. A protocol with three sub-protocols of varying degrees of effort (24, 96 and 320 h of sampling) is proposed. I also...

  12. Effects of standard training in the use of closed-circuit televisions in visually impaired adults: design of a training protocol and a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    van Rens Ger HMB

    2010-03-01

    Full Text Available Abstract Background Reading problems are frequently reported by visually impaired persons. A closed-circuit television (CCTV can be helpful to maintain reading ability, however, it is difficult to learn how to use this device. In the Netherlands, an evidence-based rehabilitation program in the use of CCTVs was lacking. Therefore, a standard training protocol needed to be developed and tested in a randomized controlled trial (RCT to provide an evidence-based training program in the use of this device. Methods/Design To develop a standard training program, information was collected by studying literature, observing training in the use of CCTVs, discussing the content of the training program with professionals and organizing focus and discussion groups. The effectiveness of the program was evaluated in an RCT, to obtain an evidence-based training program. Dutch patients (n = 122 were randomized into a treatment group: normal instructions from the supplier combined with training in the use of CCTVs, or into a control group: instructions from the supplier only. The effect of the training program was evaluated in terms of: change in reading ability (reading speed and reading comprehension, patients' skills to operate the CCTV, perceived (vision-related quality of life and tasks performed in daily living. Discussion The development of the CCTV training protocol and the design of the RCT in the present study may serve as an example to obtain an evidence-based training program. The training program was adjusted to the needs and learning abilities of individual patients, however, for scientific reasons it might have been preferable to standardize the protocol further, in order to gain more comparable results. Trial registration http://www.trialregister.nl, identifier: NTR1031

  13. Superposition Attacks on Cryptographic Protocols

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Funder, Jakob Løvstad; Nielsen, Jesper Buus

    2011-01-01

    of information. In this paper, we introduce a fundamentally new model of quantum attacks on classical cryptographic protocols, where the adversary is allowed to ask several classical queries in quantum superposition. This is a strictly stronger attack than the standard one, and we consider the security......Attacks on classical cryptographic protocols are usually modeled by allowing an adversary to ask queries from an oracle. Security is then defined by requiring that as long as the queries satisfy some constraint, there is some problem the adversary cannot solve, such as compute a certain piece...... of several primitives in this model. We show that a secret-sharing scheme that is secure with threshold $t$ in the standard model is secure against superposition attacks if and only if the threshold is lowered to $t/2$. We use this result to give zero-knowledge proofs for all of NP in the common reference...

  14. Persistent RCSMA: A MAC Protocol for a Distributed Cooperative ARQ Scheme in Wireless Networks

    Directory of Open Access Journals (Sweden)

    J. Alonso-Zárate

    2008-05-01

    Full Text Available The persistent relay carrier sensing multiple access (PRCSMA protocol is presented in this paper as a novel medium access control (MAC protocol that allows for the execution of a distributed cooperative automatic retransmission request (ARQ scheme in IEEE 802.11 wireless networks. The underlying idea of the PRCSMA protocol is to modify the basic rules of the IEEE 802.11 MAC protocol to execute a distributed cooperative ARQ scheme in wireless networks in order to enhance their performance and to extend coverage. A closed formulation of the distributed cooperative ARQ average packet transmission delay in a saturated network is derived in the paper. The analytical equations are then used to evaluate the performance of the protocol under different network configurations. Both the accuracy of the analysis and the performance evaluation of the protocol are supported and validated through computer simulations.

  15. Advanced dementia pain management protocols.

    Science.gov (United States)

    Montoro-Lorite, Mercedes; Canalias-Reverter, Montserrat

    Pain management in advanced dementia is complex because of neurological deficits present in these patients, and nurses are directly responsible for providing interventions for the evaluation, management and relief of pain for people suffering from this health problem. In order to facilitate and help decision-makers, pain experts recommend the use of standardized protocols to guide pain management, but in Spain, comprehensive pain management protocols have not yet been developed for advanced dementia. This article reflects the need for an integrated management of pain in advanced dementia. From the review and analysis of the most current and relevant studies in the literature, we performed an approximation of the scales for the determination of pain in these patients, with the observational scale PAINAD being the most recommended for the hospital setting. In addition, we provide an overview for comprehensive management of pain in advanced dementia through the conceptual framework «a hierarchy of pain assessment techniques by McCaffery and Pasero» for the development and implementation of standardized protocols, including a four-phase cyclical process (evaluation, planning/performance, revaluation and recording), which can facilitate the correct management of pain in these patients. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  16. Biologic comparison of partial breast irradiation protocols

    International Nuclear Information System (INIS)

    Rosenstein, Barry S.; Lymberis, Stella C.; Formenti, Silvia C.

    2004-01-01

    Purpose: To analyze the dose/fractionation schedules currently used in ongoing clinical trials of partial breast irradiation (PBI) by comparing their biologically effective dose (BED) values to those of three standard whole breast protocols commonly used after segmental mastectomy in the treatment of breast cancer. Methods and materials: The BED equation derived from the linear-quadratic model for radiation-induced cell killing was used to calculate the BEDs for three commonly used whole breast radiotherapy regimens, in addition to a variety of external beam radiotherapy, as well as high-dose-rate and low-dose-rate brachytherapy, PBI protocols. Results: The BED values of most PBI protocols resulted in tumor control BEDs roughly equivalent to a 50-Gy standard treatment, but consistently lower than the BEDs for regimens in which the tumor bed receives a total dose of either 60 Gy or 66 Gy. The BED values calculated for the acute radiation responses of erythema and desquamation were nearly all lower for the PBI schedules, and the late-response BEDs for most PBI regimens were in a similar range to the BEDs for the standard treatments. Conclusion: Biologically effective dose modeling raises the concern that inadequate doses might be delivered by PBI to ensure optimal in-field tumor control

  17. EU-US standards harmonization task group report : feedback to ITS standards development organizations communications.

    Science.gov (United States)

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  18. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  19. Relevant Standards

    Indian Academy of Sciences (India)

    .86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...

  20. The Nagoya Protocol: Fragmentation or Consolidation?

    Directory of Open Access Journals (Sweden)

    Carmen Richerzhagen

    2014-02-01

    Full Text Available In October, 2010, a protocol on access and benefit-sharing (ABS of genetic resources was adopted, the so-called Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity. Before the adoption of the Nagoya Protocol, the governance architecture of ABS was already characterized by a multifaceted institutional environment. The use of genetic resources is confronted with many issues (conservation, research and development, intellectual property rights, food security, health issues, climate change that are governed by different institutions and agreements. The Nagoya Protocol contributes to increased fragmentation. However, the question arises whether this new regulatory framework can help to advance the implementation of the ABS provisions of the Convention on Biological Diversity (CBD. This paper attempts to find an answer to that question by following three analytical steps. First, it analyzes the causes of change against the background of theories of institutional change. Second, it aims to assess the typology of the architecture in order to find out if this new set of rules will contribute to a more synergistic, cooperative or conflictive architecture of ABS governance. Third, the paper looks at the problem of “fit” and identifies criteria that can be used to assess the new ABS governance architecture with regard to its effectiveness.

  1. In silico toxicology protocols.

    Science.gov (United States)

    Myatt, Glenn J; Ahlberg, Ernst; Akahori, Yumi; Allen, David; Amberg, Alexander; Anger, Lennart T; Aptula, Aynur; Auerbach, Scott; Beilke, Lisa; Bellion, Phillip; Benigni, Romualdo; Bercu, Joel; Booth, Ewan D; Bower, Dave; Brigo, Alessandro; Burden, Natalie; Cammerer, Zoryana; Cronin, Mark T D; Cross, Kevin P; Custer, Laura; Dettwiler, Magdalena; Dobo, Krista; Ford, Kevin A; Fortin, Marie C; Gad-McDonald, Samantha E; Gellatly, Nichola; Gervais, Véronique; Glover, Kyle P; Glowienke, Susanne; Van Gompel, Jacky; Gutsell, Steve; Hardy, Barry; Harvey, James S; Hillegass, Jedd; Honma, Masamitsu; Hsieh, Jui-Hua; Hsu, Chia-Wen; Hughes, Kathy; Johnson, Candice; Jolly, Robert; Jones, David; Kemper, Ray; Kenyon, Michelle O; Kim, Marlene T; Kruhlak, Naomi L; Kulkarni, Sunil A; Kümmerer, Klaus; Leavitt, Penny; Majer, Bernhard; Masten, Scott; Miller, Scott; Moser, Janet; Mumtaz, Moiz; Muster, Wolfgang; Neilson, Louise; Oprea, Tudor I; Patlewicz, Grace; Paulino, Alexandre; Lo Piparo, Elena; Powley, Mark; Quigley, Donald P; Reddy, M Vijayaraj; Richarz, Andrea-Nicole; Ruiz, Patricia; Schilter, Benoit; Serafimova, Rositsa; Simpson, Wendy; Stavitskaya, Lidiya; Stidl, Reinhard; Suarez-Rodriguez, Diana; Szabo, David T; Teasdale, Andrew; Trejo-Martin, Alejandra; Valentin, Jean-Pierre; Vuorinen, Anna; Wall, Brian A; Watts, Pete; White, Angela T; Wichard, Joerg; Witt, Kristine L; Woolley, Adam; Woolley, David; Zwickl, Craig; Hasselgren, Catrin

    2018-04-17

    The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information. Copyright © 2018. Published by Elsevier Inc.

  2. Do we need 3D tube current modulation information for accurate organ dosimetry in chest CT? Protocols dose comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Rendon, Xochitl; Develter, Wim [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Leuven (Belgium); Zhang, Guozhi; Coudyzer, Walter; Zanca, Federica [University Hospitals of the KU Leuven, Department of Radiology, Leuven (Belgium); Bosmans, Hilde [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Leuven (Belgium); University Hospitals of the KU Leuven, Department of Radiology, Leuven (Belgium)

    2017-11-15

    To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI{sub vol} of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI{sub vol} of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI{sub vol} values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. (orig.)

  3. Orthogonal analytical methods for botanical standardization: determination of green tea catechins by qNMR and LC-MS/MS.

    Science.gov (United States)

    Napolitano, José G; Gödecke, Tanja; Lankin, David C; Jaki, Birgit U; McAlpine, James B; Chen, Shao-Nong; Pauli, Guido F

    2014-05-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative (1)H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted (1)H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Whole-Body Magnetic Resonance Angiography at 3 Tesla Using a Hybrid Protocol in Patients with Peripheral Arterial Disease

    International Nuclear Information System (INIS)

    Nielsen, Yousef W.; Eiberg, Jonas P.; Logager, Vibeke B.; Schroeder, Torben V.; Just, Sven; Thomsen, Henrik S.

    2009-01-01

    The purpose of this study was to determine the diagnostic performance of 3T whole-body magnetic resonance angiography (WB-MRA) using a hybrid protocol in comparison with a standard protocol in patients with peripheral arterial disease (PAD). In 26 consecutive patients with PAD two different protocols were used for WB-MRA: a standard sequential protocol (n = 13) and a hybrid protocol (n = 13). WB-MRA was performed using a gradient echo sequence, body coil for signal reception, and gadoterate meglumine as contrast agent (0.3 mmol/kg body weight). Two blinded observers evaluated all WB-MRA examinations with regard to presence of stenoses, as well as diagnostic quality and degree of venous contamination in each of the four stations used in WB-MRA. Digital subtraction angiography served as the method of reference. Sensitivity for detecting significant arterial disease (luminal narrowing ≥ 50%) using standard-protocol WB-MRA for the two observers was 0.63 (95%CI: 0.51-0.73) and 0.66 (0.58-0.78). Specificities were 0.94 (0.91-0.97) and 0.96 (0.92-0.98), respectively. In the hybrid protocol WB-MRA sensitivities were 0.75 (0.64-0.84) and 0.70 (0.58-0.8), respectively. Specificities were 0.93 (0.88-0.96) and 0.95 (0.91-0.97). Interobserver agreement was good using both the standard and the hybrid protocol, with κ = 0.62 (0.44-0.67) and κ = 0.70 (0.59-0.79), respectively. WB-MRA quality scores were significantly higher in the lower leg using the hybrid protocol compared to standard protocol (p = 0.003 and p = 0.03, observers 1 and 2). Distal venous contamination scores were significantly lower with the hybrid protocol (p = 0.02 and p = 0.01, observers 1 and 2). In conclusion, hybrid-protocol WB-MRA shows a better diagnostic performance than standard protocol WB-MRA at 3 T in patients with PAD.

  5. A CAD system and quality assurance protocol for bone age assessment utilizing digital hand atlas

    Science.gov (United States)

    Gertych, Arakadiusz; Zhang, Aifeng; Ferrara, Benjamin; Liu, Brent J.

    2007-03-01

    Determination of bone age assessment (BAA) in pediatric radiology is a task based on detailed analysis of patient's left hand X-ray. The current standard utilized in clinical practice relies on a subjective comparison of the hand with patterns in the book atlas. The computerized approach to BAA (CBAA) utilizes automatic analysis of the regions of interest in the hand image. This procedure is followed by extraction of quantitative features sensitive to skeletal development that are further converted to a bone age value utilizing knowledge from the digital hand atlas (DHA). This also allows providing BAA results resembling current clinical approach. All developed methodologies have been combined into one CAD module with a graphical user interface (GUI). CBAA can also improve the statistical and analytical accuracy based on a clinical work-flow analysis. For this purpose a quality assurance protocol (QAP) has been developed. Implementation of the QAP helped to make the CAD more robust and find images that cannot meet conditions required by DHA standards. Moreover, the entire CAD-DHA system may gain further benefits if clinical acquisition protocol is modified. The goal of this study is to present the performance improvement of the overall CAD-DHA system with QAP and the comparison of the CAD results with chronological age of 1390 normal subjects from the DHA. The CAD workstation can process images from local image database or from a PACS server.

  6. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  7. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  8. Performance Improvement Based Authentication Protocol for Intervessel Traffic Service Data Exchange Format Protocol Based on U-Navigation System in WoT Environment

    Directory of Open Access Journals (Sweden)

    Byunggil Lee

    2014-01-01

    Full Text Available International Association of Lighthouse Authorities (IALA is developing the standard intersystem VTS exchange format (IVEF protocol for exchange of navigation and vessel information between VTS systems and between VTS and vessels. VTS (vessel traffic system is an important marine traffic monitoring system which is designed to improve the safety and efficiency of navigation and the protection of the marine environment. And the demand of Inter-VTS networking has been increased for realization of e-Navigation as shore side collaboration for maritime safety. And IVEF (inter-VTS data exchange format for inter-VTS network has become a hot research topic of VTS system. Currently, the IVEF developed by the International Association of Lighthouse Authorities (IALA does not include any highly trusted certification technology for the connectors. The output of standardization is distributed as the IALA recommendation V-145, and the protocol is implemented with an open source. The IVEF open source, however, is the code used to check the functions of standard protocols. It is too slow to be used in the field and requires a large memory. And the vessel traffic information requires high security since it is highly protected by the countries. Therefore, this paper suggests the authentication protocol to increase the security of the VTS systems using the main certification server and IVEF.

  9. Reactive GTS Allocation Protocol for Sporadic Events Using the IEEE 802.15.4

    Directory of Open Access Journals (Sweden)

    Mukhtar Azeem

    2014-01-01

    by the IEEE 802.15.4 standard. The proposed control protocol ensures that a given offline sporadic schedule can be adapted online in a timely manner such that the static periodic schedule has not been disturbed and the IEEE 802.15.4 standard compliance remains intact. The proposed protocol is simulated in OPNET. The simulation results are analyzed and presented in this paper to prove the correctness of the proposed protocol regarding the efficient real-time sporadic event delivery along with the periodic event propagation.

  10. INTERGROWTH-21st Gestational Dating and Fetal and Newborn Growth Standards in Peri-Urban Nairobi, Kenya: Quasi-Experimental Implementation Study Protocol.

    Science.gov (United States)

    Millar, Kathryn; Patel, Suha; Munson, Meghan; Vesel, Linda; Subbiah, Shalini; Jones, Rachel M; Little, Sarah; Papageorghiou, Aris T; Villar, Jose; Wegner, Mary Nell; Pearson, Nick; Muigai, Faith; Ongeti, Catherine; Langer, Ana

    2018-06-22

    The burden of preterm birth, fetal growth impairment, and associated neonatal deaths disproportionately falls on low- and middle-income countries where modern obstetric tools are not available to date pregnancies and monitor fetal growth accurately. The INTERGROWTH-21 st gestational dating, fetal growth monitoring, and newborn size at birth standards make this possible. To scale up the INTERGROWTH-21 st standards, it is essential to assess the feasibility and acceptability of their implementation and their effect on clinical decision-making in a low-resource clinical setting. This study protocol describes a pre-post, quasi-experimental implementation study of the standards at Jacaranda Health, a maternity hospital in peri-urban Nairobi, Kenya. All women with viable fetuses receiving antenatal and delivery services, their resulting newborns, and the clinicians caring for them from March 2016 to March 2018 are included. The study comprises a 12-month preimplementation phase, a 12-month implementation phase, and a 5-month post-implementation phase to be completed in August 2018. Quantitative clinical and qualitative data collected during the preimplementation and implementation phases will be assessed. A clinician survey was administered eight months into the implementation phase, month 20 of the study. Implementation outcomes include quantitative and qualitative analyses of feasibility, acceptability, adoption, appropriateness, fidelity, and penetration of the standards. Clinical outcomes include appropriateness of referral and effect of the standards on clinical care and decision-making. Descriptive analyses will be conducted, and comparisons will be made between pre- and postimplementation outcomes. Qualitative data will be analyzed using thematic coding and compared across time. The study was approved by the Amref Ethics and Scientific Review Committee (Kenya) and the Harvard University Institutional Review Board. Study results will be shared with stakeholders

  11. An Improved 6LoWPAN Hierarchical Routing Protocol

    Directory of Open Access Journals (Sweden)

    Xue Li

    2015-10-01

    Full Text Available IETF 6LoWPAN working group is engaged in the IPv6 protocol stack research work based on IEEE802.15.4 standard. In this working group, the routing protocol is one of the important research contents. In the 6LoWPAN, HiLow is a well-known layered routing protocol. This paper puts forward an improved hierarchical routing protocol GHiLow by improving HiLow parent node selection and path restoration strategy. GHiLow improves the parent node selection by increasing the choice of parameters. Simutaneously, it also improves path recovery by analysing different situations to recovery path. Therefore, GHiLow contributes to the ehancement of network performance and the decrease of network energy consumption.

  12. Standardized cardiovascular magnetic resonance imaging (CMR protocols, society for cardiovascular magnetic resonance: board of trustees task force on standardized protocols

    Directory of Open Access Journals (Sweden)

    Kim Raymond J

    2008-07-01

    Full Text Available Index 1. General techniques 1.1. Stress and safety equipment 1.2. Left ventricular (LV structure and function module 1.3. Right ventricular (RV structure and function module 1.4. Gadolinium dosing module. 1.5. First pass perfusion 1.6. Late gadolinium enhancement (LGE 2. Disease specific protocols 2.1. Ischemic heart disease 2.1.1. Acute myocardial infarction (MI 2.1.2. Chronic ischemic heart disease and viability 2.1.3. Dobutamine stress 2.1.4. Adenosine stress perfusion 2.2. Angiography: 2.2.1. Peripheral magnetic resonance angiography (MRA 2.2.2. Thoracic MRA 2.2.3. Anomalous coronary arteries 2.2.4. Pulmonary vein evaluation 2.3. Other 2.3.1. Non-ischemic cardiomyopathy 2.3.2. Arrhythmogenic right ventricular cardiomyopathy (ARVC 2.3.3. Congenital heart disease 2.3.4. Valvular heart disease 2.3.5. Pericardial disease 2.3.6. Masses

  13. Preoperative vestibular assessment protocol of cochlear implant surgery: an analytical descriptive study.

    Science.gov (United States)

    Bittar, Roseli Saraiva Moreira; Sato, Eduardo Setsuo; Ribeiro, Douglas Jósimo Silva; Tsuji, Robinson Koji

    Cochlear implants are undeniably an effective method for the recovery of hearing function in patients with hearing loss. To describe the preoperative vestibular assessment protocol in subjects who will be submitted to cochlear implants. Our institutional protocol provides the vestibular diagnosis through six simple tests: Romberg and Fukuda tests, assessment for spontaneous nystagmus, Head Impulse Test, evaluation for Head Shaking Nystagmus and caloric test. 21 patients were evaluated with a mean age of 42.75±14.38 years. Only 28% of the sample had all normal test results. The presence of asymmetric vestibular information was documented through the caloric test in 32% of the sample and spontaneous nystagmus was an important clue for the diagnosis. Bilateral vestibular areflexia was present in four subjects, unilateral arreflexia in three and bilateral hyporeflexia in two. The Head Impulse Test was a significant indicator for the diagnosis of areflexia in the tested ear (p=0.0001). The sensitized Romberg test using a foam pad was able to diagnose severe vestibular function impairment (p=0.003). The six clinical tests were able to identify the presence or absence of vestibular function and function asymmetry between the ears of the same individual. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  14. Use of a Deuterated Internal Standard with Pyrolysis-GC/MS Dimeric Marker Analysis to Quantify Tire Tread Particles in the Environment

    Directory of Open Access Journals (Sweden)

    Julie M. Panko

    2012-11-01

    Full Text Available Pyrolysis(pyr-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ³ 0.88 with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.

  15. Use of a deuterated internal standard with pyrolysis-GC/MS dimeric marker analysis to quantify tire tread particles in the environment.

    Science.gov (United States)

    Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M

    2012-11-08

    Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.

  16. National protocol framework for the inventory and monitoring of bees

    Science.gov (United States)

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative

  17. Clinical trials of CCLSG L874 and I874 protocols without cranial irradiation for standard-risk acute lymphoblastic leukemia in childhood

    International Nuclear Information System (INIS)

    Koizumi, Shoichi; Fujimoto, Takeo; Tsurusawa, Masahito

    1992-01-01

    In the CCLSG-874 protocol for children with low-risk (LR) and intermediate-risk (IR) acute lymphoblastic leukemia (ALL), two regimens with or without cranial irradiation (CI) were compared with respect to their ability to prevent central nervous system (CNS) leukemia and to improve overall outcome of ALL. From 1987 to 1990, 82 and 109 evaluable patients were registered into L874 and I874 protocols for LR and IR patients, respectively. All responders to induction therapy were randomized to treatment with 18 Gy of CI plus intrathecal methotrexate (MTX it) or to treatment with high-dose MTX plus MTX it. Patients were then treated with standard maintenance regimens of L874 and I874. At a median follow-up of 39 months (range 14-58 months) there was no difference in the rate of hematologic relapse between the CI group and MTX group. The rate of CNS relapse in the MTX group seemed to be higher (3 of 39 in L874 and 2 of 54 in I874) than that in the CI group (1 of 43 in L874 and 0 of 55 in I874), but these data were not statistically significant. The rates of 4-year event-free survival (EFS) in L874 were 81.1±7.6% (mean±SE) and 75.2±7.9% (ns) for the CI and MTX group, respectively, and the rates of EFS in I874 were 70.0±13.6% and 70.0±9.0% (ns) for the CI and MTX group, respectively. These data suggest that MTX alone may be as effective as CI to prolong disease-free survival in LR and IR ALL although further continuous studies are needed. Analysis of serial CCLSG protocols for ALL from 1981 revealed that the rate of EFS of ALL allover including all risk groups has gradually been increasing from 44.2±3.6% for 811 protocol and 53.1±3.5% for 841 to 65.5±3.6% for the present 874 protocol. (author)

  18. Analytical Chemistry Laboratory: Progress report for FY 1988

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques

  19. Analytical Chemistry Laboratory progress report for FY 1989

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1989-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques

  20. Analytical Chemistry Laboratory: Progress report for FY 1988

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  1. Aligning research assessment in the Humanities to the national Standard Evaluation Protocol Challenges and developments in the Dutch research landscape

    Energy Technology Data Exchange (ETDEWEB)

    Prins, A.; Spaapen, J.; Van Vree, F

    2016-07-01

    The purpose of this session is a debate about innovation in comprehensive methods for the assessment of humanities research. Input will come from preliminary outcomes of an ongoing project in the Netherlands to find adequate indicators for humanities research that will fit in the national Standard Evaluation Protocol. The project includes processes of ‘bottom up’ data collection (that is, with input coming from the research community) and discussion with Humanities researchers, investigating the specific characteristics of publication and communication cultures in the Humanities, and the prospects for the use of quantitative and qualitative indicators. (Author)

  2. Characterization of Analytical Reference Glass-1 (ARG-1)

    International Nuclear Information System (INIS)

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers' analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ''round robin'' methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ''Analysis of Nuclear Waste Glass and Related Materials,'' January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers' analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms

  3. Dynamic Channel Slot Allocation Scheme and Performance Analysis of Cyclic Quorum Multichannel MAC Protocol

    Directory of Open Access Journals (Sweden)

    Xing Hu

    2017-01-01

    Full Text Available In high diversity node situation, multichannel MAC protocol can improve the frequency efficiency, owing to fewer collisions compared with single-channel MAC protocol. And the performance of cyclic quorum-based multichannel (CQM MAC protocol is outstanding. Based on cyclic quorum system and channel slot allocation, it can avoid the bottleneck that others suffered from and can be easily realized with only one transceiver. To obtain the accurate performance of CQM MAC protocol, a Markov chain model, which combines the channel-hopping strategy of CQM protocol and IEEE 802.11 distributed coordination function (DCF, is proposed. The results of numerical analysis show that the optimal performance of CQM protocol can be obtained in saturation bound situation. And then we obtain the saturation bound of CQM system by bird swarm algorithm. In addition, to improve the performance of CQM protocol in unsaturation situation, a dynamic channel slot allocation of CQM (DCQM protocol is proposed, based on wavelet neural network. Finally, the performance of CQM protocol and DCQM protocol is simulated by Qualnet platform. And the simulation results show that the analytic and simulation results match very well; the DCQM performs better in unsaturation situation.

  4. Interoperability through standardization: Electronic mail, and X Window systems

    Science.gov (United States)

    Amin, Ashok T.

    1993-01-01

    Since the introduction of computing machines, there has been continual advances in computer and communication technologies and approaching limits. The user interface has evolved from a row of switches, character based interface using teletype terminals and then video terminals, to present day graphical user interface. It is expected that next significant advances will come in the availability of services, such as electronic mail and directory services, as the standards for applications are developed and in the 'easy to use' interfaces, such as Graphical User Interface for example Window and X Window, which are being standardized. Various proprietary electronic mail (email) systems are in use within organizations at each center of NASA. Each system provides email services to users within an organization, however the support for email services across organizations and across centers exists at centers to a varying degree and is often easy to use. A recent NASA email initiative is intended 'to provide a simple way to send email across organizational boundaries without disruption of installed base.' The initiative calls for integration of existing organizational email systems through gateways connected by a message switch, supporting X.400 and SMTP protocols, to create a NASA wide email system and for implementation of NASA wide email directory services based on OSI standard X.500. A brief overview of MSFC efforts as a part of this initiative are described. Window based graphical user interfaces make computers easy to use. X window protocol has been developed at Massachusetts Institute of Technology in 1984/1985 to provide uniform window based interface in a distributed computing environment with heterogenous computers. It has since become a standard supported by a number of major manufacturers. Z Windows systems, terminals and workstations, and X Window applications are becoming available. However impact of its use in the Local Area Network environment on the network

  5. Efficient MAC Protocol for Hybrid Wireless Network with Heterogeneous Sensor Nodes

    Directory of Open Access Journals (Sweden)

    Md. Nasre Alam

    2016-01-01

    Full Text Available Although several Directional Medium Access Control (DMAC protocols have been designed for use with homogeneous networks, it can take a substantial amount of time to change sensor nodes that are equipped with an omnidirectional antenna for sensor nodes with a directional antenna. Thus, we require a novel MAC protocol for use with an intermediate wireless network that consists of heterogeneous sensor nodes equipped with either an omnidirectional antenna or a directional antenna. The MAC protocols that have been designed for use in homogeneous networks are not suitable for use in a hybrid network due to deaf, hidden, and exposed nodes. Therefore, we propose a MAC protocol that exploits the characteristics of a directional antenna and can also work efficiently with omnidirectional nodes in a hybrid network. In order to address the deaf, hidden, and exposed node problems, we define RTS/CTS for the neighbor (RTSN/CTSN and Neighbor Information (NIP packets. The performance of the proposed MAC protocol is evaluated through a numerical analysis using a Markov model. In addition, the analytical results of the MAC protocol are verified through an OPNET simulation.

  6. Automated Planning Enables Complex Protocols on Liquid-Handling Robots.

    Science.gov (United States)

    Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg

    2018-03-16

    Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.

  7. Field Monitoring Protocol. Heat Pump Water Heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sparn, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Earle, L. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Christensen, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wilson, E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hancock, C. E. [Mountain Energy Partnership, Longmont, CO (United States)

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  8. Field Monitoring Protocol: Heat Pump Water Heaters

    Energy Technology Data Exchange (ETDEWEB)

    Sparn, B.; Earle, L.; Christensen, D.; Maguire, J.; Wilson, E.; Hancock, E.

    2013-02-01

    This document provides a standard field monitoring protocol for evaluating the installed performance of Heat Pump Water Heaters in residential buildings. The report is organized to be consistent with the chronology of field test planning and execution. Research questions are identified first, followed by a discussion of analysis methods, and then the details of measuring the required information are laid out. A field validation of the protocol at a house near the NREL campus is included for reference.

  9. The international protocol for the dosimetry of external radiotherapy beams based on standards of absorbed dose to water

    International Nuclear Information System (INIS)

    Andreo, P.

    2001-01-01

    An International Code of Practice (CoP, or dosimetry protocol) for external beam radiotherapy dosimetry based on standards of absorbed dose to water has been published by the IAEA on behalf of IAEA, WHO, PAHO and ESTRO. The CoP provides a systematic and internationally unified approach for the determination of the absorbed dose to water in reference conditions with radiotherapy beams. The development of absorbed-dose-to-water standards for high-energy photons and electrons offers the possibility of reducing the uncertainty in the dosimetry of radiotherapy beams. Many laboratories already provide calibrations at the radiation quality of 60Co gamma-rays and some have extended calibrations to high-energy photon and electron beams. The dosimetry of kilovoltage x-rays, as well as that of proton and ion beams can also be based on these standards. Thus, a coherent dosimetry system based on the same formalism is achieved for practically all radiotherapy beams. The practical use of the CoP as simple. The document is formed by a set of different CoPs for each radiation type, which include detailed procedures and worksheets. All CoPs are based on ND,w chamber calibrations at a reference beam quality Qo, together with radiation beam quality correction factors kQ preferably measured directly for the user's chamber in a standards laboratory. Calculated values of kQ are provided together with their uncertainty estimates. Beam quality specifiers are 60Co, TPR20,10 (high-energy photons), R50 (electrons), HVL and kV (x-rays) and Rres (protons and ions) [es

  10. Establishing a protocol for element determination in human nail clippings by neutron activation analysis

    International Nuclear Information System (INIS)

    Sanches, Thalita Pinheiro; Saiki, Mitiko

    2011-01-01

    Human nail samples have been analyzed to evaluate occupational exposure, nutritional status and to diagnose certain diseases. However, sampling and washing protocols for nail analyses vary from study to study not allowing comparisons between studies. One of the difficulties in analyzing nail samples is to eliminate only surface contamination without removing elements of interest in this tissue. In the present study, a protocol was defined in order to obtain reliable results of element concentrations in human nail clippings. Nail clippings collected from all 10 fingers or toes were previously pre cleaned using an ethyl alcohol solution to eliminate microbes. Then, the clippings were cut in small pieces and submitted to different reagents for washing by shaking. Neutron activation analysis (NAA) was applied for nail samples analysis which consisted of irradiating aliquots of samples together with synthetic elemental standards in the IEA-R1 nuclear research reactor followed by gamma ray spectrometry. Comparisons made between the results obtained for nails submitted to different reagents for cleaning indicated that the procedure using acetone and Triton X100 solution is more effective than that of nitric acid solution. Analyses in triplicates of a nail sample indicated results with relative standard deviations lower than 15% for most of elements, showing the homogeneity of the prepared sample. Qualitative analyses of different nail polishes showed that the presence of elements determined in the present study is negligible in these products. Quality control of the analytical results indicated that the applied NAA procedure is adequate for human nail analysis. (author)

  11. An Energy-Efficient Link Layer Protocol for Reliable Transmission over Wireless Networks

    Directory of Open Access Journals (Sweden)

    Iqbal Adnan

    2009-01-01

    Full Text Available In multihop wireless networks, hop-by-hop reliability is generally achieved through positive acknowledgments at the MAC layer. However, positive acknowledgments introduce significant energy inefficiencies on battery-constrained devices. This inefficiency becomes particularly significant on high error rate channels. We propose to reduce the energy consumption during retransmissions using a novel protocol that localizes bit-errors at the MAC layer. The proposed protocol, referred to as Selective Retransmission using Virtual Fragmentation (SRVF, requires simple modifications to the positive-ACK-based reliability mechanism but provides substantial improvements in energy efficiency. The main premise of the protocol is to localize bit-errors by performing partial checksums on disjoint parts or virtual fragments of a packet. In case of error, only the corrupted virtual fragments are retransmitted. We develop stochastic models of the Simple Positive-ACK-based reliability, the previously-proposed Packet Length Optimization (PLO protocol, and the SRVF protocol operating over an arbitrary-order Markov wireless channel. Our analytical models show that SRVF provides significant theoretical improvements in energy efficiency over existing protocols. We then use bit-error traces collected over different real networks to empirically compare the proposed and existing protocols. These experimental results further substantiate that SRVF provides considerably better energy efficiency than Simple Positive-ACK and Packet Length Optimization protocols.

  12. Analytical Chemistry Laboratory progress report for FY 1991

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Boparai, A.S.

    1991-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  13. Wireless networking for the dental office: current wireless standards and security protocols.

    Science.gov (United States)

    Mupparapu, Muralidhar; Arora, Sarika

    2004-11-15

    Digital radiography has gained immense popularity in dentistry today in spite of the early difficulty for the profession to embrace the technology. The transition from film to digital has been happening at a faster pace in the fields of Orthodontics, Oral Surgery, Endodontics, Periodontics, and other specialties where the radiographic images (periapical, bitewing, panoramic, cephalometric, and skull radiographs) are being acquired digitally, stored within a server locally, and eventually accessed for diagnostic purposes, along with the rest of the patient data via the patient management software (PMS). A review of the literature shows the diagnostic performance of digital radiography is at least comparable to or even better than that of conventional radiography. Similarly, other digital diagnostic tools like caries detectors, cephalometric analysis software, and digital scanners were used for many years for the diagnosis and treatment planning purposes. The introduction of wireless charged-coupled device (CCD) sensors in early 2004 (Schick Technologies, Long Island City, NY) has moved digital radiography a step further into the wireless era. As with any emerging technology, there are concerns that should be looked into before adapting to the wireless environment. Foremost is the network security involved in the installation and usage of these wireless networks. This article deals with the existing standards and choices in wireless technologies that are available for implementation within a contemporary dental office. The network security protocols that protect the patient data and boost the efficiency of modern day dental clinics are enumerated.

  14. Evaluation of video transmission of MAC protocols in wireless sensor network

    Science.gov (United States)

    Maulidin, Mahmuddin, M.; Kamaruddin, L. M.; Elsaikh, Mohamed

    2016-08-01

    Wireless Sensor Network (WSN) is a wireless network which consists of sensor nodes scattered in a particular area which are used to monitor physical or environment condition. Each node in WSN is also scattered in sensor field, so an appropriate scheme of MAC protocol should have to develop communication link for data transferring. Video transmission is one of the important applications for the future that can be transmitted with low aspect in side of cost and also power consumption. In this paper, comparison of five different MAC WSN protocol for video transmission namely IEEE 802.11 standard, IEEE 802.15.4 standard, CSMA/CA, Berkeley-MAC, and Lightweight-MAC protocol are studied. Simulation experiment has been conducted in OMNeT++ with INET network simulator software to evaluate the performance. Obtained results indicate that IEEE 802.11 works better than other protocol in term of packet delivery, throughput, and latency.

  15. Simultaneous validation of the Grandway MD2301 digital automatic blood pressure monitor by the British Hypertension Society and the Association for the Advancement of Medical Instrumentation/the International Organization for Standardization protocols.

    Science.gov (United States)

    Huang, Jinhua; Wang, Yun; Liu, Zhaoying; Wang, Yuling

    2017-02-01

    The aim of this study was to determine the accuracy of the Grandway MD2301 digital automatic blood pressure monitor by the British Hypertension Society (BHS) and the Association for the Advancement of Medical Instrumentation (AAMI)/the International Organization for Standardization (ISO) protocols. A total of 85 participants were included for evaluation based on the requirements of the BHS and the AAMI/ISO protocols. The validation procedure and data analysis followed the protocols precisely. The device achieved A/A grading for the BHS protocol and maintained A/A grading throughout the low, medium and high blood pressure ranges. The device also fulfilled the requirement of the AAMI/ISO protocol with device-observer differences of -0.9±5.6 and 0.8±5.2 mmHg for systolic and diastolic blood pressure, respectively, for criterion 1, and -0.9±4.7 and 0.8±4.2 mmHg, respectively, for criterion 2. The Grandway MD2301 digital automatic blood pressure monitor achieved A/A grade of the BHS protocol and passed the requirements of the AAMI/ISO protocol in adults.

  16. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  17. An efficient multi-carrier position-based packet forwarding protocol for wireless sensor networks

    KAUST Repository

    Bader, Ahmed

    2012-01-01

    Beaconless position-based forwarding protocols have recently evolved as a promising solution for packet forwarding in wireless sensor networks. However, as the node density grows, the overhead incurred in the process of relay selection grows significantly. As such, end-to-end performance in terms of energy and latency is adversely impacted. With the motivation of developing a packet forwarding mechanism that is tolerant to variation in node density, an alternative position-based protocol is proposed in this paper. In contrast to existing beaconless protocols, the proposed protocol is designed such that it eliminates the need for potential relays to undergo a relay selection process. Rather, any eligible relay may decide to forward the packet ahead, thus significantly reducing the underlying overhead. The operation of the proposed protocol is empowered by exploiting favorable features of orthogonal frequency division multiplexing (OFDM) at the physical layer. The end-to-end performance of the proposed protocol is evaluated against existing beaconless position-based protocols analytically and as well by means of simulations. The proposed protocol is demonstrated in this paper to be more efficient. In particular, it is shown that for the same amount of energy the proposed protocol transports one bit from source to destination much quicker. © 2012 IEEE.

  18. Test Protocols for Advanced Inverter Interoperability Functions – Main Document

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already

  19. A cross-platform survey of CT image quality and dose from routine abdomen protocols and a method to systematically standardize image quality

    International Nuclear Information System (INIS)

    Favazza, Christopher P; Duan, Xinhui; Zhang, Yi; Yu, Lifeng; Leng, Shuai; Kofler, James M; Bruesewitz, Michael R; McCollough, Cynthia H

    2015-01-01

    Through this investigation we developed a methodology to evaluate and standardize CT image quality from routine abdomen protocols across different manufacturers and models. The influence of manufacturer-specific automated exposure control systems on image quality was directly assessed to standardize performance across a range of patient sizes. We evaluated 16 CT scanners across our health system, including Siemens, GE, and Toshiba models. Using each practice’s routine abdomen protocol, we measured spatial resolution, image noise, and scanner radiation output (CTDI vol ). Axial and in-plane spatial resolutions were assessed through slice sensitivity profile (SSP) and modulation transfer function (MTF) measurements, respectively. Image noise and CTDI vol values were obtained for three different phantom sizes. SSP measurements demonstrated a bimodal distribution in slice widths: an average of 6.2  ±  0.2 mm using GE’s ‘Plus’ mode reconstruction setting and 5.0  ±  0.1 mm for all other scanners. MTF curves were similar for all scanners. Average spatial frequencies at 50%, 10%, and 2% MTF values were 3.24  ±  0.37, 6.20  ±  0.34, and 7.84  ±  0.70 lp cm −1 , respectively. For all phantom sizes, image noise and CTDI vol varied considerably: 6.5–13.3 HU (noise) and 4.8–13.3 mGy (CTDI vol ) for the smallest phantom; 9.1–18.4 HU and 9.3–28.8 mGy for the medium phantom; and 7.8–23.4 HU and 16.0–48.1 mGy for the largest phantom. Using these measurements and benchmark SSP, MTF, and image noise targets, CT image quality can be standardized across a range of patient sizes. (paper)

  20. A cross-platform survey of CT image quality and dose from routine abdomen protocols and a method to systematically standardize image quality.

    Science.gov (United States)

    Favazza, Christopher P; Duan, Xinhui; Zhang, Yi; Yu, Lifeng; Leng, Shuai; Kofler, James M; Bruesewitz, Michael R; McCollough, Cynthia H

    2015-11-07

    Through this investigation we developed a methodology to evaluate and standardize CT image quality from routine abdomen protocols across different manufacturers and models. The influence of manufacturer-specific automated exposure control systems on image quality was directly assessed to standardize performance across a range of patient sizes. We evaluated 16 CT scanners across our health system, including Siemens, GE, and Toshiba models. Using each practice's routine abdomen protocol, we measured spatial resolution, image noise, and scanner radiation output (CTDIvol). Axial and in-plane spatial resolutions were assessed through slice sensitivity profile (SSP) and modulation transfer function (MTF) measurements, respectively. Image noise and CTDIvol values were obtained for three different phantom sizes. SSP measurements demonstrated a bimodal distribution in slice widths: an average of 6.2  ±  0.2 mm using GE's 'Plus' mode reconstruction setting and 5.0  ±  0.1 mm for all other scanners. MTF curves were similar for all scanners. Average spatial frequencies at 50%, 10%, and 2% MTF values were 3.24  ±  0.37, 6.20  ±  0.34, and 7.84  ±  0.70 lp cm(-1), respectively. For all phantom sizes, image noise and CTDIvol varied considerably: 6.5-13.3 HU (noise) and 4.8-13.3 mGy (CTDIvol) for the smallest phantom; 9.1-18.4 HU and 9.3-28.8 mGy for the medium phantom; and 7.8-23.4 HU and 16.0-48.1 mGy for the largest phantom. Using these measurements and benchmark SSP, MTF, and image noise targets, CT image quality can be standardized across a range of patient sizes.

  1. Towards Standardization of Sampling Methodology for Evaluation of ...

    African Journals Online (AJOL)

    This article proposes the procedure that may be adopted for comparable, representative and cost effective, soil sampling, and thereafter explores the policy issues regarding standardization of sampling activities and analytical process as it relates to soil pollution in Nigeria. Standardized sampling and analytical data for soil ...

  2. Variation in radiographic protocols in paediatric interventional cardiology

    International Nuclear Information System (INIS)

    McFadden, S L; Hughes, C M; Winder, R J

    2013-01-01

    The aim of this work is to determine current radiographic protocols in paediatric interventional cardiology (IC) in the UK and Ireland. To do this we investigated which imaging parameters/protocols are commonly used in IC in different hospitals, to identify if a standard technique is used and illustrate any variation in practice. A questionnaire was sent to all hospitals in the UK and Ireland which perform paediatric IC to obtain information on techniques used in each clinical department and on the range of clinical examinations performed. Ethical and research governance approval was sought from the Office for Research Ethics Committees Northern Ireland and the individual trusts. A response rate of 79% was achieved, and a wide variation in technique was found between hospitals. The main differences in technique involved variations in the use of an anti-scatter grid and the use of additional filtration to the radiation beam, frame rates for digital acquisition and pre-programmed projections/paediatric specific programming in the equipment. We conclude that there is no standard protocol for carrying out paediatric IC in the UK or Ireland. Each hospital carries out the IC procedure according to its own local protocols resulting in a wide variation in radiation dose. (paper)

  3. Variation in radiographic protocols in paediatric interventional cardiology.

    Science.gov (United States)

    McFadden, S L; Hughes, C M; Winder, R J

    2013-06-01

    The aim of this work is to determine current radiographic protocols in paediatric interventional cardiology (IC) in the UK and Ireland. To do this we investigated which imaging parameters/protocols are commonly used in IC in different hospitals, to identify if a standard technique is used and illustrate any variation in practice. A questionnaire was sent to all hospitals in the UK and Ireland which perform paediatric IC to obtain information on techniques used in each clinical department and on the range of clinical examinations performed. Ethical and research governance approval was sought from the Office for Research Ethics Committees Northern Ireland and the individual trusts. A response rate of 79% was achieved, and a wide variation in technique was found between hospitals. The main differences in technique involved variations in the use of an anti-scatter grid and the use of additional filtration to the radiation beam, frame rates for digital acquisition and pre-programmed projections/paediatric specific programming in the equipment. We conclude that there is no standard protocol for carrying out paediatric IC in the UK or Ireland. Each hospital carries out the IC procedure according to its own local protocols resulting in a wide variation in radiation dose.

  4. Efficacy of Low-Dose Protocol in Follow-Up of Lymphoproliferative Disorders - Preliminary Results

    International Nuclear Information System (INIS)

    Popic-Ramac, J.; Brnic, Z.; Klasic, B.; Hebrang, A.; Knezevic, Z.

    2011-01-01

    Most medically-related radiation is caused by diagnostic examinations, in particular by computed tomography (CT). The purpose of this research is to reduce radiation doses faced by the population frequently exposed to such procedures-those with lymphoproliferative disorders. The research was conducted comparing radiation-exposition doses received by the radiosensitive organs (thyroid, lens, breast and gonad) using the standard thoracic CT protocol with the radiation received using the low-dose protocol, while maintaining display quality. The standard-dose thoracic protocol implies 120 kV and 150 mAs. The low-dose protocol was conducted on the same device using 120 kV and 30 mAs. We confirmed the hypothesis that the use of the low-dose thoracic CT protocol leads to a reduction in radiation dose without compromising display quality. It is further expected that a reduction in doses will reduce the risk of radiation-related mutations. (author)

  5. Improving the efficiency of quantitative (1)H NMR: an innovative external standard-internal reference approach.

    Science.gov (United States)

    Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V

    2014-01-01

    The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Identification of a research protocol to study orthodontic tooth movement

    Directory of Open Access Journals (Sweden)

    Annalisa Dichicco

    2014-06-01

    Full Text Available Aim: The orthodontic movement is associated with a process of tissue remodeling together with the release of several chemical mediators in periodontal tissues. Each mediator is a potential marker of tooth movement and expresses biological processes as: tissue inflammation and bone remodeling. Different amounts of every mediator are present in several tissues and fluids of the oral cavity. Therefore, there are different methods that allow sampling with several degrees of invasiveness. Chemical mediators are also substances of different molecular nature, and multiple kind of analysis methods allow detection. The purpose of this study was to draft the best research protocol for an optimal study on orthodontic movement efficiency. Methods: An analysis of the international literature have been made, to identify the gold standard of each aspect of the protocol: type of mediator, source and method of sampling and analysis method. Results: From the analysis of the international literature was created an original research protocol for the study and the assessment of the orthodontic movement, by using the biomarkers of the tooth movement. Conclusions: The protocol created is based on the choice of the gold standard of every aspect already analyzed in the literature and in existing protocols for the monitoring of orthodontic tooth movement through the markers of tooth movement. Clinical trials re required for the evaluation and validation of the protocol created.

  7. Optimization of analytical and pre-analytical conditions for MALDI-TOF-MS human urine protein profiles.

    Science.gov (United States)

    Calvano, C D; Aresta, A; Iacovone, M; De Benedetto, G E; Zambonin, C G; Battaglia, M; Ditonno, P; Rutigliano, M; Bettocchi, C

    2010-03-11

    Protein analysis in biological fluids, such as urine, by means of mass spectrometry (MS) still suffers for insufficient standardization in protocols for sample collection, storage and preparation. In this work, the influence of these variables on healthy donors human urine protein profiling performed by matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was studied. A screening of various urine sample pre-treatment procedures and different sample deposition approaches on the MALDI target was performed. The influence of urine samples storage time and temperature on spectral profiles was evaluated by means of principal component analysis (PCA). The whole optimized procedure was eventually applied to the MALDI-TOF-MS analysis of human urine samples taken from prostate cancer patients. The best results in terms of detected ions number and abundance in the MS spectra were obtained by using home-made microcolumns packed with hydrophilic-lipophilic balance (HLB) resin as sample pre-treatment method; this procedure was also less expensive and suitable for high throughput analyses. Afterwards, the spin coating approach for sample deposition on the MALDI target plate was optimized, obtaining homogenous and reproducible spots. Then, PCA indicated that low storage temperatures of acidified and centrifuged samples, together with short handling time, allowed to obtain reproducible profiles without artifacts contribution due to experimental conditions. Finally, interesting differences were found by comparing the MALDI-TOF-MS protein profiles of pooled urine samples of healthy donors and prostate cancer patients. The results showed that analytical and pre-analytical variables are crucial for the success of urine analysis, to obtain meaningful and reproducible data, even if the intra-patient variability is very difficult to avoid. It has been proven how pooled urine samples can be an interesting way to make easier the comparison between

  8. Efficacy of 2 finishing protocols in the quality of orthodontic treatment outcome.

    Science.gov (United States)

    Stock, Gregory J; McNamara, James A; Baccetti, Tiziano

    2011-11-01

    The objectives of this prospective clinical study were to evaluate the quality of treatment outcomes achieved with a complex orthodontic finishing protocol involving serpentine wires and a tooth positioner, and to compare it with the outcomes of a standard finishing protocol involving archwire bends used to detail the occlusion near the end of active treatment. The complex finishing protocol sample consisted of 34 consecutively treated patients; 1 week before debonding, their molar bands were removed, and serpentine wires were placed; this was followed by active wear of a tooth positioner for up to 1 month after debonding. The standard finishing protocol group consisted of 34 patients; their dental arches were detailed with archwire bends and vertical elastics. The objective grading system of the American Board of Orthodontics was used to quantify the quality of the finish at each time point. The Wilcoxon signed rank test was used to compare changes in the complex finishing protocol; the Mann-Whitney U test was used to compare changes between groups. The complex finishing protocol group experienced a clinically significant improvement in objective grading system scores after treatment with the positioner. Mild improvement in posterior space closure was noted after molar band removal, but no improvement in the occlusion was observed after placement of the serpentine wires. Patients managed with the complex finishing protocol also had a lower objective grading system score (14.7) at the end of active treatment than did patients undergoing the standard finishing protocol (23.0). Tooth positioners caused a clinically significant improvement in interocclusal contacts, interproximal contacts, and net objective grading system score; mild improvement in posterior band space was noted after molar band removal 1 week before debond. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  9. A Review of Communications Protocol for Intelligent Remote Terminal Unit Development

    Directory of Open Access Journals (Sweden)

    Mohd Ruddin Ab. Ghani

    2013-11-01

    Full Text Available This paper reviewed all the possible interfacing communication protocols for remote terminal unit (RTU. Supervisory Control and Data Acquisition (SCADA system is a central station that can communicate with other network using the protocol. Fundamentally, the architectures of all networks are based on the seven layers of open system interconnection (OSI and International Standard Organization (ISO. The objective of designing the protocols is to check the status of all the input and output field devices and send the report according to that status. The corresponding protocol and communication parameters between the connecting devices will be included in designing a complex SCADA system. The available protocols to develop the communication of RTU are Modbus/ASCII, distributed network protocol (DNP3, controller area network (CAN, International Electro-technical Commission (IEC 60870, transmission control protocol/internet protocol (TCP/IP.

  10. Audit protocol of compliance test on x-ray and interventional radiodiagnostic

    International Nuclear Information System (INIS)

    Endang Kunarsih; Fitria Sandra

    2011-01-01

    Testing protocol is a document that defined and implemented by the testing agency in conducting compliance testing to ensure that quality of testing implementation is planned and controlled in accordance with applicable regulations and standards. Testing protocol is required in filing an application to be a qualified testing agency. Auditors will review the testing protocol document to assess adequacy of the acceptance criteria before proceed to the next process. This paper presents the acceptance criteria required in an audit of the testing protocol document from the applicant of testing agency. (author)

  11. An ultra low-power and traffic-adaptive medium access control protocol for wireless body area network.

    Science.gov (United States)

    Ullah, Sana; Kwak, Kyung Sup

    2012-06-01

    Wireless Body Area Network (WBAN) consists of low-power, miniaturized, and autonomous wireless sensor nodes that enable physicians to remotely monitor vital signs of patients and provide real-time feedback with medical diagnosis and consultations. It is the most reliable and cheaper way to take care of patients suffering from chronic diseases such as asthma, diabetes and cardiovascular diseases. Some of the most important attributes of WBAN is low-power consumption and delay. This can be achieved by introducing flexible duty cycling techniques on the energy constraint sensor nodes. Stated otherwise, low duty cycle nodes should not receive frequent synchronization and control packets if they have no data to send/receive. In this paper, we introduce a Traffic-adaptive MAC protocol (TaMAC) by taking into account the traffic information of the sensor nodes. The protocol dynamically adjusts the duty cycle of the sensor nodes according to their traffic-patterns, thus solving the idle listening and overhearing problems. The traffic-patterns of all sensor nodes are organized and maintained by the coordinator. The TaMAC protocol is supported by a wakeup radio that is used to accommodate emergency and on-demand events in a reliable manner. The wakeup radio uses a separate control channel along with the data channel and therefore it has considerably low power consumption requirements. Analytical expressions are derived to analyze and compare the performance of the TaMAC protocol with the well-known beacon-enabled IEEE 802.15.4 MAC, WiseMAC, and SMAC protocols. The analytical derivations are further validated by simulation results. It is shown that the TaMAC protocol outperforms all other protocols in terms of power consumption and delay.

  12. Protocol for Communication Networking for Formation Flying

    Science.gov (United States)

    Jennings, Esther; Okino, Clayton; Gao, Jay; Clare, Loren

    2009-01-01

    An application-layer protocol and a network architecture have been proposed for data communications among multiple autonomous spacecraft that are required to fly in a precise formation in order to perform scientific observations. The protocol could also be applied to other autonomous vehicles operating in formation, including robotic aircraft, robotic land vehicles, and robotic underwater vehicles. A group of spacecraft or other vehicles to which the protocol applies could be characterized as a precision-formation- flying (PFF) network, and each vehicle could be characterized as a node in the PFF network. In order to support precise formation flying, it would be necessary to establish a corresponding communication network, through which the vehicles could exchange position and orientation data and formation-control commands. The communication network must enable communication during early phases of a mission, when little positional knowledge is available. Particularly during early mission phases, the distances among vehicles may be so large that communication could be achieved only by relaying across multiple links. The large distances and need for omnidirectional coverage would limit communication links to operation at low bandwidth during these mission phases. Once the vehicles were in formation and distances were shorter, the communication network would be required to provide high-bandwidth, low-jitter service to support tight formation-control loops. The proposed protocol and architecture, intended to satisfy the aforementioned and other requirements, are based on a standard layered-reference-model concept. The proposed application protocol would be used in conjunction with conventional network, data-link, and physical-layer protocols. The proposed protocol includes the ubiquitous Institute of Electrical and Electronics Engineers (IEEE) 802.11 medium access control (MAC) protocol to be used in the datalink layer. In addition to its widespread and proven use in

  13. Delineation of upper urinary tract segments at MDCT urography in patients with extra-urinary mass lesions: retrospective comparison of standard and low-dose protocols for the excretory phase of imaging

    Energy Technology Data Exchange (ETDEWEB)

    Mueller-Lisse, Ulrike L. [University of Munich, Department of Urology, Munich (Germany); University of Munich Medical School, Department of Urology, Muenchen (Germany); Coppenrath, Eva M.; Meindl, Thomas; Degenhart, Christoph; Scherr, Michael K.; Reiser, Maximilian F.; Mueller-Lisse, Ullrich G. [University of Munich, Department of Radiology, Munich (Germany); Stief, Christian G. [University of Munich, Department of Urology, Munich (Germany)

    2011-02-15

    Excretory-phase CT urography (CTU) may replace excretory urography in patients without urinary tumors. However, radiation exposure is a concern. We retrospectively compared upper urinary tract (UUT) delineation in low-dose and standard CTU. CTU (1-2 phases, 120 KV, 4 x 2.5 mm, pitch 0.875, i.v. non-ionic contrast media, iodine 36 g) was obtained with standard (14 patients, n = 27 UUTs, average 175.6 mAs/slice, average delay 16.8 min) or low-dose (26 patients, n = 86 UUTs, 29 mAs/slice, average delay 19.6 min) protocols. UUT was segmented into intrarenal collecting system (IRCS), upper, middle, and lower ureter (UU,MU,LU). Two independent readers (R1,R2) graded UUT segments as 1-not delineated, 2-partially delineated, 3-completely delineated (noisy margins), 4-completely delineated (clear margins). Chi-square statistics were calculated for partial versus complete delineation and complete delineation (clear margins), respectively. Complete delineation of UUT was similar in standard and low-dose CTU (R1, p > 0.15; R2, p > 0.2). IRCS, UU, and MU clearly delineated similarly often in standard and low-dose CTU (R1, p > 0.25; R2, p > 0.1). LU clearly delineated more often in standard protocols (R1, 18/6 standard, 38/31 low-dose, p > 0.1; R2 18/6 standard, 21/48 low-dose, p < 0.05). Low-dose CTU sufficiently delineated course of UUT and may locate obstruction/dilation, but appears unlikely to find intraluminal LU lesions. (orig.)

  14. A novel protocol for dispatcher assisted CPR improves CPR quality and motivation among rescuers-A randomized controlled simulation study.

    Science.gov (United States)

    Rasmussen, Stinne Eika; Nebsbjerg, Mette Amalie; Krogh, Lise Qvirin; Bjørnshave, Katrine; Krogh, Kristian; Povlsen, Jonas Agerlund; Riddervold, Ingunn Skogstad; Grøfte, Thorbjørn; Kirkegaard, Hans; Løfgren, Bo

    2017-01-01

    Emergency dispatchers use protocols to instruct bystanders in cardiopulmonary resuscitation (CPR). Studies changing one element in the dispatcher's protocol report improved CPR quality. Whether several changes interact is unknown and the effect of combining multiple changes previously reported to improve CPR quality into one protocol remains to be investigated. We hypothesize that a novel dispatch protocol, combining multiple beneficial elements improves CPR quality compared with a standard protocol. A novel dispatch protocol was designed including wording on chest compressions, using a metronome, regular encouragements and a 10-s rest each minute. In a simulated cardiac arrest scenario, laypersons were randomized to perform single-rescuer CPR guided with the novel or the standard protocol. a composite endpoint of time to first compression, hand position, compression depth and rate and hands-off time (maximum score: 22 points). Afterwards participants answered a questionnaire evaluating the dispatcher assistance. The novel protocol (n=61) improved CPR quality score compared with the standard protocol (n=64) (mean (SD): 18.6 (1.4)) points vs. 17.5 (1.7) points, pCPR. A novel bundle of care protocol improved CPR quality score and motivation among rescuers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Validation of the G.LAB MD2200 wrist blood pressure monitor according to the European Society of Hypertension, the British Hypertension Society, and the International Organization for Standardization Protocols.

    Science.gov (United States)

    Liu, Ze-Yu; Zhang, Qing-Han; Ye, Xiao-Lei; Liu, Da-Peng; Cheng, Kang; Zhang, Chun-Hai; Wan, Yi

    2017-04-01

    To validate the G.LAB MD2200 automated wrist blood pressure (BP) monitors according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010, the British Hypertension Society (BHS), and the International Organization for Standardization (ISO) 81060-2:2013 protocols. The device was assessed on 33 participants according to the ESH requirements and was then tested on 85 participants according to the BHS and ISO 81060-2:2013 criteria. The validation procedures and data analysis followed the protocols precisely. The G.LAB MD2200 devices passed all parts of ESH-IP revision 2010 for both systolic and diastolic BP, with a device-observer difference of 2.15±5.51 and 1.51±5.16 mmHg, respectively. The device achieved A/A grading for the BHS protocol and it also fulfilled the criteria of ISO 81060-2:2013, with mean differences of systolic and diastolic BP between the device and the observer of 2.19±5.21 and 2.11±4.70 mmHg, respectively. The G.LAB MD2200 automated wrist BP monitor passed the ESH-IP revision 2010 and the ISO 81060-2:2013 protocol, and achieved the A/A grade of the BHS protocol, which can be recommended for self-measurement in the general population.

  16. Implementation of a Rapid, Protocol-based TIA Management Pathway.

    Science.gov (United States)

    Jarhult, Susann J; Howell, Melissa L; Barnaure-Nachbar, Isabelle; Chang, Yuchiao; White, Benjamin A; Amatangelo, Mary; Brown, David F; Singhal, Aneesh B; Schwamm, Lee H; Silverman, Scott B; Goldstein, Joshua N

    2018-03-01

    Our goal was to assess whether use of a standardized clinical protocol improves efficiency for patients who present to the emergency department (ED) with symptoms of transient ischemic attack (TIA). We performed a structured, retrospective, cohort study at a large, urban, tertiary care academic center. In July 2012 this hospital implemented a standardized protocol for patients with suspected TIA. The protocol selected high-risk patients for admission and low/intermediate-risk patients to an ED observation unit for workup. Recommended workup included brain imaging, vascular imaging, cardiac monitoring, and observation. Patients were included if clinical providers determined the need for workup for TIA. We included consecutive patients presenting during a six-month period prior to protocol implementation, and those presenting between 6-12 months after implementation. Outcomes included ED length of stay (LOS), hospital LOS, use of neuroimaging, and 90-day risk of stroke or TIA. From 01/2012 to 06/2012, 130 patients were evaluated for TIA symptoms in the ED, and from 01/2013 to 06/2013, 150 patients. The final diagnosis was TIA or stroke in 45% before vs. 41% after (p=0.18). Following the intervention, the inpatient admission rate decreased from 62% to 24% (pTIA among those with final diagnosis of TIA was 3% for both periods. Implementation of a TIA protocol significantly reduced ED LOS and total hospital LOS.

  17. Standardization and optimization of fluorescence in situ hybridization (FISH) for HER-2 assessment in breast cancer: A single center experience.

    Science.gov (United States)

    Bogdanovska-Todorovska, Magdalena; Petrushevska, Gordana; Janevska, Vesna; Spasevska, Liljana; Kostadinova-Kunovska, Slavica

    2018-05-20

    Accurate assessment of human epidermal growth factor receptor 2 (HER-2) is crucial in selecting patients for targeted therapy. Commonly used methods for HER-2 testing are immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH). Here we presented the implementation, optimization and standardization of two FISH protocols using breast cancer samples and assessed the impact of pre-analytical and analytical factors on HER-2 testing. Formalin fixed paraffin embedded (FFPE) tissue samples from 70 breast cancer patients were tested for HER-2 using PathVysion™ HER-2 DNA Probe Kit and two different paraffin pretreatment kits, Vysis/Abbott Paraffin Pretreatment Reagent Kit (40 samples) and DAKO Histology FISH Accessory Kit (30 samples). The concordance between FISH and IHC results was determined. Pre-analytical and analytical factors (i.e., fixation, baking, digestion, and post-hybridization washing) affected the efficiency and quality of hybridization. The overall hybridization success in our study was 98.6% (69/70); the failure rate was 1.4%. The DAKO pretreatment kit was more time-efficient and resulted in more uniform signals that were easier to interpret, compared to the Vysis/Abbott kit. The overall concordance between IHC and FISH was 84.06%, kappa coefficient 0.5976 (p characteristics. Differences in the pre-analytical and analytical steps can affect the hybridization quality and efficiency. The use of DAKO pretreatment kit is time-saving and cost-effective.

  18. Network protocols. Special issue; Netwerkprotocollen. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, G.A. [RTB Van Heugten, Nijmegen (Netherlands); Rooijakkers, G.W.J. [GTI Building Automation, Amsterdam (Netherlands); Peterse, A. [Regel Partners, Hoevelaken (Netherlands); Smits, P. [Konnex Nederland, Valkenswaard (Netherlands); Hamers, E.P. [Van Dorp Installaties, Breda (Netherlands); Van der Velden, J.A.J. [Kropman, Rijswijk (Netherlands); Van Lingen, G.; Wijn, D.M. [Engineer Johnson Controls, Gorinchem (Netherlands); Deckere, W.J.M.A. [Deerns raadgevende ingenieurs, Rijswijk (Netherlands); Driessen, B. [Saia Burgess, Gouda (Netherlands); Van Olst, K. [K en R Consultants, Deventer (Netherlands); Mosterman, F. [Wago Building Technology, Harderwijk (Netherlands); Staub, R. [BUS-House, Zuerich (Switzerland); Meiring, O.B.; Hut, W.H. [Sauter Building Control Nederland, Amsterdam (Netherlands); Tukker, A. [Webeasy Products, Sliedrecht (Netherlands); Bakker, L.G.; Soethout, L.L.; Elkhuizen, P.A. [TNO Bouw en Ondergrond, Delft (Netherlands); Haeseler, U. [TAC GmbH, Berlin (Germany); Kerdel, J.F. [Siemens Building Technologies, Zoetermeer (Netherlands); Lugt, G.L.; Draijer, G.W.

    2007-11-15

    In 20 articles attention is paid to several aspects of network protocols by means of which building automation systems can exchange data: building automation and management, history of technical installations management, the open communication standard BACnet (Building Automation and Control network), the so-called ISO/IEC domotics and communication standard KNX or Konnex, the integration of electrotechnical and engineering installations by the LonWorks technology, other standard protocols as Modbus, M-bus, OPC (OLE for Process Control), an outline of TCP/IP, smart design of networks, automation and networks and building owners, the use of BACnet and Ethernet in a renovated office building, the use of an open management network in buildings, wireless open integrated systems, terminology in network communication, the use of BACnet in combination with KNX, the impact of BACnet on building automation, the role of the installation sector in the ICT-environment, knowledge of building automation and management, regulations with respect to building automation, and BACnet MSTP (Multiple Spanning Tree Protocol) [Dutch] In 20 artikelen wordt in dit themanummer aandacht besteed aan diverse aspecten m.b.t. netwerkprotocollen waarmee verschillende automatiseringssystemen gegevens met elkaar uitwisselen: gebouwautomatisering en beheer, geschiedenis van technisch installatie beheer, de open communicatie standaard BACnet (Building Automation and Control network), de zogenaamde ISO/IEC domotica en communicatie standaard KNX of Konnex, de integratie van electrotechnische en werktuigbouwkundige installaties met behulp van de LonWorks technologie, andere standaard protocollen zoals Modbus, M-bus, OPC (OLE for Process Control), uitleg over TCP/IP, slim ontwerpen van netwerken, gebouweigenaren over automatisering en netwerken, het gebruik van BACnet en Ethernet in een tot kantoorgebouw gerenoveerd monumentaal gebouw, het gebruik van een open management netwerk in gebouwen, draadloos met

  19. Ad-Hoc vs. Standardized and Optimized Arthropod Diversity Sampling

    Directory of Open Access Journals (Sweden)

    Pedro Cardoso

    2009-09-01

    Full Text Available The use of standardized and optimized protocols has been recently advocated for different arthropod taxa instead of ad-hoc sampling or sampling with protocols defined on a case-by-case basis. We present a comparison of both sampling approaches applied for spiders in a natural area of Portugal. Tests were made to their efficiency, over-collection of common species, singletons proportions, species abundance distributions, average specimen size, average taxonomic distinctness and behavior of richness estimators. The standardized protocol revealed three main advantages: (1 higher efficiency; (2 more reliable estimations of true richness; and (3 meaningful comparisons between undersampled areas.

  20. Analytical quality control [An IAEA service

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-07-01

    In analytical chemistry the determination of small or trace amounts of elements or compounds in different types of materials is increasingly important. The results of these findings have a great influence on different fields of science, and on human life. Their reliability, precision and accuracy must, therefore, be checked by analytical quality control measures. The International Atomic Energy Agency (IAEA) set up an Analytical Quality Control Service (AQCS) in 1962 to assist laboratories in Member States in the assessment of their reliability in radionuclide analysis, and in other branches of applied analysis in which radionuclides may be used as analytical implements. For practical reasons, most analytical laboratories are not in a position to check accuracy internally, as frequently resources are available for only one method; standardized sample material, particularly in the case of trace analysis, is not available and can be prepared by the institutes themselves only in exceptional cases; intercomparisons are organized rather seldom and many important types of analysis are so far not covered. AQCS assistance is provided by the shipment to laboratories of standard reference materials containing known quantities of different trace elements or radionuclides, as well as by the organization of analytical intercomparisons in which the participating laboratories are provided with aliquots of homogenized material of unknown composition for analysis. In the latter case the laboratories report their data to the Agency's laboratory, which calculates averages and distributions of results and advises each laboratory of its performance relative to all the others. Throughout the years several dozens of intercomparisons have been organized and many thousands of samples provided. The service offered, as a consequence, has grown enormously. The programme for 1973 and 1974, which is currently being distributed to Member States, will contain 31 different types of materials.

  1. Characterization of Catalytic Fast Pyrolysis Oils: The Importance of Solvent Selection for Analytical Method Development

    Energy Technology Data Exchange (ETDEWEB)

    Ferrell, Jack R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ware, Anne E [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-25

    Two catalytic fast pyrolysis (CFP) oils (bottom/heavy fraction) were analyzed in various solvents that are used in common analytical methods (nuclear magnetic resonance - NMR, gas chromatography - GC, gel permeation chromatography - GPC, thermogravimetric analysis - TGA) for oil characterization and speciation. A more accurate analysis of the CFP oils can be obtained by identification and exploitation of solvent miscibility characteristics. Acetone and tetrahydrofuran can be used to completely solubilize CFP oils for analysis by GC and tetrahydrofuran can be used for traditional organic GPC analysis of the oils. DMSO-d6 can be used to solubilize CFP oils for analysis by 13C NMR. The fractionation of oils into solvents that did not completely solubilize the whole oils showed that miscibility can be related to the oil properties. This allows for solvent selection based on physico-chemical properties of the oils. However, based on semi-quantitative comparisons of the GC chromatograms, the organic solvent fractionation schemes did not speciate the oils based on specific analyte type. On the other hand, chlorinated solvents did fractionate the oils based on analyte size to a certain degree. Unfortunately, like raw pyrolysis oil, the matrix of the CFP oils is complicated and is not amenable to simple liquid-liquid extraction (LLE) or solvent fractionation to separate the oils based on the chemical and/or physical properties of individual components. For reliable analyses, for each analytical method used, it is critical that the bio-oil sample is both completely soluble and also not likely to react with the chosen solvent. The adoption of the standardized solvent selection protocols presented here will allow for greater reproducibility of analysis across different users and facilities.

  2. Variability in donation after cardiac death protocols: a national survey.

    Science.gov (United States)

    Fugate, Jennifer E; Stadtler, Maria; Rabinstein, Alejandro A; Wijdicks, Eelco F M

    2011-02-27

    As donation after cardiac death practices expand, the number of institutional policies is increasing. We contacted organ procurement organizations throughout the United States and requested protocols in hospitals in their donor service areas. Sixty-four protocols were obtained with representation from 16 different states. The terminology and recommended practices varied substantially. The methods for death determination were not specified in 28 (44%) protocols. Most adhered to a 2- to 5-min observation time between circulatory arrest and organ procurement, but 10 (16%) provided no information. This variability reveals a need to define a uniform standard in donation after cardiac death protocols and death determination practices.

  3. Standards and (self)implosion

    DEFF Research Database (Denmark)

    Brøgger, Katja; Staunæs, Dorthe

    2016-01-01

    of standards often tends to conceptualize the travelling of standards as contagious processes resulting in epidemic spreads. In this article, the abstract metaphor of epidemic spread is replaced by an analytical configuration of a new mode of educational governance in which orchestrating webs of incentives...

  4. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  5. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Beckers, Catherine; Hustinx, Roland

    2014-01-01

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  6. Acute effects of various weighted bat warm-up protocols on bat velocity.

    Science.gov (United States)

    Reyes, G Francis; Dolny, Dennis

    2009-10-01

    Although research has provided evidence of increased muscular performance following a facilitation set of resistance exercise, this has not been established for use prior to measuring baseball bat velocity. The purpose of this study was to determine the effectiveness of selected weighted bat warm-up protocols to enhance bat velocity in collegiate baseball players. Nineteen collegiate baseball players (age = 20.15 +/- 1.46 years) were tested for upper-body strength by a 3-repetition maximum (RM) bench press (mean = 97.98 +/- 14.54 kg) and mean bat velocity. Nine weighted bat warm-up protocols, utilizing 3 weighted bats (light = 794 g; standard = 850 g; heavy = 1,531 g) were swung in 3 sets of 6 repetitions in different orders. A control trial involved the warm-up protocol utilizing only the standard bat. Pearson product correlation revealed a significant relationship between 3RM strength and pretest bat velocity (r = 0.51, p = 0.01). Repeated measures analysis of variance (ANOVA) revealed no significant treatment effects of warm-up protocol on bat velocity. However, the order of standard, light, heavy bat sequence resulted in the greatest increase in bat velocity (+6.03%). These results suggest that upper-body muscle strength influences bat velocity. It appears that the standard, light, heavy warm-up order may provide the greatest benefit to increase subsequent bat velocity and may warrant use in game situations.

  7. Non-Intrusive Load Monitoring Assessment: Literature Review and Laboratory Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Butner, R. Scott; Reid, Douglas J.; Hoffman, Michael G.; Sullivan, Greg; Blanchard, Jeremy

    2013-07-01

    To evaluate the accuracy of NILM technologies, a literature review was conducted to identify any test protocols or standardized testing approaches currently in use. The literature review indicated that no consistent conventions were currently in place for measuring the accuracy of these technologies. Consequently, PNNL developed a testing protocol and metrics to provide the basis for quantifying and analyzing the accuracy of commercially available NILM technologies. This report discusses the results of the literature review and the proposed test protocol and metrics in more detail.

  8. Network protocol 'EPAP'; Network protokoru 'EPAP'

    Energy Technology Data Exchange (ETDEWEB)

    Kobori, T.; Fujita, F.; Iwamoto, S. [Fuji Electric Co. Ltd., Toyo (Japan)

    2000-10-10

    The Ethernet, a standard of information networks, has begun to be applied to the control local area network (LAN). To apply the Ethernet to the field level, Fuji Electric has newly developed the communication protocol 'Ethernet precision access protocol (EPAP)' in which a command/response method is structured on the user datagram protocol (UDP) to realize real time and high reliability. Further, we have implemented the EPAP on the bus interface module of the open PIO. This paper outlines the EPAP and its implementation. (author)

  9. Protocol for the building construction process with regard to the implementation trajectory protocols EWN and EUN. Manual for commissioners, contractors, building management offices and energy efficiency standard advisors; Handleiding opnameprotocollen EWN en EUN. Voor opdrachtgevers, aannemers, bouwmanagementbureaus en EPN-adviseurs

    Energy Technology Data Exchange (ETDEWEB)

    Neeleman, J. [DWA installatie- en energieadvies, Duitslandweg 4, Postbus 274, 2410 AG Bodegraven (Netherlands)

    2013-04-15

    In the year 2012 it was foreseen to base the energy label for new buildings on the Energy Efficiency Coefficient (EPC in Dutch). This is a protocol for residential and utility buildings, with the aim to check whether and to what extent buildings were constructed according the EPC and to determine the realized EPC. In order to gain experience with the new protocols and the voluntary ventilation test the Protocol for the Energy Label for New Houses (EWN in Dutch) and the Protocol for the Energy Label for New Utility Buildings (EUN in Dutch) were conducted in 12 newly built housing projects and 5 projects in the utility building sector. With this manual you can realize energy efficient houses and/or utility buildings that meet the standards [Dutch] In het jaar 2012 was voorzien om het nieuwbouwlabel te baseren op de EPC (Energie Prestatie Coefficient). Hiervoor is een opnameprotocol opgesteld voor de woningbouw en de utiliteitsbouw, met als doel te controleren of en in hoeverre conform de EPC is gebouwd en om de gerealiseerde EPC te bepalen. Om ervaring op te doen met de nieuwe opnameprotocollen en de vrijwillige ventilatietoets werden het Opnameprotocol Energielabel Woningen Nieuwbouw (EWN) en Opnameprotocol Energielabel Utiliteitsgebouwen Nieuwbouw (EUN) uitgevoerd bij 12 nieuwbouwprojecten in de woningbouw en 5 projecten in de utiliteitsbouw. Met deze handleiding realiseert u energiezuinige woningen en/of utiliteitsgebouwen die aan de verwachtingen voldoen.

  10. Standardization of a protocol to obtain genomic DNA for the quantification of 5mC in epicormics buds of Tectona grandis L.

    OpenAIRE

    Elisa Quiala; Luis Valledor; Rodrigo Hazbun; Raúl Barbón; Manuel de Feria; Maité Chávez

    2008-01-01

    The present investigation was carried out with the objective of defining an extraction and purification method that it provided deoxyribonucleic acid (DNA) appropriate to determine the percentage of 5mC in the genomic DNA of epicormics buds of Tectona grandis L. During the standardization of the protocol four methods were compared: 1 -classic based on saline shock solution with CTAB (hexadecil trimetil ammonium bromide), 2 - Kit of extraction of DNA plants DNeasy Plant Mini Kit (QIAGEN) accor...

  11. A Web Resource for Standardized Benchmark Datasets, Metrics, and Rosetta Protocols for Macromolecular Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Shane Ó Conchúir

    Full Text Available The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.

  12. Early Appropriate Care: A Protocol to Standardize Resuscitation Assessment and to Expedite Fracture Care Reduces Hospital Stay and Enhances Revenue.

    Science.gov (United States)

    Vallier, Heather A; Dolenc, Andrea J; Moore, Timothy A

    2016-06-01

    We hypothesized that a standardized protocol for fracture care would enhance revenue by reducing complications and length of stay. Prospective consecutive series. Level 1 trauma center. Two hundread and fifty-three adult patients with a mean age of 40.7 years and mean Injury Severity Score of 26.0. Femur, pelvis, or spine fractures treated surgically. Hospital and professional charges and collections were analyzed. Fixation was defined as early (<36 hours) or delayed. Complications and hospital stay were recorded. Mean charges were US $180,145 with a mean of US $66,871 collected (37%). The revenue multiplier was US $59,882/$6989 (8.57), indicating hospital collection of US $8.57 for every professional dollar, less than half of which went to orthopaedic surgeons. Delayed fracture care was associated with more intensive care unit (4.5 vs. 9.4) and total hospital days (9.4 vs. 15.3), with mean loss of actual revenue US $6380/patient delayed (n = 47), because of the costs of longer length of stay. Complications were associated with the highest expenses: mean of US $291,846 charges and US $101,005 collections, with facility collections decreased by 5.1%. An uncomplicated course of care was associated with the most favorable total collections: (US $60,017/$158,454 = 38%) and the shortest mean stay (8.7 days). Facility collections were nearly 9 times more than professional collections. Delayed fixation was associated with more complications, and facility collections decreased 5% with a complication. Furthermore, delayed fixation was associated with longer hospital stay, accounting for US $300K more in actual costs during the study. A standardized protocol to expedite definitive fixation enhances the profitability of the trauma service line. Economic Level IV. See Instructions for Authors for a complete description of levels of evidence.

  13. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  14. Analytic central path, sensitivity analysis and parametric linear programming

    NARCIS (Netherlands)

    A.G. Holder; J.F. Sturm; S. Zhang (Shuzhong)

    1998-01-01

    textabstractIn this paper we consider properties of the central path and the analytic center of the optimal face in the context of parametric linear programming. We first show that if the right-hand side vector of a standard linear program is perturbed, then the analytic center of the optimal face

  15. Analytical protocols for the determination of sulphur compounds characteristic of the metabolism of Chlorobium limicola

    Directory of Open Access Journals (Sweden)

    A. Aliboni

    2015-09-01

    Full Text Available Chlorobium limicola belongs to the green sulphur bacteria that has a potential for technological applications such as biogas clean up oxidising hydrogen sulphide to elemental sulphur through photosynthetic process. In the present work, analytical methods are described for the determination of different sulphur species in C. limicola cultures – sulphide by GC-FPD, sulphate by ionic HPLC and elemental sulphur by RP HPLC. The latter method eliminates the need for chloroform extraction of water suspensions of elemental sulphur. Data from sulphide and elemental sulphur analyses have been compared with ones coming from more traditional analytical methodologies.

  16. A Business Evaluation Of The Next Generation Ipv6 Protocol In Fixed And Mobile Communication Services

    OpenAIRE

    Pau, Louis-François

    2002-01-01

    textabstractThis paper gives an analytical business model of the Internet IPv4 and IPv6 protocols ,focussing on the business implications of intrinsic technical properties of these protocols .The technical properties modeled in business terms are : address space, payload, autoconfiguration, IP mobility , security, and flow label. Three operational cash flow focussed performance indexes are defined for respectively an Internet operator or ISP, for the address domain owner, and for the end user...

  17. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  18. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  19. Protocol for the management of psychiatric patients with psychomotor agitation.

    Science.gov (United States)

    Vieta, Eduard; Garriga, Marina; Cardete, Laura; Bernardo, Miquel; Lombraña, María; Blanch, Jordi; Catalán, Rosa; Vázquez, Mireia; Soler, Victòria; Ortuño, Noélia; Martínez-Arán, Anabel

    2017-09-08

    Psychomotor agitation (PMA) is a state of motor restlessness and mental tension that requires prompt recognition, appropriate assessment and management to minimize anxiety for the patient and reduce the risk for escalation to aggression and violence. Standardized and applicable protocols and algorithms can assist healthcare providers to identify patients at risk of PMA, achieve timely diagnosis and implement minimally invasive management strategies to ensure patient and staff safety and resolution of the episode. Spanish experts in PMA from different disciplines (psychiatrists, psychologists and nurses) convened in Barcelona for a meeting in April 2016. Based on recently issued international consensus guidelines on the standard of care for psychiatric patients with PMA, the meeting provided the opportunity to address the complexities in the assessment and management of PMA from different perspectives. The attendees worked towards producing a consensus for a unified approach to PMA according to the local standards of care and current local legislations. The draft protocol developed was reviewed and ratified by all members of the panel prior to its presentation to the Catalan Society of Psychiatry and Mental Health, the Spanish Society of Biological Psychiatry (SEPB) and the Spanish Network Centre for Research in Mental Health (CIBERSAM) for input. The final protocol and algorithms were then submitted to these organizations for endorsement. The protocol presented here provides guidance on the appropriate selection and use of pharmacological agents (inhaled/oral/IM), seclusion, and physical restraint for psychiatric patients suspected of or presenting with PMA. The protocol is applicable within the Spanish healthcare system. Implementation of the protocol and the constituent algorithms described here should ensure the best standard of care of patients at risk of PMA. Episodes of PMA could be identified earlier in their clinical course and patients could be managed in

  20. Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.

    Science.gov (United States)

    Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo

    2018-02-23

    The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.

  1. Chasing the effects of Pre-analytical Confounders - a Multicentre Study on CSF-AD biomarkers

    Directory of Open Access Journals (Sweden)

    Maria Joao Leitao

    2015-07-01

    Full Text Available Core cerebrospinal fluid (CSF biomarkers-Aβ42, Tau and pTau–have been recently incorporated in the revised criteria for Alzheimer’s disease (AD. However, their widespread clinical application lacks standardization. Pre-analytical sample handling and storage play an important role in the reliable measurement of these biomarkers across laboratories. In this study, we aim to surpass the efforts from previous studies, by employing a multicentre approach to assess the impact of less studied CSF pre-analytical confounders in AD-biomarkers quantification. Four different centres participated in this study and followed the same established protocol. CSF samples were analysed for three biomarkers (Aβ42, Tau and pTau and tested for different spinning conditions (temperature: Room temperature (RT vs. 4oC; speed: 500g vs. 2000g vs. 3000g, storage volume variations (25%, 50% and 75% of tube total volume as well as freezing-thaw cycles (up to 5 cyles. The influence of sample routine parameters, inter-centre variability and relative value of each biomarker (reported as normal/abnormal, was analysed. Centrifugation conditions did not influence biomarkers levels, except for samples with a high CSF total protein content, where either non centrifugation or centrifugation at RT, compared to 4ºC, led to higher Aβ42 levels. Reducing CSF storage volume from 75% to 50% of total tube capacity, decreased Aβ42 concentration (within analytical CV of the assay, whereas no change in Tau or pTau was observed. Moreover, the concentration of Tau and pTau appears to be stable up to 5 freeze-thaw cycles, whereas Aβ42 levels decrease if CSF is freeze-thawed more than 3 times. This systematic study reinforces the need for CSF centrifugation at 4ºC prior to storage and highlights the influence of storage conditions in Aβ42 levels. This study contributes to the establishment of harmonized standard operating procedures that will help reducing inter-lab variability of CSF

  2. Application of low-dose radiation protocols in survey CT scans

    International Nuclear Information System (INIS)

    Fu Qiang; Liu Ting; Lu Tao; Xu Ke; Zhang Lin

    2009-01-01

    Objective: To characterize the protocols with low-dose radiation in survey CT scans for localization. Methods: Eighty standard adult patients, head and body phantoms were recruited. Default protocols provided by operator's manual setting were that all the tube voltage for head, chest, abdomen and lumbar was 120 kV; the tube currents were 20,10,20 and 40 mA, respectively. Values of kV and mA in the low-dose experiments were optimized according to the device options. For chest and abdomen, the tube position were compared between default (0 degree) and 180 degree. Phantoms were scanned with above protocols, and the radiation doses were measured respectively. Paired t-test were used for comparisons of standard deviation in CT value, noise and exposure surface dose (ESD) between group with default protocols and group with optimized protocols. Results: The optimized protocols in low-dose CT survey scans were 80 kV, 10 mA for head, 80 kV, 10 mA for chest, 80 kV, 10 mA for abdomen and 100 kV, 10 mA for lumbar. The values of ESD for phantom scan in default and optimized protocols were 0.38 mGy/0.16 mGy in head, 0.30 mGy/0.20 mGy in chest, 0.74 mGy/0.30 mGy in abdomen and 0.81 mGy/0.44 mGy in lumbar, respectively. Compared with default protocols, the optimized protocols reduced the radiation doses 59%, 33%, 59% and 46% in head, chest, abdomen and lumbar. When tube position changed from 0 degree to 180 degree, the ESD were 0.24 mGy/0.20 mGy for chest; 0.37 mGy/0.30 mGy for abdomen, and the radiation doses were reduced 20% and 17%. Conclusion: A certain amount of image noise is increased in low-dose protocols, but image quality is still acceptable without problem in CT localization. The reduction of radiation dose and the radiation harm to patients are the superiority. (authors)

  3. Field validation of protocols developed to evaluate in-line mastitis detection systems.

    Science.gov (United States)

    Kamphuis, C; Dela Rue, B T; Eastwood, C R

    2016-02-01

    This paper reports on a field validation of previously developed protocols for evaluating the performance of in-line mastitis-detection systems. The protocols outlined 2 requirements of these systems: (1) to detect cows with clinical mastitis (CM) promptly and accurately to enable timely and appropriate treatment and (2) to identify cows with high somatic cell count (SCC) to manage bulk milk SCC levels. Gold standard measures, evaluation tests, performance measures, and performance targets were proposed. The current study validated the protocols on commercial dairy farms with automated in-line mastitis-detection systems using both electrical conductivity (EC) and SCC sensor systems that both monitor at whole-udder level. The protocol for requirement 1 was applied on 3 commercial farms. For requirement 2, the protocol was applied on 6 farms; 3 of them had low bulk milk SCC (128×10(3) cells/mL) and were the same farms as used for field evaluation of requirement 1. Three farms with high bulk milk SCC (270×10(3) cells/mL) were additionally enrolled. The field evaluation methodology and results were presented at a workshop including representation from 7 international suppliers of in-line mastitis-detection systems. Feedback was sought on the acceptance of standardized performance evaluation protocols and recommended refinements to the protocols. Although the methodology for requirement 1 was relatively labor intensive and required organizational skills over an extended period, no major issues were encountered during the field validation of both protocols. The validation, thus, proved the protocols to be practical. Also, no changes to the data collection process were recommended by the technology supplier representatives. However, 4 recommendations were made to refine the protocols: inclusion of an additional analysis that ignores small (low-density) clot observations in the definition of CM, extension of the time window from 4 to 5 milkings for timely alerts for CM

  4. New Acquisition Protocol of 18F-Choline PET/CT in Prostate Cancer Patients: Review of the Literature about Methodology and Proposal of Standardization

    Directory of Open Access Journals (Sweden)

    Sotirios Chondrogiannis

    2014-01-01

    Full Text Available Purpose. (1 To evaluate a new acquisition protocol of 18F-choline (FCH PET/CT for prostate cancer patients (PC, (2 to review acquisition 18F-choline PET/CT methodology, and (3 to propose a standardized acquisition protocol on FCH PET/CT in PC patients. Materials. 100 consecutive PC patients (mean age 70.5 years, mean PSA 21.35 ng/mL were prospectively evaluated. New protocol consisted of an early scan of the pelvis immediately after the injection of the tracer (1 bed position of 4 min followed by a whole body scan at one 1 hour. Early and 1 hour images were compared for interfering activity and pathologic findings. Results. The overall detection rate of FCH PET/CT was 64%. The early static images of the pelvis showed absence of radioactive urine in ureters, bladder, or urethra which allowed a clean evaluation of the prostatic fossae. Uptake in the prostatic region was better visualized in the early phase in 26% (7/30 of cases. Other pelvic pathologic findings (bone and lymph nodes were visualized in both early and late images. Conclusion. Early 18F-choline images improve visualization of abnormal uptake in prostate fossae. All pathologic pelvic deposits (prostate, lymph nodes, and bone were visualized in both early and late images.

  5. Remote Memory Access Protocol Target Node Intellectual Property

    Science.gov (United States)

    Haddad, Omar

    2013-01-01

    The MagnetoSpheric Multiscale (MMS) mission had a requirement to use the Remote Memory Access Protocol (RMAP) over its SpaceWire network. At the time, no known intellectual property (IP) cores were available for purchase. Additionally, MMS preferred to implement the RMAP functionality with control over the low-level details of the design. For example, not all the RMAP standard functionality was needed, and it was desired to implement only the portions of the RMAP protocol that were needed. RMAP functionality had been previously implemented in commercial off-the-shelf (COTS) products, but the IP core was not available for purchase. The RMAP Target IP core is a VHDL (VHSIC Hardware Description Language description of a digital logic design suitable for implementation in an FPGA (field-programmable gate array) or ASIC (application-specific integrated circuit) that parses SpaceWire packets that conform to the RMAP standard. The RMAP packet protocol allows a network host to access and control a target device using address mapping. This capability allows SpaceWire devices to be managed in a standardized way that simplifies the hardware design of the device, as well as the development of the software that controls the device. The RMAP Target IP core has some features that are unique and not specified in the RMAP standard. One such feature is the ability to automatically abort transactions if the back-end logic does not respond to read/write requests within a predefined time. When a request times out, the RMAP Target IP core automatically retracts the request and returns a command response with an appropriate status in the response packet s header. Another such feature is the ability to control the SpaceWire node or router using RMAP transactions in the extended address range. This allows the SpaceWire network host to manage the SpaceWire network elements using RMAP packets, which reduces the number of protocols that the network host needs to support.

  6. Estimation of the radiation exposure of a chest pain protocol with ECG-gating in dual-source computed tomography

    International Nuclear Information System (INIS)

    Ketelsen, Dominik; Luetkhoff, Marie H.; Thomas, Christoph; Werner, Matthias; Tsiflikas, Ilias; Reimann, Anja; Kopp, Andreas F.; Claussen, Claus D.; Heuschmid, Martin; Buchgeister, Markus; Burgstahler, Christof

    2009-01-01

    The aim of the study was to evaluate radiation exposure of a chest pain protocol with ECG-gated dual-source computed tomography (DSCT). An Alderson Rando phantom equipped with thermoluminescent dosimeters was used for dose measurements. Exposure was performed on a dual-source computed tomography system with a standard protocol for chest pain evaluation (120 kV, 320 mAs/rot) with different simulated heart rates (HRs). The dose of a standard chest CT examination (120 kV, 160 mAs) was also measured. Effective dose of the chest pain protocol was 19.3/21.9 mSv (male/female, HR 60), 17.9/20.4 mSv (male/female, HR 80) and 14.7/16.7 mSv (male/female, HR 100). Effective dose of a standard chest examination was 6.3 mSv (males) and 7.2 mSv (females). Radiation dose of the chest pain protocol increases significantly with a lower heart rate for both males (p = 0.040) and females (p = 0.044). The average radiation dose of a standard chest CT examination is about 36.5% that of a CT examination performed for chest pain. Using DSCT, the evaluated chest pain protocol revealed a higher radiation exposure compared with standard chest CT. Furthermore, HRs markedly influenced the dose exposure when using the ECG-gated chest pain protocol. (orig.)

  7. Evaluation of the quality of results obtained in institutions participating in interlaboratory experiments and of the reliability characteristics of the analytical methods used on the basis of certification of standard soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Parshin, A.K.; Obol' yaninova, V.G.; Sul' dina, N.P.

    1986-08-20

    Rapid monitoring of the level of pollution of the environment and, especially, of soils necessitates preparation of standard samples (SS) close in properties and material composition to the objects to be analyzed. During 1978-1982 four sets (three types of samples in each) of State Standard Samples of different soils were developed: soddy-podzolic sandy-loamy, typical chernozem, krasnozem, and calcareous sierozem. The certification studies of the SS of the soils were carried out in accordance with the classical scheme of interlab experiment (ILE). More than 100 institutions were involved in the ILE and the total number of independent analytical results was of the order of 10/sup 4/. With such a volume of analytical information at their disposal they were able to find some general characteristics intrinsic to certification studies, to assess the quality of work of the ILE participants with due regard for their specialization, and the reliability characteristics of the analytical methods used.

  8. The Internet of Things Key Applications and Protocols

    CERN Document Server

    Hersent, Olivier; Elloumi, Omar

    2011-01-01

    An all-in-one reference to the major Home Area Networking, Building Automation and AMI protocols, including 802.15.4 over radio or PLC, 6LowPAN/RPL, ZigBee 1.0 and Smart Energy 2.0, Zwave, LON, BACNet, KNX, ModBus, mBus, C.12 and DLMS/COSEM, and the new ETSI M2M system level standard. In-depth coverage of Smart-grid and EV charging use cases. This book describes the Home Area Networking, Building Automation and AMI protocols and their evolution towards open protocols based on IP such as 6LowPAN and ETSI M2M. The authors discuss the approach taken by service providers to interconnect the protoc

  9. Analytic continuation of quantum Monte Carlo data by stochastic analytical inference.

    Science.gov (United States)

    Fuchs, Sebastian; Pruschke, Thomas; Jarrell, Mark

    2010-05-01

    We present an algorithm for the analytic continuation of imaginary-time quantum Monte Carlo data which is strictly based on principles of Bayesian statistical inference. Within this framework we are able to obtain an explicit expression for the calculation of a weighted average over possible energy spectra, which can be evaluated by standard Monte Carlo simulations, yielding as by-product also the distribution function as function of the regularization parameter. Our algorithm thus avoids the usual ad hoc assumptions introduced in similar algorithms to fix the regularization parameter. We apply the algorithm to imaginary-time quantum Monte Carlo data and compare the resulting energy spectra with those from a standard maximum-entropy calculation.

  10. [Studies on the identification of psychotropic substances. VIII. Preparation and various analytical data of reference standard of some stimulants, amfepramone, cathinone, N-ethylamphetamine, fenethylline, fenproporex and mefenorex].

    Science.gov (United States)

    Shimamine, M; Takahashi, K; Nakahara, Y

    1992-01-01

    The Reference Standards for amfepramone, cathinone, N-ethylamphetamine, fenethylline, fenproporex and mefenorex were prepared. Their purities determined by HPLC were more than 99.5%. For the identification and determination of these six drugs, their analytical data were measured and discussed by TLC, UV, IR, HPLC, GC/MS and NMR.

  11. Teachable, high-content analytics for live-cell, phase contrast movies.

    Science.gov (United States)

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  12. Non operative management of blunt splenic trauma: a prospective evaluation of a standardized treatment protocol.

    Science.gov (United States)

    Brillantino, A; Iacobellis, F; Robustelli, U; Villamaina, E; Maglione, F; Colletti, O; De Palma, M; Paladino, F; Noschese, G

    2016-10-01

    The advantages of the conservative approach for major spleen injuries are still debated. This study was designed to evaluate the safety and effectiveness of NOM in the treatment of minor (grade I-II according with the American Association for the Surgery of Trauma; AAST) and severe (AAST grade III-V) blunt splenic trauma, following a standardized treatment protocol. All the hemodynamically stable patients with computer tomography (CT) diagnosis of blunt splenic trauma underwent NOM, which included strict clinical and laboratory observation, 48-72 h contrast-enhanced ultrasonography (CEUS) follow-up and splenic angioembolization, performed both in patients with admission CT evidence of vascular injuries and in patients with falling hematocrit during observation. 87 patients [32 (36.7 %) women and 55 (63.2 %) men, median age 34 (range 14-68)] were included. Of these, 28 patients (32.1 %) had grade I, 22 patients (25.2 %) grade II, 20 patients (22.9 %) grade III, 11 patients (12.6 %) grade IV and 6 patients (6.8 %) grade V injuries. The overall success rate of NOM was 95.4 % (82/87). There was no significant difference in the success rate between the patients with different splenic injuries grade. Of 24 patients that had undergone angioembolization, 22 (91.6 %) showed high splenic injury grade. The success rate of embolization was 91.6 % (22/24). No major complications were observed. The minor complications (2 pleural effusions, 1 pancreatic fistula and 2 splenic abscesses) were successfully treated by EAUS or CT guided drainage. The non operative management of blunt splenic trauma, according to our protocol, represents a safe and effective treatment for both minor and severe injuries, achieving an overall success rate of 95 %. The angiographic study could be indicated both in patients with CT evidence of vascular injuries and in patients with high-grade splenic injuries, regardless of CT findings.

  13. Comparison of mRNA splicing assay protocols across multiple laboratories: recommendations for best practice in standardized clinical testing.

    Science.gov (United States)

    Whiley, Phillip J; de la Hoya, Miguel; Thomassen, Mads; Becker, Alexandra; Brandão, Rita; Pedersen, Inge Sokilde; Montagna, Marco; Menéndez, Mireia; Quiles, Francisco; Gutiérrez-Enríquez, Sara; De Leeneer, Kim; Tenés, Anna; Montalban, Gemma; Tserpelis, Demis; Yoshimatsu, Toshio; Tirapo, Carole; Raponi, Michela; Caldes, Trinidad; Blanco, Ana; Santamariña, Marta; Guidugli, Lucia; de Garibay, Gorka Ruiz; Wong, Ming; Tancredi, Mariella; Fachal, Laura; Ding, Yuan Chun; Kruse, Torben; Lattimore, Vanessa; Kwong, Ava; Chan, Tsun Leung; Colombo, Mara; De Vecchi, Giovanni; Caligo, Maria; Baralle, Diana; Lázaro, Conxi; Couch, Fergus; Radice, Paolo; Southey, Melissa C; Neuhausen, Susan; Houdayer, Claude; Fackenthal, Jim; Hansen, Thomas Van Overeem; Vega, Ana; Diez, Orland; Blok, Rien; Claes, Kathleen; Wappenschmidt, Barbara; Walker, Logan; Spurdle, Amanda B; Brown, Melissa A

    2014-02-01

    Accurate evaluation of unclassified sequence variants in cancer predisposition genes is essential for clinical management and depends on a multifactorial analysis of clinical, genetic, pathologic, and bioinformatic variables and assays of transcript length and abundance. The integrity of assay data in turn relies on appropriate assay design, interpretation, and reporting. We conducted a multicenter investigation to compare mRNA splicing assay protocols used by members of the ENIGMA (Evidence-Based Network for the Interpretation of Germline Mutant Alleles) consortium. We compared similarities and differences in results derived from analysis of a panel of breast cancer 1, early onset (BRCA1) and breast cancer 2, early onset (BRCA2) gene variants known to alter splicing (BRCA1: c.135-1G>T, c.591C>T, c.594-2A>C, c.671-2A>G, and c.5467+5G>C and BRCA2: c.426-12_8delGTTTT, c.7988A>T, c.8632+1G>A, and c.9501+3A>T). Differences in protocols were then assessed to determine which elements were critical in reliable assay design. PCR primer design strategies, PCR conditions, and product detection methods, combined with a prior knowledge of expected alternative transcripts, were the key factors for accurate splicing assay results. For example, because of the position of primers and PCR extension times, several isoforms associated with BRCA1, c.594-2A>C and c.671-2A>G, were not detected by many sites. Variation was most evident for the detection of low-abundance transcripts (e.g., BRCA2 c.8632+1G>A Δ19,20 and BRCA1 c.135-1G>T Δ5q and Δ3). Detection of low-abundance transcripts was sometimes addressed by using more analytically sensitive detection methods (e.g., BRCA2 c.426-12_8delGTTTT ins18bp). We provide recommendations for best practice and raise key issues to consider when designing mRNA assays for evaluation of unclassified sequence variants.

  14. Analytical chemistry of nuclear materials

    International Nuclear Information System (INIS)

    1963-01-01

    The last two decades have witnessed an enormous development in chemical analysis. The rapid progress of nuclear energy, of solid-state physics and of other fields of modern industry has extended the concept of purity to limits previously unthought of, and to reach the new dimensions of these extreme demands, entirely new techniques have been invented and applied and old ones have been refined. Recognizing these facts, the International Atomic Energy Agency convened a Panel on Analytical Chemistry of Nuclear Materials to discuss the general problems facing the analytical chemist engaged in nuclear energy development, particularly in newly developing centre and countries, to analyse the represent situation and to advise as to the directions in which research and development appear to be most necessary. The Panel also discussed the analytical programme of the Agency's laboratory at Seibersdorf, where the Agency has already started a programme of international comparison of analytical methods which may lead to the establishment of international standards for many materials of interest. Refs and tabs

  15. Analytical Chemistry Laboratory progress report for FY 1985

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  16. Analytical Chemistry Laboratory progress report for FY 1985

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab

  17. Energy-Aware RFID Anti-Collision Protocol.

    Science.gov (United States)

    Arjona, Laura; Simon, Hugo Landaluce; Ruiz, Asier Perallos

    2018-06-11

    The growing interest in mobile devices is transforming wireless identification technologies. Mobile and battery-powered Radio Frequency Identification (RFID) readers, such as hand readers and smart phones, are are becoming increasingly attractive. These RFID readers require energy-efficient anti-collision protocols to minimize the tag collisions and to expand the reader's battery life. Furthermore, there is an increasing interest in RFID sensor networks with a growing number of RFID sensor tags. Thus, RFID application developers must be mindful of tag anti-collision protocols. Energy-efficient protocols involve a low reader energy consumption per tag. This work presents a thorough study of the reader energy consumption per tag and analyzes the main factor that affects this metric: the frame size update strategy. Using the conclusion of this analysis, the anti-collision protocol Energy-Aware Slotted Aloha (EASA) is presented to decrease the energy consumption per tag. The frame size update strategy of EASA is configured to minimize the energy consumption per tag. As a result, EASA presents an energy-aware frame. The performance of the proposed protocol is evaluated and compared with several state of the art Aloha-based anti-collision protocols based on the current RFID standard. Simulation results show that EASA, with an average of 15 mJ consumed per tag identified, achieves a 6% average improvement in the energy consumption per tag in relation to the strategies of the comparison.

  18. Using the ACR/NEMA standard with TCP/IP and Ethernet

    Science.gov (United States)

    Chimiak, William J.; Williams, Rodney C.

    1991-07-01

    There is a need for a consolidated picture archival and communications system (PACS) in hospitals. At the Bowman Gray School of Medicine of Wake Forest University (BGSM), the authors are enhancing the ACR/NEMA Version 2 protocol using UNIX sockets and TCP/IP to greatly improve connectivity. Initially, nuclear medicine studies using gamma cameras are to be sent to PACS. The ACR/NEMA Version 2 protocol provides the functionality of the upper three layers of the open system interconnection (OSI) model in this implementation. The images, imaging equipment information, and patient information are then sent in ACR/NEMA format to a software socket. From there it is handed to the TCP/IP protocol, which provides the transport and network service. TCP/IP, in turn, uses the services of IEEE 802.3 (Ethernet) to complete the connectivity. The advantage of this implementation is threefold: (1) Only one I/O port is consumed by numerous nuclear medicine cameras, instead of a physical port for each camera. (2) Standard protocols are used which maximize interoperability with ACR/NEMA compliant PACSs. (3) The use of sockets allows a migration path to the transport and networking services of OSIs TP4 and connectionless network service as well as the high-performance protocol being considered by the American National Standards Institute (ANSI) and the International Standards Organization (ISO) -- the Xpress Transfer Protocol (XTP). The use of sockets also gives access to ANSI's Fiber Distributed Data Interface (FDDI) as well as other high-speed network standards.

  19. Performance Evaluation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Curti, Michele

    2005-01-01

    We use a special operational semantics which drives us in inferring quantitative measures on systems describing cryptographis cryptographic protocols. We assign rates to transitions by only looking at these labels. The rates reflect the distributed architecture running applications and the use...... of possibly different cryptosystems. We then map transition systems to Markov chains and evaluate performance of systems, using standard tools....

  20. Copper and tin isotopic analysis of ancient bronzes for archaeological investigation: development and validation of a suitable analytical methodology.

    Science.gov (United States)

    Balliana, Eleonora; Aramendía, Maite; Resano, Martin; Barbante, Carlo; Vanhaecke, Frank

    2013-03-01

    Although in many cases Pb isotopic analysis can be relied on for provenance determination of ancient bronzes, sometimes the use of "non-traditional" isotopic systems, such as those of Cu and Sn, is required. The work reported on in this paper aimed at revising the methodology for Cu and Sn isotope ratio measurements in archaeological bronzes via optimization of the analytical procedures in terms of sample pre-treatment, measurement protocol, precision, and analytical uncertainty. For Cu isotopic analysis, both Zn and Ni were investigated for their merit as internal standard (IS) relied on for mass bias correction. The use of Ni as IS seems to be the most robust approach as Ni is less prone to contamination, has a lower abundance in bronzes and an ionization potential similar to that of Cu, and provides slightly better reproducibility values when applied to NIST SRM 976 Cu isotopic reference material. The possibility of carrying out direct isotopic analysis without prior Cu isolation (with AG-MP-1 anion exchange resin) was investigated by analysis of CRM IARM 91D bronze reference material, synthetic solutions, and archaeological bronzes. Both procedures (Cu isolation/no Cu isolation) provide similar δ (65)Cu results with similar uncertainty budgets in all cases (±0.02-0.04 per mil in delta units, k = 2, n = 4). Direct isotopic analysis of Cu therefore seems feasible, without evidence of spectral interference or matrix-induced effect on the extent of mass bias. For Sn, a separation protocol relying on TRU-Spec anion exchange resin was optimized, providing a recovery close to 100 % without on-column fractionation. Cu was recovered quantitatively together with the bronze matrix with this isolation protocol. Isotopic analysis of this Cu fraction provides δ (65)Cu results similar to those obtained upon isolation using AG-MP-1 resin. This means that Cu and Sn isotopic analysis of bronze alloys can therefore be carried out after a single chromatographic

  1. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  2. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality.

  3. Ocean fertilization, carbon credits and the Kyoto Protocol

    Science.gov (United States)

    Westley, M. B.; Gnanadesikan, A.

    2008-12-01

    Commercial interest in ocean fertilization as a carbon sequestration tool was excited by the December 1997 agreement of the Kyoto Protocol to the United Nations Convention on Climate Change. The Protocol commits industrialized countries to caps on net greenhouse gas emissions and allows for various flexible mechanisms to achieve these caps in the most economically efficient manner possible, including trade in carbon credits from projects that reduce emissions or enhance sinks. The carbon market was valued at 64 billion in 2007, with the bulk of the trading (50 billion) taking place in the highly regulated European Union Emission Trading Scheme, which deals primarily in emission allowances in the energy sector. A much smaller amount, worth $265 million, was traded in the largely unregulated "voluntary" market (Capoor and Ambrosi 2008). As the voluntary market grows, so do calls for its regulation, with several efforts underway to set rules and standards for the sale of voluntary carbon credits using the Kyoto Protocol as a starting point. Four US-based companies and an Australian company currently seek to develop ocean fertilization technologies for the generation of carbon credits. We review these plans through the lens of the Kyoto Protocol and its flexible mechanisms, and examine whether and how ocean fertilization could generate tradable carbon credits. We note that at present, ocean sinks are not included in the Kyoto Protocol, and that furthermore, the Kyoto Protocol only addresses sources and sinks of greenhouse gases within national boundaries, making open-ocean fertilization projects a jurisdictional challenge. We discuss the negotiating history behind the limited inclusion of land use, land use change and forestry in the Kyoto Protocol and the controversy and eventual compromise concerning methodologies for terrestrial carbon accounting. We conclude that current technologies for measuring and monitoring carbon sequestration following ocean fertilization

  4. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    Science.gov (United States)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  5. The stability of uranium microspheres for future application as reference standard in analytical measurements

    Energy Technology Data Exchange (ETDEWEB)

    Middendorp, R.; Duerr, M.; Bosbach, D. [Forschungszentrum Juelich GmbH, IEK-6, 52428 Juelich (Germany)

    2016-07-01

    The monitoring of fuel-cycle facilities provides a tool to confirm the compliant operation, for example with respect to emissions into the environment or to supervise non-proliferation commitments. Hereby, anomalous situations can be detected in a timely manner and responsive action can be initiated to prevent an escalation into an event of severe consequence to society. In order to verify non-nuclear weapon states' compliance with the non-proliferation treaty (NPT), international authorities such as the International Atomic Energy Agency (IAEA) conduct inspections at facilities dealing with fissile or fertile nuclear materials. One measure consists of collection of swipe samples through inspectors for later analysis of collected nuclear material traces in the laboratory. Highly sensitive mass spectrometric methods provide a means to detect traces from nuclear material handling activities that provide indication of undeclared use of the facility. There are, however, no relevant (certified) reference materials available that can be used as calibration or quality control standards. Therefore, an aerosol-generation based process was established at Forschungszentrum Juelich for the production of spherical, mono-disperse uranium oxide micro-particles with accurately characterized isotopic compositions and amounts of uranium in the pico-gram range. The synthesized particles are studied with respect to their suitability as (certified) reference material in ultra-trace analysis. Several options for preparation and stabilization of the particles are available, where preparation of particles in suspension offers the possibility to produces specific particle mixtures. In order to assess the stability of particles, dissolution behavior and isotope exchange effects of particles in liquid suspension is studied on the bulk of suspended particles and also via micro-analytical methods applied for single particle characterization. The insights gained within these studies will

  6. Control Chart on Semi Analytical Weighting

    Science.gov (United States)

    Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.

  7. Development and validation of simple RP-HPLC-PDA analytical protocol for zileuton assisted with Design of Experiments for robustness determination

    Directory of Open Access Journals (Sweden)

    Saurabh B. Ganorkar

    2017-02-01

    Full Text Available A simple, rapid, sensitive, robust, stability-indicating RP-HPLC-PDA analytical protocol was developed and validated for the analysis of zileuton racemate in bulk and in tablet formulation. Development of method and resolution of degradation products from forced; hydrolytic (acidic, basic, neutral, oxidative, photolytic (acidic, basic, neutral, solid state and thermal (dry heat degradation was achieved on a LC – GC Qualisil BDS C18 column (250 mm × 4.6 mm × 5 μm by isocratic mode at ambient temperature, employing a mobile phase methanol and (0.2%, v/v orthophosphoric acid in ratio of (80:20, v/v at a flow rate of 1.0 mL min−1 and detection at 260 nm. ‘Design of Experiments’ (DOE employing ‘Central Composite Design’ (CCD and ‘Response Surface Methodology’ (RSM were applied as an advancement to traditional ‘One Variable at Time’ (OVAT approach to evaluate the effects of variations in selected factors (methanol content, flow rate, concentration of orthophosphoric acid as graphical interpretation for robustness and statistical interpretation was achieved with Multiple Linear Regression (MLR and ANOVA. The method succeeded over the validation parameters: linearity, precision, accuracy, limit of detection and limit of quantitation, and robustness. The method was applied effectively for analysis of in-house zileuton tablets.

  8. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry

    Science.gov (United States)

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. [Figure not available: see fulltext.

  9. Optimizing RDF Data Cubes for Efficient Processing of Analytical Queries

    DEFF Research Database (Denmark)

    Jakobsen, Kim Ahlstrøm; Andersen, Alex B.; Hose, Katja

    2015-01-01

    data warehouses and data cubes. Today, external data sources are essential for analytics and, as the Semantic Web gains popularity, more and more external sources are available in native RDF. With the recent SPARQL 1.1 standard, performing analytical queries over RDF data sources has finally become...

  10. Preston and Park-Sanders protocols adapted for semi-quantitative isolation of thermotolerant Campylobacter from chicken rinse

    DEFF Research Database (Denmark)

    Josefsen, Mathilde Hartmann; Lübeck, Peter Stephensen; Aalbaek, B.

    2003-01-01

    Human campylobacteriosis has become the major cause of foodborne gastrointestinal diseases in several European countries. In order to implement effective control measures in the primary production, and as a tool in risk assessment studies, it is necessary to have sensitive and quantitative...... detection methods. Thus, semi-quantitative detection of thermophilic Campylobacter spp. in 20 naturally contaminated chicken rinse samples was carried out using the two most common standard protocols: Preston and Park-Sanders, as proposed by Nordic Committee on Food Analysis (NMKL) and International...... Standard Organization (ISO), respectively. For both protocols, the chicken rinse samples were prepared in 500 ml buffered peptone water, as recommended in the ISO protocol no. 6887-2. The results indicated that the Preston protocol was superior to the Park-Sanders protocol in supporting growth...

  11. Data transmission protocol for Pi-of-the-Sky cameras

    Science.gov (United States)

    Uzycki, J.; Kasprowicz, G.; Mankiewicz, M.; Nawrocki, K.; Sitek, P.; Sokolowski, M.; Sulej, R.; Tlaczala, W.

    2006-10-01

    The large amount of data collected by the automatic astronomical cameras has to be transferred to the fast computers in a reliable way. The method chosen should ensure data streaming in both directions but in nonsymmetrical way. The Ethernet interface is very good choice because of its popularity and proven performance. However it requires TCP/IP stack implementation in devices like cameras for full compliance with existing network and operating systems. This paper describes NUDP protocol, which was made as supplement to standard UDP protocol and can be used as a simple-network protocol. The NUDP does not need TCP protocol implementation and makes it possible to run the Ethernet network with simple devices based on microcontroller and/or FPGA chips. The data transmission idea was created especially for the "Pi of the Sky" project.

  12. Analysis of the Implementation of Standardized Clinical Protocol «Diabetes Mellitus Type 2» by Quality Indicators in Institutions of Kyiv Region

    Directory of Open Access Journals (Sweden)

    V.I. Tkachenko

    2014-10-01

    Full Text Available In Ukraine, a standardized clinical protocol (SCP to provide medical care in diabetes mellitus type 2 (order of the Ministry of Healthcare of Ukraine dated 21.12.2012 № 1118, which identifies 4 quality indicators, is being implemented. The objective of research — to analyze the implementation of SCP based on monitoring of quality indicators in the institutions of the Kyiv region. Materials and Methods. Technique for assessing the quality of diabetes care, one element of which is the monitoring of quality indicators specified in SCP, has been developed and applied. Collection and analysis of information was carried out by forms of primary records № 025/030 and 030/o, forms of statistical reporting № 12 and 20. Statistical analysis was performed using Excel 2007, SPSS. Results. Today, primary health care institutions in Kyiv region developed local protocols that confirms the implementation of the first quality indicator, in accordance with the desired level of the indicator value by SCP. The second indicator — the percentage of patients who were defined the level of glycated hemoglobin in the reporting period amounted to 12.2 %, which is higher than in 2012 (8.84 %, but remains low. The third quality indicator — the percentage of patients who were admitted to hospital for diabetes mellitus and its complications during the reporting period amounted to 15.01 %, while in 2012 it stood at 8.66 %. For comparison, this figure in 2007 was 9.37 %. Conclusions. The quality of care at an early stage of implementation is not enough, partly due to the lack of awareness by physicians of major provisions of the protocol, lack of equipment, the need of payment by a patient for medical services specified in the protocol, lack of doctors’ understanding of the characteristics of different types of medical and technological documents and difficulties in the development and implementation of local protocols, particularly. The obtained results are

  13. A Protocol for Advanced Psychometric Assessment of Surveys

    Science.gov (United States)

    Squires, Janet E.; Hayduk, Leslie; Hutchinson, Alison M.; Cranley, Lisa A.; Gierl, Mark; Cummings, Greta G.; Norton, Peter G.; Estabrooks, Carole A.

    2013-01-01

    Background and Purpose. In this paper, we present a protocol for advanced psychometric assessments of surveys based on the Standards for Educational and Psychological Testing. We use the Alberta Context Tool (ACT) as an exemplar survey to which this protocol can be applied. Methods. Data mapping, acceptability, reliability, and validity are addressed. Acceptability is assessed with missing data frequencies and the time required to complete the survey. Reliability is assessed with internal consistency coefficients and information functions. A unitary approach to validity consisting of accumulating evidence based on instrument content, response processes, internal structure, and relations to other variables is taken. We also address assessing performance of survey data when aggregated to higher levels (e.g., nursing unit). Discussion. In this paper we present a protocol for advanced psychometric assessment of survey data using the Alberta Context Tool (ACT) as an exemplar survey; application of the protocol to the ACT survey is underway. Psychometric assessment of any survey is essential to obtaining reliable and valid research findings. This protocol can be adapted for use with any nursing survey. PMID:23401759

  14. Benchmarking pediatric cranial CT protocols using a dose tracking software system: a multicenter study

    Energy Technology Data Exchange (ETDEWEB)

    Bondt, Timo de; Parizel, Paul M. [Antwerp University Hospital and University of Antwerp, Department of Radiology, Antwerp (Belgium); Mulkens, Tom [H. Hart Hospital, Department of Radiology, Lier (Belgium); Zanca, Federica [GE Healthcare, DoseWatch, Buc (France); KU Leuven, Imaging and Pathology Department, Leuven (Belgium); Pyfferoen, Lotte; Casselman, Jan W. [AZ St. Jan Brugge-Oostende AV Hospital, Department of Radiology, Brugge (Belgium)

    2017-02-15

    To benchmark regional standard practice for paediatric cranial CT-procedures in terms of radiation dose and acquisition parameters. Paediatric cranial CT-data were retrospectively collected during a 1-year period, in 3 different hospitals of the same country. A dose tracking system was used to automatically gather information. Dose (CTDI and DLP), scan length, amount of retakes and demographic data were stratified by age and clinical indication; appropriate use of child-specific protocols was assessed. In total, 296 paediatric cranial CT-procedures were collected. Although the median dose of each hospital was below national and international diagnostic reference level (DRL) for all age categories, statistically significant (p-value < 0.001) dose differences among hospitals were observed. The hospital with lowest dose levels showed smallest dose variability and used age-stratified protocols for standardizing paediatric head exams. Erroneous selection of adult protocols for children still occurred, mostly in the oldest age-group. Even though all hospitals complied with national and international DRLs, dose tracking and benchmarking showed that further dose optimization and standardization is possible by using age-stratified protocols for paediatric cranial CT. Moreover, having a dose tracking system revealed that adult protocols are still applied for paediatric CT, a practice that must be avoided. (orig.)

  15. Toward analytic aids for standard setting in nuclear regulation

    International Nuclear Information System (INIS)

    Brown, R.V.; O'Connor, M.F.; Peterson, C.R.

    1979-05-01

    US NRC promulgates standards for nuclear reprocessing and other facilities to safeguard against the diversion of nuclear material. Two broad tasks have been directed toward establishing performance criteria for standard settings: general-purpose modeling, and analysis specific to a particular performance criterion option. This report emphasizes work on the second task. Purpose is to provide a framework for the evaluation of such options that organizes the necessary components in a way that provides for meaningful assessments with respect to required inputs

  16. 7 CFR 93.13 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... No. 1, USDA, Agricultural Marketing Service, Science and Technology, 3521 South Agriculture Building... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.13 Section 93.13 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards...

  17. Quality system implementation for nuclear analytical techniques

    International Nuclear Information System (INIS)

    2004-01-01

    The international effort (UNIDO, ILAC, BIPM, etc.) to establish a functional infrastructure for metrology and accreditation in many developing countries needs to be complemented by assistance to implement high quality practices and high quality output by service providers and producers in the respective countries. Knowledge of how to approach QA systems that justify a formal accreditation is available in only a few countries and the dissemination of know how and development of skills is needed bottom up from the working level of laboratories and institutes. Awareness building, convincing of management, introduction of good management practices, technical expertise and good documentation will lead to the creation of a quality culture that assures a sustainability and inherent development of quality practices as a prerequisite of economic success. Quality assurance and quality control can be used as a valuable management tool and is a prerequisite for international trade and information exchange. This publication tries to assist quality managers, Laboratory Managers and staff involved in setting up a QA/QC system in a nuclear analytical laboratory to take appropriate action to start and complete the necessary steps for a successful quality system for ultimate national accreditation. This guidebook contributes to a better understanding of the basic ideas behind ISO/IEC 17025, the international standard for 'General requirements for the competence of testing and calibration laboratories'. It provides basic information and detailed explanation about the establishment of the QC system in analytical and nuclear analytical laboratories. It is a proper training material for training of trainers and makes managers with QC management and implementation familiar. This training material aims to facilitate the implementation of internationally accepted quality principles and to promote attempts by Member States' laboratories to obtain accreditation for nuclear analytical

  18. A protocol for lifetime energy and environmental impact assessment of building insulation materials

    International Nuclear Information System (INIS)

    Shrestha, Som S.; Biswas, Kaushik; Desjarlais, Andre O.

    2014-01-01

    This article describes a proposed protocol that is intended to provide a comprehensive list of factors to be considered in evaluating the direct and indirect environmental impacts of building insulation materials, as well as detailed descriptions of standardized calculation methodologies to determine those impacts. The energy and environmental impacts of insulation materials can generally be divided into two categories: (1) direct impact due to the embodied energy of the insulation materials and other factors and (2) indirect or environmental impacts avoided as a result of reduced building energy use due to addition of insulation. Standards and product category rules exist, which provide guidelines about the life cycle assessment (LCA) of materials, including building insulation products. However, critical reviews have suggested that these standards fail to provide complete guidance to LCA studies and suffer from ambiguities regarding the determination of the environmental impacts of building insulation and other products. The focus of the assessment protocol described here is to identify all factors that contribute to the total energy and environmental impacts of different building insulation products and, more importantly, provide standardized determination methods that will allow comparison of different insulation material types. Further, the intent is not to replace current LCA standards but to provide a well-defined, easy-to-use comparison method for insulation materials using existing LCA guidelines. - Highlights: • We proposed a protocol to evaluate the environmental impacts of insulation materials. • The protocol considers all life cycle stages of an insulation material. • Both the direct environmental impacts and the indirect impacts are defined. • Standardized calculation methods for the ‘avoided operational energy’ is defined. • Standardized calculation methods for the ‘avoided environmental impact’ is defined

  19. Pulmonary CT angiography protocol adapted to the hemodynamic effects of pregnancy.

    LENUS (Irish Health Repository)

    Ridge, Carole A

    2012-02-01

    OBJECTIVE: The purpose of this study was to compare the image quality of a standard pulmonary CT angiography (CTA) protocol with a pulmonary CTA protocol optimized for use in pregnant patients with suspected pulmonary embolism (PE). MATERIALS AND METHODS: Forty-five consecutive pregnant patients with suspected PE were retrospectively included in the study: 25 patients (group A) underwent standard-protocol pulmonary CTA and 20 patients (group B) were imaged using a protocol modified for pregnancy. The modified protocol used a shallow inspiration breath-hold and a high concentration, high rate of injection, and high volume of contrast material. Objective image quality and subjective image quality were evaluated by measuring pulmonary arterial enhancement, determining whether there was transient interruption of the contrast bolus by unopacified blood from the inferior vena cava (IVC), and assessing diagnostic adequacy. RESULTS: Objective and subjective image quality were significantly better for group B-that is, for the group who underwent the CTA protocol optimized for pregnancy. Mean pulmonary arterial enhancement and the percentage of studies characterized as adequate for diagnosis were higher in group B than in group A: 321 +\\/- 148 HU (SD) versus 178 +\\/- 67 HU (p = 0.0001) and 90% versus 64% (p = 0.05), respectively. Transient interruption of contrast material by unopacified blood from the IVC was observed more frequently in group A (39%) than in group B (10%) (p = 0.05). CONCLUSION: A pulmonary CTA protocol optimized for pregnancy significantly improved image quality by increasing pulmonary arterial opacification, improving diagnostic adequacy, and decreasing transient interruption of the contrast bolus by unopacified blood from the IVC.

  20. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  1. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Walvekar, A.P.; Ali, M.M.; Thantry, S.S.; Verma, R.; Devi, R.

    1995-01-01

    The concept of the use of human scalp hair as a first level indicator of exposure to inorganic pollutants has been established by us earlier. Efforts towards the preparation of a hair reference material are described. The analytical approaches for the determination of total mercury by cold vapour AAS and INAA and of methylmercury by extraction combined with gas chromatography coupled to an ECD are summarized with results on some of the samples analyzed, including the stability of values over a period of time of storage. (author)

  2. Analytical method for heavy metal determination in algae and turtle eggs from Guanahacabibes Protected Sea Park

    Directory of Open Access Journals (Sweden)

    Abel I. Balbín Tamayo

    2014-12-01

    Full Text Available A standard digestion method coupled to electrochemical detection for the monitoring of heavy metals in biological samples has been used for the simultaneous analysis of the target analytes. Square wave anodic stripping voltammetry (SWASV coupled to disposable screen-printed electrodes (SPEs was employed as a fast and sensitive electroanalytical method for the detection of heavy metals. The aim of our study was to determine Cd, Pb and Cu by SWASV in brown algae (Sargasum natan and green turtle eggs (Chelonia mydas using screen-printed electrodes. The method proved useful for the simultaneous analysis of these metals by comparison between two different procedures for preparing the samples. Two different approaches in digestion protocols were assessed. The study was focused on Guanahacabibes brown algae and green turtle eggs because the metal concentrations recorded in this area may be used for intraspecific comparison within the Guanahacabibes Protected Sea Park area, a body of water for which information is still very scarce. The best results were obtained by digesting biological samples with the EPA 3050B method. This treatment allowed the fast and quantitative extraction from brown algae and green turtle eggs of the target analytes, with high sensitivity and avoiding organic residues, eventually affecting electrochemical measurements.

  3. Who needs inpatient detox? Development and implementation of a hospitalist protocol for the evaluation of patients for alcohol detoxification.

    Science.gov (United States)

    Stephens, John R; Liles, E Allen; Dancel, Ria; Gilchrist, Michael; Kirsch, Jonathan; DeWalt, Darren A

    2014-04-01

    Clinicians caring for patients seeking alcohol detoxification face many challenges, including lack of evidence-based guidelines for treatment and high recidivism rates. To develop a standardized protocol for determining which alcohol dependent patients seeking detoxification need inpatient versus outpatient treatment, and to study the protocol's implementation. Review of best evidence by ad hoc task force and subsequent creation of standardized protocol. Prospective observational evaluation of initial protocol implementation. Patients presenting for alcohol detoxification. Development and implementation of a protocol for evaluation and treatment of patients requesting alcohol detoxification. Number of admissions per month with primary alcohol related diagnosis (DRG), 30-day readmission rate, and length of stay, all measured before and after protocol implementation. We identified one randomized clinical trial and three cohort studies to inform the choice of inpatient versus outpatient detoxification, along with one prior protocol in this population, and combined that data with clinical experience to create an institutional protocol. After implementation, the average number of alcohol related admissions was 15.9 per month, compared with 18.9 per month before implementation (p = 0.037). There was no difference in readmission rate or length of stay. Creation and utilization of a protocol led to standardization of care for patients requesting detoxification from alcohol. Initial evaluation of protocol implementation showed a decrease in number of admissions.

  4. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    Energy Technology Data Exchange (ETDEWEB)

    Drzymala, R. E., E-mail: drzymala@wustl.edu [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Alvarez, P. E. [Imaging and Radiation Oncology Core Houston, UT MD Anderson Cancer Center, Houston, Texas 77030 (United States); Bednarz, G. [Radiation Oncology Department, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania 15232 (United States); Bourland, J. D. [Department of Radiation Oncology, Wake Forest University, Winston-Salem, North Carolina 27157 (United States); DeWerd, L. A. [Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Ma, L. [Department of Radiation Oncology, University California San Francisco, San Francisco, California 94143 (United States); Meltsner, S. G. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27710 (United States); Neyman, G. [Department of Radiation Oncology, The Cleveland Clinic Foundation, Cleveland, Ohio 44195 (United States); Novotny, J. [Medical Physics Department, Hospital Na Homolce, Prague 15030 (Czech Republic); Petti, P. L. [Gamma Knife Center, Washington Hospital Healthcare System, Fremont, California 94538 (United States); Rivard, M. J. [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States); Shiu, A. S. [Department of Radiation Oncology, University of Southern California, Los Angeles, California 90033 (United States); Goetsch, S. J. [San Diego Medical Physics, Inc., La Jolla, California 92037 (United States)

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods: Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS

  5. Utilizing distributional analytics and electronic records to assess timeliness of inpatient blood glucose monitoring in non-critical care wards

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2016-04-01

    Full Text Available Abstract Background Regular and timely monitoring of blood glucose (BG levels in hospitalized patients with diabetes mellitus is crucial to optimizing inpatient glycaemic control. However, methods to quantify timeliness as a measurement of quality of care are lacking. We propose an analytical approach that utilizes BG measurements from electronic records to assess adherence to an inpatient BG monitoring protocol in hospital wards. Methods We applied our proposed analytical approach to electronic records obtained from 24 non-critical care wards in November and December 2013 from a tertiary care hospital in Singapore. We applied distributional analytics to evaluate daily adherence to BG monitoring timings. A one-sample Kolmogorov-Smirnov (1S-KS test was performed to test daily BG timings against non-adherence represented by the uniform distribution. This test was performed among wards with high power, determined through simulation. The 1S-KS test was coupled with visualization via the cumulative distribution function (cdf plot and a two-sample Kolmogorov-Smirnov (2S-KS test, enabling comparison of the BG timing distributions between two consecutive days. We also applied mixture modelling to identify the key features in daily BG timings. Results We found that 11 out of the 24 wards had high power. Among these wards, 1S-KS test with cdf plots indicated adherence to BG monitoring protocols. Integrating both 1S-KS and 2S-KS information within a moving window consisting of two consecutive days did not suggest frequent potential change from or towards non-adherence to protocol. From mixture modelling among wards with high power, we consistently identified four components with high concentration of BG measurements taken before mealtimes and around bedtime. This agnostic analysis provided additional evidence that the wards were adherent to BG monitoring protocols. Conclusions We demonstrated the utility of our proposed analytical approach as a monitoring

  6. A Universal Standard for the Validation of Blood Pressure Measuring Devices: Association for the Advancement of Medical Instrumentation/European Society of Hypertension/International Organization for Standardization (AAMI/ESH/ISO) Collaboration Statement.

    Science.gov (United States)

    Stergiou, George S; Alpert, Bruce; Mieke, Stephan; Asmar, Roland; Atkins, Neil; Eckert, Siegfried; Frick, Gerhard; Friedman, Bruce; Graßl, Thomas; Ichikawa, Tsutomu; Ioannidis, John P; Lacy, Peter; McManus, Richard; Murray, Alan; Myers, Martin; Palatini, Paolo; Parati, Gianfranco; Quinn, David; Sarkis, Josh; Shennan, Andrew; Usuda, Takashi; Wang, Jiguang; Wu, Colin O; O'Brien, Eoin

    2018-03-01

    In the past 30 years, several organizations, such as the US Association for the Advancement of Medical Instrumentation (AAMI), the British Hypertension Society, the European Society of Hypertension (ESH) Working Group on Blood Pressure (BP) Monitoring, and the International Organization for Standardization (ISO), have developed protocols for clinical validation of BP measuring devices. However, it is recognized that science, as well as patients, consumers, and manufacturers, would be best served if all BP measuring devices were assessed for accuracy according to an agreed single validation protocol that had global acceptance. Therefore, an international initiative was taken by the AAMI, ESH, and ISO experts who agreed to develop a universal standard for device validation. This statement presents the key aspects of a validation procedure, which were agreed by the AAMI, ESH, and ISO representatives as the basis for a single universal validation protocol. As soon as the AAMI/ESH/ISO standard is fully developed, this will be regarded as the single universal standard and will replace all other previous standards/protocols. © 2018 American Heart Association, Inc., and Wolters Kluwer Health, Inc.

  7. User-oriented end-to-end transport protocols for the real-time distribution of telemetry data from NASA spacecraft

    Science.gov (United States)

    Hooke, A. J.

    1979-01-01

    A set of standard telemetry protocols for downlink data flow facilitating the end-to-end transport of instrument data from the spacecraft to the user in real time is proposed. The direct switching of data by autonomous message 'packets' that are assembled by the source instrument on the spacecraft is discussed. The data system consists thus of a format on a message rather than word basis, and such packet telemetry would include standardized protocol headers. Standards are being developed within the NASA End-to-End Data System (NEEDS) program for the source packet and transport frame protocols. The source packet protocol contains identification of both the sequence number of the packet as it is generated by the source and the total length of the packet, while the transport frame protocol includes a sequence count defining the serial number of the frame as it is generated by the spacecraft data system, and a field specifying any 'options' selected in the format of the frame itself.

  8. A Fair Cooperative MAC Protocol in IEEE 802.11 WLAN

    Directory of Open Access Journals (Sweden)

    Seyed Davoud Mousavi

    2018-05-01

    Full Text Available Cooperative communication techniques have recently enabled wireless technologies to overcome their challenges. The main objective of these techniques is to improve resource allocation. In this paper, we propose a new protocol in medium access control (MAC of the IEEE 802.11 standard. In our new protocol, which is called Fair Cooperative MAC (FC-MAC, every relay node participates in cooperation proportionally to its provided cooperation gain. This technique improves network resource allocation by exploiting the potential capacity of all relay candidates. Simulation results demonstrate that the FC-MAC protocol presents better performance in terms of throughput, fairness, and network lifetime.

  9. Development of a micro total analytic system based on isotachophoresis for the separation and characterization of lanthanides

    International Nuclear Information System (INIS)

    Vio, L.

    2010-01-01

    The accurate and reproducible characterization of radioactive solutions in isotope composition and concentration is an essential topic for analytical laboratories in the nuclear field. In order to reduce manipulation time in glove box and production of contaminated wastes, it is necessary to propose innovative and efficient solutions for these analyses. Since few years, microchips are a major field of development in analytical chemistry and those devices could provide a solution which fits the needs of nuclear industry. The aim of this work is to design a disposable analytical micro-device devoted to lanthanide separation from spent nuclear fuel before their analysis in mass spectrometry. Designed to be used in place of a separation process by liquid chromatography which is involved in a three step protocol, the new protocol based on isotachophoresis (ITP) keeps compatible with the other two steps. The complete separation of lanthanides by ITP was obtained by the use of only one chelating compound rigorously selected: the 2-hydroxy 2-methyl butyric acid (HMBA). The main parameters involved in solute resolution were defined from the theoretical models of ITP and experimental studies of the influence of these parameters allowed to optimize the geometry of the system and to improve its performances. To suppress cleaning of the system and, consequently, to strongly reduce both liquid waste volume and handling radioactive material, the ITP protocol was transferred in a polymeric (COC) disposable microchip especially developed for this purpose. (author) [fr

  10. A History of the Improvement of Internet Protocols Over Satellites Using ACTS

    Science.gov (United States)

    Allman, Mark; Kruse, Hans; Ostermann, Shawn

    2000-01-01

    This paper outlines the main results of a number of ACTS experiments on the efficacy of using standard Internet protocols over long-delay satellite channels. These experiments have been jointly conducted by NASAs Glenn Research Center and Ohio University over the last six years. The focus of our investigations has been the impact of long-delay networks with non-zero bit-error rates on the performance of the suite of Internet protocols. In particular, we have focused on the most widely used transport protocol, the Transmission Control Protocol (TCP), as well as several application layer protocols. This paper presents our main results, as well as references to more verbose discussions of our experiments.

  11. Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus

    Directory of Open Access Journals (Sweden)

    CHRISTIE GRAF RIBEIRO

    Full Text Available ABSTRACT Objective: to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c. Methods: this is a descriptive study, with the methodology divided into three phases: (1 development of a theoretical ophthalmologic database with emphasis on strabismus; (2 computerization of this theoretical ophthalmologic database using SINPE(c and (3 interpretation of the information with demonstration of results to validate the protocol. We inputed data from the charts of fifty patients with known strabismus through the Electronic Protocol for testing and validation. Results: the new electronic protocol was able to store information regarding patient history, physical examination, laboratory exams, imaging results, diagnosis and treatment of patients with ophthalmologic diseases, with emphasis on strabismus. We included 2,141 items in this master protocol and created 20 new specific electronic protocols for strabismus, each with its own specifics. Validation was achieved through correlation and corroboration of the symptoms and confirmed diagnoses of the fifty included patients with the diagnostic criteria for the twenty new strabismus protocols. Conclusion: a new, validated electronic database focusing on ophthalmology, with emphasis on strabismus, was successfully created through the standardized collection of information, and computerization of the database using proprietary software. This protocol is ready for deployment to facilitate data collection, sorting and application for practitioners and researchers in numerous specialties.

  12. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  13. Fibrinolysis standards: a review of the current status.

    Science.gov (United States)

    Thelwell, C

    2010-07-01

    Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  14. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Psychometric Properties of a Standardized Observation Protocol to Quantify Pediatric Physical Therapy Actions

    NARCIS (Netherlands)

    Sonderer, Patrizia; Ziegler, Schirin Akhbari; Oertle, Barbara Gressbach; Meichtry, Andre; Hadders-Algra, Mijna

    Purpose: Pediatric physical therapy (PPT) is characterized by heterogeneity. This blurs the evaluation of effective components of PPT. The Groningen Observation Protocol (GOP) was developed to quantify contents of PPT. This study assesses the reliability and completeness of the GOP. Methods: Sixty

  16. Spiked sample standards; their uses and disadvantages in analytical quality control

    International Nuclear Information System (INIS)

    Bowen, V.T.; Volchok, H.L

    1980-01-01

    The advantages and disadvantages of spiked standards are discussed and contrasted with those of natural matrix standards. The preparation of the former class of standards and the evidence supporting recommendation of caution in their use are considered. (author)

  17. Challenges in standardization of blood pressure measurement at the population level.

    Science.gov (United States)

    Tolonen, Hanna; Koponen, Päivikki; Naska, Androniki; Männistö, Satu; Broda, Grazyna; Palosaari, Tarja; Kuulasmaa, Kari

    2015-04-10

    Accurate blood pressure measurements are needed in clinical practice, intervention studies and health examination surveys. Blood pressure measurements are sensitive: their accuracy can be affected by measurement environment, behaviour of the subject, measurement procedures, devices used for the measurement and the observer. To minimize errors in blood pressure measurement, a standardized measurement protocol is needed. The European Health Examination Survey (EHES) Pilot project was conducted in 2009-2012. A pilot health examination survey was conducted in 12 countries using a standardized protocol. The measurement protocols used in each survey, training provided for the measurers, measurement data, and observations during site visits were collected and evaluated to assess the level of standardization. The EHES measurement protocol for blood pressure was followed accurately in all 12 pilot surveys. Most of the surveys succeeded in organizing a quiet and comfortable measurement environment, and staff instructed survey participants appropriately before examination visits. In all surveys, blood pressure was measured three times, from the right arm in a sitting posture. The biggest variation was in the device used for the blood pressure measurement. It is possible to reach a high level of standardization for blood pressure measurements across countries and over time. A detailed, standardized measurement protocol, and adequate training and monitoring during the fieldwork and centrally organized quality assessment of the data are needed. The recent EU regulation banning the sale of mercury sphygmomanometer in European Union Member States has set new challenges for the standardization of measurement devices since the validity of oscillometric measurements is device-specific and performance of aneroid devices depends very much on calibration.

  18. Comparison of absorbed dose determinations using the IAEA dosimetry protocol and the ferrous sulphate dosimeter

    International Nuclear Information System (INIS)

    Mattsson, Olof

    1988-01-01

    In 1985 a comparison of different revised protocols for the dosimetry of high-energy photon and electron beams was published (Mattsson, 1985). The conclusions were that the agreement in absorbed dose to water determined using the different protocols is very good and that the agreement between ionization chamber and ferrous sulphate dosimetry is generally good. For electron beams the differences obtained with the ionization chamber and ferrous sulphate dosimeters were up to about 2%. The influence of the energy and angular distribution of the electron beams on the ionization chamber dosimetry is not fully considered in the dosimetry protocols. The basis for the ionization chamber dosimetry has recently been changed when the Bureau International des Poids et Mesures (BIPM) in 1986 changed the air-kerma standard. The reason was the adaption of the new stopping-power values reported in the ICRU Report No. 37. To achieve consistency in the ionization chamber dosimetry the interaction coefficients and correction factors given in the dosimetry protocols should also be based on the same set of stopping-power values. This is not the case with the protocols included in the comparison made by Mattsson. However, in the international code of practice by the International Atomic Energy Agency (IAEA, 1987) the new stopping-power values have been used. The formalism is the same as in most of the previous protocols. Mattsson et al. (1989) have shown that the differences in the various steps cancel out for the protocols published by NACP (1980) and by IAEA (1987) for cobalt-60 gamma quality. However, it is also of interest to investigate the influence of the new air-kerma standard and the new values on coefficients and factors given in the IAEA protocol for other beam qualities. Therefore, the data given by Mattsson (1985) have been recalculated using the new air-kerma standard and the IAEA protocol

  19. SPIRIT 2013 explanation and elaboration: guidance for protocols of clinical trials.

    Science.gov (United States)

    Chan, An-Wen; Tetzlaff, Jennifer M; Gøtzsche, Peter C; Altman, Douglas G; Mann, Howard; Berlin, Jesse A; Dickersin, Kay; Hróbjartsson, Asbjørn; Schulz, Kenneth F; Parulekar, Wendy R; Krleza-Jeric, Karmela; Laupacis, Andreas; Moher, David

    2013-01-08

    High quality protocols facilitate proper conduct, reporting, and external review of clinical trials. However, the completeness of trial protocols is often inadequate. To help improve the content and quality of protocols, an international group of stakeholders developed the SPIRIT 2013 Statement (Standard Protocol Items: Recommendations for Interventional Trials). The SPIRIT Statement provides guidance in the form of a checklist of recommended items to include in a clinical trial protocol. This SPIRIT 2013 Explanation and Elaboration paper provides important information to promote full understanding of the checklist recommendations. For each checklist item, we provide a rationale and detailed description; a model example from an actual protocol; and relevant references supporting its importance. We strongly recommend that this explanatory paper be used in conjunction with the SPIRIT Statement. A website of resources is also available (www.spirit-statement.org). The SPIRIT 2013 Explanation and Elaboration paper, together with the Statement, should help with the drafting of trial protocols. Complete documentation of key trial elements can facilitate transparency and protocol review for the benefit of all stakeholders.

  20. Mac protocols for wireless sensor network (wsn): a comparative study

    International Nuclear Information System (INIS)

    Arshad, J.; Akram, Q.; Saleem, Y.

    2014-01-01

    Data communication between nodes is carried out under Medium Access Control (MAC) protocol which is defined at data link layer. The MAC protocols are responsible to communicate and coordinate between nodes according to the defined standards in WSN (Wireless Sensor Networks). The design of a MAC protocol should also address the issues of energy efficiency and transmission efficiency. There are number of MAC protocols that exist in the literature proposed for WSN. In this paper, nine MAC protocols which includes S-MAC, T-MAC, Wise-MAC, Mu-MAC, Z-MAC, A-MAC, D-MAC, B-MAC and B-MAC+ for WSN have been explored, studied and analyzed. These nine protocols are classified in contention based and hybrid (combination of contention and schedule based) MAC protocols. The goal of this comparative study is to provide a basis for MAC protocols and to highlight different mechanisms used with respect to parameters for the evaluation of energy and transmission efficiency in WSN. This study also aims to give reader a better understanding of the concepts, processes and flow of information used in these MAC protocols for WSN. A comparison with respect to energy reservation scheme, idle listening avoidance, latency, fairness, data synchronization, and throughput maximization has been presented. It was analyzed that contention based MAC protocols are less energy efficient as compared to hybrid MAC protocols. From the analysis of contention based MAC protocols in term of energy consumption, it was being observed that protocols based on preamble sampling consume lesser energy than protocols based on static or dynamic sleep schedule. (author)

  1. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Science.gov (United States)

    2010-01-27

    ...-mask and full face piece respirators are normally considered two different types of air purifying... article published in a peer-reviewed industrial-hygiene journal describing the protocol and explaining how... article from an industrial- hygiene journal describing the accuracy and reliability of these proposed...

  2. Advertisement-Based Energy Efficient Medium Access Protocols for Wireless Sensor Networks

    Science.gov (United States)

    Ray, Surjya Sarathi

    One of the main challenges that prevents the large-scale deployment of Wireless Sensor Networks (WSNs) is providing the applications with the required quality of service (QoS) given the sensor nodes' limited energy supplies. WSNs are an important tool in supporting applications ranging from environmental and industrial monitoring, to battlefield surveillance and traffic control, among others. Most of these applications require sensors to function for long periods of time without human intervention and without battery replacement. Therefore, energy conservation is one of the main goals for protocols for WSNs. Energy conservation can be performed in different layers of the protocol stack. In particular, as the medium access control (MAC) layer can access and control the radio directly, large energy savings is possible through intelligent MAC protocol design. To maximize the network lifetime, MAC protocols for WSNs aim to minimize idle listening of the sensor nodes, packet collisions, and overhearing. Several approaches such as duty cycling and low power listening have been proposed at the MAC layer to achieve energy efficiency. In this thesis, I explore the possibility of further energy savings through the advertisement of data packets in the MAC layer. In the first part of my research, I propose Advertisement-MAC or ADV-MAC, a new MAC protocol for WSNs that utilizes the concept of advertising for data contention. This technique lets nodes listen dynamically to any desired transmission and sleep during transmissions not of interest. This minimizes the energy lost in idle listening and overhearing while maintaining an adaptive duty cycle to handle variable loads. Additionally, ADV-MAC enables energy efficient MAC-level multicasting. An analytical model for the packet delivery ratio and the energy consumption of the protocol is also proposed. The analytical model is verified with simulations and is used to choose an optimal value of the advertisement period

  3. Survey of protocols for the manual segmentation of the hippocampus: preparatory steps towards a joint EADC-ADNI harmonized protocol.

    Science.gov (United States)

    Boccardi, Marina; Ganzola, Rossana; Bocchetta, Martina; Pievani, Michela; Redolfi, Alberto; Bartzokis, George; Camicioli, Richard; Csernansky, John G; de Leon, Mony J; deToledo-Morrell, Leyla; Killiany, Ronald J; Lehéricy, Stéphane; Pantel, Johannes; Pruessner, Jens C; Soininen, H; Watson, Craig; Duchesne, Simon; Jack, Clifford R; Frisoni, Giovanni B

    2011-01-01

    Manual segmentation from magnetic resonance imaging (MR) is the gold standard for evaluating hippocampal atrophy in Alzheimer's disease (AD). Nonetheless, different segmentation protocols provide up to 2.5-fold volume differences. Here we surveyed the most frequently used segmentation protocols in the AD literature as a preliminary step for international harmonization. The anatomical landmarks (anteriormost and posteriormost slices, superior, inferior, medial, and lateral borders) were identified from 12 published protocols for hippocampal manual segmentation ([Abbreviation] first author, publication year: [B] Bartzokis, 1998; [C] Convit, 1997; [dTM] deToledo-Morrell, 2004; [H] Haller, 1997; [J] Jack, 1994; [K] Killiany, 1993; [L] Lehericy, 1994; [M] Malykhin, 2007; [Pa] Pantel, 2000; [Pr] Pruessner, 2000; [S] Soininen, 1994; [W] Watson, 1992). The hippocampi of one healthy control and one AD patient taken from the 1.5T MR ADNI database were segmented by a single rater according to each protocol. The accuracy of the protocols' interpretation and translation into practice was checked with lead authors of protocols through individual interactive web conferences. Semantically harmonized landmarks and differences were then extracted, regarding: (a) the posteriormost slice, protocol [B] being the most restrictive, and [H, M, Pa, Pr, S] the most inclusive; (b) inclusion [C, dTM, J, L, M, Pr, W] or exclusion [B, H, K, Pa, S] of alveus/fimbria; (c) separation from the parahippocampal gyrus, [C] being the most restrictive, [B, dTM, H, J, Pa, S] the most inclusive. There were no substantial differences in the definition of the anteriormost slice. This survey will allow us to operationalize differences among protocols into tracing units, measure their impact on the repeatability and diagnostic accuracy of manual hippocampal segmentation, and finally develop a harmonized protocol.

  4. A Lightweight Protocol for Secure Video Streaming.

    Science.gov (United States)

    Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis

    2018-05-14

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.

  5. Reduction of cancer risk by optimization of Computed Tomography head protocols: far eastern Cuban experience

    International Nuclear Information System (INIS)

    Miller Clemente, R.; Adame Brooks, D.; Lores Guevara, M.; Perez Diaz, M.; Arias Garlobo, M. L.; Ortega Rodriguez, O.; Nepite Haber, R.; Grinnan Hernandez, O.; Guillama Llosas, A.

    2015-01-01

    The cancer risk estimation constitutes one way for the evaluation of the public health, regarding computed tomography (CT) exposures. Starting from the hypothesis that the optimization of CT protocols would reduce significantly the added cancer risk, the purpose of this research was the application of optimization strategies regarding head CT protocols, in order to reduce the factors affecting the risk of induced cancer. The applied systemic approach included technological and human components, represented by quantitative physical factors. the volumetric kerma indexes, compared with respect to standard, optimized and reference values, were evaluated with multiple means comparison method. The added cancer risk resulted from the application of the methodology for biological effects evaluation, at low doses with low Linear Energy Transfer. Human observers in all scenarios evaluated the image quality. the reduced dose was significantly lower than for standard head protocols and reference levels, where: (1) for pediatric patients, by using an Automatic Exposure Control system, a reduction of 31% compared with standard protocol and ages range of 10-14, and (2) adults, using a Bilateral Filter for images obtained at low doses of 62% from those of standard head protocol. The risk reduction was higher than 25%. The systemic approach used allows the effective identification of factors involved on cancer risk related with exposures to CT. The combination of dose modulation and image restoration with Bilateral Filter, provide a significantly reduction of cancer risk, with acceptable diagnostic image quality. (Author)

  6. Informatics and Standards for Nanomedicine Technology

    Science.gov (United States)

    Thomas, Dennis G.; Klaessig, Fred; Harper, Stacey L.; Fritts, Martin; Hoover, Mark D.; Gaheen, Sharon; Stokes, Todd H.; Reznik-Zellen, Rebecca; Freund, Elaine T.; Klemm, Juli D.; Paik, David S.; Baker, Nathan A.

    2011-01-01

    There are several issues to be addressed concerning the management and effective use of information (or data), generated from nanotechnology studies in biomedical research and medicine. These data are large in volume, diverse in content, and are beset with gaps and ambiguities in the description and characterization of nanomaterials. In this work, we have reviewed three areas of nanomedicine informatics: information resources; taxonomies, controlled vocabularies, and ontologies; and information standards. Informatics methods and standards in each of these areas are critical for enabling collaboration, data sharing, unambiguous representation and interpretation of data, semantic (meaningful) search and integration of data; and for ensuring data quality, reliability, and reproducibility. In particular, we have considered four types of information standards in this review, which are standard characterization protocols, common terminology standards, minimum information standards, and standard data communication (exchange) formats. Currently, due to gaps and ambiguities in the data, it is also difficult to apply computational methods and machine learning techniques to analyze, interpret and recognize patterns in data that are high dimensional in nature, and also to relate variations in nanomaterial properties to variations in their chemical composition, synthesis, characterization protocols, etc. Progress towards resolving the issues of information management in nanomedicine using informatics methods and standards discussed in this review will be essential to the rapidly growing field of nanomedicine informatics. PMID:21721140

  7. Multicriteria evaluation of power plants impact on the living standard using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Chatzimouratidis, Athanasios I.; Pilavachi, Petros A.

    2008-01-01

    The purpose of this study is to evaluate 10 types of power plants available at present including fossil fuel, nuclear as well as renewable-energy-based power plants, with regard to their overall impact on the living standard of local communities. Both positive and negative impacts of power plant operation are considered using the analytic hierarchy process (AHP). The current study covers the set of criteria weights considered typical for many local communities in many developed countries. The results presented here are illustrative only and user-defined weighting is required to make this study valuable for a specific group of users. A sensitivity analysis examines the most important weight variations, thus giving an overall view of the problem evaluation to every decision maker. Regardless of criteria weight variations, the five types of renewable energy power plant rank in the first five positions. Nuclear plants are in the sixth position when priority is given to quality of life and last when socioeconomic aspects are valued more important. Natural gas, oil and coal/lignite power plants rank between sixth and tenth position having slightly better ranking under priority to socioeconomic aspects

  8. Handbook of Carbon Offset Programs. Trading Systems, Funds, Protocols and Standards

    Energy Technology Data Exchange (ETDEWEB)

    Kollmuss, Anja; Lazarus, Michael; Lee, Carrie; Polycarp, Clifford (SEI-US (United States)); LeFranc, Maurice (US EPA (United States))

    2010-03-15

    Greenhouse gas (GHG) offsets have long been promoted as an important element of a comprehensive climate policy approach. Offset programs can reduce the overall cost of achieving a given emission goal by enabling emission reductions to occur where costs are lower. Offsets have the potential to deliver sustainability co-benefits, through technology development and transfer. They can also develop human and institutional capacity for reducing emissions in sectors and locations not included in a cap and trade or a mandatory government policy. However, offsets can pose a risk to the environmental integrity of climate actions, especially if issues surrounding additionality, permanence, leakage, quantification and verification are not adequately addressed. The challenge is to design offset programs and policies that can maximize their potential benefits while minimizing their potential risks. This handbook provides a systematic and comprehensive review of existing offset programs. It looks are what offsets are, how offset mechanisms function, and the successes and pitfalls they have encountered. Coverage includes offset programs across the full swath of applications including mandatory and voluntary systems, government regulated and private markets, carbon offset funds, and accounting and reporting protocols such as the WBCSD/WRI GHG Protocol and ISO 14064. Learning from the successes and failures of these programs will be essential to crafting effective climate policy. A reference for regulators, policy makers, business leaders and NGOs concerned with the design and operation of GHG offset programs world-wide

  9. Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.; Crawford, Aladsair J.; Viswanathan, Vilayanur V.; Ferreira, Summer; Schoenwald, David

    2014-06-01

    The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Its subsequent use in the field and review by the protocol working group and most importantly the users’ subgroup and the thermal subgroup has led to the fundamental modifications reflected in this update of the 2012 Protocol. As an update of the 2012 Protocol, this document (the June 2014 Protocol) is intended to supersede its predecessor and be used as the basis for measuring and expressing ESS performance. The foreword provides general and specific details about what additions, revisions, and enhancements have been made to the 2012 Protocol and the rationale for them in arriving at the June 2014 Protocol.

  10. 75 FR 43059 - Mandatory Reliability Standards for the Calculation of Available Transfer Capability, Capacity...

    Science.gov (United States)

    2010-07-23

    ... Standards for Business Practices and Communications Protocols for Public Utilities July 15, 2010. AGENCY... Practices and Communication Protocols for Public Utilities. Order No. 729-B Order on Rehearing and..., Order No. 729-A, 131 FERC ] 61,109 (2010). \\2\\ Standards for Business Practices and Communication...

  11. Analytical capabilities of laser-probe mass spectrometry

    International Nuclear Information System (INIS)

    Kovalev, I.D.; Madsimov, G.A.; Suchkov, A.I.; Larin, N.V.

    1978-01-01

    The physical bases and quantitative analytical procedures of laser-probe mass spectrometry are considered in this review. A comparison is made of the capabilities of static and dynamic mass spectrometers. Techniques are studied for improving the analytical characteristics of laser-probe mass spectrometers. The advantages, for quantitative analysis, of the Q-switched mode over the normal pulse mode for lasers are: (a) the possibility of analysing metals, semiconductors and insulators without the use of standards; and (b) the possibility of layer-by-layer and local analysis. (Auth.)

  12. Aldefluor protocol to sort keratinocytes stem cells from skin

    OpenAIRE

    Noronha, Samuel Marcos Ribeiro; Gragnani, Alfredo; Pereira, Thiago Antônio Calado; Correa, Silvana Aparecida Alves; Bonucci, Jessica; Ferreira, Lydia Masako

    2017-01-01

    Abstract Purpose: To investigate the use Aldefluor® and N, N - Dimethylaminobenzaldehyde (DEAB) to design a protocol to sort keratinocyte stem cells from cultured keratinocytes from burned patients. Methods: Activated Aldefluor® aliquots were prepared and maintained at temperature between 2 to 8°C, or stored at -20°C. Next, the cells were collected following the standard protocol of sample preparation. Results: Best results were obtained with Aldefluor® 1.5µl and DEAB 15 µl for 1 x 106 c...

  13. Abbreviated protocol for breast MRI: Are multiple sequences needed for cancer detection?

    International Nuclear Information System (INIS)

    Mango, Victoria L.; Morris, Elizabeth A.; David Dershaw, D.; Abramson, Andrea; Fry, Charles; Moskowitz, Chaya S.; Hughes, Mary; Kaplan, Jennifer; Jochelson, Maxine S.

    2015-01-01

    Highlights: • Abbreviated breast MR demonstrates high sensitivity for breast carcinoma detection. • Time to perform/interpret the abbreviated exam is shorter than a standard MRI exam. • An abbreviated breast MRI could reduce costs and make MRI screening more available. - Abstract: Objective: To evaluate the ability of an abbreviated breast magnetic resonance imaging (MRI) protocol, consisting of a precontrast T1 weighted (T1W) image and single early post-contrast T1W image, to detect breast carcinoma. Materials and methods: A HIPAA compliant Institutional Review Board approved review of 100 consecutive breast MRI examinations in patients with biopsy proven unicentric breast carcinoma. 79% were invasive carcinomas and 21% were ductal carcinoma in situ. Four experienced breast radiologists, blinded to carcinoma location, history and prior examinations, assessed the abbreviated protocol evaluating only the first post-contrast T1W image, post-processed subtracted first post-contrast and subtraction maximum intensity projection images. Detection and localization of tumor were compared to the standard full diagnostic examination consisting of 13 pre-contrast, post-contrast and post-processed sequences. Results: All 100 cancers were visualized on initial reading of the abbreviated protocol by at least one reader. The mean sensitivity for each sequence was 96% for the first post-contrast sequence, 96% for the first post-contrast subtraction sequence and 93% for the subtraction MIP sequence. Within each sequence, there was no significant difference between the sensitivities among the 4 readers (p = 0.471, p = 0.656, p = 0.139). Mean interpretation time was 44 s (range 11–167 s). The abbreviated imaging protocol could be performed in approximately 10–15 min, compared to 30–40 min for the standard protocol. Conclusion: An abbreviated breast MRI protocol allows detection of breast carcinoma. One pre and post-contrast T1W sequence may be adequate for detecting

  14. Abbreviated protocol for breast MRI: Are multiple sequences needed for cancer detection?

    Energy Technology Data Exchange (ETDEWEB)

    Mango, Victoria L., E-mail: vlm2125@columbia.edu [Columbia University Medical Center, Herbert Irving Pavilion, 161 Fort Washington Avenue, 10th Floor, New York, NY 10032 (United States); Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); Morris, Elizabeth A., E-mail: morrise@mskcc.org [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); David Dershaw, D., E-mail: dershawd@mskcc.org [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); Abramson, Andrea, E-mail: abramsoa@mskcc.org [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); Fry, Charles, E-mail: charles_fry@nymc.edu [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); New York Medical College, 40 Sunshine Cottage Rd, Valhalla, NY 10595 (United States); Moskowitz, Chaya S. [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); Hughes, Mary, E-mail: hughesm@mskcc.org [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); Kaplan, Jennifer, E-mail: kaplanj@mskcc.org [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States); Jochelson, Maxine S., E-mail: jochelsm@mskcc.org [Memorial Sloan-Kettering Cancer Center, Breast and Imaging Center, 300 East 66th Street, New York, NY 10065 (United States)

    2015-01-15

    Highlights: • Abbreviated breast MR demonstrates high sensitivity for breast carcinoma detection. • Time to perform/interpret the abbreviated exam is shorter than a standard MRI exam. • An abbreviated breast MRI could reduce costs and make MRI screening more available. - Abstract: Objective: To evaluate the ability of an abbreviated breast magnetic resonance imaging (MRI) protocol, consisting of a precontrast T1 weighted (T1W) image and single early post-contrast T1W image, to detect breast carcinoma. Materials and methods: A HIPAA compliant Institutional Review Board approved review of 100 consecutive breast MRI examinations in patients with biopsy proven unicentric breast carcinoma. 79% were invasive carcinomas and 21% were ductal carcinoma in situ. Four experienced breast radiologists, blinded to carcinoma location, history and prior examinations, assessed the abbreviated protocol evaluating only the first post-contrast T1W image, post-processed subtracted first post-contrast and subtraction maximum intensity projection images. Detection and localization of tumor were compared to the standard full diagnostic examination consisting of 13 pre-contrast, post-contrast and post-processed sequences. Results: All 100 cancers were visualized on initial reading of the abbreviated protocol by at least one reader. The mean sensitivity for each sequence was 96% for the first post-contrast sequence, 96% for the first post-contrast subtraction sequence and 93% for the subtraction MIP sequence. Within each sequence, there was no significant difference between the sensitivities among the 4 readers (p = 0.471, p = 0.656, p = 0.139). Mean interpretation time was 44 s (range 11–167 s). The abbreviated imaging protocol could be performed in approximately 10–15 min, compared to 30–40 min for the standard protocol. Conclusion: An abbreviated breast MRI protocol allows detection of breast carcinoma. One pre and post-contrast T1W sequence may be adequate for detecting

  15. Surface and subsurface cleanup protocol for radionuclides, Gunnison, Colorado, UMTRA project processing site

    International Nuclear Information System (INIS)

    1993-09-01

    Surface and subsurface soil cleanup protocols for the Gunnison, Colorado, processing sits are summarized as follows: In accordance with EPA-promulgated land cleanup standards (40 CFR 192), in situ Ra-226 is to be cleaned up based on bulk concentrations not exceeding 5 and 15 pCi/g in 15-cm surface and subsurface depth increments, averaged over 100-m 2 grid blocks, where the parent Ra-226 concentrations are greater than, or in secular equilibrium with, the Th-230 parent. A bulk interpretation of these EPA standards has been accepted by the Nuclear Regulatory Commission (NRC), and while the concentration of the finer-sized soil fraction less than a No. 4 mesh sieve contains the higher concentration of radioactivity, the bulk approach in effect integrates the total sample radioactivity over the entire sample mass. In locations where Th-230 has differentially migrated in subsoil relative to Ra-226, a Th-230 cleanup protocol has been developed in accordance with Supplemental Standard provisions of 40 CFR 192 for NRC/Colorado Department of Health (CDH) approval for timely implementation. Detailed elements of the protocol are contained in Appendix A, Generic Protocol from Thorium-230 Cleanup/Verification at UMTRA Project Processing Sites. The cleanup of other radionuclides or nonradiological hazards that pose a significant threat to the public and the environment will be determined and implemented in accordance with pathway analysis to assess impacts and the implications of ALARA specified in 40 CFR 192 relative to supplemental standards

  16. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  17. RCRA groundwater data analysis protocol for the Hanford Site, Washington

    International Nuclear Information System (INIS)

    Chou, C.J.; Jackson, R.L.

    1992-04-01

    The Resource Conservation and Recovery Act of 1976 (RCRA) groundwater monitoring program currently involves site-specific monitoring of 20 facilities on the Hanford Site in southeastern Washington. The RCRA groundwater monitoring program has collected abundant data on groundwater quality. These data are used to assess the impact of a facility on groundwater quality or whether remediation efforts under RCRA corrective action programs are effective. Both evaluations rely on statistical analysis of groundwater monitoring data. The need for information on groundwater quality by regulators and environmental managers makes statistical analysis of monitoring data an important part of RCRA groundwater monitoring programs. The complexity of groundwater monitoring programs and variabilities (spatial, temporal, and analytical) exhibited in groundwater quality variables indicate the need for a data analysis protocol to guide statistical analysis. A data analysis protocol was developed from the perspective of addressing regulatory requirements, data quality, and management information needs. This data analysis protocol contains four elements: data handling methods; graphical evaluation techniques; statistical tests for trend, central tendency, and excursion analysis; and reporting procedures for presenting results to users

  18. Use of a standardized JaCVAM in vivo rat comet assay protocol to assess the genotoxicity of three coded test compounds; ampicillin trihydrate, 1,2-dimethylhydrazine dihydrochloride, and N-nitrosodimethylamine.

    Science.gov (United States)

    McNamee, J P; Bellier, P V

    2015-07-01

    As part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), our laboratory examined ampicillin trihydrate (AMP), 1,2-dimethylhydrazine dihydrochloride (DMH), and N-nitrosodimethylamine (NDA) using a standard comet assay validation protocol (v14.2) developed by the JaCVAM validation management team (VMT). Coded samples were received by our laboratory along with basic MSDS information. Solubility analysis and range-finding experiments of the coded test compounds were conducted for dose selection. Animal dosing schedules, the comet assay processing and analysis, and statistical analysis were conducted in accordance with the standard protocol. Based upon our blinded evaluation, AMP was not found to exhibit evidence of genotoxicity in either the rat liver or stomach. However, both NDA and DMH were observed to cause a significant increase in % tail DNA in the rat liver at all dose levels tested. While acute hepatoxicity was observed for these compounds in the high dose group, in the investigators opinion there were a sufficient number of consistently damaged/measurable cells at the medium and low dose groups to judge these compounds as genotoxic. There was no evidence of genotoxicity from either NDA or DMH in the rat stomach. In conclusion, our laboratory observed increased DNA damage from two blinded test compounds in rat liver (later identified as genotoxic carcinogens), while no evidence of genotoxicity was observed for the third blinded test compound (later identified as a non-genotoxic, non-carcinogen). This data supports the use of a standardized protocol of the in vivo comet assay as a cost-effective alternative genotoxicity assay for regulatory testing purposes. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  19. Evaluation of an app-based stress protocol

    Directory of Open Access Journals (Sweden)

    Noeh Claudius

    2016-09-01

    Full Text Available Stress is a major influence on the quality of life in our fast-moving society. This paper describes a standardized and contemporary protocol that is capable of inducing moderate psychological stress in a laboratory setting. Furthermore, it evaluates its effects on physiological biomarkers. The protocol called “THM-Stresstest” mainly consists of a rest period (30 min, an app-based stress test under the surveillance of an audience (4 min and a regeneration period (32 min. We investigated 12 subjects to evaluate the developed protocol. We could show significant changes in heart rate variability, electromyography, electro dermal activity and salivary cortisol and α-amylase. From this data we conclude that the THM-Stresstest can serve as a psychobiological tool for provoking responses in the cardiovascular-, the endocrine and exocrine system as well as the sympathetic part of the central nervous system.

  20. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Directory of Open Access Journals (Sweden)

    Yongming Han

    Full Text Available Quantifying elemental carbon (EC content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT. A high-temperature method with extended heating times (STN120 showed the highest ECT/ECR ratio (0.86 while a low-temperature protocol (IMPROVE-550, with heating time adjusted for sample loading, showed the lowest (0.53. STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average than in soils (5.2 on average, most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  1. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    Science.gov (United States)

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  2. Establishment of reference intervals of clinical chemistry analytes for the adult population in Saudi Arabia: a study conducted as a part of the IFCC global study on reference values.

    Science.gov (United States)

    Borai, Anwar; Ichihara, Kiyoshi; Al Masaud, Abdulaziz; Tamimi, Waleed; Bahijri, Suhad; Armbuster, David; Bawazeer, Ali; Nawajha, Mustafa; Otaibi, Nawaf; Khalil, Haitham; Kawano, Reo; Kaddam, Ibrahim; Abdelaal, Mohamed

    2016-05-01

    This study is a part of the IFCC-global study to derive reference intervals (RIs) for 28 chemistry analytes in Saudis. Healthy individuals (n=826) aged ≥18 years were recruited using the global study protocol. All specimens were measured using an Architect analyzer. RIs were derived by both parametric and non-parametric methods for comparative purpose. The need for secondary exclusion of reference values based on latent abnormal values exclusion (LAVE) method was examined. The magnitude of variation attributable to gender, ages and regions was calculated by the standard deviation ratio (SDR). Sources of variations: age, BMI, physical exercise and smoking levels were investigated by using the multiple regression analysis. SDRs for gender, age and regional differences were significant for 14, 8 and 2 analytes, respectively. BMI-related changes in test results were noted conspicuously for CRP. For some metabolic related parameters the ranges of RIs by non-parametric method were wider than by the parametric method and RIs derived using the LAVE method were significantly different than those without it. RIs were derived with and without gender partition (BMI, drugs and supplements were considered). RIs applicable to Saudis were established for the majority of chemistry analytes, whereas gender, regional and age RI partitioning was required for some analytes. The elevated upper limits of metabolic analytes reflects the existence of high prevalence of metabolic syndrome in Saudi population.

  3. Yarn supplier selection using analytical hierarchy process (AHP) and standardized unitless rating (SUR) method on textile industry

    Science.gov (United States)

    Erfaisalsyah, M. H.; Mansur, A.; Khasanah, A. U.

    2017-11-01

    For a company which engaged in the textile field, specify the supplier of raw materials for production is one important part of supply chain management which can affect the company's business processes. This study aims to identify the best suppliers of raw material suppliers of yarn for PC. PKBI based on several criteria. In this study, the integration between the Analytical Hierarchy Process (AHP) and the Standardized Unitless Rating (SUR) are used to assess the performance of the suppliers. By using AHP, it can be known the value of the relative weighting of each criterion. While SUR shows the sequence performance value of the supplier. The result of supplier ranking calculation can be used to know the strengths and weaknesses of each supplier based on its performance criteria. From the final result, it can be known which suppliers should improve their performance in order to create long term cooperation with the company.

  4. Lead isotope analyses of standard rock samples

    International Nuclear Information System (INIS)

    Koide, Yoshiyuki; Nakamura, Eizo

    1990-01-01

    New results on lead isotope compositions of standard rock samples and their analytical procedures are reported. Bromide form anion exchange chromatography technique was adopted for the chemical separation lead from rock samples. The lead contamination during whole analytical procedure was low enough to determine lead isotope composition of common natural rocks. Silica-gel activator method was applied for emission of lead ions in the mass spectrometer. Using the data reduction of 'unfractionated ratios', we obtained good reproducibility, precision and accuracy on lead isotope compositions of NBS SRM. Here we present new reliable lead isotope compositions of GSJ standard rock samples and USGS standard rock, BCR-1. (author)

  5. Nursing Music Protocol and Postoperative Pain.

    Science.gov (United States)

    Poulsen, Michael J; Coto, Jeffrey

    2018-04-01

    Pain has always been a major concern for patients and nurses during the postoperative period. Therapies, medicines, and protocols have been developed to improve pain and anxiety but have undesirable risks to the patient. Complementary and alternative medicine therapies have been studied but have not been applied as regular protocols in the hospital setting. Music is one type of complementary and alternative medicine therapy that has been reported to have favorable results on reducing postoperative pain, anxiety, and opioid usage. However, music lacks a protocol that nurses can implement during the perioperative process. This paper is an in-depth literature review assessing a best practice recommendation and protocol that establishes a consensus in the use of music therapy. The results suggest that music therapy may consist of calming, soft tones of 60-80 beats per minute for at least 15-30 minutes at least twice daily during the pre- and postoperative periods. It is suggested that music only be used in conjunction with standards of care and not as the primary intervention of pain or anxiety. This evidence suggests that proper use of music therapy can significantly reduce surgical pain. Implementing these protocols and allowing the freedom of nursing staff to use them may lead to greater reductions in surgical pain and anxiety and a reduction in opioid use. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  6. Standards and Protocols for Digital Libraries Dijital Kütüphanelerde Standartlar ve Protokoller

    Directory of Open Access Journals (Sweden)

    Mehmet Emin Küçük

    2003-06-01

    Full Text Available New paradigms have been emerged in the field of librarianship and publishing with the developments in electronic environment, and rapid increase in electronically archived information and its acquisition and retrieval. While libraries show more interest in electronic information, publishers' current tendency is towards production of electronic information. In addition to the acquisition of electronic information, libraries have begun to digitise some of the holdings which do not present copyright problems. Economies provided by digitised information in terms of archiving, sharing and retrieval as well as the users' requirements are the basic motivation for digitisation. However, digital library standards have vital roles in performing effective library cooperation and interoperability between the systems. In this descriptive study, "the most commonly" confronted standards in digital library applications are being examined and evaluated under the following headings; (i record structure standards, (ii encoding standards, and (iii communication standards and protocols. However, metadata standards which can be categorised in record structure standards are not included since the matters are well studied in several papers in Turkish. Elektronik ortamdaki gelişmeler ve elektronik ortamda depolanan bilginin artmasıyla bilginin sağlanması, depolanması ve erişimi konularında önemli değişiklikler meydana gelmiş, yayıncılık ve kütüphanecilikte yeni paradigmaların oluşmasına neden olmuştur. Yayıncılar kaynakları elektronik ortamda üretme eğilimi göstermeye başlarken kütüphaneler de elektronik kaynaklara daha çok ilgi gösterir olmuşlardır. Elektronik ortamda üretilen kaynakların sağlanmasının yanı sıra, bir dizi kütüphane, dermelerinde bulunan ve telif sorunu olmayan materyali dijitalle sürerek kullanıcılarının hizmetine sunmaya başlamıştır. Dijital ortamda bilginin çok daha rahat depolanması, payla

  7. A protocol for amide bond formation with electron deficient amines and sterically hindered substrates

    DEFF Research Database (Denmark)

    Due-Hansen, Maria E; Pandey, Sunil K; Christiansen, Elisabeth

    2016-01-01

    A protocol for amide coupling by in situ formation of acyl fluorides and reaction with amines at elevated temperature has been developed and found to be efficient for coupling of sterically hindered substrates and electron deficient amines where standard methods failed.......A protocol for amide coupling by in situ formation of acyl fluorides and reaction with amines at elevated temperature has been developed and found to be efficient for coupling of sterically hindered substrates and electron deficient amines where standard methods failed....

  8. Developing protocols for geochemical baseline studies: An example from the Coles Hill uranium deposit, Virginia, USA

    International Nuclear Information System (INIS)

    Levitan, Denise M.; Schreiber, Madeline E.; Seal, Robert R.; Bodnar, Robert J.; Aylor, Joseph G.

    2014-01-01

    Highlights: • We outline protocols for baseline geochemical surveys of stream sediments and water. • Regression on order statistics was used to handle non-detect data. • U concentrations in stream water near this unmined ore were below regulatory standards. • Concentrations of major and trace elements were correlated with stream discharge. • Methods can be applied to other extraction activities, including hydraulic fracturing. - Abstract: In this study, we determined baseline geochemical conditions in stream sediments and surface waters surrounding an undeveloped uranium deposit. Emphasis was placed on study design, including site selection to encompass geological variability and temporal sampling to encompass hydrological and climatic variability, in addition to statistical methods for baseline data analysis. The concentrations of most elements in stream sediments were above analytical detection limits, making them amenable to standard statistical analysis. In contrast, some trace elements in surface water had concentrations that were below the respective detection limits, making statistical analysis more challenging. We describe and compare statistical methods appropriate for concentrations that are below detection limits (non-detect data) and conclude that regression on order statistics provided the most rigorous analysis of our results, particularly for trace elements. Elevated concentrations of U and deposit-associated elements (e.g. Ba, Pb, and V) were observed in stream sediments and surface waters downstream of the deposit, but concentrations were below regulatory guidelines for the protection of aquatic ecosystems and for drinking water. Analysis of temporal trends indicated that concentrations of major and trace elements were most strongly related to stream discharge. These findings highlight the need for sampling protocols that will identify and evaluate the temporal and spatial variations in a thorough baseline study

  9. Assessment of Grade of Dysphonia and Correlation With Quality of Life Protocol.

    Science.gov (United States)

    Spina, Ana Lúcia; Crespo, Agrício Nubiato

    2017-03-01

    The main objective of this study is to check the correlation between vocal self-assessment and results of the Voice-Related Quality of Life (V-RQOL) protocol, and whether there is a correlation between perceptual vocal assessment made by voice therapists and the results from the V-RQOL protocol. The study included 245 subjects with vocal complaints. This was a prospective analytical clinical study. Vocal perceptual assessment of each subject with dysphonia was made by three voice therapists, followed by self-assessment made by the subjects themselves, and the application of the V-RQOL protocol. The results have shown poor level of agreement between vocal assessment made by the voice therapists and self-assessment made by the subjects. The statistical analysis indicated that the results of V-RQOL protocol showed significant correlation with the vocal assessment made by the voice therapists and the self-assessment by the subjects. The agreement between the assessments was low and variable; age, gender, professional voice use, and clinical laryngoscopic diagnosis did not influence the agreement level. Protocol V-RQOL is sensitive to vocal assessment made by the voice therapists and self-assessment made by the patient. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  10. H-point standard additions method for simultaneous determination of sulfamethoxazole and trimethoprim in pharmaceutical formulations and biological fluids with simultaneous addition of two analytes

    Science.gov (United States)

    Givianrad, M. H.; Saber-Tehrani, M.; Aberoomand-Azar, P.; Mohagheghian, M.

    2011-03-01

    The applicability of H-point standard additions method (HPSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim is verified by UV-vis spectrophotometry. The results show that the H-point standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. The results of applying the H-point standard additions method showed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:18 to 16:1 in the mixed samples. Also, the limits of detections were 0.58 and 0.37 μmol L -1 for sulfamethoxazole and trimethoprim, respectively. In addition the means of the calculated RSD (%) were 1.63 and 2.01 for SMX and TMP, respectively in synthetic mixtures. The proposed method has been successfully applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples.

  11. Application of X-ray fluorescence analytical techniques in phytoremediation and plant biology studies

    International Nuclear Information System (INIS)

    Necemer, Marijan; Kump, Peter; Scancar, Janez; Jacimovic, Radojko; Simcic, Jurij; Pelicon, Primoz; Budnar, Milos; Jeran, Zvonka; Pongrac, Paula; Regvar, Marjana; Vogel-Mikus, Katarina

    2008-01-01

    Phytoremediation is an emerging technology that employs the use of higher plants for the clean-up of contaminated environments. Progress in the field is however handicapped by limited knowledge of the biological processes involved in plant metal uptake, translocation, tolerance and plant-microbe-soil interactions; therefore a better understanding of the basic biological mechanisms involved in plant/microbe/soil/contaminant interactions would allow further optimization of phytoremediation technologies. In view of the needs of global environmental protection, it is important that in phytoremediation and plant biology studies the analytical procedures for elemental determination in plant tissues and soil should be fast and cheap, with simple sample preparation, and of adequate accuracy and reproducibility. The aim of this study was therefore to present the main characteristics, sample preparation protocols and applications of X-ray fluorescence-based analytical techniques (energy dispersive X-ray fluorescence spectrometry-EDXRF, total reflection X-ray fluorescence spectrometry-TXRF and micro-proton induced X-ray emission-micro-PIXE). Element concentrations in plant leaves from metal polluted and non-polluted sites, as well as standard reference materials, were analyzed by the mentioned techniques, and additionally by instrumental neutron activation analysis (INAA) and atomic absorption spectrometry (AAS). The results were compared and critically evaluated in order to assess the performance and capability of X-ray fluorescence-based techniques in phytoremediation and plant biology studies. It is the EDXRF, which is recommended as suitable to be used in the analyses of a large number of samples, because it is multi-elemental, requires only simple preparation of sample material, and it is analytically comparable to the most frequently used instrumental chemical techniques. The TXRF is compatible to FAAS in sample preparation, but relative to AAS it is fast, sensitive and

  12. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    Science.gov (United States)

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  13. Getting started with Greenplum for big data analytics

    CERN Document Server

    Gollapudi, Sunila

    2013-01-01

    Standard tutorial-based approach.""Getting Started with Greenplum for Big Data"" Analytics is great for data scientists and data analysts with a basic knowledge of Data Warehousing and Business Intelligence platforms who are new to Big Data and who are looking to get a good grounding in how to use the Greenplum Platform. It's assumed that you will have some experience with database design and programming as well as be familiar with analytics tools like R and Weka.

  14. Analytic of elements for the determination of soil->plant transfer factors

    International Nuclear Information System (INIS)

    Liese, T.

    1985-02-01

    This article describes a part of the conventional analytical work, which was done to determine soil to plant transfer factors. The analytical methods, the experiments to find out the best way of sample digestion and the resulting analytical procedures are described. Analytical methods are graphite furnace atomic absorption spectrometry (GFAAS) and inductively coupled plasma atomic emission spectrometry (ICP-AES). In case of ICP-AES the necessity of right background correction and correction of the spectral interferences is shown. The reliability of the analytical procedure is demonstrated by measuring different kinds of standard reference materials and by comparison of AAS and AES. (orig./HP) [de

  15. Streetlight Control System Based on Wireless Communication over DALI Protocol

    Science.gov (United States)

    Bellido-Outeiriño, Francisco José; Quiles-Latorre, Francisco Javier; Moreno-Moreno, Carlos Diego; Flores-Arias, José María; Moreno-García, Isabel; Ortiz-López, Manuel

    2016-01-01

    Public lighting represents a large part of the energy consumption of towns and cities. Efficient management of public lighting can entail significant energy savings. This work presents a smart system for managing public lighting networks based on wireless communication and the DALI protocol. Wireless communication entails significant economic savings, as there is no need to install new wiring and visual impacts and damage to the facades of historical buildings in city centers are avoided. The DALI protocol uses bidirectional communication with the ballast, which allows its status to be controlled and monitored at all times. The novelty of this work is that it tackles all aspects related to the management of public lighting: a standard protocol, DALI, was selected to control the ballast, a wireless node based on the IEEE 802.15.4 standard with a DALI interface was designed, a network layer that considers the topology of the lighting network has been developed, and lastly, some user-friendly applications for the control and maintenance of the system by the technical crews of the different towns and cities have been developed. PMID:27128923

  16. Streetlight Control System Based on Wireless Communication over DALI Protocol.

    Science.gov (United States)

    Bellido-Outeiriño, Francisco José; Quiles-Latorre, Francisco Javier; Moreno-Moreno, Carlos Diego; Flores-Arias, José María; Moreno-García, Isabel; Ortiz-López, Manuel

    2016-04-27

    Public lighting represents a large part of the energy consumption of towns and cities. Efficient management of public lighting can entail significant energy savings. This work presents a smart system for managing public lighting networks based on wireless communication and the DALI protocol. Wireless communication entails significant economic savings, as there is no need to install new wiring and visual impacts and damage to the facades of historical buildings in city centers are avoided. The DALI protocol uses bidirectional communication with the ballast, which allows its status to be controlled and monitored at all times. The novelty of this work is that it tackles all aspects related to the management of public lighting: a standard protocol, DALI, was selected to control the ballast, a wireless node based on the IEEE 802.15.4 standard with a DALI interface was designed, a network layer that considers the topology of the lighting network has been developed, and lastly, some user-friendly applications for the control and maintenance of the system by the technical crews of the different towns and cities have been developed.

  17. May the Kyoto protocol produce results?

    International Nuclear Information System (INIS)

    Jaureguy-Naudin, M.

    2009-01-01

    A not well managed drastic reduction of greenhouse emissions might result in significant decrease of living standards, but without such reduction efforts, climate change might have five to twenty times higher costs. Thus, while indicating estimated consequences or evolutions of greenhouse emissions and temperature, the author stresses the need of emission reduction. She discusses the role of economic instruments which can be used in policies aimed at the struggle against climate change. She recalls the emission reduction commitments specified in the Kyoto protocol, discusses the present status, operation and results of the international emission trading scheme, the lessons learned after the first years of operation, comments the involvement of emerging countries in relationship with another mechanism defined in the protocol: the Clean Development Mechanism

  18. A new testing protocol for zirconia dental implants.

    Science.gov (United States)

    Sanon, Clarisse; Chevalier, Jérôme; Douillard, Thierry; Cattani-Lorente, Maria; Scherrer, Susanne S; Gremillard, Laurent

    2015-01-01

    Based on the current lack of standards concerning zirconia dental implants, we aim at developing a protocol to validate their functionality and safety prior their clinical use. The protocol is designed to account for the specific brittle nature of ceramics and the specific behavior of zirconia in terms of phase transformation. Several types of zirconia dental implants with different surface textures (porous, alveolar, rough) were assessed. The implants were first characterized in their as-received state by Scanning Electron Microscopy (SEM), Focused Ion Beam (FIB), X-Ray Diffraction (XRD). Fracture tests following a method adapted from ISO 14801 were conducted to evaluate their initial mechanical properties. Accelerated aging was performed on the implants, and XRD monoclinic content measured directly at their surface instead of using polished samples as in ISO 13356. The implants were then characterized again after aging. Implants with an alveolar surface presented large defects. The protocol shows that such defects compromise the long-term mechanical properties. Implants with a porous surface exhibited sufficient strength but a significant sensitivity to aging. Even if associated to micro cracking clearly observed by FIB, aging did not decrease mechanical strength of the implants. As each dental implant company has its own process, all zirconia implants may behave differently, even if the starting powder is the same. Especially, surface modifications have a large influence on strength and aging resistance, which is not taken into account by the current standards. Protocols adapted from this work could be useful. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  19. Quality Measures in Pre-Analytical Phase of Tissue Processing: Understanding Its Value in Histopathology.

    Science.gov (United States)

    Rao, Shalinee; Masilamani, Suresh; Sundaram, Sandhya; Duvuru, Prathiba; Swaminathan, Rajendiran

    2016-01-01

    Quality monitoring in histopathology unit is categorized into three phases, pre-analytical, analytical and post-analytical, to cover various steps in the entire test cycle. Review of literature on quality evaluation studies pertaining to histopathology revealed that earlier reports were mainly focused on analytical aspects with limited studies on assessment of pre-analytical phase. Pre-analytical phase encompasses several processing steps and handling of specimen/sample by multiple individuals, thus allowing enough scope for errors. Due to its critical nature and limited studies in the past to assess quality in pre-analytical phase, it deserves more attention. This study was undertaken to analyse and assess the quality parameters in pre-analytical phase in a histopathology laboratory. This was a retrospective study done on pre-analytical parameters in histopathology laboratory of a tertiary care centre on 18,626 tissue specimens received in 34 months. Registers and records were checked for efficiency and errors for pre-analytical quality variables: specimen identification, specimen in appropriate fixatives, lost specimens, daily internal quality control performance on staining, performance in inter-laboratory quality assessment program {External quality assurance program (EQAS)} and evaluation of internal non-conformities (NC) for other errors. The study revealed incorrect specimen labelling in 0.04%, 0.01% and 0.01% in 2007, 2008 and 2009 respectively. About 0.04%, 0.07% and 0.18% specimens were not sent in fixatives in 2007, 2008 and 2009 respectively. There was no incidence of specimen lost. A total of 113 non-conformities were identified out of which 92.9% belonged to the pre-analytical phase. The predominant NC (any deviation from normal standard which may generate an error and result in compromising with quality standards) identified was wrong labelling of slides. Performance in EQAS for pre-analytical phase was satisfactory in 6 of 9 cycles. A low incidence

  20. Actions of a protocol for radioactive waste management

    International Nuclear Information System (INIS)

    Sousa, Joyce Caroline de Oliveira; Andrade, Idalmar Gomes da Silva; Frazão, Denys Wanderson Pereira; Abreu, Lukas Maxwell Oliveira de; França, Clyslane Alves; Macedo, Paulo de Tarso Silva de

    2017-01-01

    Radioactive wastes are all those materials generated in the various uses of radioactive materials, which can not be reused and which have radioactive substances in quantities that can not be treated as ordinary waste. All management of these wastes must be carried out carefully, including actions ranging from its collection to the point where they are generated to their final destination. However, any and all procedures must be carried out in order to comply with the requirements for the protection of workers, individuals, the public and the environment. The final product of the study was a descriptive tutorial on the procedures and actions of a standard radioactive waste management protocol developed from scientific publications on radiation protection. The management of radioactive waste is one of the essential procedures in the radiological protection of man and the environment where the manipulation of radioactive materials occurs. The standard radioactive management protocol includes: collection, segregation of various types of wastes, transport, characterization, treatment, storage and final disposal. The radioactive wastes typology interferes with sequencing and the way in which actions are developed. The standardization of mechanisms in the management of radioactive waste contributes to the radiological safety of all those involved

  1. Analytical quality control of neutron activation analysis by interlaboratory comparison and proficiency test

    International Nuclear Information System (INIS)

    Kim, S. H.; Moon, J. H.; Jeong, Y. S.

    2002-01-01

    Two air filters (V-50, P-50) artificially loaded with urban dust were provided from IAEA and trace elements to study inter-laboratory comparison and proficiency test were determined using instrumental neutron activation analysis non-destructively. Standard reference material(Urban Particulate Matter, NIST SRM 1648) of National Institute of Standard and Technology was used for internal analytical quality control. About 20 elements in each loaded filter sample were determined, respectively. Our analytical data were compared with statistical results using neutron activation analysis, particle induced X-ray emission spectrometry, inductively coupled plasma mass spectroscopy, etc., which were collected from 49 laboratories of 40 countries. From the results that were statistically re-treated with reported values, Z-scores of our analytical values are within ±2. In addition, the results of proficiency test are passed and accuracy and precision of the analytical values are reliable. Consequently, it was proved that analytical quality control for the analysis of air dust samples is reasonable

  2. Establishment of gold-quartz standard GQS-1

    Science.gov (United States)

    Millard, Hugh T.; Marinenko, John; McLane, John E.

    1969-01-01

    A homogeneous gold-quartz standard, GQS-1, was prepared from a heterogeneous gold-bearing quartz by chemical treatment. The concentration of gold in GQS-1 was determined by both instrumental neutron activation analysis and radioisotope dilution analysis to be 2.61?0.10 parts per million. Analysis of 10 samples of the standard by both instrumental neutron activation analysis and radioisotope dilution analysis failed to reveal heterogeneity within the standard. The precision of the analytical methods, expressed as standard error, was approximately 0.1 part per million. The analytical data were also used to estimate the average size of gold particles. The chemical treatment apparently reduced the average diameter of the gold particles by at least an order of magnitude and increased the concentration of gold grains by a factor of at least 4,000.

  3. Protocol - realist and meta-narrative evidence synthesis: Evolving Standards (RAMESES

    Directory of Open Access Journals (Sweden)

    Westhorp Gill

    2011-08-01

    Full Text Available Abstract Background There is growing interest in theory-driven, qualitative and mixed-method approaches to systematic review as an alternative to (or to extend and supplement conventional Cochrane-style reviews. These approaches offer the potential to expand the knowledge base in policy-relevant areas - for example by explaining the success, failure or mixed fortunes of complex interventions. However, the quality of such reviews can be difficult to assess. This study aims to produce methodological guidance, publication standards and training resources for those seeking to use the realist and/or meta-narrative approach to systematic review. Methods/design We will: [a] collate and summarise existing literature on the principles of good practice in realist and meta-narrative systematic review; [b] consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing methods could be improved; [c] using an online Delphi method with an interdisciplinary panel of experts from academia and policy, produce a draft set of methodological steps and publication standards; [d] produce training materials with learning outcomes linked to these steps; [e] pilot these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise; [f] synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards; [g] disseminate outputs to audiences in academia and policy. The outputs of the study will be threefold: 1. Quality standards and methodological guidance for realist and meta-narrative reviews for use by researchers, research sponsors, students and supervisors 2. A 'RAMESES' (Realist and Meta-review Evidence Synthesis: Evolving Standards statement (comparable to CONSORT or PRISMA of publication standards for such reviews, published in an open

  4. Backpressure-based control protocols: design and computational aspects

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, Willem R.W.; Mandjes, M.R.H.

    2009-01-01

    Congestion control in packet-based networks is often realized by feedback protocols. In this paper we assess their performance under a back-pressure mechanism that has been proposed and standardized for Ethernet metropolitan networks. In such a mechanism the service rate of an upstream queue is

  5. Backpressure-based control protocols: Design and computational aspects

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2009-01-01

    Congestion control in packet-based networks is often realized by feedback protocols. In this paper we assess their performance under a back-pressure mechanism that has been proposed and standardized for Ethernet metropolitan networks. In such a mechanism the service rate of an upstream queue is

  6. Control room envelope unfiltered air inleakage test protocols

    International Nuclear Information System (INIS)

    Lagus, P.L.; Grot, R.A.

    1997-01-01

    In 1983, the Advisory Committee on Reactor Safeguards (ACRS) recommended that the US NRC develop a control room HVAC performance testing protocol. To date no such protocol has been forthcoming. Beginning in mid-1994, an effort was funded by NRC under a Small Business Innovation Research (SBIR) grant to develop several simplified test protocols based on the principles of tracer gas testing in order to measure the total unfiltered inleakage entering a CRE during emergency mode operation of the control room ventilation system. These would allow accurate assessment of unfiltered air inleakage as required in SRP 6.4. The continuing lack of a standard protocol is unfortunate since one of the significant parameters required to calculate operator dose is the amount of unfiltered air inleakage into the control room. Often it is assumed that, if the Control Room Envelope (CRE) is maintained at +1/8 in. w.g. differential pressure relative to the surroundings, no significant unfiltered inleakage can occur it is further assumed that inleakage due to door openings is the only source of unfiltered air. 23 refs., 13 figs., 2 tabs

  7. Control room envelope unfiltered air inleakage test protocols

    Energy Technology Data Exchange (ETDEWEB)

    Lagus, P.L. [Lagus Applied Technology, San Diego, CA (United States); Grot, R.A. [Lagus Applied Technology, Olney, MD (United States)

    1997-08-01

    In 1983, the Advisory Committee on Reactor Safeguards (ACRS) recommended that the US NRC develop a control room HVAC performance testing protocol. To date no such protocol has been forthcoming. Beginning in mid-1994, an effort was funded by NRC under a Small Business Innovation Research (SBIR) grant to develop several simplified test protocols based on the principles of tracer gas testing in order to measure the total unfiltered inleakage entering a CRE during emergency mode operation of the control room ventilation system. These would allow accurate assessment of unfiltered air inleakage as required in SRP 6.4. The continuing lack of a standard protocol is unfortunate since one of the significant parameters required to calculate operator dose is the amount of unfiltered air inleakage into the control room. Often it is assumed that, if the Control Room Envelope (CRE) is maintained at +1/8 in. w.g. differential pressure relative to the surroundings, no significant unfiltered inleakage can occur it is further assumed that inleakage due to door openings is the only source of unfiltered air. 23 refs., 13 figs., 2 tabs.

  8. Task Group on Computer/Communication Protocols for Bibliographic Data Exchange. Interim Report = Groupe de Travail sur les Protocoles de Communication/Ordinateurs pour l'Exchange de Donnees Bibliographiques. Rapport d'Etape. May 1983.

    Science.gov (United States)

    Canadian Network Papers, 1983

    1983-01-01

    This preliminary report describes the work to date of the Task Group on Computer/Communication protocols for Bibliographic Data Interchange, which was formed in 1980 to develop a set of protocol standards to facilitate communication between heterogeneous library and information systems within the framework of Open Systems Interconnection (OSI). A…

  9. Analytical performance of 17 general chemistry analytes across countries and across manufacturers in the INPUtS project of EQA organizers in Italy, the Netherlands, Portugal, United Kingdom and Spain.

    Science.gov (United States)

    Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula

    2017-02-01

    Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.

  10. New Communication Network Protocol for a Data Acquisition System

    Science.gov (United States)

    Uchida, T.; Fujii, H.; Nagasaka, Y.; Tanaka, M.

    2006-02-01

    An event builder based on communication networks has been used in high-energy physics experiments, and various networks have been adopted, for example, IEEE 802.3 (Ethernet), asynchronous transfer mode (ATM), and so on. In particular, Ethernet is widely used because its infrastructure is very cost effective. Many systems adopt standard protocols that are designed for a general network. However, in the case of an event builder, the communication pattern between stations is different from that in a general network. The unique communication pattern causes congestion, and thus makes it difficulty to quantitatively design the network. To solve this problem, we have developed a simple network protocol for a data acquisition (DAQ) system. The protocol is designed to keep the sequence of senders so that no congestion occurs. We implemented the protocol on a small hardware component [a field programmable gate array (FPGA)] and measured the performance, so that it will be ready for a generic DAQ system

  11. CytometryML: a data standard which has been designed to interface with other standards

    Science.gov (United States)

    Leif, Robert C.

    2007-02-01

    Because of the differences in the requirements, needs, and past histories including existing standards of the creating organizations, a single encompassing cytology-pathology standard will not, in the near future, replace the multiple existing or under development standards. Except for DICOM and FCS, these standardization efforts are all based on XML. CytometryML is a collection of XML schemas, which are based on the Digital Imaging and Communications in Medicine (DICOM) and Flow Cytometry Standard (FCS) datatypes. The CytometryML schemas contain attributes that link them to the DICOM standard and FCS. Interoperability with DICOM has been facilitated by, wherever reasonable, limiting the difference between CytometryML and the previous standards to syntax. In order to permit the Resource Description Framework, RDF, to reference the CytometryML datatypes, id attributes have been added to many CytometryML elements. The Laboratory Digital Imaging Project (LDIP) Data Exchange Specification and the Flowcyt standards development effort employ RDF syntax. Documentation from DICOM has been reused in CytometryML. The unity of analytical cytology was demonstrated by deriving a microscope type and a flow cytometer type from a generic cytometry instrument type. The feasibility of incorporating the Flowcyt gating schemas into CytometryML has been demonstrated. CytometryML is being extended to include many of the new DICOM Working Group 26 datatypes, which describe patients, specimens, and analytes. In situations where multiple standards are being created, interoperability can be facilitated by employing datatypes based on a common set of semantics and building in links to standards that employ different syntax.

  12. Radiologic procedures, policies and protocols for pediatric emergency medicine

    International Nuclear Information System (INIS)

    Woodward, George A.

    2008-01-01

    Protocol development between radiology and pediatric emergency medicine requires a multidisciplinary approach to manage straightforward as well as complex and time-sensitive needs for emergency department patients. Imaging evaluation requires coordination of radiologic technologists, radiologists, transporters, nurses and coordinators, among others, and might require accelerated routines or occur at sub-optimal times. Standardized protocol development enables providers to design a best practice in all of these situations and should be predicated on evidence, mission, and service expectations. As in any new process, constructive feedback channels are imperative for evaluation and modification. (orig.)

  13. Standardized Treatment of Neonatal Status Epilepticus Improves Outcome.

    Science.gov (United States)

    Harris, Mandy L; Malloy, Katherine M; Lawson, Sheena N; Rose, Rebecca S; Buss, William F; Mietzsch, Ulrike

    2016-12-01

    We aimed to decrease practice variation in treatment of neonatal status epilepticus by implementing a standardized protocol. Our primary goal was to achieve 80% adherence to the algorithm within 12 months. Secondary outcome measures included serum phenobarbital concentrations, number of patients progressing from seizures to status epilepticus, and length of hospital stay. Data collection occurred for 6 months prior and 12 months following protocol implementation. Adherence of 80% within 12 months was partially achieved in patients diagnosed in our hospital; in pretreated patients, adherence was not achieved. Maximum phenobarbital concentrations were decreased (56.8 vs 41.0 µg/mL), fewer patients progressed from seizures to status epilepticus (46% vs 36%), and hospital length of stay decreased by 9.7 days in survivors. In conclusion, standardized, protocol-driven treatment of neonatal status epilepticus improves consistency and short-term outcome. © The Author(s) 2016.

  14. Calibration of working standard ionization chambers and dose standardization

    International Nuclear Information System (INIS)

    Abd Elmahoud, A. A. B.

    2011-01-01

    Measurements were performed for the calibration of two working standard ionization chambers in the secondary standard dosimetry laboratory of Sudan. 600 cc cylindrical former type and 1800 cc cylindrical radical radiation protection level ionization chambers were calibrated against 1000 cc spherical reference standard ionization chamber. The chamber were calibrated at X-ray narrow spectrum series with beam energies ranged from (33-116 KeV) in addition to 1''3''7''Cs beam with 662 KeV energy. The chambers 0.6 cc and 0.3 cc therapy level ionization were used for dose standardization and beam output calibrations of cobalt-60 radiotherapy machine located at the National Cancer Institute, University of Gazira. Concerning beam output measurements for 6''0''Co radiotherapy machine, dosimetric measurements were performed in accordance with the relevant per IAEA dosimetry protocols TRS-277 and TRS-398. The kinetic energy released per unit mass in air (air kerma) were obtained by multiplying the corrected electrometer reading (nC/min) by the calibration factors (Gy/n C) of the chambers from given in the calibration certificate. The uncertainty of measurements of air kerma were calculated for the all ionization chambers (combined uncertainty) the calibration factors of these ionization chambers then were calculated by comparing the reading of air kerma of secondary standard ionization chambers to than from radical and farmer chambers. The result of calibration working standard ionization chambers showed different calibration factors ranged from 0.99 to 1.52 for different radiation energies and these differences were due to chambers response and specification. The absorbed dose to to water calculated for therapy ionization chamber using two code of practice TRS-277 and TRS-398 as beam output for 6''0''Co radiotherapy machine and it can be used as a reference for future beam output calibration in radiotherapy dosimetry. The measurement of absorbed dose to water showed that the

  15. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    Science.gov (United States)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification

  16. Analytical chemistry: Principles and techniques

    International Nuclear Information System (INIS)

    Hargis, L.G.

    1988-01-01

    Although this text seems to have been intended for use in a one-semester course in undergraduate analytical chemistry, it includes the range of topics usually encountered in a two-semester introductory course in chemical analysis. The material is arranged logically for use in a two-semester course: the first 12 chapters contain the subjects most often covered in the first term, and the next 10 chapters pertain to the second (instrumental) term. Overall breadth and level of treatment are standards for an undergraduate text of this sort, and the only major omission is that of kinetic methods (which is a common omission in analytical texts). In the first 12 chapters coverage of the basic material is quite good. The emphasis on the underlying principles of the techniques rather than on specifics and design of instrumentation is welcomed. This text may be more useful for the instrumental portion of an analytical chemistry course than for the solution chemistry segment. The instrumental analysis portion is appropriate for an introductory textbook

  17. Interlaboratory test comparison among Environmental Radioactivity Laboratories using the ISO/IUPAC/AOAC Protocol

    International Nuclear Information System (INIS)

    Romero, L.; Ramos, L.; Salas, R.

    1998-01-01

    World-wide acceptance of results from radiochemical analyses requires reliable, traceable and comparable measurements to SI units, particularly when data sets generated by laboratories are to contribute to evaluation of data from environmental pollution research and monitoring programmes. The Spanish Nuclear Safety Council (CSN) organizes in collaboration with CIEMAT periodical interlaboratory test comparisons for environmental radioactivity laboratories aiming to provide them with the necessary means to asses the quality of their results. This paper presents data from the most recent exercise which, for the first time, was evaluated following the procedure recommended in the ISO/IUPAC/AOAC Harmonized Protocol for the proficiency testing of analytical laboratories (1). The test sample was a Reference Material provided by the IAEA-AQCS, a lake sediment containing the following radionuclides: k-40, Ra-226, Ac-228, Cs-137, Sr-90, Pu-(239+240). The results of the proficiency test were computed for the 28 participating laboratories using the z-score approach, the evaluation of the exercises is presented in the paper. The use of a z-score classification has demonstrated to provide laboratories with a more objective means of assessing and demonstrating the reliability of the data they are producing. Analytical proficiency of the participating laboratories has been found to be satisfactory in 57 to 100 percent of cases. (1)- The International harmonized protocol for the proficiency testing of (chemical) analytical laboratories. Pure and Appl. Chem. Vol. 65, n 9, pp. 2123-2144, 1993 IUPAC. GB (Author) 3 refs

  18. Epidemiological cut-off values for Flavobacterium psychrophilum MIC data generated by a standard test protocol.

    Science.gov (United States)

    Smith, P; Endris, R; Kronvall, G; Thomas, V; Verner-Jeffreys, D; Wilhelm, C; Dalsgaard, I

    2016-02-01

    Epidemiological cut-off values were developed for application to antibiotic susceptibility data for Flavobacterium psychrophilum generated by standard CLSI test protocols. The MIC values for ten antibiotic agents against Flavobacterium psychrophilum were determined in two laboratories. For five antibiotics, the data sets were of sufficient quality and quantity to allow the setting of valid epidemiological cut-off values. For these agents, the cut-off values, calculated by the application of the statistically based normalized resistance interpretation method, were ≤16 mg L(-1) for erythromycin, ≤2 mg L(-1) for florfenicol, ≤0.025 mg L(-1) for oxolinic acid (OXO), ≤0.125 mg L(-1) for oxytetracycline and ≤20 (1/19) mg L(-1) for trimethoprim/sulphamethoxazole. For ampicillin and amoxicillin, the majority of putative wild-type observations were 'off scale', and therefore, statistically valid cut-off values could not be calculated. For ormetoprim/sulphadimethoxine, the data were excessively diverse and a valid cut-off could not be determined. For flumequine, the putative wild-type data were extremely skewed, and for enrofloxacin, there was inadequate separation in the MIC values for putative wild-type and non-wild-type strains. It is argued that the adoption of OXO as a class representative for the quinolone group would be a valid method of determining susceptibilities to these agents. © 2014 John Wiley & Sons Ltd.

  19. Analytical methods for heat transfer and fluid flow problems

    CERN Document Server

    Weigand, Bernhard

    2015-01-01

    This book describes useful analytical methods by applying them to real-world problems rather than solving the usual over-simplified classroom problems. The book demonstrates the applicability of analytical methods even for complex problems and guides the reader to a more intuitive understanding of approaches and solutions. Although the solution of Partial Differential Equations by numerical methods is the standard practice in industries, analytical methods are still important for the critical assessment of results derived from advanced computer simulations and the improvement of the underlying numerical techniques. Literature devoted to analytical methods, however, often focuses on theoretical and mathematical aspects and is therefore useless to most engineers. Analytical Methods for Heat Transfer and Fluid Flow Problems addresses engineers and engineering students. The second edition has been updated, the chapters on non-linear problems and on axial heat conduction problems were extended. And worked out exam...

  20. Analytical quality control in environmental analysis - Recent results and future trends of the IAEA's analytical quality control programme

    Energy Technology Data Exchange (ETDEWEB)

    Suschny, O; Heinonen, J

    1973-12-01

    The significance of analytical results depends critically on the degree of their reliability, an assessment of this reliability is indispensable if the results are to have any meaning at all. Environmental radionuclide analysis is a relatively new analytical field in which new methods are continuously being developed and into which many new laboratories have entered during the last ten to fifteen years. The scarcity of routine methods and the lack of experience of the new laboratories have made the need for the assessment of the reliability of results particularly urgent in this field. The IAEA, since 1962, has provided assistance to its member states by making available to their laboratories analytical quality control services in the form of standard samples, reference materials and the organization of analytical intercomparisons. The scope of this programme has increased over the years and now includes, in addition to environmental radionuclides, non-radioactive environmental contaminants which may be analysed by nuclear methods, materials for forensic neutron activation analysis, bioassay materials and nuclear fuel. The results obtained in recent intercomparisons demonstrate the continued need for these services. (author)

  1. Multidisciplinary perioperative protocol in patients undergoing acute high-risk abdominal surgery

    DEFF Research Database (Denmark)

    Tengberg, L. T.; Bay-Nielsen, M.; Bisgaard, T.

    2017-01-01

    Background: Acute high-risk abdominal (AHA) surgery carries a very high risk of morbidity and mortality and represents a massive healthcare burden. The aim of the present study was to evaluate the effect of a standardized multidisciplinary perioperative protocol in patients undergoing AHA surgery...... = 0·004). Conclusion: The introduction of a multidisciplinary perioperative protocol was associated with a significant reduction in postoperative mortality in patients undergoing AHA surgery. NCT01899885 (http://www.clinicaltrials.gov)....

  2. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  3. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  4. Toward a standardized investigation protocol in sudden unexpected deaths in infancy in South Africa: a multicenter study of medico-legal investigation procedures and outcomes.

    Science.gov (United States)

    du Toit-Prinsloo, Lorraine; Dempers, Johan; Verster, Janette; Hattingh, Christa; Nel, Hestelle; Brandt, V D; Jordaan, Joyce; Saayman, Gert

    2013-09-01

    South Africa manifests a socio-economic dichotomy that shows features of both a developed and developing country. As a result of this, areas exist where a lack of resources and expertise prevents the implementation of a highly standardized protocol for the investigation of sudden and unexpected deaths in infants (SUDI). Although the medico-legal mortuaries attached to academic centers have the capacity to implement standardized protocols, a previous study conducted at two large medico-legal mortuaries indicated otherwise. This study also revealed that the exact number and incidence of sudden infant death syndrome (SIDS) cases was unknown. These findings prompted a multicenter study of the medico-legal investigation procedures and outcomes in five academic centers in South Africa. A retrospective case audit was conducted for a 5-year period (2005-2009) at medico-legal laboratories attached to universities in Bloemfontein, Cape Town-Tygerberg, Durban, Johannesburg, and Pretoria. The total case load as well as the total number of infants younger than 1 year of age admitted to these mortuaries was documented. The case files on all infants younger than 1 year of age who were admitted as sudden and unexpected or unexplained deaths were included in the study population. Data collected on the target population included demographic details, the nature and scope of the post-mortem examinations, as well as the final outcome (cause of death). A total case load of 80,399 cases were admitted to the mortuaries over the 5 year period with a total of 3,295 (6.5 %) infants. In the infant group, 591 (0.7 %) died from non-natural causes and 2,704 (3.3 %) cases of sudden, unexpected and/or unexplained deaths in infants were admitted and included in the detailed case analysis study. One hundred and ninety-nine babies were between 0 and 7 days of age and 210 babies between 8 and 30 days. The remaining 2,295 infants were between 1 month and 12 months of age. Death scene investigation was

  5. Scaling HEP to Web size with RESTful protocols: The frontier example

    International Nuclear Information System (INIS)

    Dykstra, Dave

    2011-01-01

    The World-Wide-Web has scaled to an enormous size. The largest single contributor to its scalability is the HTTP protocol, particularly when used in conformity to REST (REpresentational State Transfer) principles. High Energy Physics (HEP) computing also has to scale to an enormous size, so it makes sense to base much of it on RESTful protocols. Frontier, which reads databases with an HTTP-based RESTful protocol, has successfully scaled to deliver production detector conditions data from both the CMS and ATLAS LHC detectors to hundreds of thousands of computer cores worldwide. Frontier is also able to re-use a large amount of standard software that runs the Web: on the clients, caches, and servers. I discuss the specific ways in which HTTP and REST enable high scalability for Frontier. I also briefly discuss another protocol used in HEP computing that is HTTP-based and RESTful, and another protocol that could benefit from it. My goal is to encourage HEP protocol designers to consider HTTP and REST whenever the same information is needed in many places.

  6. Energy Efficient Medium Access Control Protocol for Clustered Wireless Sensor Networks with Adaptive Cross-Layer Scheduling.

    Science.gov (United States)

    Sefuba, Maria; Walingo, Tom; Takawira, Fambirai

    2015-09-18

    This paper presents an Energy Efficient Medium Access Control (MAC) protocol for clustered wireless sensor networks that aims to improve energy efficiency and delay performance. The proposed protocol employs an adaptive cross-layer intra-cluster scheduling and an inter-cluster relay selection diversity. The scheduling is based on available data packets and remaining energy level of the source node (SN). This helps to minimize idle listening on nodes without data to transmit as well as reducing control packet overhead. The relay selection diversity is carried out between clusters, by the cluster head (CH), and the base station (BS). The diversity helps to improve network reliability and prolong the network lifetime. Relay selection is determined based on the communication distance, the remaining energy and the channel quality indicator (CQI) for the relay cluster head (RCH). An analytical framework for energy consumption and transmission delay for the proposed MAC protocol is presented in this work. The performance of the proposed MAC protocol is evaluated based on transmission delay, energy consumption, and network lifetime. The results obtained indicate that the proposed MAC protocol provides improved performance than traditional cluster based MAC protocols.

  7. GnRH antagonist versus long agonist protocols in IVF

    DEFF Research Database (Denmark)

    Lambalk, C B; Banga, F R; Huirne, J A

    2017-01-01

    BACKGROUND: Most reviews of IVF ovarian stimulation protocols have insufficiently accounted for various patient populations, such as ovulatory women, women with polycystic ovary syndrome (PCOS) or women with poor ovarian response, and have included studies in which the agonist or antagonist...... was not the only variable between the compared study arms. OBJECTIVE AND RATIONALE: The aim of the current study was to compare GnRH antagonist protocols versus standard long agonist protocols in couples undergoing IVF or ICSI, while accounting for various patient populations and treatment schedules. SEARCH...... in couples undergoing IVF or ICSI. The primary outcome was ongoing pregnancy rate. Secondary outcomes were: live birth rate, clinical pregnancy rate, number of oocytes retrieved and safety with regard to ovarian hyperstimulation syndrome (OHSS). Separate comparisons were performed for the general IVF...

  8. Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crawford, Aladsair J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fuller, Jason C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gourisetti, Sri Nikhil Gup [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Viswanathan, Vilayanur V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ferreira, Summer [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schoenwald, David [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rosewater, David [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-01

    The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Based on experiences with the application and use of that document, and to include additional ESS applications and associated duty cycles, test procedures and performance metrics, a first revision of the November 2012 Protocol was issued in June 2014 (PNNL 22010 Rev. 1). As an update of the 2014 revision 1 to the Protocol, this document (the March 2016 revision 2 to the Protocol) is intended to supersede the June 2014 revision 1 to the Protocol and provide a more user-friendly yet more robust and comprehensive basis for measuring and expressing ESS performance.

  9. A review of analytical procedures for the simultaneous determination of medically important veterinary antibiotics in environmental water: Sample preparation, liquid chromatography, and mass spectrometry.

    Science.gov (United States)

    Kim, Chansik; Ryu, Hong-Duck; Chung, Eu Gene; Kim, Yongseok; Lee, Jae-Kwan

    2018-07-01

    Medically important (MI) antibiotics are defined by the United States Food and Drug Administration as drugs containing certain active antimicrobial ingredients that are used for the treatment of human diseases or enteric pathogens causing food-borne diseases. The presence of MI antibiotic residues in environmental water is a major concern for both aquatic ecosystems and public health, particularly because of their potential to contribute to the development of antimicrobial-resistant microorganisms. In this article, we present a review of global trends in the sales of veterinary MI antibiotics and the analytical methodologies used for the simultaneous determination of antibiotic residues in environmental water. According to recently published government reports, sales volumes have increased steadily, despite many countries having adopted strategies for reducing the consumption of antibiotics. Global attention needs to be directed urgently at establishing new management strategies for reducing the use of MI antimicrobial products in the livestock industry. The development of standardized analytical methods for the detection of multiple residues is required to monitor and understand the fate of antibiotics in the environment. Simultaneous analyses of antibiotics have mostly been conducted using high-performance liquid chromatography-tandem mass spectrometry with a solid-phase extraction (SPE) pretreatment step. Currently, on-line SPE protocols are used for the rapid and sensitive detection of antibiotics in water samples. On-line detection protocols must be established for the monitoring and screening of unknown metabolites and transformation products of antibiotics in environmental water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. 42 CFR 493.1276 - Standard: Clinical cytogenetics.

    Science.gov (United States)

    2010-10-01

    ... SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing Analytic Systems § 493.1276 Standard: Clinical cytogenetics. (a) The laboratory must have policies and procedures for ensuring accurate and reliable patient specimen identification during the process...

  11. Reduced-dose chest CT with 3D automatic exposure control vs. standard chest CT: Quantitative assessment of emphysematous changes in smokers’ lung parenchyma

    International Nuclear Information System (INIS)

    Koyama, Hisanobu; Ohno, Yoshiharu; Yamazaki, Youichi; Matsumoto, Keiko; Onishi, Yumiko; Takenaka, Daisuke; Yoshikawa, Takeshi; Nishio, Mizuho; Matsumoto, Sumiaki; Murase, Kenya; Nishimura, Yoshihiro

    2012-01-01

    Objectives: To determine the capability of reduced-dose chest CT with three-dimensional (3D) automatic exposure control (AEC) on quantitative assessment of emphysematous change in smoker’ lung parenchyma, compared to standard chest CT. Methods: Twenty consecutive smoker patients (mean age 62.8 years) underwent CT examinations using a standard protocol (150 mAs) and a protocol with 3D-AEC. In this study, the targeted standard deviations number was set to 160. For quantitative assessment of emphysematous change in lung parenchyma in each subject using the standard protocol, a percentage of voxels less than −950 HU in the lung (%LAA −950 ) was calculated. The 3D-AEC protocol's %LAA was computed from of voxel percentages under selected threshold CT value. The differences of radiation doses between these two protocols were evaluated, and %LAAs −950 was compared with the 3D-AEC protocol %LAAs. Results: Mean dose length products were 780.2 ± 145.5 mGy cm (standard protocol), and 192.0 ± 95.9 (3D-AEC protocol). There was significant difference between them (paired Student's t test, p −950 and 3D-AEC protocol %LAAs. In adopting the feasible threshold CT values of the 3D-AEC protocol, the 3D-AEC protocol %LAAs were significantly correlated with %LAAs −950 (r = 0.98, p < 0.001) and limits of agreement from Bland–Altman analysis was 0.52 ± 4.3%. Conclusions: Changing threshold CT values demonstrated that reduced-dose chest CT with 3D-AEC can substitute for the standard protocol in assessments of emphysematous change in smoker’ lung parenchyma.

  12. Development of high-reliable real-time communication network protocol for SMART

    Energy Technology Data Exchange (ETDEWEB)

    Song, Ki Sang; Kim, Young Sik [Korea National University of Education, Chongwon (Korea); No, Hee Chon [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-04-01

    In this research, we first define protocol subsets for SMART(System-integrated Modular Advanced Reactor) communication network based on the requirement of SMART MMIS transmission delay and traffic requirements and OSI(Open System Interconnection) 7 layers' network protocol functions. Also, current industrial purpose LAN protocols are analyzed and the applicability of commercialized protocols are checked. For the suitability test, we have applied approximated SMART data traffic and maximum allowable transmission delay requirement. With the simulation results, we conclude that IEEE 802.5 and FDDI which is an ANSI standard, is the most suitable for SMART. We further analyzed the FDDI and token ring protocols for SMART and nuclear plant network environment including IEEE 802.4, IEEE 802.5, and ARCnet. The most suitable protocol for SMART is FDDI and FDDI MAC and RMT protocol specifications have been verified with LOTOS and the verification results show that FDDI MAC and RMT satisfy the reachability and liveness, but does not show deadlock and livelock. Therefore, we conclude that FDDI MAC and RMT is highly reliable protocol for SMART MMIS network. After that, we consider the stacking fault of IEEE 802.5 token ring protocol and propose a fault tolerant MAM(Modified Active Monitor) protocol. The simulation results show that the MAM protocol improves lower priority traffic service rate when stacking fault occurs. Therefore, proposed MAM protocol can be applied to SMART communication network for high reliability and hard real-time communication purpose in data acquisition and inter channel network. (author). 37 refs., 79 figs., 39 tabs.

  13. BioBlocks: Programming Protocols in Biology Made Easier.

    Science.gov (United States)

    Gupta, Vishal; Irimia, Jesús; Pau, Iván; Rodríguez-Patón, Alfonso

    2017-07-21

    The methods to execute biological experiments are evolving. Affordable fluid handling robots and on-demand biology enterprises are making automating entire experiments a reality. Automation offers the benefit of high-throughput experimentation, rapid prototyping, and improved reproducibility of results. However, learning to automate and codify experiments is a difficult task as it requires programming expertise. Here, we present a web-based visual development environment called BioBlocks for describing experimental protocols in biology. It is based on Google's Blockly and Scratch, and requires little or no experience in computer programming to automate the execution of experiments. The experiments can be specified, saved, modified, and shared between multiple users in an easy manner. BioBlocks is open-source and can be customized to execute protocols on local robotic platforms or remotely, that is, in the cloud. It aims to serve as a de facto open standard for programming protocols in Biology.

  14. National standards for the nuclear industry

    International Nuclear Information System (INIS)

    Laing, W.R.; Corbin, L.T.

    1981-01-01

    Standards needs for the nuclear industry are being met by a number of voluntary organizations, such as ANS, ASTM, AWS, ASME, and IEEE. The American National Standards Institute (ANSI) coordinates these activities and approves completed standards as American National Standards. ASTM has two all-nuclear committees, E-10 and C-26. A C-26 subcommittee, Test Methods, has been active in writing analytical chemistry standards for twelve years. Thirteen have been approved as ANSI standards and others are ready for ballot. Work is continuing in all areas of the nuclear fuel cycle

  15. D-RATS 2011: RAFT Protocol Overview

    Science.gov (United States)

    Utz, Hans

    2011-01-01

    A brief overview presentation on the protocol used during the D-RATS2011 field test for file transfer from the field-test robots at Black Point Lava Flow AZ to Johnson Space Center, Houston TX over a simulated time-delay. The file transfer actually uses a commercial implementation of an open communications standard. The focus of the work lies on how to make the state of the distributed system observable.

  16. Application Protocol, Initial Graphics Exchange Specification (IGES), Layered Electrical Product

    Energy Technology Data Exchange (ETDEWEB)

    O`Connell, L.J. [ed.

    1994-12-01

    An application protocol is an information systems engineering view of a specific product The view represents an agreement on the generic activities needed to design and fabricate the product the agreement on the information needed to support those activities, and the specific constructs of a product data standard for use in transferring some or all of the information required. This application protocol describes the data for electrical and electronic products in terms of a product description standard called the Initial Graphics Exchange Specification (IGES). More specifically, the Layered Electrical Product IGES Application Protocol (AP) specifies the mechanisms for defining and exchanging computer-models and their associated data for those products which have been designed in two dimensional geometry so as to be produced as a series of layers in IGES format The AP defines the appropriateness of the data items for describing the geometry of the various parts of a product (shape and location), the connectivity, and the processing and material characteristics. Excluded is the behavioral requirements which the product was intended to satisfy, except as those requirements have been recorded as design rules or product testing requirements.

  17. Streetlight Control System Based on Wireless Communication over DALI Protocol

    Directory of Open Access Journals (Sweden)

    Francisco José Bellido-Outeiriño

    2016-04-01

    Full Text Available Public lighting represents a large part of the energy consumption of towns and cities. Efficient management of public lighting can entail significant energy savings. This work presents a smart system for managing public lighting networks based on wireless communication and the DALI protocol. Wireless communication entails significant economic savings, as there is no need to install new wiring and visual impacts and damage to the facades of historical buildings in city centers are avoided. The DALI protocol uses bidirectional communication with the ballast, which allows its status to be controlled and monitored at all times. The novelty of this work is that it tackles all aspects related to the management of public lighting: a standard protocol, DALI, was selected to control the ballast, a wireless node based on the IEEE 802.15.4 standard with a DALI interface was designed, a network layer that considers the topology of the lighting network has been developed, and lastly, some user-friendly applications for the control and maintenance of the system by the technical crews of the different towns and cities have been developed.

  18. Rational Selection, Criticality Assessment, and Tiering of Quality Attributes and Test Methods for Analytical Similarity Evaluation of Biosimilars.

    Science.gov (United States)

    Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette

    2018-05-10

    Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.

  19. Appendix 1: Analytical Techniques (Online supplementary material ...

    Indian Academy of Sciences (India)

    HP

    Further details of analytical techniques are given in http://www.actlabs.com. Zircon U–Pb dating and trace element analysis. The zircons were separated using standard procedures including crushing (in iron mortar and pestle), sieving (375 to 75 micron), tabling, heavy liquid separation (bromoform and methylene iodide) ...

  20. Newly developed standard reference materials for organic contaminant analysis

    Energy Technology Data Exchange (ETDEWEB)

    Poster, D.; Kucklick, J.; Schantz, M.; Porter, B.; Wise, S. [National Inst. of Stand. and Technol., Gaithersburg, MD (USA). Center for Anal. Chem.

    2004-09-15

    The National Institute of Standards and Technology (NIST) has issued a number of Standard Reference Materials (SRM) for specified analytes. The SRMs are biota and biological related materials, sediments and particle related SRMs. The certified compounds for analysis are polychlorinated biphenyls (PCB), polycylic aromatic hydrocarbons (PAH) and their nitro-analogues, chlorinated pesticides, methylmercury, organic tin compounds, fatty acids, polybrominated biphenyl ethers (PBDE). The authors report on origin of materials and analytic methods. (uke)

  1. Environmental monitoring standardization of effluent from nuclear fuel cycle facilities in China

    International Nuclear Information System (INIS)

    Gao Mili

    1993-01-01

    China has established some environmental monitoring standards of effluent from nuclear fuel cycle facilities. Up to date 33 standards have been issued; 10 to be issued; 11 in drafting. These standards cover sampling, gross activities measurement, analytical methods and management rules and so on. They involve with almost all nuclear fuel cycle facilities and have formed a complete standards system. By the end of the century, we attempt to draft a series of analytical and determination standards in various environmental various medium, they include 36 radionuclides from nuclear fuel cycle facilities. (3 tabs.)

  2. WelFur - mink: development of on-farm welfare assessment protocols for mink

    DEFF Research Database (Denmark)

    Møller, Steen Henrik; Hansen, Steffen W; Rousing, Tine

    2012-01-01

    European Fur Breeder's Association initiated the "WelFur" project in 2009 in order to develop a welfare assessment protocol for mink and fox farms after the Welfare Quality® standards. The assessment is based on four welfare principles (Good feeding, good housing, good health and appropriate beha...... mink production seasons: Winter, spring, and autumn, in order to cover the life cycle of mink and proved feasible for a one-day visit.......European Fur Breeder's Association initiated the "WelFur" project in 2009 in order to develop a welfare assessment protocol for mink and fox farms after the Welfare Quality® standards. The assessment is based on four welfare principles (Good feeding, good housing, good health and appropriate...

  3. Some recent developments in the surface-analytical application of X-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Gries, W.H.

    1987-01-01

    Standard depth profiles of an analyte deposited into (diffusion or ion implantation) or on (thin-film deposition) a plane surface can be analyzed for profile type and centroid depth or film thickness by means of a standardless method in which the matrix-attenuated signals of the fluorescing analyte measured at two different take-off angles are related to the mathematical distribution moments of the profile. For a binary thin film the element ratio can also be established. Results obtained on phosphorus profiles in silicon and on zinc sulphide optical coatings are referred to. The quantity or concentration level can be determined by use of a reference standard which may contain the analyte in an entirely different distribution. This simplifies the calibration of secondary reference standards. A good lateral resolution in the sub-millimeter range can be achieved with synchrotron radiation. A further improvement of lateral resolution is possible by direct excitation with electron microbeams, though at significantly inferior detection limits. (orig.)

  4. Comparison of IAEA protocols for clinical electron beam dosimetry

    International Nuclear Information System (INIS)

    Novotny, J.; Soukup, M.

    2002-01-01

    In most beam calibration protocols so far used in clinical practice, the method recommended for the determination of absorbed dose to water in high-energy electron beams is based on either an exposure or an air kerma calibration factor of an ionisation chamber in a C0 60 gamma-ray or 2 MV x-ray beam. These protocols are complex and the overall uncertainty in the absorbed dose to water under reference conditions is about 3-4%. The new generation of protocols, namely IAEA TRS 398, are based on absorbed dose-to-water standards in photon beams from Co 60 and accelerator beams. The possible errors in absorbed dose determination in reference conditions in practical clinical dosimetry caused by replacement of TRS 277 and TRS 381 protocols for a new TRS 398 protocol were carefully studied for clinical electron beams in energy range 6-20 MeV. All measurements were performed on Varian CLINAC 2100 C linear accelerator. The electron beam energy ranged from 6 to 20 MeV. Basically three different detectors were used for measurements: PTW Roos plane-parallel ionization chamber, calibrated PTW 30002 Farmer type, ionization, Scanditronix electron diode detector. Measurements of central axis percentage depth doses were made by diode using Wellhoefer WP700 beam scanner in 40 cm x 40 cm x 50 cm water phantom. A reference chamber or semiconductor diode mounted on electron treatment cone was used to correct beam output variations for a chamber or diode measurements during scanning. Absolute dose measurements were carried out with Roos plane-parallel chamber connected to PTW UNIDOS electrometer always for preselected number of monitor units. In a new IAEA dosimetry protocol clinical reference dosimetry for electron beam is performed at depth of d ref = 0.6R 50 - 0.1 [cm] instead of d max as in previous ones. To check the stability of electron beams for energy and to establish d ref and standard deviation for reference depth position, the depth dose curves obtained during the quality

  5. DELAMINATION AND XRF ANALYSIS OF NIST LEAD IN PAINT FILM STANDARDS

    Science.gov (United States)

    The objectives of this protocol were to remove the laminate coating from lead paint film standards acquired from NIST by means of surface heating. The average XRF value did not change after removal of the polymer coating suggesting that this protocol is satisfactory for renderin...

  6. Surface and subsurface cleanup protocol for radionuclides, Gunnison, Colorado, UMTRA project processing site. Final [report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    Surface and subsurface soil cleanup protocols for the Gunnison, Colorado, processing sits are summarized as follows: In accordance with EPA-promulgated land cleanup standards (40 CFR 192), in situ Ra-226 is to be cleaned up based on bulk concentrations not exceeding 5 and 15 pCi/g in 15-cm surface and subsurface depth increments, averaged over 100-m{sup 2} grid blocks, where the parent Ra-226 concentrations are greater than, or in secular equilibrium with, the Th-230 parent. A bulk interpretation of these EPA standards has been accepted by the Nuclear Regulatory Commission (NRC), and while the concentration of the finer-sized soil fraction less than a No. 4 mesh sieve contains the higher concentration of radioactivity, the bulk approach in effect integrates the total sample radioactivity over the entire sample mass. In locations where Th-230 has differentially migrated in subsoil relative to Ra-226, a Th-230 cleanup protocol has been developed in accordance with Supplemental Standard provisions of 40 CFR 192 for NRC/Colorado Department of Health (CDH) approval for timely implementation. Detailed elements of the protocol are contained in Appendix A, Generic Protocol from Thorium-230 Cleanup/Verification at UMTRA Project Processing Sites. The cleanup of other radionuclides or nonradiological hazards that pose a significant threat to the public and the environment will be determined and implemented in accordance with pathway analysis to assess impacts and the implications of ALARA specified in 40 CFR 192 relative to supplemental standards.

  7. Dual-energy CT workflow: multi-institutional consensus on standardization of abdominopelvic MDCT protocols.

    Science.gov (United States)

    Patel, Bhavik N; Alexander, Lauren; Allen, Brian; Berland, Lincoln; Borhani, Amir; Mileto, Achille; Moreno, Courtney; Morgan, Desiree; Sahani, Dushyant; Shuman, William; Tamm, Eric; Tublin, Mitchell; Yeh, Benjamin; Marin, Daniele

    2017-03-01

    To standardize workflow for dual-energy computed tomography (DECT) involving common abdominopelvic exam protocols. 9 institutions (4 rsDECT, 1 dsDECT, 4 both) with 32 participants [average # years (range) in practice and DECT experience, 12.3 (1-35) and 4.6 (1-14), respectively] filled out a single survey (n = 9). A five-point agreement scale (0, 1, 2, 3, 4-contra-, not, mildly, moderately, strongly indicated, respectively) and utilization scale (0-not performing and shouldn't; 1-performing but not clinically useful; 2-performing but not sure if clinically useful; 3-not performing it but would like to; 4-performing and clinically useful) were used. Consensus was considered with a score of ≥2.5. Survey results were discussed over three separate live webinar sessions. 5/9 (56%) institutions exclude large patients from DECT. 2 (40%) use weight, 2 (40%) use transverse dimension, and 1 (20%) uses both. 7/9 (78%) use 50 keV for low and 70 keV for medium monochromatic reconstructed images. DECT is indicated for dual liver [agreement score (AS) 3.78; utilization score (US) 3.22] and dual pancreas in the arterial phase (AS 3.78; US 3.11), mesenteric ischemia/gastrointestinal bleeding in both the arterial and venous phases (AS 2.89; US 2.79), RCC exams in the arterial phase (AS 3.33; US 2.78), and CT urography in the nephrographic phase (AS 3.11; US 2.89). DECT for renal stone and certain single-phase exams is indicated (AS 3.00). DECT is indicated during the arterial phase for multiphasic abdominal exams, nephrographic phase for CTU, and for certain single-phase and renal stone exams.

  8. A field protocol to monitor cavity-nesting birds

    Science.gov (United States)

    J. Dudley; V. Saab

    2003-01-01

    We developed a field protocol to monitor populations of cavity-nesting birds in burned and unburned coniferous forests of western North America. Standardized field methods are described for implementing long-term monitoring strategies and for conducting field research to evaluate the effects of habitat change on cavity-nesting birds. Key references (but not...

  9. Analytical study on the determination of boron in environmental water samples

    International Nuclear Information System (INIS)

    Lopez, F.J.; Gimenez, E.; Hernandez, F.

    1993-01-01

    An analytical study on the determination of boron in environmental water samples was carried out. The curcumin and carmine standard methods were compared with the most recent Azomethine-H method in order to evaluate their analytical characteristics and feasibility for the analysis of boron in water samples. Analyses of synthetic water, ground water, sea water and waste water samples were carried out and a statistical evaluation of the results was made. The Azomethine-H method was found to be the most sensitive (detection limit 0.02 mg l -1 ) and selective (no interference of commonly occurring ions in water was observed), showing also the best precision (relative standard deviation lower than 4%). Moreover, it gave good results for all types of samples analyzed. The accuracy of this method was tested by the addition of known amounts of standard solutions to different types of water samples. The slopes of standard additions and direct calibration graphs were similar and recoveries of added boron ranged from 99 to 107%. (orig.)

  10. Entanglement fidelity of the standard quantum teleportation channel

    Energy Technology Data Exchange (ETDEWEB)

    Li, Gang; Ye, Ming-Yong, E-mail: myye@fjnu.edu.cn; Lin, Xiu-Min

    2013-09-16

    We consider the standard quantum teleportation protocol where a general bipartite state is used as entanglement resource. We use the entanglement fidelity to describe how well the standard quantum teleportation channel transmits quantum entanglement and give a simple expression for the entanglement fidelity when it is averaged on all input states.

  11. Cryotherapy for acute ankle sprains: a randomised controlled study of two different icing protocols.

    Science.gov (United States)

    Bleakley, C M; McDonough, S M; MacAuley, D C; Bjordal, J

    2006-08-01

    The use of cryotherapy in the management of acute soft tissue injury is largely based on anecdotal evidence. Preliminary evidence suggests that intermittent cryotherapy applications are most effective at reducing tissue temperature to optimal therapeutic levels. However, its efficacy in treating injured human subjects is not yet known. To compare the efficacy of an intermittent cryotherapy treatment protocol with a standard cryotherapy treatment protocol in the management of acute ankle sprains. Sportsmen (n = 44) and members of the general public (n = 45) with mild/moderate acute ankle sprains. Subjects were randomly allocated, under strictly controlled double blind conditions, to one of two treatment groups: standard ice application (n = 46) or intermittent ice application (n = 43). The mode of cryotherapy was standardised across groups and consisted of melting iced water (0 degrees C) in a standardised pack. Function, pain, and swelling were recorded at baseline and one, two, three, four, and six weeks after injury. Subjects treated with the intermittent protocol had significantly (p<0.05) less ankle pain on activity than those using a standard 20 minute protocol; however, one week after ankle injury, there were no significant differences between groups in terms of function, swelling, or pain at rest. Intermittent applications may enhance the therapeutic effect of ice in pain relief after acute soft tissue injury.

  12. Integrated Array/Metadata Analytics

    Science.gov (United States)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  13. Detection of p-coumaric acid from cell supernatant using surface enhanced Raman scattering

    DEFF Research Database (Denmark)

    Morelli, Lidia; Jendresen, Christian Bille; Zor, Kinga

    2017-01-01

    A standard protocol for analysis of microbial factories requires the screening of several populations in order to find the bestperforming ones. Standard analytical methods usually include high performance liquid chromatography (HPLC), thin layerchromatography (TLC) or spectrophotometry, which...

  14. On the need of improved Accelerated Degradation Protocols (ADPs)

    DEFF Research Database (Denmark)

    Pizzutilo, E.; Geiger, S.; Grote, J. P.

    2016-01-01

    protocol covering the whole potential range from 0.6 to 1.5 VRHE. The latter is typically not addressed in literature. This finding is explained by taking into account platinum catalyzed carbon corrosion and transient platinum dissolution. Based on the obtained results, the question is raised...... protocols that are commonly used in the fuel cell community to simulate load cycle and start-stop conditions in proton exchange membrane fuel cells (PEMFCs). In contrast to previous assumptions, claiming a separation between carbon corrosion and platinum dissolution, in both standard protocols platinum...... dissolution and carbon corrosion are present at low rates, which is also reflected by a comparably low ECSA decrease. On the other hand, a huge increase in rate of both processes is observed during transitions from low to high potential regimes experienced by a PEMFC in operation, here studied in a third...

  15. A Group Neighborhood Average Clock Synchronization Protocol for Wireless Sensor Networks

    Science.gov (United States)

    Lin, Lin; Ma, Shiwei; Ma, Maode

    2014-01-01

    Clock synchronization is a very important issue for the applications of wireless sensor networks. The sensors need to keep a strict clock so that users can know exactly what happens in the monitoring area at the same time. This paper proposes a novel internal distributed clock synchronization solution using group neighborhood average. Each sensor node collects the offset and skew rate of the neighbors. Group averaging of offset and skew rate value are calculated instead of conventional point-to-point averaging method. The sensor node then returns compensated value back to the neighbors. The propagation delay is considered and compensated. The analytical analysis of offset and skew compensation is presented. Simulation results validate the effectiveness of the protocol and reveal that the protocol allows sensor networks to quickly establish a consensus clock and maintain a small deviation from the consensus clock. PMID:25120163

  16. Analytical Chemistry Laboratory, progress report for FY 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.

  17. Inter-comparison of NIOSH and IMPROVE protocols for OC and EC determination: implications for inter-protocol data conversion

    Science.gov (United States)

    Wu, Cheng; Huang, X. H. Hilda; Ng, Wai Man; Griffith, Stephen M.; Zhen Yu, Jian

    2016-09-01

    Organic carbon (OC) and elemental carbon (EC) are operationally defined by analytical methods. As a result, OC and EC measurements are protocol dependent, leading to uncertainties in their quantification. In this study, more than 1300 Hong Kong samples were analyzed using both National Institute for Occupational Safety and Health (NIOSH) thermal optical transmittance (TOT) and Interagency Monitoring of Protected Visual Environment (IMPROVE) thermal optical reflectance (TOR) protocols to explore the cause of EC disagreement between the two protocols. EC discrepancy mainly (83 %) arises from a difference in peak inert mode temperature, which determines the allocation of OC4NSH, while the rest (17 %) is attributed to a difference in the optical method (transmittance vs. reflectance) applied for the charring correction. Evidence shows that the magnitude of the EC discrepancy is positively correlated with the intensity of the biomass burning signal, whereby biomass burning increases the fraction of OC4NSH and widens the disagreement in the inter-protocol EC determination. It is also found that the EC discrepancy is positively correlated with the abundance of metal oxide in the samples. Two approaches (M1 and M2) that translate NIOSH TOT OC and EC data into IMPROVE TOR OC and EC data are proposed. M1 uses direct relationship between ECNSH_TOT and ECIMP_TOR for reconstruction: M1 : ECIMP_TOR = a × ECNSH_TOT + b; while M2 deconstructs ECIMP_TOR into several terms based on analysis principles and applies regression only on the unknown terms: M2 : ECIMP_TOR = AECNSH + OC4NSH - (a × PCNSH_TOR + b), where AECNSH, apparent EC by the NIOSH protocol, is the carbon that evolves in the He-O2 analysis stage, OC4NSH is the carbon that evolves at the fourth temperature step of the pure helium analysis stage of NIOSH, and PCNSH_TOR is the pyrolyzed carbon as determined by the NIOSH protocol. The implementation of M1 to all urban site data (without considering seasonal specificity

  18. Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates.

    Science.gov (United States)

    Wallace, Ryan A; Charlton, Jennifer J; Kirchner, Teresa B; Lavrik, Nickolay V; Datskos, Panos G; Sepaniak, Michael J

    2014-12-02

    The ability to detect a few molecules present in a large sample is of great interest for the detection of trace components in both medicinal and environmental samples. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. The following work involves superhydrophobic surfaces that have as a framework deterministic or stochastic silicon pillar arrays formed by lithographic or metal dewetting protocols, respectively. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added to the functionalized pillar array system via soaking. Native pillars and pillars with hydrophobic modification are used. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A ≥ 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 × 10(-12) M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up uses in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.

  19. Traffic Adaptive MAC Protocols in Wireless Body Area Networks

    Directory of Open Access Journals (Sweden)

    Farhan Masud

    2017-01-01

    Full Text Available In Wireless Body Area Networks (WBANs, every healthcare application that is based on physical sensors is responsible for monitoring the vital signs data of patient. WBANs applications consist of heterogeneous and dynamic traffic loads. Routine patient’s observation is described as low-load traffic while an alarming situation that is unpredictable by nature is referred to as high-load traffic. This paper offers a thematic review of traffic adaptive Medium Access Control (MAC protocols in WBANs. First, we have categorized them based on their goals, methods, and metrics of evaluation. The Zigbee standard IEEE 802.15.4 and the baseline MAC IEEE 802.15.6 are also reviewed in terms of traffic adaptive approaches. Furthermore, a comparative analysis of the protocols is made and their performances are analyzed in terms of delay, packet delivery ratio (PDR, and energy consumption. The literature shows that no review work has been done on traffic adaptive MAC protocols in WBANs. This review work, therefore, could add enhancement to traffic adaptive MAC protocols and will stimulate a better way of solving the traffic adaptivity problem.

  20. A Secure Key Establishment Protocol for ZigBee Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2009-01-01

    ZigBee is a wireless sensor network standard that defines network and application layers on top of IEEE 802.15.4’s physical and medium access control layers. In the latest version of ZigBee, enhancements are prescribed for the security sublayer but we show in this paper that problems persist....... In particular we show that the End-to-End Application Key Establishment Protocol is flawed and we propose a secure protocol instead. We do so by using formal verification techniques based on static program analysis and process algebras. We present a way of using formal methods in wireless network security......, and propose a secure key establishment protocol for ZigBee networks....