WorldWideScience

Sample records for identify evaluate verify

  1. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    Science.gov (United States)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  2. A performance evaluation of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.; Wright, L.J.

    1987-01-01

    Personnel identity verification devices, which are based on the examination and assessment of a body feature or a unique repeatable personal action, are steadily improving. These biometric devices are becoming more practical with respect to accuracy, speed, user compatibility, reliability and cost, but more development is necessary to satisfy the varied and sometimes ill-defined future requirements of the security industry. In an attempt to maintain an awareness of the availability and the capabilities of identity verifiers for the DOE security community, Sandia Laboratories continues to comparatively evaluate the capabilities and improvements of developing devices. An evaluation of several recently available verifiers is discussed in this paper. Operating environments and procedures more typical of physical access control use can reveal performance substantially different from the basic laboratory tests

  3. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  4. USCIS E-Verify Program Reports

    Data.gov (United States)

    Department of Homeland Security — The report builds on the last comprehensive evaluation of the E-Verify Program and demonstrates that E-Verify produces accurate results and that accuracy rates have...

  5. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    Science.gov (United States)

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. [The development and evaluation of software to verify diagnostic accuracy].

    Science.gov (United States)

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  7. Reasoning about knowledge: Children’s evaluations of generality and verifiability

    Science.gov (United States)

    Koenig, Melissa A.; Cole, Caitlin A.; Meyer, Meredith; Ridge, Katherine E.; Kushnir, Tamar; Gelman, Susan A.

    2015-01-01

    In a series of experiments, we examined 3- to 8-year-old children’s (N = 223) and adults’ (N = 32) use of two properties of testimony to estimate a speaker’s knowledge: generality and verifiability. Participants were presented with a “Generic speaker” who made a series of 4 general claims about “pangolins” (a novel animal kind), and a “Specific speaker” who made a series of 4 specific claims about “this pangolin” as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., “has a pointy nose”) or a non-evident feature that was not visible (e.g., “sleeps in a hollow tree”). Three main findings emerged: (1) Young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) Children’s attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) Children often generalized speakers’ knowledge outside of the pangolin domain, indicating a belief that a person’s knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  8. Status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  9. Methods to verify absorbed dose of irradiated containers and evaluation of dosimeters

    International Nuclear Information System (INIS)

    Gao Meixu; Wang Chuanyao; Tang Zhangxong; Li Shurong

    2001-01-01

    The research on dose distribution in irradiated food containers and evaluation of several methods to verify absorbed dose were carried out. The minimum absorbed dose of treated five orange containers was in the top of the highest or in the bottom of lowest container. D max /D min in this study was 1.45 irradiated in a commercial 60 Co facility. The density of orange containers was about 0.391g/cm 3 . The evaluation of dosimeters showed that the PMMA-YL and clear PMMA dosimeters have linear relationship with dose response, and the word NOT in STERIN-125 and STERIN-300 indicators were covered completely at the dosage of 125 and 300 Gy respectively. (author)

  10. The status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  11. Appraising the value of independent EIA follow-up verifiers

    Energy Technology Data Exchange (ETDEWEB)

    Wessels, Jan-Albert, E-mail: janalbert.wessels@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Environmental Assessment, School of Environmental Science, Murdoch University, Australia. (Australia)

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  12. Appraising the value of independent EIA follow-up verifiers

    International Nuclear Information System (INIS)

    Wessels, Jan-Albert; Retief, Francois; Morrison-Saunders, Angus

    2015-01-01

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  13. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling

    Science.gov (United States)

    Antweiler, Ronald C.; Writer, Jeffrey H.; Murphy, Sheila F.

    2014-01-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations – such as missing the Lagrangian parcel by less than 1 h – can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed “verified Lagrangian” sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2–4 h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we

  14. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling.

    Science.gov (United States)

    Antweiler, Ronald C; Writer, Jeffrey H; Murphy, Sheila F

    2014-02-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations - such as missing the Lagrangian parcel by less than 1h - can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed "verified Lagrangian" sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2-4h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we show how data

  15. Identifying the 'right patient': nurse and consumer perspectives on verifying patient identity during medication administration.

    Science.gov (United States)

    Kelly, Teresa; Roper, Cath; Elsom, Stephen; Gaskin, Cadeyrn

    2011-10-01

    Accurate verification of patient identity during medication administration is an important component of medication administration practice. In medical and surgical inpatient settings, the use of identification aids, such as wristbands, is common. In many psychiatric inpatient units in Victoria, Australia, however, standardized identification aids are not used. The present paper outlines the findings of a qualitative research project that employed focus groups to examine mental health nurse and mental health consumer perspectives on the identification of patients during routine medication administration in psychiatric inpatient units. The study identified a range of different methods currently employed to verify patient identity, including technical methods, such as wristband and photographs, and interpersonal methods, such as patient recognition. There were marked similarities in the perspectives of mental health nurses and mental health consumers regarding their opinions and preferences. Technical aids were seen as important, but not as a replacement for the therapeutic nurse-patient encounter. © 2011 The Authors. International Journal of Mental Health Nursing © 2011 Australian College of Mental Health Nurses Inc.

  16. POSSIBILITIES TO EVALUATE THE QUALITY OF EDUCATION BY VERIFYING THE DISTRIBUTION OF MARKS

    Directory of Open Access Journals (Sweden)

    Alexandru BOROIU

    2015-05-01

    Full Text Available In the higher education, for the evaluation of education process it is of high interest to use some numeric indicators obtained from the database with the final results realized by the students on exams session. For this purpose could be used the following numeric indicators: proportion of students absent on final evaluation, proportion of non-promoted students, normality degree of passing marks distribution. In order to do this we realized an Excel calculation program that could be applied to each discipline. The inputs are concrete (students total, students present to final evaluation, marks absolute frequency and the outputs for the three indicators are binary (competent or noncompetent, in the last situation the verdict being: “Give explanations. Propose an action plan, with actions, responsible and terms”. To verify the imposed normality degree we elaborate a calculation program based on Kolmogorov-Smirnov concordance test. So, it was realized the increase of analyze objectivity and it was created the opportunity to apply corrective measures in order to improve the education process.

  17. Business rescue decision making through verifier determinants – ask the specialists

    Directory of Open Access Journals (Sweden)

    Marius Pretorius

    2013-11-01

    Full Text Available Orientation: Business rescue has become a critical part of business strategy decision making, especially during economic downturns and recessions. Past legislation has generally supported creditor-friendly regimes, and its mind-set still applies which increases the difficulty of such turnarounds. There are many questions and critical issues faced by those involved in rescue. Despite extensive theory in the literature on failure, there is a void regarding practical verifiers of the signs and causes of venture decline, as specialists are not forthcoming about what they regard as their “competitive advantage”. Research purpose: This article introduces the concept and role of “verifier determinants” of early warning signs, as a tool to confirm the causes of decline in order to direct rescue strategies and, most importantly, reduce time between the first observation and the implementation of the rescue. Motivation for the study: Knowing how specialist practitioners confirm causes of business decline could assist in deciding on strategies for the rescue earlier than can be done using traditional due diligence which is time consuming. Reducing time is a crucial element of a successful rescue. Research design and approach: The researchers interviewed specialist practitioners with extensive experience in rescue and turnaround. An experimental design was used to ensure the specialists evaluated the same real cases to extract their experiences and base their decisions on. Main findings: The specialists confirmed the use of verifier determinants and identified such determinants as they personally used them to confirm causes of decline. These verifier determinants were classified into five categories; namely, management, finance, strategic, banking and operations and marketing of the ventures under investigation. The verifier determinants and their use often depend heavily on subconscious (non-factual information based on previous experiences

  18. Verifying Architectural Design Rules of the Flight Software Product Line

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  19. Evaluating MC and A effectiveness to verify the presence of nuclear materials

    International Nuclear Information System (INIS)

    Dawson, P.G.; Morzinski, J.A.; Ostenak, Carl A.; Longmire, V.L.; Jewell, D.; Williams, J.D.

    2001-01-01

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process may move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.

  20. Automated measurement and control of concrete properties in a ready mix truck with VERIFI.

    Science.gov (United States)

    2014-02-01

    In this research, twenty batches of concrete with six different mixture proportions were tested with VERIFI to evaluate 1) accuracy : and repeatability of VERIFI measurements, 2) ability of VERIFI to adjust slump automatically with water and admixtur...

  1. Externally Verifiable Oblivious RAM

    Directory of Open Access Journals (Sweden)

    Gancher Joshua

    2017-04-01

    Full Text Available We present the idea of externally verifiable oblivious RAM (ORAM. Our goal is to allow a client and server carrying out an ORAM protocol to have disputes adjudicated by a third party, allowing for the enforcement of penalties against an unreliable or malicious server. We give a security definition that guarantees protection not only against a malicious server but also against a client making false accusations. We then give modifications of the Path ORAM [15] and Ring ORAM [9] protocols that meet this security definition. These protocols both have the same asymptotic runtimes as the semi-honest original versions and require the external verifier to be involved only when the client or server deviates from the protocol. Finally, we implement externally verified ORAM, along with an automated cryptocurrency contract to use as the external verifier.

  2. Identifying Anomalous Citations for Objective Evaluation of Scholarly Article Impact.

    Directory of Open Access Journals (Sweden)

    Xiaomei Bai

    Full Text Available Evaluating the impact of a scholarly article is of great significance and has attracted great attentions. Although citation-based evaluation approaches have been widely used, these approaches face limitations e.g. in identifying anomalous citations patterns. This negligence would inevitably cause unfairness and inaccuracy to the article impact evaluation. In this study, in order to discover the anomalous citations and ensure the fairness and accuracy of research outcome evaluation, we investigate the citation relationships between articles using the following factors: collaboration times, the time span of collaboration, citing times and the time span of citing to weaken the relationship of Conflict of Interest (COI in the citation network. Meanwhile, we study a special kind of COI, namely suspected COI relationship. Based on the COI relationship, we further bring forward the COIRank algorithm, an innovative scheme for accurately assessing the impact of an article. Our method distinguishes the citation strength, and utilizes PageRank and HITS algorithms to rank scholarly articles comprehensively. The experiments are conducted on the American Physical Society (APS dataset. We find that about 80.88% articles contain contributed citations by co-authors in 26,366 articles and 75.55% articles among these articles are cited by the authors belonging to the same affiliation, indicating COI and suspected COI should not be ignored for evaluating impact of scientific papers objectively. Moreover, our experimental results demonstrate COIRank algorithm significantly outperforms the state-of-art solutions. The validity of our approach is verified by using the probability of Recommendation Intensity.

  3. Identifying Anomalous Citations for Objective Evaluation of Scholarly Article Impact.

    Science.gov (United States)

    Bai, Xiaomei; Xia, Feng; Lee, Ivan; Zhang, Jun; Ning, Zhaolong

    2016-01-01

    Evaluating the impact of a scholarly article is of great significance and has attracted great attentions. Although citation-based evaluation approaches have been widely used, these approaches face limitations e.g. in identifying anomalous citations patterns. This negligence would inevitably cause unfairness and inaccuracy to the article impact evaluation. In this study, in order to discover the anomalous citations and ensure the fairness and accuracy of research outcome evaluation, we investigate the citation relationships between articles using the following factors: collaboration times, the time span of collaboration, citing times and the time span of citing to weaken the relationship of Conflict of Interest (COI) in the citation network. Meanwhile, we study a special kind of COI, namely suspected COI relationship. Based on the COI relationship, we further bring forward the COIRank algorithm, an innovative scheme for accurately assessing the impact of an article. Our method distinguishes the citation strength, and utilizes PageRank and HITS algorithms to rank scholarly articles comprehensively. The experiments are conducted on the American Physical Society (APS) dataset. We find that about 80.88% articles contain contributed citations by co-authors in 26,366 articles and 75.55% articles among these articles are cited by the authors belonging to the same affiliation, indicating COI and suspected COI should not be ignored for evaluating impact of scientific papers objectively. Moreover, our experimental results demonstrate COIRank algorithm significantly outperforms the state-of-art solutions. The validity of our approach is verified by using the probability of Recommendation Intensity.

  4. Auto-identification fiberoptical seal verifier

    International Nuclear Information System (INIS)

    Yamamoto, Yoichi; Mukaiyama, Takehiro

    1998-08-01

    An auto COBRA seal verifier was developed by Japan Atomic Energy Research Institute (JAERI) to provide more efficient and simpler inspection measures for IAEA safeguards. The verifier is designed to provide means of a simple, quantitative and objective judgment on in-situ verification for the COBRA seal. The equipment is a portable unit with hand-held weight and size. It can be operated by battery or AC power. The verifier reads a COBRA seal signature by using a built-in CCD camera and carries out the signature comparison procedure automatically on digital basis. The result of signature comparison is given as a YES/NO answer. The production model of the verifier was completed in July 1996. The development was carried out in collaboration with Mitsubishi Heavy Industries, Ltd. This report describes the design and functions of the COBRA seal verifier and the results of environmental and functional tests. The development of the COBRA seal verifier was carried out in the framework of Japan Support Programme for Agency Safeguards (JASPAS) as a project, JD-4 since 1981. (author)

  5. Verifier Theory and Unverifiability

    OpenAIRE

    Yampolskiy, Roman V.

    2016-01-01

    Despite significant developments in Proof Theory, surprisingly little attention has been devoted to the concept of proof verifier. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verificati...

  6. Verified OS Interface Code Synthesis

    Science.gov (United States)

    2016-12-01

    results into the larger proof framework of the seL4 microkernel to be directly usable in practice. Beyond the stated project goals, the solution...CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This combination increases proof productivity...were used for the verified ML compiler CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This

  7. Verifiably Truthful Mechanisms

    DEFF Research Database (Denmark)

    Branzei, Simina; Procaccia, Ariel D.

    2015-01-01

    the computational sense). Our approach involves three steps: (i) specifying the structure of mechanisms, (ii) constructing a verification algorithm, and (iii) measuring the quality of verifiably truthful mechanisms. We demonstrate this approach using a case study: approximate mechanism design without money...

  8. 28 CFR 802.13 - Verifying your identity.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity. You...

  9. 20 CFR 401.45 - Verifying your identity.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make a...

  10. Verifying the integrity of hardcopy document using OCR

    CSIR Research Space (South Africa)

    Mthethwa, Sthembile

    2018-03-01

    Full Text Available stream_source_info Mthethwa_20042_2018.pdf.txt stream_content_type text/plain stream_size 7349 Content-Encoding UTF-8 stream_name Mthethwa_20042_2018.pdf.txt Content-Type text/plain; charset=UTF-8 Verifying the Integrity...) of the document to be defined. Each text in the meta-template is labelled with a unique identifier, which makes it easier for the process of validation. The meta-template consist of two types of text; normal text and validation text (important text that must...

  11. Verifying competence of operations personnel in nuclear power plants

    International Nuclear Information System (INIS)

    Farber, G.H.

    1986-01-01

    To ensure that only competent people are authorized to fill positions in a nuclear power plant, both the initial competence of personnel and the continuous maintenance of competence have to be verified. Two main methods are normally used for verifying competence, namely evaluation of a person's performance over a period of time, and evaluation of his knowledge and skills at a particular time by means of an examination. Both methods have limitations, and in practice they are often used together to give different and to some extent complementary evaluations of a person's competence. Verification of competence itself is a problem area, because objective judging of human competence is extremely difficult. Formal verification methods, such as tests and examinations, are particularly or exclusively applied for the direct operating personnel in the control room (very rarely for management personnel). Out of the many elements contributing to a person's competence, the knowledge which is needed and the intellectual skills are the main subjects of the formal verification methods. Therefore the presentation will concentrate on the proof of the technical qualification of operators by means of examinations. The examination process in the Federal Republic of Germany for the proof of knowledge and skills will serve as an example to describe and analyze the important aspects. From that recommendations are derived regarding standardization of the procedure as well as validation. (orig./GL)

  12. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  13. Verifying FreeRTOS; a feasibility study

    NARCIS (Netherlands)

    Pronk, C.

    2010-01-01

    This paper presents a study on modeling and verifying the kernel of Real-Time Operating Systems (RTOS). The study will show advances in formally verifying such an RTOS both by refinement and by model checking approaches. This work fits in the context of Hoare’s verification challenge. Several

  14. Verifying a nuclear weapon`s response to radiation environments

    Energy Technology Data Exchange (ETDEWEB)

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  15. Biochemically verified smoking cessation and vaping beliefs among vape store customers.

    Science.gov (United States)

    Tackett, Alayna P; Lechner, William V; Meier, Ellen; Grant, DeMond M; Driskill, Leslie M; Tahirkheli, Noor N; Wagener, Theodore L

    2015-05-01

    To evaluate biochemically verified smoking status and electronic nicotine delivery systems (ENDS) use behaviors and beliefs among a sample of customers from vapor stores (stores specializing in ENDS). A cross-sectional survey of 215 adult vapor store customers at four retail locations in the Midwestern United States; a subset of participants (n = 181) also completed exhaled carbon monoxide (CO) testing to verify smoking status. Outcomes evaluated included ENDS preferences, harm beliefs, use behaviors, smoking history and current biochemically verified smoking status. Most customers reported starting ENDS as a means of smoking cessation (86%), using newer-generation devices (89%), vaping non-tobacco/non-menthol flavors (72%) and using e-liquid with nicotine strengths of ≤20 mg/ml (72%). There was a high rate of switching (91.4%) to newer-generation ENDS among those who started with a first-generation product. Exhaled CO readings confirmed that 66% of the tested sample had quit smoking. Among those who continued to smoke, mean cigarettes per day decreased from 22.1 to 7.5 (P customers in the United States who use electronic nicotine delivery devices to stop smoking, vaping longer, using newer-generation devices and using non-tobacco and non-menthol flavored e-liquid appear to be associated with higher rates of smoking cessation. © 2015 Society for the Study of Addiction.

  16. Verified Interval Orbit Propagation in Satellite Collision Avoidance

    NARCIS (Netherlands)

    Römgens, B.A.; Mooij, E.; Naeije, M.C.

    2011-01-01

    Verified interval integration methods enclose a solution set corresponding to interval initial values and parameters, and bound integration and rounding errors. Verified methods suffer from overestimation of the solution, i.e., non-solutions are also included in the solution enclosure. Two verified

  17. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  18. Design of a verifiable subset for HAL/S

    Science.gov (United States)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  19. Verifying design patterns in Hoare Type Theory

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Buisse, Alexandre; Birkedal, Lars

    In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory.......In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory....

  20. New concepts in nuclear arms control: verified cutoff and verified disposal

    International Nuclear Information System (INIS)

    Donnelly, W.H.

    1990-01-01

    Limiting the numbers of nuclear warheads by reducing military production and stockpiles of fissionable materials has been a constant item on the nuclear arms control agenda for the last 45 years. It has become more salient recently, however, because of two events: the enforced closure for safety reasons of the current United States military plutonium production facilities; and the possibility that the US and USSR may soon conclude an agreement providing for the verified destruction of significant numbers of nuclear warheads and the recovery of the fissionable material they contain with the option of transferring these materials to peaceful uses. A study has been made of the practical problems of verifying the cut off of fissionable material production for military purposes in the nuclear weapon states, as well as providing assurance that material recovered from warheads is not re-used for proscribed military purposes and facilitating its transfer to civil uses. Implementation of such measures would have important implications for non-proliferation. The resultant paper was presented to a meeting of the PPNN Core Group held in Baden, close to Vienna, over the weekend of 18/19th November 1989 and is reprinted in this booklet. (author)

  1. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  2. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  3. Experience with in vivo diode dosimetry for verifying radiotherapy dose delivery: Practical implementation of cost-effective approaches

    International Nuclear Information System (INIS)

    Thwaites, D.I.; Blyth, C.; Carruthers, L.; Elliott, P.A.; Kidane, G.; Millwater, C.J.; MacLeod, A.S.; Paolucci, M.; Stacey, C.

    2002-01-01

    A systematic programme of in vivo dosimetry using diodes to verify radiotherapy delivered doses began in Edinburgh in 1992. The aims were to investigate the feasibility of routine systematic use of diodes as part of a comprehensive QA programme, to carry out clinical pilot studies to assess the accuracy of dose delivery on each machine and for each site and technique, to identify and rectify systematic deviations, to assess departmental dosimetric precision and to compare to clinical requirements. A further aim was to carry out a cost-benefit evaluation based on the results from the pilot studies to consider how best to use diodes routinely

  4. An alternative test for verifying electronic balance linearity

    International Nuclear Information System (INIS)

    Thomas, I.R.

    1998-02-01

    This paper presents an alternative method for verifying electronic balance linearity and accuracy. This method is being developed for safeguards weighings (weighings for the control and accountability of nuclear material) at the Idaho National Engineering and Environmental Laboratory (INEEL). With regard to balance linearity and accuracy, DOE Order 5633.3B, Control and Accountability of Nuclear Materials, Paragraph 2, 4, e, (1), (a) Scales and Balances Program, states: ''All scales and balances used for accountability purposes shall be maintained in good working condition, recalibrated according to an established schedule, and checked for accuracy and linearity on each day that the scale or balance is used for accountability purposes.'' Various tests have been proposed for testing accuracy and linearity. In the 1991 Measurement Science Conference, Dr. Walter E. Kupper presented a paper entitled: ''Validation of High Accuracy Weighing Equipment.'' Dr. Kupper emphasized that tolerance checks for calibrated, state-of-the-art electronic equipment need not be complicated, and he presented four easy steps for verifying that a calibrated balance is operating correctly. These tests evaluate the standard deviation of successive weighings (of the same load), the off-center error, the calibration error, and the error due to nonlinearity. This method of balance validation is undoubtedly an authoritative means of ensuring balance operability, yet it could have two drawbacks: one, the test for linearity is not intuitively obvious, especially from a statistical viewpoint; and two, there is an absence of definitively defined testing limits. Hence, this paper describes an alternative means of verifying electronic balance linearity and accuracy that is being developed for safeguards measurements at the INEEL

  5. Privacy-Preserving Verifiability: A Case for an Electronic Exam Protocol

    DEFF Research Database (Denmark)

    Giustolisi, Rosario; Iovino, Vincenzo; Lenzini, Gabriele

    2017-01-01

    We introduce the notion of privacy-preserving verifiability for security protocols. It holds when a protocol admits a verifiability test that does not reveal, to the verifier that runs it, more pieces of information about the protocol’s execution than those required to run the test. Our definition...... of privacy-preserving verifiability is general and applies to cryptographic protocols as well as to human security protocols. In this paper we exemplify it in the domain of e-exams. We prove that the notion is meaningful by studying an existing exam protocol that is verifiable but whose verifiability tests...... are not privacy-preserving. We prove that the notion is applicable: we review the protocol using functional encryption so that it admits a verifiability test that preserves privacy according to our definition. We analyse, in ProVerif, that the verifiability holds despite malicious parties and that the new...

  6. USCIS E-Verify Self-Check

    Data.gov (United States)

    Department of Homeland Security — E-Verify is an internet based system that contains datasets to compare information from an employee's Form I-9, Employment Eligibility Verification, to data from the...

  7. A Finite Equivalence of Verifiable Multi-secret Sharing

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2012-02-01

    Full Text Available We give an abstraction of verifiable multi-secret sharing schemes that is accessible to a fully mechanized analysis. This abstraction is formalized within the applied pi-calculus by using an equational theory which characterizes the cryptographic semantics of secret share. We also present an encoding from the equational theory into a convergent rewriting system, which is suitable for the automated protocol verifier ProVerif. Based on that, we verify the threshold certificate protocol in ProVerif.

  8. USCIS E-Verify Customer Satisfaction Survey, January 2013

    Data.gov (United States)

    Department of Homeland Security — This report focuses on the customer satisfaction of companies currently enrolled in the E-Verify program. Satisfaction with E-Verify remains high and follows up a...

  9. Association between cotinine-verified smoking status and hypertension in 167,868 Korean adults.

    Science.gov (United States)

    Kim, Byung Jin; Han, Ji Min; Kang, Jung Gyu; Kim, Bum Soo; Kang, Jin Ho

    2017-10-01

    Previous studies showed inconsistent results concerning the relationship between chronic smoking and blood pressure. Most of the studies involved self-reported smoking status. This study was performed to evaluate the association of urinary cotinine or self-reported smoking status with hypertension and blood pressure in Korean adults. Among individuals enrolled in the Kangbuk Samsung Health Study and Kangbuk Samsung Cohort Study, 167,868 participants (men, 55.7%; age, 37.5 ± 6.9 years) between 2011 and 2013 who had urinary cotinine measurements were included. Individuals with urinary cotinine levels ≥50 ng/mL were defined as cotinine-verified current smokers. The prevalence of hypertension and cotinine-verified current smokers in the overall population was 6.8% and 22.7%, respectively (10.0% in men and 2.8% in women for hypertension: 37.7% in men and 3.9% in women for cotinine-verified current smokers). In a multivariate regression analysis adjusted for age, sex, body mass index, waist circumference, alcohol drinking, vigorous exercise, and diabetes, cotinine-verified current smoking was associated with lower prevalence of hypertension compared with cotinine-verified never smoking (OR[95% CI], 0.79 [0.75, 0.84]). Log-transformed cotinine levels and unobserved smoking were negatively associated with hypertension, respectively (0.96 [0.96, 0.97] and 0.55 [0.39, 0.79]). In a multivariate linear regression analysis, the cotinine-verified current smoking was inversely associated with systolic and diastolic blood pressure (BP) (regression coefficient[95% CI], -1.23[-1.39, -1.07] for systolic BP and -0.71 [-0.84, -0.58] for diastolic BP). In subgroup analyses according to sex, the inverse associations between cotinine-verified current smoking and hypertension were observed only in men. This large observational study showed that cotinine-verified current smoking and unobserved smoking were inversely associated with hypertension in Korean adults, especially only in

  10. Verifying Digital Components of Physical Systems: Experimental Evaluation of Test Quality

    Science.gov (United States)

    Laputenko, A. V.; López, J. E.; Yevtushenko, N. V.

    2018-03-01

    This paper continues the study of high quality test derivation for verifying digital components which are used in various physical systems; those are sensors, data transfer components, etc. We have used logic circuits b01-b010 of the package of ITC'99 benchmarks (Second Release) for experimental evaluation which as stated before, describe digital components of physical systems designed for various applications. Test sequences are derived for detecting the most known faults of the reference logic circuit using three different approaches to test derivation. Three widely used fault types such as stuck-at-faults, bridges, and faults which slightly modify the behavior of one gate are considered as possible faults of the reference behavior. The most interesting test sequences are short test sequences that can provide appropriate guarantees after testing, and thus, we experimentally study various approaches to the derivation of the so-called complete test suites which detect all fault types. In the first series of experiments, we compare two approaches for deriving complete test suites. In the first approach, a shortest test sequence is derived for testing each fault. In the second approach, a test sequence is pseudo-randomly generated by the use of an appropriate software for logic synthesis and verification (ABC system in our study) and thus, can be longer. However, after deleting sequences detecting the same set of faults, a test suite returned by the second approach is shorter. The latter underlines the fact that in many cases it is useless to spend `time and efforts' for deriving a shortest distinguishing sequence; it is better to use the test minimization afterwards. The performed experiments also show that the use of only randomly generated test sequences is not very efficient since such sequences do not detect all the faults of any type. After reaching the fault coverage around 70%, saturation is observed, and the fault coverage cannot be increased anymore. For

  11. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    Directory of Open Access Journals (Sweden)

    J. H. Lee

    2015-01-01

    Full Text Available A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions.

  12. 76 FR 14678 - Communications Unit Leader Prerequisite and Evaluation

    Science.gov (United States)

    2011-03-17

    ... evaluation form. OEC will use the evaluation form to identify course attendees, verify satisfaction of course... and evaluation of OEC events. Evaluation forms will be available in hard copy at each training session... Prerequisite and Evaluation. OMB Number: 1670--NEW. COML Prerequisites Verification Frequency: On occasion...

  13. Personal identifiers in medical research networks: evaluation of the personal identifier generator in the Competence Network Paediatric Oncology and Haematology

    Directory of Open Access Journals (Sweden)

    Pommerening, Klaus

    2006-06-01

    Full Text Available The Society for Paediatric Oncology and Haematology (GPOH and the corresponding Competence Network Paediatric Oncology and Haematology conduct various clinical trials. The comprehensive analysis requires reliable identification of the recruited patients. Therefore, a personal identifier (PID generator is used to assign unambiguous, pseudonymous, non-reversible PIDs to participants in those trials. We tested the matching algorithm of the PID generator using a configuration specific to the GPOH. False data was used to verify the correct processing of PID requests (functionality tests, while test data was used to evaluate the matching outcome. We also assigned PIDs to more than 44,000 data records from the German Childhood Cancer Registry (GCCR and assessed the status of the associated patient list which contains the PIDs, partly encrypted data items and information on the PID generation process for each data record. All the functionality tests showed the expected results. Neither 14,915 test data records nor the GCCR data records yielded any homonyms. Six synonyms were found in the test data, due to erroneous birth dates, and 22 synonyms were found when the GCCR data was run against the actual patient list of 2579 records. In the resulting patient list of 45,693 entries, duplicate record submissions were found for about 7% of all listed patients, while more frequent submissions occurred in less than 1% of cases. The synonym error rate depends mainly on the quality of the input data and on the frequency of multiple submissions. Depending on the requirements on maximally tolerable synonym and homonym error rates, additional measures for securing input data quality might be necessary. The results demonstrate that the PID generator is an appropriate tool for reliably identifying trial participants in medical research networks.

  14. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  15. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  16. 31 CFR 363.14 - How will you verify my identity?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application. At...

  17. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  18. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  19. Construct a procedure to verify radiation protection for apparatus of industrial gamma radiography

    International Nuclear Information System (INIS)

    Nghiem Xuan Long; Trinh Dinh Truong; Dinh Chi Hung; Le Ngoc Hieu

    2013-01-01

    Apparatus for industrial gamma radiography include an exposure container, source guide tube, remote control hand crank assembly and other attached equipment. It is used a lot in inspection and evaluation of projects. In Vietnam, there are now more than 50 companies in radiography field and more than 100 apparatus are being used on the site. Therefore, the verification and evaluation is very necessary and important. This project constructs a procedure to verify a radiation protection for apparatus in the industrial gamma radiography for its application in Vietnam. (author)

  20. An IBM 370 assembly language program verifier

    Science.gov (United States)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  1. Verifying different-modality properties for concepts produces switching costs.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2003-03-01

    According to perceptual symbol systems, sensorimotor simulations underlie the representation of concepts. It follows that sensorimotor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than after verifying a property in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless, switching modalities incurred a cost, analogous to the cost of switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing.

  2. Identifying and Evaluating External Validity Evidence for Passing Scores

    Science.gov (United States)

    Davis-Becker, Susan L.; Buckendahl, Chad W.

    2013-01-01

    A critical component of the standard setting process is collecting evidence to evaluate the recommended cut scores and their use for making decisions and classifying students based on test performance. Kane (1994, 2001) proposed a framework by which practitioners can identify and evaluate evidence of the results of the standard setting from (1)…

  3. Verifying pronunciation dictionaries using conflict analysis

    CSIR Research Space (South Africa)

    Davel, MH

    2010-09-01

    Full Text Available The authors describe a new language-independent technique for automatically identifying errors in an electronic pronunciation dictionary by analyzing the source of conflicting patterns directly.They evaluate the effectiveness of the technique in two...

  4. A two-dimensional deformable phantom for quantitatively verifying deformation algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, Neil; Chuang, Cynthia; Pouliot, Jean [Department of Radiation Oncology, University of California San Francisco, San Francisco, California 94143-1708 (United States)

    2011-08-15

    Purpose: The incorporation of deformable image registration into the treatment planning process is rapidly advancing. For this reason, the methods used to verify the underlying deformation algorithms must evolve equally fast. This manuscript proposes a two-dimensional deformable phantom, which can objectively verify the accuracy of deformation algorithms, as the next step for improving these techniques. Methods: The phantom represents a single plane of the anatomy for a head and neck patient. Inflation of a balloon catheter inside the phantom simulates tumor growth. CT and camera images of the phantom are acquired before and after its deformation. Nonradiopaque markers reside on the surface of the deformable anatomy and are visible through an acrylic plate, which enables an optical camera to measure their positions; thus, establishing the ground-truth deformation. This measured deformation is directly compared to the predictions of deformation algorithms, using several similarity metrics. The ratio of the number of points with more than a 3 mm deformation error over the number that are deformed by more than 3 mm is used for an error metric to evaluate algorithm accuracy. Results: An optical method of characterizing deformation has been successfully demonstrated. For the tests of this method, the balloon catheter deforms 32 out of the 54 surface markers by more than 3 mm. Different deformation errors result from the different similarity metrics. The most accurate deformation predictions had an error of 75%. Conclusions: The results presented here demonstrate the utility of the phantom for objectively verifying deformation algorithms and determining which is the most accurate. They also indicate that the phantom would benefit from more electron density heterogeneity. The reduction of the deformable anatomy to a two-dimensional system allows for the use of nonradiopaque markers, which do not influence deformation algorithms. This is the fundamental advantage of this

  5. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  6. Comparison of VerifyNow-P2Y12 test and Flow Cytometry for monitoring individual platelet response to clopidogrel. What is the cut-off value for identifying patients who are low responders to clopidogrel therapy?

    Directory of Open Access Journals (Sweden)

    Castelli Alfredo

    2009-05-01

    Full Text Available Abstract Background Dual anti-platelet therapy with aspirin and a thienopyridine (DAT is used to prevent stent thrombosis after percutaneous coronary intervention (PCI. Low response to clopidogrel therapy (LR occurs, but laboratory tests have a controversial role in the identification of this condition. Methods We studied LR in patients with stable angina undergoing elective PCI, all on DAT for at least 7 days, by comparing: 1 Flow cytometry (FC to measure platelet membrane expression of P-selectin (CD62P and PAC-1 binding following double stimulation with ADP and collagen type I either in the presence of prostaglandin (PG E1; 2 VerifyNow-P2Y12 test, in which results are reported as absolute P2Y12-Reaction-Units (PRU or % of inhibition (% inhibition. Results Thirty controls and 52 patients were analyzed. The median percentage of platelets exhibiting CD62P expression and PAC-1 binding by FC evaluation after stimulation in the presence of PG E1 was 25.4% (IQR: 21.4–33.1% and 3.5% (1.7–9.4%, respectively. Only 6 patients receiving DAT (11.5% had both values above the 1st quartile of controls, and were defined as LR. Evaluation of the same patients with the VerifyNow-P2Y12 test revealed that the area under the receiver-operating-characteristic (ROC curve was 0.94 (95% CI: 0.84–0.98, p 213 PRU gave the maximum accuracy for the detection of patients defined as having LR by FC. Conclusion In conclusion our findings show that a cut-off value of ≤ 15% inhibition or > 213 PRU in the VerifyNow-P2Y12 test may provide the best accuracy for the identification of patients with LR.

  7. Verifiable Distribution of Material Goods Based on Cryptology

    Directory of Open Access Journals (Sweden)

    Radomír Palovský

    2015-12-01

    Full Text Available Counterfeiting of material goods is a general problem. In this paper an architecture for verifiable distribution of material goods is presented. This distribution is based on printing such a QR code on goods, which would contain digitally signed serial number of the product, and validity of this digital signature could be verifiable by a customer. Extension consisting of adding digital signatures to revenue stamps used for state-controlled goods is also presented. Discussion on possibilities in making copies leads to conclusion that cryptographic security needs to be completed by technical difficulties of copying.

  8. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  9. Guidance for Identifying, Selecting and Evaluating Open Literature Studies

    Science.gov (United States)

    This guidance for Office of Pesticide Program staff will assist in their evaluation of open literature studies of pesticides. It also describes how we identify, select, and ensure that data we use in risk assessments is of sufficient scientific quality.

  10. Classroom Experiment to Verify the Lorentz Force

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 3. Classroom Experiment to Verify the Lorentz Force. Somnath Basu Anindita Bose Sumit Kumar Sinha Pankaj Vishe S Chatterjee. Classroom Volume 8 Issue 3 March 2003 pp 81-86 ...

  11. On alternative approach for verifiable secret sharing

    OpenAIRE

    Kulesza, Kamil; Kotulski, Zbigniew; Pieprzyk, Joseph

    2002-01-01

    Secret sharing allows split/distributed control over the secret (e.g. master key). Verifiable secret sharing (VSS) is the secret sharing extended by verification capacity. Usually verification comes at the price. We propose "free lunch", the approach that allows to overcome this inconvenience.

  12. A Verifiable Secret Shuffle of Homomorphic Encryptions

    DEFF Research Database (Denmark)

    Groth, Jens

    2003-01-01

    We show how to prove in honest verifier zero-knowledge the correctness of a shuffle of homomorphic encryptions (or homomorphic commitments.) A shuffle consists in a rearrangement of the input ciphertexts and a reencryption of them so that the permutation is not revealed....

  13. The impact and applicability of critical experiment evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Brewer, R. [Los Alamos National Lab., NM (United States)

    1997-06-01

    This paper very briefly describes a project to evaluate previously performed critical experiments. The evaluation is intended for use by criticality safety engineers to verify calculations, and may also be used to identify data which need further investigation. The evaluation process is briefly outlined; the accepted benchmark critical experiments will be used as a standard for verification and validation. The end result of the project will be a comprehensive reference document.

  14. Multilingual Validation of the Questionnaire for Verifying Stroke-Free Status in West Africa.

    Science.gov (United States)

    Sarfo, Fred; Gebregziabher, Mulugeta; Ovbiagele, Bruce; Akinyemi, Rufus; Owolabi, Lukman; Obiako, Reginald; Akpa, Onoja; Armstrong, Kevin; Akpalu, Albert; Adamu, Sheila; Obese, Vida; Boa-Antwi, Nana; Appiah, Lambert; Arulogun, Oyedunni; Mensah, Yaw; Adeoye, Abiodun; Tosin, Aridegbe; Adeleye, Osimhiarherhuo; Tabi-Ajayi, Eric; Phillip, Ibinaiye; Sani, Abubakar; Isah, Suleiman; Tabari, Nasir; Mande, Aliyu; Agunloye, Atinuke; Ogbole, Godwin; Akinyemi, Joshua; Laryea, Ruth; Melikam, Sylvia; Uvere, Ezinne; Adekunle, Gregory; Kehinde, Salaam; Azuh, Paschal; Dambatta, Abdul; Ishaq, Naser; Saulson, Raelle; Arnett, Donna; Tiwari, Hemnant; Jenkins, Carolyn; Lackland, Dan; Owolabi, Mayowa

    2016-01-01

    The Questionnaire for Verifying Stroke-Free Status (QVSFS), a method for verifying stroke-free status in participants of clinical, epidemiological, and genetic studies, has not been validated in low-income settings where populations have limited knowledge of stroke symptoms. We aimed to validate QVSFS in 3 languages, Yoruba, Hausa and Akan, for ascertainment of stroke-free status of control subjects enrolled in an on-going stroke epidemiological study in West Africa. Data were collected using a cross-sectional study design where 384 participants were consecutively recruited from neurology and general medicine clinics of 5 tertiary referral hospitals in Nigeria and Ghana. Ascertainment of stroke status was by neurologists using structured neurological examination, review of case records, and neuroimaging (gold standard). Relative performance of QVSFS without and with pictures of stroke symptoms (pictograms) was assessed using sensitivity, specificity, positive predictive value, and negative predictive value. The overall median age of the study participants was 54 years and 48.4% were males. Of 165 stroke cases identified by gold standard, 98% were determined to have had stroke, whereas of 219 without stroke 87% were determined to be stroke-free by QVSFS. Negative predictive value of the QVSFS across the 3 languages was 0.97 (range, 0.93-1.00), sensitivity, specificity, and positive predictive value were 0.98, 0.82, and 0.80, respectively. Agreement between the questionnaire with and without the pictogram was excellent/strong with Cohen k=0.92. QVSFS is a valid tool for verifying stroke-free status across culturally diverse populations in West Africa. © 2015 American Heart Association, Inc.

  15. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  16. An experiment designed to verify the general theory of relativity; Une experience destinee a verifier la theorie de la relativite generalisee

    Energy Technology Data Exchange (ETDEWEB)

    Surdin, Maurice [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1960-07-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [French] Projet d'une experience, utilisant l'effet de gravitation sur des horloges du type Maser placees sur la terre a deux altitudes differentes, et destinee a verifier la theorie de la relativite generalisee. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, seance du 11 janvier 1960.

  17. Analytic solution to verify code predictions of two-phase flow in a boiling water reactor core channel

    International Nuclear Information System (INIS)

    Chen, K.F.; Olson, C.A.

    1983-01-01

    One reliable method that can be used to verify the solution scheme of a computer code is to compare the code prediction to a simplified problem for which an analytic solution can be derived. An analytic solution for the axial pressure drop as a function of the flow was obtained for the simplified problem of homogeneous equilibrium two-phase flow in a vertical, heated channel with a cosine axial heat flux shape. This analytic solution was then used to verify the predictions of the CONDOR computer code, which is used to evaluate the thermal-hydraulic performance of boiling water reactors. The results show excellent agreement between the analytic solution and CONDOR prediction

  18. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  19. Verifying versus falsifying banknotes

    Science.gov (United States)

    van Renesse, Rudolf L.

    1998-04-01

    A series of counterfeit Dutch, German, English, and U.S. banknotes was examined with respect to the various modi operandi to imitate paper based, printed and post-printed security features. These features provide positive evidence (verifiability) as well as negative evidence (falsifiability). It appears that the positive evidence provided in most cases is insufficiently convincing: banknote inspection mainly rests on negative evidence. The act of falsifying (to prove to be false), however, is an inefficacious procedure. Ergonomic verificatory security features are demanded. This demand is increasingly met by security features based on nano- technology. The potential of nano-security has a twofold base: (1) the unique optical effects displayed allow simple, fast and unambiguous inspection, and (2) the nano-technology they are based on, makes successful counterfeit or simulation extremely improbable.

  20. Identifying Method of Drunk Driving Based on Driving Behavior

    Directory of Open Access Journals (Sweden)

    Xiaohua Zhao

    2011-05-01

    Full Text Available Drunk driving is one of the leading causes contributing to traffic crashes. There are numerous issues that need to be resolved with the current method of identifying drunk driving. Driving behavior, with the characteristic of real-time, was extensively researched to identify impaired driving behaviors. In this paper, the drives with BACs above 0.05% were defined as drunk driving state. A detailed comparison was made between normal driving and drunk driving. The experiment in driving simulator was designed to collect the driving performance data of the groups. According to the characteristics analysis for the effect of alcohol on driving performance, seven significant indicators were extracted and the drunk driving was identified by the Fisher Discriminant Method. The discriminant function demonstrated a high accuracy of classification. The optimal critical score to differentiate normal from drinking state was found to be 0. The evaluation result verifies the accuracy of classification method.

  1. Robustness and device independence of verifiable blind quantum computing

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-01-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456–60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant. (paper)

  2. NOS CO-OPS Water Level Data, Verified, Hourly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), hourly, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  3. Unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2016-07-01

    Full Text Available stream_source_info Marais_2016_ABSTRACT.pdf.txt stream_content_type text/plain stream_size 796 Content-Encoding ISO-8859-1 stream_name Marais_2016_ABSTRACT.pdf.txt Content-Type text/plain; charset=ISO-8859-1 18th... International Workshop on Descriptional Complexity of Formal Systems, 5 - 8 July 2016, Bucharest, Romania Unary self-verifying symmetric difference automata Laurette Marais1,2 and Lynette van Zijl1(B) 1 Department of Computer Science, Stellenbosch...

  4. Verifying the agreed framework between the United States and North Korea

    International Nuclear Information System (INIS)

    May, M.M.

    2001-01-01

    Under the 1994 Agreed Framework (AF) between the United States and the Democratic People Republic of Korea (DPRK), the US and its allies will provide two nuclear-power reactors and other benefits to the DPRK in exchange for an agreement by the DPRK to declare how much nuclear-weapon material it has produced; to identify, freeze, and eventually dismantle specified facilities for producing this material; and to remain a party to the nuclear Non- Proliferation Treaty (NPT) and allow the implementation of its safeguards agreement. This study assesses the verifiability of these provisions. The study concludes verification can be accomplished, given cooperation and openness from the DPRK. Special effort will be needed from the IAEA, as well as support from the US and the Republic of Korea. (author)

  5. A Practical Voter-Verifiable Election Scheme.

    OpenAIRE

    Chaum, D; Ryan, PYA; Schneider, SA

    2005-01-01

    We present an election scheme designed to allow voters to verify that their vote is accurately included in the count. The scheme provides a high degree of transparency whilst ensuring the secrecy of votes. Assurance is derived from close auditing of all the steps of the vote recording and counting process with minimal dependence on the system components. Thus, assurance arises from verification of the election rather than having to place trust in the correct behaviour of components of the vot...

  6. Identifying and Evaluating Chaotic Behavior in Hydro-Meteorological Processes

    Directory of Open Access Journals (Sweden)

    Soojun Kim

    2015-01-01

    Full Text Available The aim of this study is to identify and evaluate chaotic behavior in hydro-meteorological processes. This study poses the two hypotheses to identify chaotic behavior of the processes. First, assume that the input data is the significant factor to provide chaotic characteristics to output data. Second, assume that the system itself is the significant factor to provide chaotic characteristics to output data. For solving this issue, hydro-meteorological time series such as precipitation, air temperature, discharge, and storage volume were collected in the Great Salt Lake and Bear River Basin, USA. The time series in the period of approximately one year were extracted from the original series using the wavelet transform. The generated time series from summation of sine functions were fitted to each series and used for investigating the hypotheses. Then artificial neural networks had been built for modeling the reservoir system and the correlation dimension was analyzed for the evaluation of chaotic behavior between inputs and outputs. From the results, we found that the chaotic characteristic of the storage volume which is output is likely a byproduct of the chaotic behavior of the reservoir system itself rather than that of the input data.

  7. A synthesis of evaluation monitoring projects by the forest health monitoring program (1998-2007)

    Science.gov (United States)

    William A. Bechtold; Michael J. Bohne; Barbara L. Conkling; Dana L. Friedman

    2012-01-01

    The national Forest Health Monitoring Program of the Forest Service, U.S. Department of Agriculture, has funded over 200 Evaluation Monitoring projects. Evaluation Monitoring is designed to verify and define the extent of deterioration in forest ecosystems where potential problems have been identified. This report is a synthesis of results from over 150 Evaluation...

  8. Development of measurement standards for verifying functional performance of surface texture measuring instruments

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, A [Life and Industrial Product Development Department Olympus Corporation, 2951 Ishikawa-machi, Hachiouji-shi, Tokyo (Japan); Suzuki, H [Industrial Marketing and Planning Department Olympus Corporation, Shinjyuku Monolith, 3-1 Nishi-Shinjyuku 2-chome, Tokyo (Japan); Yanagi, K, E-mail: a_fujii@ot.olympus.co.jp [Department of Mechanical Engineering, Nagaoka University of Technology, 1603-1 Kamitomioka-machi, Nagaoka-shi, Niigata (Japan)

    2011-08-19

    A new measurement standard is proposed for verifying overall functional performance of surface texture measuring instruments. Its surface is composed of sinusoidal surface waveforms of chirp signals along horizontal cross sections of the material measure. One of the notable features is that the amplitude of each cycle in the chirp signal form is geometrically modulated so that the maximum slope is kept constant. The maximum slope of the chirp-like signal is gradually decreased according to movement in the lateral direction. We fabricated the measurement standard by FIB processing, and it was calibrated by AFM. We tried to evaluate the functional performance of Laser Scanning Microscope by this standard in terms of amplitude response with varying slope angles. As a result, it was concluded that the proposed standard can easily evaluate the performance of surface texture measuring instruments.

  9. The Method of a Standalone Functional Verifying Operability of Sonar Control Systems

    Directory of Open Access Journals (Sweden)

    A. A. Sotnikov

    2014-01-01

    Full Text Available This article describes a method of standalone verifying sonar control system, which is based on functional checking of control system operability.The main features of realized method are a development of the valid mathematic model for simulation of sonar signals at the point of hydroacoustic antenna, a valid representation of the sonar control system modes as a discrete Markov model, providing functional object verification in real time mode.Some ways are proposed to control computational complexity in case of insufficient computing resources of the simulation equipment, namely the way of model functionality reduction and the way of adequacy reduction.Experiments were made using testing equipment, which was developed by department of Research Institute of Information Control System at Bauman Moscow State Technical University to verify technical validity of industrial sonar complexes.On-board software was artificially changed to create malfunctions in functionality of sonar control systems during the verifying process in order to estimate verifying system performances.The method efficiency was proved by the theory and experiment results in comparison with the basic methodology of verifying technical systems.This method could be also used in debugging of on-board software of sonar complexes and in development of new promising algorithms of sonar signal processing.

  10. A Trustworthy Internet Auction Model with Verifiable Fairness.

    Science.gov (United States)

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  11. Reliability of stable Pb isotopes to identify Pb sources and verifying biological fractionation of Pb isotopes in goats and chickens

    International Nuclear Information System (INIS)

    Nakata, Hokuto; Nakayama, Shouta M.M.; Yabe, John; Liazambi, Allan; Mizukawa, Hazuki; Darwish, Wageh Sobhy; Ikenaka, Yoshinori; Ishizuka, Mayumi

    2016-01-01

    Stable Pb isotope ratios (Pb-IRs) have been recognized as an efficient tool for identifying sources. This study carried out at Kabwe mining area, Zambia, to elucidate the presence or absence of Pb isotope fractionation in goat and chicken, to evaluate the reliability of identifying Pb pollution sources via analysis of Pb-IRs, and to assess whether a threshold for blood Pb levels (Pb-B) for biological fractionation was present. The variation of Pb-IRs in goat decreased with an increase in Pb-B and were fixed at certain values close to those of the dominant source of Pb exposure at Pb-B > 5 μg/dL. However, chickens did not show a clear relationship for Pb-IRs against Pb-B, or a fractionation threshold. Given these, the biological fractionation of Pb isotopes should not occur in chickens but in goats, and the threshold for triggering biological fractionation is at around 5 μg/dL of Pb-B in goats. - Highlights: • Presence of Pb isotope fractionation in goat and chicken was studied. • The variation of Pb-IRs in goat decreased with an increase in Pb-B. • Chickens did not show a clear relationship for Pb-IRs against Pb-B. • The biological fractionation of Pb isotopes should not occur in chickens but in goats. • Threshold for triggering biological fractionation is at 5 μg/dL of Pb-B in goats. - Biological fractionation and its threshold for stable Pb isotope ratio in goats and chickens were examined.

  12. A record and verify system for radiotherapy treatment

    International Nuclear Information System (INIS)

    Koens, M.L.; Vroome, H. de

    1984-01-01

    The Record and Verify system developed for the radiotherapy department of the Leiden University Hospital is described. The system has been in use since 1980 and will now be installed in at least four of the Dutch University Hospitals. The system provides the radiographer with a powerful tool for checking the set-up of the linear accelerator preceeding the irradiation of a field. After the irradiation of a field the machine settings are registered in the computer system together with the newly calculated cumulative dose. These registrations are used by the system to produce a daily report which provides the management of the department with insight into the established differences between treatment and treatment planning. Buying a record and verify system from the manufacturer of the linear accelerator is not an optimal solution especially for a department with more than one accelerator from different manufacturers. Integration in a Hospital Information System (HIS) has important advantages over the development of a dedicated departmental system. (author)

  13. Verifiable Outsourced Decryption of Attribute-Based Encryption with Constant Ciphertext Length

    Directory of Open Access Journals (Sweden)

    Jiguo Li

    2017-01-01

    Full Text Available Outsourced decryption ABE system largely reduces the computation cost for users who intend to access the encrypted files stored in cloud. However, the correctness of the transformation ciphertext cannot be guaranteed because the user does not have the original ciphertext. Lai et al. provided an ABE scheme with verifiable outsourced decryption which helps the user to check whether the transformation done by the cloud is correct. In order to improve the computation performance and reduce communication overhead, we propose a new verifiable outsourcing scheme with constant ciphertext length. To be specific, our scheme achieves the following goals. (1 Our scheme is verifiable which ensures that the user efficiently checks whether the transformation is done correctly by the CSP. (2 The size of ciphertext and the number of expensive pairing operations are constant, which do not grow with the complexity of the access structure. (3 The access structure in our scheme is AND gates on multivalued attributes and we prove our scheme is verifiable and it is secure against selectively chosen-plaintext attack in the standard model. (4 We give some performance analysis which indicates that our scheme is adaptable for various limited bandwidth and computation-constrained devices, such as mobile phone.

  14. Building Program Verifiers from Compilers and Theorem Provers

    Science.gov (United States)

    2015-05-14

    Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015

  15. NOS CO-OPS Water Level Data, Verified, High Low

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), daily, high low water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services...

  16. Methodology to identify, review, and evaluate components for license renewal

    International Nuclear Information System (INIS)

    Carlson, D.D.; Gregor, F.E.; Walker, R.S.

    1988-01-01

    A methodology has been developed to systematically identify, review, and evaluate plant equipment for license renewal. The method builds upon the existing licensing basis, operating history, and accepted deterministic and probabilistic techniques. Use of these approaches provides a focus for license renewal upon those safety-significant systems and components that are not routinely replaced, refurbished, or subject to detailed inspection as part of the plant's existing test, maintenance, and surveillance programs. Application of the method identified the PWR and BWR systems that should be subjected to detailed license renewal review. Detailed examination of two example systems demonstrates the approach. The review and evaluation of plant equipment for license renewal differ from the initial licensing of the plant. A substantial operating history has been established, the licensing basis has evolved from the original one, and plant equipment has been subject to periodic maintenance and surveillance throughout its life. In consideration of these differences, a basis for license renewal is needed. License renewal should be based upon continuation of the existing licensing basis and recognition of existing programs and operating history

  17. NOS CO-OPS Water Level Data, Verified, 6-Minute

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), 6-minute, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  18. Verifying mapping, monitoring and modeling of fine sediment pollution sources in West Maui, Hawai'i, USA

    Science.gov (United States)

    Cerovski-Darriau, C.; Stock, J. D.

    2017-12-01

    Coral reef ecosystems, and the fishing and tourism industries they support, depend on clean waters. Fine sediment pollution from nearshore watersheds threatens these enterprises in West Maui, Hawai'i. To effectively mitigate sediment pollution, we first have to know where the sediment is coming from, and how fast it erodes. In West Maui, we know that nearshore sediment plumes originate from erosion of fine sand- to silt-sized air fall deposits where they are exposed by grazing, agriculture, or other disturbances. We identified and located these sediment sources by mapping watershed geomorphological processes using field traverses, historic air photos, and modern orthophotos. We estimated bank lowering rates using erosion pins, and other surface erosion rates were extrapolated from data collected elsewhere on the Hawaiian Islands. These measurements and mapping led to a reconnaissance sediment budget which showed that annual loads are dominated by bank erosion of legacy terraces. Field observations during small storms confirm that nearshore sediment plumes are sourced from bank erosion of in-stream, legacy agricultural deposits. To further verify this sediment budget, we used geochemical fingerprinting to uniquely identify each potential source (e.g. stream banks, agricultural fields, roads, other human modified soils, and hillslopes) from the Wahikuli watershed (10 km2) and analyzed the fine fraction using ICP-MS for elemental geochemistry. We propose to apply this the fingerprinting results to nearshore suspended sediment samples taken during storms to identify the proportion of sediment coming from each source. By combining traditional geomorphic mapping, monitoring and geochemistry, we hope to provide a powerful tool to verify the primary source of sediment reaching the nearshore.

  19. Verified Subtyping with Traits and Mixins

    Directory of Open Access Journals (Sweden)

    Asankhaya Sharma

    2014-07-01

    Full Text Available Traits allow decomposing programs into smaller parts and mixins are a form of composition that resemble multiple inheritance. Unfortunately, in the presence of traits, programming languages like Scala give up on subtyping relation between objects. In this paper, we present a method to check subtyping between objects based on entailment in separation logic. We implement our method as a domain specific language in Scala and apply it on the Scala standard library. We have verified that 67% of mixins used in the Scala standard library do indeed conform to subtyping between the traits that are used to build them.

  20. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  1. Verifying Safety Messages Using Relative-Time and Zone Priority in Vehicular Ad Hoc Networks

    Science.gov (United States)

    Banani, Sam; Thiemjarus, Surapa; Kittipiyakul, Somsak

    2018-01-01

    In high-density road networks, with each vehicle broadcasting multiple messages per second, the arrival rate of safety messages can easily exceed the rate at which digital signatures can be verified. Since not all messages can be verified, algorithms for selecting which messages to verify are required to ensure that each vehicle receives appropriate awareness about neighbouring vehicles. This paper presents a novel scheme to select important safety messages for verification in vehicular ad hoc networks (VANETs). The proposed scheme uses location and direction of the sender, as well as proximity and relative-time between vehicles, to reduce the number of irrelevant messages verified (i.e., messages from vehicles that are unlikely to cause an accident). Compared with other existing schemes, the analysis results show that the proposed scheme can verify messages from nearby vehicles with lower inter-message delay and reduced packet loss and thus provides high level of awareness of the nearby vehicles. PMID:29652840

  2. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    International Nuclear Information System (INIS)

    Yampolskiy, Roman V

    2017-01-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification. (invited comment)

  3. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    Science.gov (United States)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  4. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  5. A credit card verifier structure using diffraction and spectroscopy concepts

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  6. Process to identify and evaluate restoration options

    International Nuclear Information System (INIS)

    Strand, J.; Senner, S.; Weiner, A.; Rabinowitch, S.; Brodersen, M.; Rice, K.; Klinge, K.; MacMullin, S.; Yender, R.; Thompson, R.

    1993-01-01

    The restoration planning process has yielded a number of possible alternatives for restoring resources and services injured by the Exxon Valdez oil spill. They were developed by resource managers, scientists, and the public, taking into consideration the results of damage assessment and restoration studies and information from the scientific literature. The alternatives thus far identified include no action natural recovery, management of human uses, manipulation of resources, habitat protection and acquisition, acquisition of equivalent resources, and combinations of the above. Each alternative consists of a different mix of resource- or service-specific restoration options. To decide whether it was appropriate to spend restoration funds on a particular resource or service, first criteria had to be developed that evaluated available evidence for consequential injury and the adequacy and rate of natural recovery. Then, recognizing the range of effective restoration options, a second set of criteria was applied to determine which restoration options were the most beneficial. These criteria included technical feasibility, potential to improve the rate or degree of recovery, the relationship of expected costs to benefits, cost effectiveness, and the potential to restore the ecosystem as a whole. The restoration options considered to be most beneficial will be grouped together in several or more of the above alternatives and presented in a draft restoration plan. They will be further evaluated in a companion draft environmental impact statement

  7. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  8. Characterizing Verified Head Impacts in High School Girls' Lacrosse.

    Science.gov (United States)

    Caswell, Shane V; Lincoln, Andrew E; Stone, Hannah; Kelshaw, Patricia; Putukian, Margot; Hepburn, Lisa; Higgins, Michael; Cortes, Nelson

    2017-12-01

    Girls' high school lacrosse players have higher rates of head and facial injuries than boys. Research indicates that these injuries are caused by stick, player, and ball contacts. Yet, no studies have characterized head impacts in girls' high school lacrosse. To characterize girls' high school lacrosse game-related impacts by frequency, magnitude, mechanism, player position, and game situation. Descriptive epidemiology study. Thirty-five female participants (mean age, 16.2 ± 1.2 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) volunteered during 28 games in the 2014 and 2015 lacrosse seasons. Participants wore impact sensors affixed to the right mastoid process before each game. All game-related impacts recorded by the sensors were verified using game video. Data were summarized for all verified impacts in terms of frequency, peak linear acceleration (PLA), and peak rotational acceleration (PRA). Descriptive statistics and impact rates were calculated. Fifty-eight verified game-related impacts ≥20 g were recorded (median PLA, 33.8 g; median PRA, 6151.1 rad/s 2 ) during 467 player-games. The impact rate for all game-related verified impacts was 0.12 per athlete-exposure (AE) (95% CI, 0.09-0.16), equivalent to 2.1 impacts per team game, indicating that each athlete suffered fewer than 2 head impacts per season ≥20 g. Of these impacts, 28 (48.3%) were confirmed to directly strike the head, corresponding with an impact rate of 0.05 per AE (95% CI, 0.00-0.10). Overall, midfielders (n = 28, 48.3%) sustained the most impacts, followed by defenders (n = 12, 20.7%), attackers (n = 11, 19.0%), and goalies (n = 7, 12.1%). Goalies demonstrated the highest median PLA and PRA (38.8 g and 8535.0 rad/s 2 , respectively). The most common impact mechanisms were contact with a stick (n = 25, 43.1%) and a player (n = 17, 29.3%), followed by the ball (n = 7, 12.1%) and the ground (n = 7, 12.1%). One hundred percent of ball impacts occurred to goalies. Most impacts

  9. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility`s WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator`s waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits.

  10. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    International Nuclear Information System (INIS)

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility's WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator's waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits

  11. Getting What We Paid for: a Script to Verify Full Access to E-Resources

    Directory of Open Access Journals (Sweden)

    Kristina M. Spurgin

    2014-07-01

    Full Text Available Libraries regularly pay for packages of e-resources containing hundreds to thousands of individual titles. Ideally, library patrons could access the full content of all titles in such packages. In reality, library staff and patrons inevitably stumble across inaccessible titles, but no library has the resources to manually verify full access to all titles, and basic URL checkers cannot check for access. This article describes the E-Resource Access Checker—a script that automates the verification of full access. With the Access Checker, library staff can identify all inaccessible titles in a package and bring these problems to content providers’ attention to ensure we get what we pay for.

  12. Identifying and evaluating E-procurement in supply chain risk by Fuzzy MADM

    Directory of Open Access Journals (Sweden)

    Mostafa Memarzade

    2012-08-01

    Full Text Available E-procurement risks has emerged as an important issue for researchers and practitioners because mitigating supply chain risk helps improve firms’ as well as supply chains’ performance. E-marketplaces have been steadily growing and there have been significant interest in e-business research. There are different risks and uncertainties involved with E-marketplaces, which jeopardizes the sector but we have had a large amount of hype and the business still continue to grow. The primary aim of this study is to identify E-procurement risks and evaluate them using a fuzzy AHP framework. We contribute E-procurement risk by identifying 13 critical criteria and determine four important ones including the extent of acceptable information, interrelationship risk, lack of honesty in relationships and product quality and safety for evaluating suppliers’ risk.

  13. Identifying and overcoming barriers to technology implementation

    International Nuclear Information System (INIS)

    Bailey, M.; Warren, S.; McCune, M.

    1996-01-01

    In a recent General Accounting Office report, the Department of Energy's (DOE) Office of Environmental Management was found to be ineffective in integrating their environmental technology development efforts with the cleanup actions. As a result of these findings, a study of remediation documents was performed by the Technology Applications Team within DOE's Office of Environmental Restoration (EM-40) to validate this finding and to understand why it was occurring. A second initiative built on the foundation of the remediation document study and evaluated solutions to the ineffective implementation of improved technologies. The Technology Applications Team examined over 50 remediation documents (17 projects) which included nearly 600 proposed remediation technologies. It was determined that very few technologies are reaching the Records of Decision documents. In fact, most are eliminated in the early stages of consideration. These observations stem from regulators' and stakeholders' uncertainties in cost and performance of the technology and the inability of the technology to meet site specific conditions. The Technology Applications Team also set out to identify and evaluate solutions to barriers to implementing innovative technology into the DOE's environmental management activities. Through the combined efforts of DOE and the Hazardous Waste Action Coalition (HWAC), a full day workshop was conducted at the annual HWAC meeting in June 1995 to solve barriers to innovative technology implementation. Three barriers were identified as widespread throughout the DOE complex and industry. Identified barriers included a lack of verified or certified cost and performance data for innovative technologies; risk of failure to reach cleanup goals using innovative technologies; and communication barriers that are present at virtually every stage of the characterization/remediation process from development through implementation

  14. Making Digital Artifacts on the Web Verifiable and Reliable

    NARCIS (Netherlands)

    Kuhn, T.; Dumontier, M.

    2015-01-01

    The current Web has no general mechanisms to make digital artifacts - such as datasets, code, texts, and images - verifiable and permanent. For digital artifacts that are supposed to be immutable, there is moreover no commonly accepted method to enforce this immutability. These shortcomings have a

  15. Evaluating the Atrial Myopathy Underlying Atrial Fibrillation: Identifying the Arrhythmogenic and Thrombogenic Substrate

    Science.gov (United States)

    Goldberger, Jeffrey J.; Arora, Rishi; Green, David; Greenland, Philip; Lee, Daniel C.; Lloyd-Jones, Donald M.; Markl, Michael; Ng, Jason; Shah, Sanjiv J.

    2015-01-01

    Atrial disease or myopathy forms the substrate for atrial fibrillation (AF) and underlies the potential for atrial thrombus formation and subsequent stroke. Current diagnostic approaches in patients with AF focus on identifying clinical predictors with evaluation of left atrial size by echocardiography serving as the sole measure specifically evaluating the atrium. Although the atrial substrate underlying AF is likely developing for years prior to the onset of AF, there is no current evaluation to identify the pre-clinical atrial myopathy. Atrial fibrosis is one component of the atrial substrate that has garnered recent attention based on newer MRI techniques that have been applied to visualize atrial fibrosis in humans with prognostic implications regarding success of treatment. Advanced ECG signal processing, echocardiographic techniques, and MRI imaging of fibrosis and flow provide up-to-date approaches to evaluate the atrial myopathy underlying AF. While thromboembolic risk is currently defined by clinical scores, their predictive value is mediocre. Evaluation of stasis via imaging and biomarkers associated with thrombogenesis may provide enhanced approaches to assess risk for stroke in patients with AF. Better delineation of the atrial myopathy that serves as the substrate for AF and thromboembolic complications might improve treatment outcomes. Furthermore, better delineation of the pathophysiologic mechanisms underlying the development of the atrial substrate for AF, particularly in its earlier stages, could help identify blood and imaging biomarkers that could be useful to assess risk for developing new onset AF and suggest specific pathways that could be targeted for prevention. PMID:26216085

  16. 77 FR 70484 - Preoperational Testing of Onsite Electric Power Systems To Verify Proper Load Group Assignments...

    Science.gov (United States)

    2012-11-26

    ...-1294, ``Preoperational Testing of On-Site Electric Power Systems to Verify Proper Load Group... entitled ``Preoperational Testing of On- Site Electric Power Systems to Verify Proper Load Group... Electric Power Systems to Verify Proper Load Group Assignments, Electrical Separation, and Redundancy...

  17. TrustGuard: A Containment Architecture with Verified Output

    Science.gov (United States)

    2017-01-01

    that the TrustGuard system has minimal performance decline, despite restrictions such as high communication latency and limited available bandwidth...design are the availability of high bandwidth and low delays between the host and the monitoring chip. 3-D integration provides an alternate way of...TRUSTGUARD: A CONTAINMENT ARCHITECTURE WITH VERIFIED OUTPUT SOUMYADEEP GHOSH A DISSERTATION PRESENTED TO THE FACULTY OF PRINCETON UNIVERSITY IN

  18. Verifying a smart design of TCAP : a synergetic experience

    NARCIS (Netherlands)

    T. Arts; I.A. van Langevelde

    1999-01-01

    textabstractAn optimisation of the SS No. 7 Transport Capabilities Procedures is verified by specifying both the original and the optimised {scriptsize sf TCAP in {scriptsize sf $mu$CRL, generating transition systems for both using the {scriptsize sf $mu$CRL tool set, and checking weak bisimulation

  19. Scoring of the radiological picture of idiopathic interstitial pneumonia: a study to verify the reliability of the method

    International Nuclear Information System (INIS)

    Kocova, Eva; Vanasek, Jiri; Koblizek, Vladimir; Novosad, Jakub; Elias, Pavel; Bartos, Vladimir; Sterclova, Martina

    2015-01-01

    Idiopathic pulmonary fibrosis (IPF) is a clinical form of usual interstitial pneumonia (UIP). Computed chest tomography (CT) has a fundamental role in the multidisciplinary diagnostics. However, it has not been verified if and how a subjective opinion of a radiologists or pneumologists can influence the assessment and overall diagnostic summary. To verify the reliability of the scoring system. Assessment of conformity of the radiological score of high-resolution CT (HRCT) of lungs in patients with IPF was performed by a group of radiologists and pneumologists. Personal data were blinded and the assessment was performed independently using the Dutka/Vasakova scoring system (modification of the Gay system). The final score of the single assessors was then evaluated by means of the paired Spearman’s correlation and analysis of the principal components. Two principal components explaining cumulatively a 62% or 73% variability of the assessment of the single assessors were extracted during the analysis. The groups did not differ both in terms of specialty and experience with the assessment of the HRCT findings. According to our study, scoring of a radiological image using the Dutka/Vasakova system is a reliable method in the hands of experienced radiologists. Significant differences occur during the assessment performed by pneumologists especially during the evaluation of the alveolar changes

  20. Identifying protein phosphorylation sites with kinase substrate specificity on human viruses.

    Directory of Open Access Journals (Sweden)

    Neil Arvin Bretaña

    Full Text Available Viruses infect humans and progress inside the body leading to various diseases and complications. The phosphorylation of viral proteins catalyzed by host kinases plays crucial regulatory roles in enhancing replication and inhibition of normal host-cell functions. Due to its biological importance, there is a desire to identify the protein phosphorylation sites on human viruses. However, the use of mass spectrometry-based experiments is proven to be expensive and labor-intensive. Furthermore, previous studies which have identified phosphorylation sites in human viruses do not include the investigation of the responsible kinases. Thus, we are motivated to propose a new method to identify protein phosphorylation sites with its kinase substrate specificity on human viruses. The experimentally verified phosphorylation data were extracted from virPTM--a database containing 301 experimentally verified phosphorylation data on 104 human kinase-phosphorylated virus proteins. In an attempt to investigate kinase substrate specificities in viral protein phosphorylation sites, maximal dependence decomposition (MDD is employed to cluster a large set of phosphorylation data into subgroups containing significantly conserved motifs. The experimental human phosphorylation sites are collected from Phospho.ELM, grouped according to its kinase annotation, and compared with the virus MDD clusters. This investigation identifies human kinases such as CK2, PKB, CDK, and MAPK as potential kinases for catalyzing virus protein substrates as confirmed by published literature. Profile hidden Markov model is then applied to learn a predictive model for each subgroup. A five-fold cross validation evaluation on the MDD-clustered HMMs yields an average accuracy of 84.93% for Serine, and 78.05% for Threonine. Furthermore, an independent testing data collected from UniProtKB and Phospho.ELM is used to make a comparison of predictive performance on three popular kinase

  1. Identifying protein phosphorylation sites with kinase substrate specificity on human viruses.

    Science.gov (United States)

    Bretaña, Neil Arvin; Lu, Cheng-Tsung; Chiang, Chiu-Yun; Su, Min-Gang; Huang, Kai-Yao; Lee, Tzong-Yi; Weng, Shun-Long

    2012-01-01

    Viruses infect humans and progress inside the body leading to various diseases and complications. The phosphorylation of viral proteins catalyzed by host kinases plays crucial regulatory roles in enhancing replication and inhibition of normal host-cell functions. Due to its biological importance, there is a desire to identify the protein phosphorylation sites on human viruses. However, the use of mass spectrometry-based experiments is proven to be expensive and labor-intensive. Furthermore, previous studies which have identified phosphorylation sites in human viruses do not include the investigation of the responsible kinases. Thus, we are motivated to propose a new method to identify protein phosphorylation sites with its kinase substrate specificity on human viruses. The experimentally verified phosphorylation data were extracted from virPTM--a database containing 301 experimentally verified phosphorylation data on 104 human kinase-phosphorylated virus proteins. In an attempt to investigate kinase substrate specificities in viral protein phosphorylation sites, maximal dependence decomposition (MDD) is employed to cluster a large set of phosphorylation data into subgroups containing significantly conserved motifs. The experimental human phosphorylation sites are collected from Phospho.ELM, grouped according to its kinase annotation, and compared with the virus MDD clusters. This investigation identifies human kinases such as CK2, PKB, CDK, and MAPK as potential kinases for catalyzing virus protein substrates as confirmed by published literature. Profile hidden Markov model is then applied to learn a predictive model for each subgroup. A five-fold cross validation evaluation on the MDD-clustered HMMs yields an average accuracy of 84.93% for Serine, and 78.05% for Threonine. Furthermore, an independent testing data collected from UniProtKB and Phospho.ELM is used to make a comparison of predictive performance on three popular kinase-specific phosphorylation site

  2. Identify User’s Satisfaction from Platform Using Behavior

    Directory of Open Access Journals (Sweden)

    Kuo L.H.

    2016-01-01

    Full Text Available The purpose of this study was to verify a model of user’s satisfaction of an e-learning environment based upon platform using behaviors. This is a non-experimental study. The data was collected from system management logs and users satisfaction survey after learning service. Total 314 users were invited in this study. First, theory model was identified. Second, the satisfaction survey results were prepared. Third, the behavior data of each survey subject were prepared. A CFA procedure were conducted to verify whether the data fits into the model. The model fit is positive with χ2 = 2.06, p-value=.151, df= 1, RMSEA=.058. The proposed two-factor theory model with simple structure fit the data. The love e-learning and satisfaction of e-learning factors are significantly supporting the hypothesis of a relationship between the factors. The findings suggested that identifying satisfaction from behavior is possible.

  3. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    Science.gov (United States)

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  4. Verifying Correct Usage of Atomic Blocks and Typestate: Technical Companion

    National Research Council Canada - National Science Library

    Beckman, Nels E; Aldrich, Jonathan

    2008-01-01

    In this technical report, we present a static and dynamic semantics as well as a proof of soundness for a programming language presented in the paper entitled, 'Verifying Correct Usage of Atomic Blocks and Typestate...

  5. Development of material measures for performance verifying surface topography measuring instruments

    International Nuclear Information System (INIS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-01-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments

  6. Verifying Temporal Properties of Reactive Systems by Transformation

    OpenAIRE

    Hamilton, Geoff

    2015-01-01

    We show how program transformation techniques can be used for the verification of both safety and liveness properties of reactive systems. In particular, we show how the program transformation technique distillation can be used to transform reactive systems specified in a functional language into a simplified form that can subsequently be analysed to verify temporal properties of the systems. Example systems which are intended to model mutual exclusion are analysed using these techniques with...

  7. Identifying Knowledge Gaps in Clinicians Who Evaluate and Treat Vocal Performing Artists in College Health Settings.

    Science.gov (United States)

    McKinnon-Howe, Leah; Dowdall, Jayme

    2018-05-01

    The goal of this study was to identify knowledge gaps in clinicians who evaluate and treat performing artists for illnesses and injuries that affect vocal function in college health settings. This pilot study utilized a web-based cross-sectional survey design incorporating common clinical scenarios to test knowledge of evaluation and management strategies in the vocal performing artist. A web-based survey was administered to a purposive sample of 28 clinicians to identify the approach utilized to evaluate and treat vocal performing artists in college health settings, and factors that might affect knowledge gaps and influence referral patterns to voice specialists. Twenty-eight clinicians were surveyed, with 36% of respondents incorrectly identifying appropriate vocal hygiene measures, 56% of respondents failing to identify symptoms of vocal fold hemorrhage, 84% failing to identify other indications for referral to a voice specialist, 96% of respondents acknowledging unfamiliarity with the Voice Handicap Index and the Singers Voice Handicap Index, and 68% acknowledging unfamiliarity with the Reflux Symptom Index. The data elucidated specific knowledge gaps in college health providers who are responsible for evaluating and treating common illnesses that affect vocal function, and triaging and referring students experiencing symptoms of potential vocal emergencies. Future work is needed to improve the standard of care for this population. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  8. Use of tiling array data and RNA secondary structure predictions to identify noncoding RNA genes

    DEFF Research Database (Denmark)

    Weile, Christian; Gardner, Paul P; Hedegaard, Mads M

    2007-01-01

    neuroblastoma cell line SK-N-AS. Using this strategy, we identify thousands of human candidate RNA genes. To further verify the expression of these genes, we focused on candidate genes that had a stable hairpin structures or a high level of covariance. Using northern blotting, we verify the expression of 2 out...

  9. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  10. Identifying the critical financial ratios for stocks evaluation: A fuzzy delphi approach

    Science.gov (United States)

    Mokhtar, Mazura; Shuib, Adibah; Mohamad, Daud

    2014-12-01

    Stocks evaluation has always been an interesting and challenging problem for both researchers and practitioners. Generally, the evaluation can be made based on a set of financial ratios. Nevertheless, there are a variety of financial ratios that can be considered and if all ratios in the set are placed into the evaluation process, data collection would be more difficult and time consuming. Thus, the objective of this paper is to identify the most important financial ratios upon which to focus in order to evaluate the stock's performance. For this purpose, a survey was carried out using an approach which is based on an expert judgement, namely the Fuzzy Delphi Method (FDM). The results of this study indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share and debt to equity are the most important ratios.

  11. Efficient Verifiable Range and Closest Point Queries in Zero-Knowledge

    Directory of Open Access Journals (Sweden)

    Ghosh Esha

    2016-10-01

    Full Text Available We present an efficient method for answering one-dimensional range and closest-point queries in a verifiable and privacy-preserving manner. We consider a model where a data owner outsources a dataset of key-value pairs to a server, who answers range and closest-point queries issued by a client and provides proofs of the answers. The client verifies the correctness of the answers while learning nothing about the dataset besides the answers to the current and previous queries. Our work yields for the first time a zero-knowledge privacy assurance to authenticated range and closest-point queries. Previous work leaked the size of the dataset and used an inefficient proof protocol. Our construction is based on hierarchical identity-based encryption. We prove its security and analyze its efficiency both theoretically and with experiments on synthetic and real data (Enron email and Boston taxi datasets.

  12. Use of models and mockups in verifying man-machine interfaces

    International Nuclear Information System (INIS)

    Seminara, J.L.

    1985-01-01

    The objective of Human Factors Engineering is to tailor the design of facilities and equipment systems to match the capabilities and limitations of the personnel who will operate and maintain the system. This optimization of the man-machine interface is undertaken to enhance the prospects for safe, reliable, timely, and error-free human performance in meeting system objectives. To ensure the eventual success of a complex man-machine system it is important to systematically and progressively test and verify the adequacy of man-machine interfaces from initial design concepts to system operation. Human factors specialists employ a variety of methods to evaluate the quality of the human-system interface. These methods include: (1) Reviews of two-dimensional drawings using appropriately scaled transparent overlays of personnel spanning the anthropometric range, considering clothing and protective gear encumbrances (2) Use of articulated, scaled, plastic templates or manikins that are overlayed on equipment or facility drawings (3) Development of computerized manikins in computer aided design approaches (4) Use of three-dimensional scale models to better conceptualize work stations, control rooms or maintenance facilities (5) Full or half-scale mockups of system components to evaluate operator/maintainer interfaces (6) Part of full-task dynamic simulation of operator or maintainer tasks and interactive system responses (7) Laboratory and field research to establish human performance capabilities with alternative system design concepts or configurations. Of the design verification methods listed above, this paper will only consider the use of models and mockups in the design process

  13. Trends in the incidence rate, type and treatment of surgically verified endometriosis - a nationwide cohort study.

    Science.gov (United States)

    Saavalainen, Liisu; Tikka, Tuulia; But, Anna; Gissler, Mika; Haukka, Jari; Tiitinen, Aila; Härkki, Päivi; Heikinheimo, Oskari

    2018-01-01

    To study the trends in incidence rate, type and surgical treatment, and patient characteristics of surgically verified endometriosis during 1987-2012. This is a register-based cohort study. We identified women receiving their first diagnosis of endometriosis in surgery from the Finnish Hospital Discharge Register (FHDR). Quality of the FHDR records was assessed bidirectionally. The age-standardized incidence rates of the first surgically verified endometriosis was assessed by calendar year. The cohort comprises 49 956 women. The quality assessment suggested the FHDR data to be of good quality. The most common diagnosis, ovarian endometriosis (46%), was associated with highest median age 38.5 years (interquartile range 31.0-44.8) and the second most common diagnosis, peritoneal endometriosis (40%), with median age 34.9 years (28.6-41.7). Between 1987 and 2012, a decrease was observed in the median age, from 38.8 (32.3-43.6) to 34.0 (28.9-41.0) years, and in the age-standardized incidence rate from 116 [95% confidence interval (CI) 112-121] to 45 (42-48) per 100 000 women. The proportion of hysterectomy as a first surgical treatment decreased from 38 to 19%, whereas that of laparoscopy increased from 42 to 73% when comparing 1987-1995 with 1996-2012. This nationwide cohort of surgically verified endometriosis showed a decrease in the incidence rate and in the patient age at the time of first diagnosis, even though the proportion of laparoscopy has increased. The number of hysterectomies has decreased. These changes are likely to reflect the evolving diagnostics, increasing awareness of endometriosis, and effective use of medical treatment before surgery. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  14. Experimental evaluation of the exposure level onboard Czech Airlines aircraft - measurements verified the routine method

    International Nuclear Information System (INIS)

    Ploc, O.; Spurny, F.; Turek, K.; Kovar, I.

    2008-01-01

    Air-crew members are exposed to ionizing radiation due to their work on board of air-crafts. The International Commission on Radiological Protection (ICRP) in 1990 recommends that exposure to cosmic radiation in the operation of jet aircraft should be recognised as occupational exposure. Czech air transport operators are therefore obliged to ensure: - Air-crew members to be well informed about the exposure level and health risks; - An analysis of complete exposure level of aircraft crew and its continuing monitoring in cases of exceeding the informative value 1 mSv; - A compliance of limit 1 mSv during pregnancy Since 1998, after receiving a proper accreditation, the Department of Radiation Dosimetry of Nuclear Physics Institute of Czech Academy of Sciences (DRD) is the competent dosimetric service realized requirements of Notice No.307 of the State Office for Nuclear Safety concerning air-crew exposure (paragraphs 87-90). The DRD has developed routine method of personal dosimetry of aircraft crew in 1998 which has been applied after receiving a proper accreditation in the same year. DRD therefore helps Czech airlines a.s. (CSA) with their legislative obligations mentioned above, and in return, once per four years, in terms of business contract, CSA allows scientific measurements performed by DRD onboard its air-crafts with the aim to verify the method of routine individual monitoring of aircraft crew exposure. (authors)

  15. Preclinical Evaluations To Identify Optimal Linezolid Regimens for Tuberculosis Therapy

    Science.gov (United States)

    Drusano, George L.; Adams, Jonathan R.; Rodriquez, Jaime L.; Jambunathan, Kalyani; Baluya, Dodge L.; Brown, David L.; Kwara, Awewura; Mirsalis, Jon C.; Hafner, Richard; Louie, Arnold

    2015-01-01

    ABSTRACT Linezolid is an oxazolidinone with potent activity against Mycobacterium tuberculosis. Linezolid toxicity in patients correlates with the dose and duration of therapy. These toxicities are attributable to the inhibition of mitochondrial protein synthesis. Clinically relevant linezolid regimens were simulated in the in vitro hollow-fiber infection model (HFIM) system to identify the linezolid therapies that minimize toxicity, maximize antibacterial activity, and prevent drug resistance. Linezolid inhibited mitochondrial proteins in an exposure-dependent manner, with toxicity being driven by trough concentrations. Once-daily linezolid killed M. tuberculosis in an exposure-dependent manner. Further, 300 mg linezolid given every 12 hours generated more bacterial kill but more toxicity than 600 mg linezolid given once daily. None of the regimens prevented linezolid resistance. These findings show that with linezolid monotherapy, a clear tradeoff exists between antibacterial activity and toxicity. By identifying the pharmacokinetic parameters linked with toxicity and antibacterial activity, these data can provide guidance for clinical trials evaluating linezolid in multidrug antituberculosis regimens. PMID:26530386

  16. An Evaluation of the Technical Adequacy of a Revised Measure of Quality Indicators of Transition

    Science.gov (United States)

    Morningstar, Mary E.; Lee, Hyunjoo; Lattin, Dana L.; Murray, Angela K.

    2016-01-01

    This study confirmed the reliability and validity of the Quality Indicators of Exemplary Transition Programs Needs Assessment-2 (QI-2). Quality transition program indicators were identified through a systematic synthesis of transition research, policies, and program evaluation measures. To verify reliability and validity of the QI-2, we…

  17. People consider reliability and cost when verifying their autobiographical memories.

    Science.gov (United States)

    Wade, Kimberley A; Nash, Robert A; Garry, Maryanne

    2014-02-01

    Because memories are not always accurate, people rely on a variety of strategies to verify whether the events that they remember really did occur. Several studies have examined which strategies people tend to use, but none to date has asked why people opt for certain strategies over others. Here we examined the extent to which people's beliefs about the reliability and the cost of different strategies would determine their strategy selection. Subjects described a childhood memory and then suggested strategies they might use to verify the accuracy of that memory. Next, they rated the reliability and cost of each strategy, and the likelihood that they might use it. Reliability and cost each predicted strategy selection, but a combination of the two ratings provided even greater predictive value. Cost was significantly more influential than reliability, which suggests that a tendency to seek and to value "cheap" information more than reliable information could underlie many real-world memory errors. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Large test rigs verify Clinch River control rod reliability

    International Nuclear Information System (INIS)

    Michael, H.D.; Smith, G.G.

    1983-01-01

    The purpose of the Clinch River control test programme was to use multiple full-scale prototypic control rod systems for verifying the system's ability to perform reliably during simulated reactor power control and emergency shutdown operations. Two major facilities, the Shutdown Control Rod and Maintenance (Scram) facility and the Dynamic and Seismic Test (Dast) facility, were constructed. The test programme of each facility is described. (UK)

  19. Verified Gaming

    DEFF Research Database (Denmark)

    Kiniry, Joseph Roland; Zimmerman, Daniel

    2011-01-01

    ---falls every year and any mention of mathematics in the classroom seems to frighten students away. So the question is: How do we attract new students in computing to the area of dependable software systems? Over the past several years at three universities we have experimented with the use of computer games......In recent years, several Grand Challenges (GCs) of computing have been identified and expounded upon by various professional organizations in the U.S. and England. These GCs are typically very difficult problems that will take many hundreds, or perhaps thousands, of man-years to solve. Researchers...

  20. Identifying, Preparing and Evaluating Army Instructors

    Science.gov (United States)

    2016-04-01

    is perhaps the most prominent and widely-used framework for evaluating training courses and programs (Hilbert, Preskill & Russ- Eft , 1997; Hoole...Glazerman, S., & Seifullah, A. (2012). An evaluation of the Chicago Teacher Advancement Program (Chicago TAP ) after four years. (Report prepared for...H., & Russ- Eft , D. (1997). Evaluating training. In L. J. Bassi & D. Russ- Eft (Eds.), What works: Assessment, development, and measurement (pp. 109

  1. Accuracy of self-reported length of coma and posttraumatic amnesia in persons with medically verified traumatic brain injury.

    Science.gov (United States)

    Sherer, Mark; Sander, Angelle M; Maestas, Kacey Little; Pastorek, Nicholas J; Nick, Todd G; Li, Jingyun

    2015-04-01

    To determine the accuracy of self-reported length of coma and posttraumatic amnesia (PTA) in persons with medically verified traumatic brain injury (TBI) and to investigate factors that affect self-report of length of coma and PTA duration. Prospective cohort study. Specialized rehabilitation center with inpatient and outpatient programs. Persons (N=242) with medically verified TBI who were identified from a registry of persons who had previously participated in TBI-related research. Not applicable. Self-reported length of coma and self-reported PTA duration. Review of medical records revealed that the mean medically documented length of coma and PTA duration was 6.9±12 and 19.2±22 days, respectively, and the mean self-reported length of coma and PTA duration was 16.7±22 and 106±194 days, respectively. The average discrepancy between self-report and medical record for length of coma and PTA duration was 8.2±21 and 64±176 days, respectively. Multivariable regression models revealed that time since injury, performance on cognitive tests, and medical record values were associated with self-reported values for both length of coma and PTA duration. In this investigation, persons with medically verified TBI showed poor accuracy in their self-report of length of coma and PTA duration. Discrepancies were large enough to affect injury severity classification. Caution should be exercised when considering self-report of length of coma and PTA duration. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  2. Descriptional complexity of non-unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2017-09-01

    Full Text Available Previously, self-verifying symmetric difference automata were defined and a tight bound of 2^n-1-1 was shown for state complexity in the unary case. We now consider the non-unary case and show that, for every n at least 2, there is a regular...

  3. Verifying the gravitational shift due to the earth's rotation

    International Nuclear Information System (INIS)

    Briatore, L.; Leschiutta, S.

    1976-01-01

    Data on various independent time scales kept in different laboratories are elaborated in order to verify the gravitational shift due to the earth's rotation. It is shown that the state of the art in the measurement of time is just now resulting in the possibility to make measurement of Δ t/t approximately 10 -13 . Moreover it is shown an experimental evidence of the earth's rotation relativistic effects

  4. Systematic Correlation Matrix Evaluation (SCoMaE) - a bottom-up, science-led approach to identifying indicators

    Science.gov (United States)

    Mengis, Nadine; Keller, David P.; Oschlies, Andreas

    2018-01-01

    This study introduces the Systematic Correlation Matrix Evaluation (SCoMaE) method, a bottom-up approach which combines expert judgment and statistical information to systematically select transparent, nonredundant indicators for a comprehensive assessment of the state of the Earth system. The methods consists of two basic steps: (1) the calculation of a correlation matrix among variables relevant for a given research question and (2) the systematic evaluation of the matrix, to identify clusters of variables with similar behavior and respective mutually independent indicators. Optional further analysis steps include (3) the interpretation of the identified clusters, enabling a learning effect from the selection of indicators, (4) testing the robustness of identified clusters with respect to changes in forcing or boundary conditions, (5) enabling a comparative assessment of varying scenarios by constructing and evaluating a common correlation matrix, and (6) the inclusion of expert judgment, for example, to prescribe indicators, to allow for considerations other than statistical consistency. The example application of the SCoMaE method to Earth system model output forced by different CO2 emission scenarios reveals the necessity of reevaluating indicators identified in a historical scenario simulation for an accurate assessment of an intermediate-high, as well as a business-as-usual, climate change scenario simulation. This necessity arises from changes in prevailing correlations in the Earth system under varying climate forcing. For a comparative assessment of the three climate change scenarios, we construct and evaluate a common correlation matrix, in which we identify robust correlations between variables across the three considered scenarios.

  5. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    Science.gov (United States)

    Stephen M Ogle; Kenneth Davis; Thomas Lauvaux; Andrew Schuh; Dan Cooley; Tristram O West; Linda S Heath; Natasha L Miles; Scott Richardson; F Jay Breidt; James E Smith; Jessica L McCarty; Kevin R Gurney; Pieter Tans; A Scott. Denning

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country's contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated...

  6. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...... tool based on belief logics and explicit attacker knowledge....

  7. The evaluation of trustworthiness to identify health insurance fraud in dentistry.

    Science.gov (United States)

    Wang, Shu-Li; Pai, Hao-Ting; Wu, Mei-Fang; Wu, Fan; Li, Chen-Lin

    2017-01-01

    According to the investigations of the U.S. Government Accountability Office (GAO), health insurance fraud has caused an enormous pecuniary loss in the U.S. In Taiwan, in dentistry the problem is getting worse if dentists (authorized entities) file fraudulent claims. Several methods have been developed to solve health insurance fraud; however, these methods are like a rule-based mechanism. Without exploring the behavior patterns, these methods are time-consuming and ineffective; in addition, they are inadequate for managing the fraudulent dentists. Based on social network theory, we develop an evaluation approach to solve the problem of cross-dentist fraud. The trustworthiness score of a dentist is calculated based upon the amount and type of dental operations performed on the same patient and the same tooth by that dentist and other dentists. The simulation provides the following evidence. (1) This specific type of fraud can be identified effectively using our evaluation approach. (2) A retrospective study for the claims is also performed. (3) The proposed method is effective in identifying the fraudulent dentists. We provide a new direction for investigating the genuineness of claims data. If the insurer can detect fraudulent dentists using the traditional method and the proposed method simultaneously, the detection will be more transparent and ultimately reduce the losses caused by fraudulent claims. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Die verifiëring, verfyning en toepassing van leksikografiese liniale ...

    African Journals Online (AJOL)

    Leksikografiese liniale vir Afrikaans en die Afrikatale is 'n dekade oud en word algemeen gebruik in die samestelling van woordeboeke. Die samestellers het dit tot dusver nie nodig geag om hierdie liniale te verifieer of te verfyn nie. Kritiek is egter uitgespreek op die samestelling van die Afrikaanse Liniaal en dit word in ...

  9. Identifying and assessing strategies for evaluating the impact of mobile eye health units on health outcomes.

    Science.gov (United States)

    Fu, Shiwan; Turner, Angus; Tan, Irene; Muir, Josephine

    2017-12-01

    To identify and assess strategies for evaluating the impact of mobile eye health units on health outcomes. Systematic literature review. Worldwide. Peer-reviewed journal articles that included the use of a mobile eye health unit. Journal articles were included if outcome measures reflected an assessment of the impact of a mobile eye health unit on health outcomes. Six studies were identified with mobile services offering diabetic retinopathy screening (three studies), optometric services (two studies) and orthoptic services (one study). This review identified and assessed strategies in existing literature used to evaluate the impact of mobile eye health units on health outcomes. Studies included in this review used patient outcomes (i.e. disease detection, vision impairment, treatment compliance) and/or service delivery outcomes (i.e. cost per attendance, hospital transport use, inappropriate referrals, time from diabetic retinopathy photography to treatment) to evaluate the impact of mobile eye health units. Limitations include difficulty proving causation of specific outcome measures and the overall shortage of impact evaluation studies. Variation in geographical location, service population and nature of eye care providers limits broad application. © 2017 National Rural Health Alliance Inc.

  10. Identifying motivators and barriers to student completion of instructor evaluations: A multi-faceted, collaborative approach from four colleges of pharmacy.

    Science.gov (United States)

    McAuley, James W; Backo, Jennifer Lynn; Sobota, Kristen Finley; Metzger, Anne H; Ulbrich, Timothy

    To identify motivators and barriers to pharmacy student completion of instructor evaluations, and to develop potential strategies to improve the evaluation process. Completed at four Ohio Colleges of Pharmacy, Phase I consisted of a student/faculty survey and Phase II consisted of joint student/faculty focus groups to discuss Phase I data and to problem solve. In Phase I, the top three student-identified and faculty-perceived motivators to completion of evaluations were to (1) make the course better, (2) earn bonus points, and (3) improve the instructor's teaching. The top three student-identified barriers to completion of evaluations were having to (1) evaluate multiple instructors, (2) complete several evaluations around the same time, and (3) complete lengthy evaluations. Phase II focus groups identified a number of potential ways to enhance the motivators and reduce barriers, including but not limited to making sure faculty convey to students that the feedback they provide is useful and to provide examples of how student feedback has been used to improve their teaching/the course. Students and faculty identified motivators and barriers to completing instructor evaluations and were willing to work together to improve the process. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Measuring reporting verifying. A primer on MRV for nationally appropriate mitigation actions

    Energy Technology Data Exchange (ETDEWEB)

    Hinostroza, M. (ed.); Luetken, S.; Holm Olsen, K. (Technical Univ. of Denmark. UNEP Risoe Centre, Roskilde (Denmark)); Aalders, E.; Pretlove, B.; Peters, N. (Det Norske Veritas, Hellerup (Denmark))

    2012-03-15

    The requirements for measurement, reporting and verification (MRV) of nationally appropriate mitigation actions (NAMAs) are one of the crucial topics on the agenda of international negotiations to address climate change mitigation. According to agreements so far, the general guidelines for domestic MRV are to be developed by Subsidiary Body for Scientific and Technological Advice (SBSTA)1. Further, the Subsidiary Body for Implementation (SBI) will be conducting international consultations and analysis (ICA) of biennial update reports (BUR) to improve transparency of mitigation actions, which should be measured, reported and verified. 2. What is clear from undergoing discussions both at SBSTA and at SBI is that MRV for NAMAs should not be a burden for controlling greenhouse gas (GHG) emissions connected to economic activities. Instead, the MRV process should facilitate mitigation actions; encourage the redirection of investments and address concerns regarding carbon content of emission intensive operations of private and public companies and enterprises worldwide. While MRV requirements are being shaped within the Convention, there are a number of initiatives supporting developing countries moving forward with NAMA development and demonstration activities. How these actions shall be measured, reported and verified, however, remain unanswered. MRV is not new. It is present in most existing policies and frameworks related to climate change mitigation. With an aim to contribute to international debate and capacity building on this crucial issue, the UNEP Risoe Centre in cooperation with UNDP, are pleased to present this publication that through the direct collaboration with Det Norske Veritas (DNV) builds on existing MRV practices in current carbon markets; provides insights on how MRV for NAMAs can be performed and identifies elements and drivers to be considered when designing adequate MRV systems for NAMAs in developing countries. This primer is the second

  12. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    We propose an approach to automatic verification of real-time systems against scenario-based requirements. A real-time system is modeled as a network of Timed Automata (TA), and a scenario-based requirement is specified as a Live Sequence Chart (LSC). We define a trace-based semantics for a kernel...... subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  13. From Operating-System Correctness to Pervasively Verified Applications

    Science.gov (United States)

    Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike

    Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.

  14. Efficient logistic regression designs under an imperfect population identifier.

    Science.gov (United States)

    Albert, Paul S; Liu, Aiyi; Nansel, Tonja

    2014-03-01

    Motivated by actual study designs, this article considers efficient logistic regression designs where the population is identified with a binary test that is subject to diagnostic error. We consider the case where the imperfect test is obtained on all participants, while the gold standard test is measured on a small chosen subsample. Under maximum-likelihood estimation, we evaluate the optimal design in terms of sample selection as well as verification. We show that there may be substantial efficiency gains by choosing a small percentage of individuals who test negative on the imperfect test for inclusion in the sample (e.g., verifying 90% test-positive cases). We also show that a two-stage design may be a good practical alternative to a fixed design in some situations. Under optimal and nearly optimal designs, we compare maximum-likelihood and semi-parametric efficient estimators under correct and misspecified models with simulations. The methodology is illustrated with an analysis from a diabetes behavioral intervention trial. © 2013, The International Biometric Society.

  15. Building and Verifying a Predictive Model of Interruption Resumption

    Science.gov (United States)

    2012-03-01

    the gardener to remember those plants (and whether they need to be removed), and so will not commit resources to remember that information . The overall...camera), the storyteller needed help much less often. This result suggests that when there is no one to help them remember the last thing they said...INV ITED P A P E R Building and Verifying a Predictive Model of Interruption Resumption Help from a robot, to allow a human storyteller to continue

  16. 40 CFR 8.9 - Measures to assess and verify environmental impacts.

    Science.gov (United States)

    2010-07-01

    ... environmental impacts. 8.9 Section 8.9 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL ENVIRONMENTAL IMPACT ASSESSMENT OF NONGOVERNMENTAL ACTIVITIES IN ANTARCTICA § 8.9 Measures to assess and verify environmental impacts. (a) The operator shall conduct appropriate monitoring of key environmental indicators as...

  17. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    International Nuclear Information System (INIS)

    Tolk, Keith M.; Stoker, Gerald C.

    1999-01-01

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements

  18. An experiment designed to verify the general theory of relativity

    International Nuclear Information System (INIS)

    Surdin, Maurice

    1960-01-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [fr

  19. Elements of a system for verifying a Comprehensive Test Ban

    International Nuclear Information System (INIS)

    Hannon, W.J.

    1987-01-01

    The paper discusses the goals of a monitoring system for a CTB, its functions, the challenges to verification, discrimination techniques, and some recent developments. It is concluded technical, military and political efforts are required to establish and verify test ban treaties which will contribute to stability in the long term. It currently appears there will be a significant number of unidentified events

  20. Towards Verifying National CO2 Emissions

    Science.gov (United States)

    Fung, I. Y.; Wuerth, S. M.; Anderson, J. L.

    2017-12-01

    With the Paris Agreement, nations around the world have pledged their voluntary reductions in future CO2 emissions. Satellite observations of atmospheric CO2 have the potential to verify self-reported emission statistics around the globe. We present a carbon-weather data assimilation system, wherein raw weather observations together with satellite observations of the mixing ratio of column CO2 from the Orbiting Carbon Observatory-2 are assimilated every 6 hours into the NCAR carbon-climate model CAM5 coupled to the Ensemble Kalman Filter of DART. In an OSSE, we reduced the fossil fuel emissions from a country, and estimated the emissions innovations demanded by the atmospheric CO2 observations. The uncertainties in the innovation are analyzed with respect to the uncertainties in the meteorology to determine the significance of the result. The work follows from "On the use of incomplete historical data to infer the present state of the atmosphere" (Charney et al. 1969), which maps the path for continuous data assimilation for weather forecasting and the five decades of progress since.

  1. Isotope correlation techniques for verifying input accountability measurements at a reprocessing plant

    International Nuclear Information System (INIS)

    Umezawa, H.; Nakahara, Y.

    1983-01-01

    Isotope correlation techniques were studied to verify input accountability measurements at a reprocessing plant. On the basis of a historical data bank, correlation between plutonium-to-uranium ratio and isotopic variables was derived as a function of burnup. The burnup was determined from the isotopic ratios of uranium and plutonium, too. Data treatment was therefore made in an iterative manner. The isotopic variables were defined to cover a wide spectrum of isotopes of uranium and plutonium. The isotope correlation techniques evaluated important parameters such as the fuel burnup, the most probable ratio of plutonium to uranium, and the amounts of uranium and plutonium in reprocessing batches in connection with fresh fuel fabrication data. In addition, the most probable values of isotope abundance of plutonium and uranium could be estimated from the plutonium-to-uranium ratio determined, being compared with the reported data for verification. A pocket-computer-based system was developed to enable inspectors to collect and evaluate data in a timely fashion at the input accountability measurement point by the isotope correlation techniques. The device is supported by battery power and completely independent of the operator's system. The software of the system was written in BASIC. The data input can be stored in a cassette tape and transferred into a higher level computer. The correlations used for the analysis were given as a form of analytical function. Coefficients for the function were provided relevant to the type of reactor and the initial enrichment of fuel. (author)

  2. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    Science.gov (United States)

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  3. Readiness evaluation report -- High-exposure rate hardware removal resumption of activities

    International Nuclear Information System (INIS)

    Volkman, C.L.

    1996-11-01

    In August 1996, N Basin Project Management proactively ceased activities in the N Basin after noting several radiological control anomalies occurring during the performance of the high exposure rate hardware removal activity. The HERH is one of several activities that will be accomplished to complete deactivation of N Basin. Three project critiques were performed to identify causes and several corrective actions were identified. To ensure the true causes of the events were identified the N Basin Project Manager requested that a root cause analysis be performed for the events covered by the three critiques. The intent was to identify recurring events and evaluate the effectiveness of corrective action implementation. These three review elements were used by the project to develop a corrective action plan (CAP) which consisted of both project unique and programmatic items. The N Basin Project is using this BHI Readiness Evaluation (RE) process as a mechanism to independently verify that corrective actions identified from the CAP have been completed and that no changes have been made during the stand down that affect the resumption of the HERH activities. A readiness evaluation (RE) plan (Attachment 3) was prepared. The completed Readiness Evaluation Records which documents the results of the team member evaluations are in attachment 1. The independent readiness evaluation team identified 5 deficiencies of which two are post startup and three are pre startup. All deficiencies are in the area of training. These findings are explained in detail in Attachment 2

  4. Evaluation of CT in identifying colorectal carcinoma in the frail and disabled patient

    International Nuclear Information System (INIS)

    Ng, C.S.; Dixon, A.K.; Doyle, T.C.; Courtney, H.M.; Bull, R.K.; Freeman, A.H.; Pinto, E.M.; Prevost, A.T.; Campbell, G.A.

    2002-01-01

    Frail and physically or mentally disabled patients frequently have difficulty in tolerating formal colonic investigations. The aims of this study were to evaluate the accuracy of minimal-preparation CT in identifying colorectal carcinoma in this population and to determine the clinical indications and radiological signs with the highest yield for tumour. The CT technique involved helical acquisition (10-mm collimation, 1.5 pitch) following 2 days of preparation with oral contrast medium only. The outcome of 4 years of experience was retrospectively reviewed. The gold standards were pathological and cancer registration records, together with colonoscopy and barium enema when undertaken, with a minimum of 15 months follow-up. One thousand seventy-seven CT studies in 1031 patients (median age 80 years) were evaluated. CT correctly identified 83 of the 98 colorectal carcinomas in this group but missed 15 cases; sensitivity and specificity (with 95% confidence interval) 85% (78-92%) and 91% (90-93%), respectively. Multivariate analysis identified: (a) a palpable abdominal mass and anaemia to be the strongest clinical indications, particularly in combination (p<0.0025); and (b) lesion width and blurring of the serosal margin of lesions to be associated with tumours (p<0.0001). Computed tomography has a valuable role in the investigation of frail and otherwise disabled patients with symptoms suspicious for a colonic neoplasm. Although interpretation can be difficult, the technique is able to exclude malignancy with good accuracy. (orig.)

  5. Verifying atom entanglement schemes by testing Bell's inequality

    International Nuclear Information System (INIS)

    Angelakis, D.G.; Knight, P.L.; Tregenna, B.; Munro, W.J.

    2001-01-01

    Recent experiments to test Bell's inequality using entangled photons and ions aimed at tests of basic quantum mechanical principles. Interesting results have been obtained and many loopholes could be closed. In this paper we want to point out that tests of Bell's inequality also play an important role in verifying atom entanglement schemes. We describe as an example a scheme to prepare arbitrary entangled states of N two-level atoms using a leaky optical cavity and a scheme to entangle atoms inside a photonic crystal. During the state preparation no photons are emitted, and observing a violation of Bell's inequality is the only way to test whether a scheme works with a high precision or not. (orig.)

  6. Changing Climate, Challenging Choices: Identifying and Evaluating Climate Change Adaptation Options for Protected Areas Management in Ontario, Canada

    Science.gov (United States)

    Lemieux, Christopher J.; Scott, Daniel J.

    2011-10-01

    Climate change will pose increasingly significant challenges to managers of parks and other forms of protected areas around the world. Over the past two decades, numerous scientific publications have identified potential adaptations, but their suitability from legal, policy, financial, internal capacity, and other management perspectives has not been evaluated for any protected area agency or organization. In this study, a panel of protected area experts applied a Policy Delphi methodology to identify and evaluate climate change adaptation options across the primary management areas of a protected area agency in Canada. The panel identified and evaluated one hundred and sixty five (165) adaptation options for their perceived desirability and feasibility. While the results revealed a high level of agreement with respect to the desirability of adaptation options and a moderate level of capacity pertaining to policy formulation and management direction, a perception of low capacity for implementation in most other program areas was identified. A separate panel of senior park agency decision-makers used a multiple criterion decision-facilitation matrix to further evaluate the institutional feasibility of the 56 most desirable adaptation options identified by the initial expert panel and to prioritize them for consideration in a climate change action plan. Critically, only two of the 56 adaptation options evaluated by senior decision-makers were deemed definitely implementable, due largely to fiscal and internal capacity limitations. These challenges are common to protected area agencies in developed countries and pervade those in developing countries, revealing that limited adaptive capacity represents a substantive barrier to biodiversity conservation and other protected area management objectives in an era of rapid climate change.

  7. Regressive transgressive cycle of Devonian sea in Uruguay verified by Palynology

    International Nuclear Information System (INIS)

    Da Silva, J.

    1990-01-01

    This work is about the results and conclusions of the populations palinomorphs study, carried out in Devonian formations in the center of Uruguay. The existence of a regressive transgressive cycle is verified by analyzing the vertical distribution of palinomorphs as well as is mentioned the presence of chintziest for the section studied - hoesphaeridium Cyathochitina kinds

  8. Evaluation of Antigen-Conjugated Fluorescent Beads to Identify Antigen-Specific B Cells

    Directory of Open Access Journals (Sweden)

    Isabel Correa

    2018-03-01

    Full Text Available Selection of single antigen-specific B cells to identify their expressed antibodies is of considerable interest for evaluating human immune responses. Here, we present a method to identify single antibody-expressing cells using antigen-conjugated fluorescent beads. To establish this, we selected Folate Receptor alpha (FRα as a model antigen and a mouse B cell line, expressing both the soluble and the membrane-bound forms of a human/mouse chimeric antibody (MOv18 IgG1 specific for FRα, as test antibody-expressing cells. Beads were conjugated to FRα using streptavidin/avidin-biotin bridges and used to select single cells expressing the membrane-bound form of anti-FRα. Bead-bound cells were single cell-sorted and processed for single cell RNA retrotranscription and PCR to isolate antibody heavy and light chain variable regions. Variable regions were then cloned and expressed as human IgG1/k antibodies. Like the original clone, engineered antibodies from single cells recognized native FRα. To evaluate whether antigen-coated beads could identify specific antibody-expressing cells in mixed immune cell populations, human peripheral blood mononuclear cells (PBMCs were spiked with test antibody-expressing cells. Antigen-specific cells could comprise up to 75% of cells selected with antigen-conjugated beads when the frequency of the antigen-positive cells was 1:100 or higher. In PBMC pools, beads conjugated to recombinant antigens FRα and HER2 bound antigen-specific anti-FRα MOv18 and anti-HER2 Trastuzumab antibody-expressing cells, respectively. From melanoma patient-derived B cells selected with melanoma cell line-derived protein-coated fluorescent beads, we generated a monoclonal antibody that recognized melanoma antigen-coated beads. This approach may be further developed to facilitate analysis of B cells and their antibody profiles at the single cell level and to help unravel humoral immune repertoires.

  9. Evaluation of Antigen-Conjugated Fluorescent Beads to Identify Antigen-Specific B Cells.

    Science.gov (United States)

    Correa, Isabel; Ilieva, Kristina M; Crescioli, Silvia; Lombardi, Sara; Figini, Mariangela; Cheung, Anthony; Spicer, James F; Tutt, Andrew N J; Nestle, Frank O; Karagiannis, Panagiotis; Lacy, Katie E; Karagiannis, Sophia N

    2018-01-01

    Selection of single antigen-specific B cells to identify their expressed antibodies is of considerable interest for evaluating human immune responses. Here, we present a method to identify single antibody-expressing cells using antigen-conjugated fluorescent beads. To establish this, we selected Folate Receptor alpha (FRα) as a model antigen and a mouse B cell line, expressing both the soluble and the membrane-bound forms of a human/mouse chimeric antibody (MOv18 IgG1) specific for FRα, as test antibody-expressing cells. Beads were conjugated to FRα using streptavidin/avidin-biotin bridges and used to select single cells expressing the membrane-bound form of anti-FRα. Bead-bound cells were single cell-sorted and processed for single cell RNA retrotranscription and PCR to isolate antibody heavy and light chain variable regions. Variable regions were then cloned and expressed as human IgG1/k antibodies. Like the original clone, engineered antibodies from single cells recognized native FRα. To evaluate whether antigen-coated beads could identify specific antibody-expressing cells in mixed immune cell populations, human peripheral blood mononuclear cells (PBMCs) were spiked with test antibody-expressing cells. Antigen-specific cells could comprise up to 75% of cells selected with antigen-conjugated beads when the frequency of the antigen-positive cells was 1:100 or higher. In PBMC pools, beads conjugated to recombinant antigens FRα and HER2 bound antigen-specific anti-FRα MOv18 and anti-HER2 Trastuzumab antibody-expressing cells, respectively. From melanoma patient-derived B cells selected with melanoma cell line-derived protein-coated fluorescent beads, we generated a monoclonal antibody that recognized melanoma antigen-coated beads. This approach may be further developed to facilitate analysis of B cells and their antibody profiles at the single cell level and to help unravel humoral immune repertoires.

  10. Evaluation of Antigen-Conjugated Fluorescent Beads to Identify Antigen-Specific B Cells

    Science.gov (United States)

    Correa, Isabel; Ilieva, Kristina M.; Crescioli, Silvia; Lombardi, Sara; Figini, Mariangela; Cheung, Anthony; Spicer, James F.; Tutt, Andrew N. J.; Nestle, Frank O.; Karagiannis, Panagiotis; Lacy, Katie E.; Karagiannis, Sophia N.

    2018-01-01

    Selection of single antigen-specific B cells to identify their expressed antibodies is of considerable interest for evaluating human immune responses. Here, we present a method to identify single antibody-expressing cells using antigen-conjugated fluorescent beads. To establish this, we selected Folate Receptor alpha (FRα) as a model antigen and a mouse B cell line, expressing both the soluble and the membrane-bound forms of a human/mouse chimeric antibody (MOv18 IgG1) specific for FRα, as test antibody-expressing cells. Beads were conjugated to FRα using streptavidin/avidin-biotin bridges and used to select single cells expressing the membrane-bound form of anti-FRα. Bead-bound cells were single cell-sorted and processed for single cell RNA retrotranscription and PCR to isolate antibody heavy and light chain variable regions. Variable regions were then cloned and expressed as human IgG1/k antibodies. Like the original clone, engineered antibodies from single cells recognized native FRα. To evaluate whether antigen-coated beads could identify specific antibody-expressing cells in mixed immune cell populations, human peripheral blood mononuclear cells (PBMCs) were spiked with test antibody-expressing cells. Antigen-specific cells could comprise up to 75% of cells selected with antigen-conjugated beads when the frequency of the antigen-positive cells was 1:100 or higher. In PBMC pools, beads conjugated to recombinant antigens FRα and HER2 bound antigen-specific anti-FRα MOv18 and anti-HER2 Trastuzumab antibody-expressing cells, respectively. From melanoma patient-derived B cells selected with melanoma cell line-derived protein-coated fluorescent beads, we generated a monoclonal antibody that recognized melanoma antigen-coated beads. This approach may be further developed to facilitate analysis of B cells and their antibody profiles at the single cell level and to help unravel humoral immune repertoires. PMID:29628923

  11. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    International Nuclear Information System (INIS)

    Ogle, Stephen M; Davis, Kenneth; Lauvaux, Thomas; Miles, Natasha L; Richardson, Scott; Schuh, Andrew; Cooley, Dan; Breidt, F Jay; West, Tristram O; Heath, Linda S; Smith, James E; McCarty, Jessica L; Gurney, Kevin R; Tans, Pieter; Denning, A Scott

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country’s contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated emissions associated with managing lands for carbon sequestration and other activities, which often have large uncertainties. We report here on the challenges and results associated with a case study using atmospheric measurements of CO 2 concentrations and inverse modeling to verify nationally-reported biogenic CO 2 emissions. The biogenic CO 2 emissions inventory was compiled for the Mid-Continent region of United States based on methods and data used by the US government for reporting to the UNFCCC, along with additional sources and sinks to produce a full carbon balance. The biogenic emissions inventory produced an estimated flux of −408 ± 136 Tg CO 2 for the entire study region, which was not statistically different from the biogenic flux of −478 ± 146 Tg CO 2 that was estimated using the atmospheric CO 2 concentration data. At sub-regional scales, the spatial density of atmospheric observations did not appear sufficient to verify emissions in general. However, a difference between the inventory and inversion results was found in one isolated area of West-central Wisconsin. This part of the region is dominated by forestlands, suggesting that further investigation may be warranted into the forest C stock or harvested wood product data from this portion of the study area. The results suggest that observations of atmospheric CO 2 concentration data and inverse modeling could be used to verify biogenic emissions, and provide more confidence in biogenic GHG emissions reporting to the UNFCCC. (letter)

  12. Verifying the functional ability of microstructured surfaces by model-based testing

    Science.gov (United States)

    Hartmann, Wito; Weckenmann, Albert

    2014-09-01

    Micro- and nanotechnology enables the use of new product features such as improved light absorption, self-cleaning or protection, which are based, on the one hand, on the size of functional nanostructures and the other hand, on material-specific properties. With the need to reliably measure progressively smaller geometric features, coordinate and surface-measuring instruments have been refined and now allow high-resolution topography and structure measurements down to the sub-nanometre range. Nevertheless, in many cases it is not possible to make a clear statement about the functional ability of the workpiece or its topography because conventional concepts of dimensioning and tolerancing are solely geometry oriented and standardized surface parameters are not sufficient to consider interaction with non-geometric parameters, which are dominant for functions such as sliding, wetting, sealing and optical reflection. To verify the functional ability of microstructured surfaces, a method was developed based on a parameterized mathematical-physical model of the function. From this model, function-related properties can be identified and geometric parameters can be derived, which may be different for the manufacturing and verification processes. With this method it is possible to optimize the definition of the shape of the workpiece regarding the intended function by applying theoretical and experimental knowledge, as well as modelling and simulation. Advantages of this approach will be discussed and demonstrated by the example of a microstructured inking roll.

  13. Use of instruments to evaluate leadership in nursing and health services.

    Science.gov (United States)

    Carrara, Gisleangela Lima Rodrigues; Bernardes, Andrea; Balsanelli, Alexandre Pazetto; Camelo, Silvia Helena Henriques; Gabriel, Carmen Silvia; Zanetti, Ariane Cristina Barboza

    2018-03-12

    To identify the available scientific evidence about the use of instruments for the evaluation of leadership in health and nursing services and verify the use of leadership styles/models/theories in the construction of these tools. Integrative literature review of indexed studies in the LILACS, PUBMED, CINAHL and EMBASE databases from 2006 to 2016. Thirty-eight articles were analyzed, exhibiting 19 leadership evaluation tools; the most used were the Multifactor Leadership Questionnaire, the Global Transformational Leadership Scale, the Leadership Practices Inventory, the Servant Leadership Questionnaire, the Servant Leadership Survey and the Authentic Leadership Questionnaire. The literature search allowed to identify the main theories/styles/models of contemporary leadership and analyze their use in the design of leadership evaluation tools, with the transformational, situational, servant and authentic leadership categories standing out as the most prominent. To a lesser extent, the quantum, charismatic and clinical leadership types were evidenced.

  14. Identifying Bitcoin users by transaction behavior

    Science.gov (United States)

    Monaco, John V.

    2015-05-01

    Digital currencies, such as Bitcoin, offer convenience and security to criminals operating in the black marketplace. Some Bitcoin marketplaces, such as Silk Road, even claim anonymity. This claim contradicts the findings in this work, where long term transactional behavior is used to identify and verify account holders. Transaction timestamps and network properties observed over time contribute to this finding. The timestamp of each transaction is the result of many factors: the desire purchase an item, daily schedule and activities, as well as hardware and network latency. Dynamic network properties of the transaction, such as coin flow and the number of edge outputs and inputs, contribute further to reveal account identity. In this paper, we propose a novel methodology for identifying and verifying Bitcoin users based on the observation of Bitcoin transactions over time. The behavior we attempt to quantify roughly occurs in the social band of Newell's time scale. A subset of the Blockchain 230686 is taken, selecting users that initiated between 100 and 1000 unique transactions per month for at least 6 different months. This dataset shows evidence of being nonrandom and nonlinear, thus a dynamical systems approach is taken. Classification and authentication accuracies are obtained under various representations of the monthly Bitcoin samples: outgoing transactions, as well as both outgoing and incoming transactions are considered, along with the timing and dynamic network properties of transaction sequences. The most appropriate representations of monthly Bitcoin samples are proposed. Results show an inherent lack of anonymity by exploiting patterns in long-term transactional behavior.

  15. How to Verify and Manage the Translational Plagiarism?

    Science.gov (United States)

    Wiwanitkit, Viroj

    2016-01-01

    The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and non-English article that might translate from an original English article, which cannot be detected by simple plagiarism screening tool. It is a hard job for any journal to detect the complex translational plagiarism but the harder job might be how to effectively manage the case. PMID:27703588

  16. 13 CFR 127.403 - What happens if SBA verifies the concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA verifies the concern's eligibility? 127.403 Section 127.403 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  17. Techniques and methodologies to identify potential generated industries of NORM in Angola Republic and evaluate its impacts

    International Nuclear Information System (INIS)

    Diogo, José Manuel Sucumula

    2017-01-01

    Numerous steps have been taken worldwide to identify and quantify the radiological risks associated with the mining of ores containing Naturally Occurrence Radioactive Material (NORM), often resulting in unnecessary exposures to individuals and high environmental damage, with devastating consequences for the health of workers and damage to the economy of many countries due to a lack of regulations or inadequate regulations. For these and other reasons, the objective of this work was to identify industrial potential generating NORM in the Republic of Angola and to estimate its radiological environmental impacts. To achieve this objective, we studied the theoretical aspects, identified the main internationally recognized industrial companies that as generate by NORM. The Brazilian experience in the regulatory aspect was observed in the evaluation criteria to classify industries that generate NORM, the methods of mining and its radiological environmental impacts, as well as the main techniques applied to evaluate the concentrations of radionuclides in a specific environmental matrix and/or a NORM sample. The study approach allowed the elaboration of a NORM map for the main provinces of Angola, establishing the evaluation criteria for implementing the Radiation Protection Plan in the extractive industry, establishing measures to control ionizing radiation in mining, identifying and quantifying radionuclides present in samples of lees oil. However, in order to assess adequately the radiological environmental impact of the NORM industry, it is not enough to identify them, it is important to know the origin, quantify the radioactive material released as liquid and gaseous effluents, identify the main routes of exposure and examine how this material spreads into the environment until it reaches man. (author)

  18. [Diagnostic evaluation of the developmental level in children identified at risk of delay through the Child Development Evaluation Test].

    Science.gov (United States)

    Rizzoli-Córdoba, Antonio; Campos-Maldonado, Martha Carmen; Vélez-Andrade, Víctor Hugo; Delgado-Ginebra, Ismael; Baqueiro-Hernández, César Iván; Villasís-Keever, Miguel Ángel; Reyes-Morales, Hortensia; Ojeda-Lara, Lucía; Davis-Martínez, Erika Berenice; O'Shea-Cuevas, Gabriel; Aceves-Villagrán, Daniel; Carrasco-Mendoza, Joaquín; Villagrán-Muñoz, Víctor Manuel; Halley-Castillo, Elizabeth; Sidonio-Aguayo, Beatriz; Palma-Tavera, Josuha Alexander; Muñoz-Hernández, Onofre

    The Child Development Evaluation (or CDE Test) was developed in Mexico as a screening tool for child developmental problems. It yields three possible results: normal, slow development or risk of delay. The modified version was elaborated using the information obtained during the validation study but its properties according to the base population are not known. The objective of this work was to establish diagnostic confirmation of developmental delay in children 16- to 59-months of age previously identified as having risk of delay through the CDE Test in primary care facilities. A population-based cross-sectional study was conducted in one Mexican state. CDE test was administered to 11,455 children 16- to 59-months of age from December/2013 to March/2014. The eligible population represented the 6.2% of the children (n=714) who were identified at risk of delay through the CDE Test. For inclusion in the study, a block randomization stratified by sex and age group was performed. Each participant included in the study had a diagnostic evaluation using the Battelle Development Inventory, 2 nd edition. From the 355 participants included with risk of delay, 65.9% were male and 80.2% were from rural areas; 6.5% were false positives (Total Development Quotient ˃90) and 6.8% did not have any domain with delay (Domain Developmental Quotient <80). The proportion of delay for each domain was as follows: communication 82.5%; cognitive 80.8%; social-personal 33.8%; motor 55.5%; and adaptive 41.7%. There were significant differences in the percentages of delay both by age and by domain/subdomain evaluated. In 93.2% of the participants, developmental delay was corroborated in at least one domain evaluated. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  19. A smart growth evaluation model based on data envelopment analysis

    Science.gov (United States)

    Zhang, Xiaokun; Guan, Yongyi

    2018-04-01

    With the rapid spread of urbanization, smart growth (SG) has attracted plenty of attention from all over the world. In this paper, by the establishment of index system for smart growth, data envelopment analysis (DEA) model was suggested to evaluate the SG level of the current growth situation in cities. In order to further improve the information of both radial direction and non-radial detection, we introduced the non-Archimedean infinitesimal to form C2GS2 control model. Finally, we evaluated the SG level in Canberra and identified a series of problems, which can verify the applicability of the model and provide us more improvement information.

  20. Waste Handling Equipment Development Test and Evaluation Study

    International Nuclear Information System (INIS)

    R.L. Tome

    1998-01-01

    The purpose of this study is to identify candidate Monitored Geologic Repository (MGR) surface waste handling equipment for development testing. This study will also identify strategies for performing the development tests. Development testing shall be implemented to support detail design and reduce design risks. Development testing shall be conducted to confirm design concepts, evaluate alternative design concepts, show the availability of needed technology, and provide design documentation. The candidate equipment will be selected from MGR surface waste handling equipment that is the responsibility of the Management and Operating Contractor (M and O) Surface Design Department. The equipment identified in this study is based on Viability Assessment (VA) design. The ''Monitored Geologic Repository Test and Evaluation Plan'' (MGR T and EP), Reference 5.1, was used as a basis for this study. The MGR T and EP reflects the extent of test planning and analysis that can be conducted, given the current status of the MGR requirements and latest VA design information. The MGR T and EP supports the appropriate sections in the license application (LA) in accordance with 10 CFR 60.2 1(c)(14). The MGR T and EP describes the following test activities: site characterization to confirm, by test and analysis, the suitability of the Yucca Mountain site for housing a geologic repository; development testing to investigate and document design concepts to reduce risk; qualification testing to verify equipment compliance with design requirements, specifications, and regulatory requirements; system testing to validate compliance with MGR requirements, which include the receipt, handling, retrieval, and disposal of waste; periodic performance testing to verify preclosure requirements and to demonstrate safe and reliable MGR operation; and performance confirmation modeling, testing, and analysis to verify adherence to postclosure regulatory requirements. Development test activities can be

  1. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  2. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  3. The Comprehensive Evaluation Method of Supervision Risk in Electricity Transaction Based on Unascertained Rational Number

    Science.gov (United States)

    Haining, Wang; Lei, Wang; Qian, Zhang; Zongqiang, Zheng; Hongyu, Zhou; Chuncheng, Gao

    2018-03-01

    For the uncertain problems in the comprehensive evaluation of supervision risk in electricity transaction, this paper uses the unidentified rational numbers to evaluation the supervision risk, to obtain the possible result and corresponding credibility of evaluation and realize the quantification of risk indexes. The model can draw the risk degree of various indexes, which makes it easier for the electricity transaction supervisors to identify the transaction risk and determine the risk level, assisting the decision-making and realizing the effective supervision of the risk. The results of the case analysis verify the effectiveness of the model.

  4. Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol

    Science.gov (United States)

    Huang, Xiaowan; Singh, Anu; Smolka, Scott A.

    2010-01-01

    We use the UPPAAL model checker for Timed Automata to verify the Timing-Sync time-synchronization protocol for sensor networks (TPSN). The TPSN protocol seeks to provide network-wide synchronization of the distributed clocks in a sensor network. Clock-synchronization algorithms for sensor networks such as TPSN must be able to perform arithmetic on clock values to calculate clock drift and network propagation delays. They must be able to read the value of a local clock and assign it to another local clock. Such operations are not directly supported by the theory of Timed Automata. To overcome this formal-modeling obstacle, we augment the UPPAAL specification language with the integer clock derived type. Integer clocks, which are essentially integer variables that are periodically incremented by a global pulse generator, greatly facilitate the encoding of the operations required to synchronize clocks as in the TPSN protocol. With this integer-clock-based model of TPSN in hand, we use UPPAAL to verify that the protocol achieves network-wide time synchronization and is devoid of deadlock. We also use the UPPAAL Tracer tool to illustrate how integer clocks can be used to capture clock drift and resynchronization during protocol execution

  5. Organic food processing: a framework for concept, starting definitions and evaluation.

    Science.gov (United States)

    Kahl, Johannes; Alborzi, Farnaz; Beck, Alexander; Bügel, Susanne; Busscher, Nicolaas; Geier, Uwe; Matt, Darja; Meischner, Tabea; Paoletti, Flavio; Pehme, Sirli; Ploeger, Angelika; Rembiałkowska, Ewa; Schmid, Otto; Strassner, Carola; Taupier-Letage, Bruno; Załęcka, Aneta

    2014-10-01

    In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria. These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product identity as the underlying paradigm and a holistic quality view connected to naturalness as consumers' perception of organic food quality. In a European study, the quality concept was applied to the organic food chain, resulting in a problem, namely that clear principles and related criteria were missing to evaluate processing methods. Therefore the goal of this paper is to describe and discuss the topic of organic food processing to make it operational. A conceptual background for organic food processing is given by verifying the underlying paradigms and principles of organic farming and organic food as well as on organic processing. The proposed definition connects organic processing to related systems such as minimal, sustainable and careful, gentle processing, and describes clear principles and related criteria. Based on food examples, such as milk with different heat treatments, the concept and definitions were verified. Organic processing can be defined by clear paradigms and principles and evaluated according criteria from a multidimensional approach. Further work has to be done on developing indicators and parameters for assessment of organic food quality. © 2013 Society of Chemical Industry.

  6. Evaluation of puberty by verifying spontaneous and stimulated gonadotropin values in girls.

    Science.gov (United States)

    Chin, Vivian L; Cai, Ziyong; Lam, Leslie; Shah, Bina; Zhou, Ping

    2015-03-01

    Changes in pharmacological agents and advancements in laboratory assays have changed the gonadotropin-releasing hormone analog stimulation test. To determine the best predictive model for detecting puberty in girls. Thirty-five girls, aged 2 years 7 months to 9 years 3 months, with central precocious puberty (CPP) (n=20) or premature thelarche/premature adrenarche (n=15). Diagnoses were based on clinical information, baseline hormones, bone age, and pelvic sonogram. Gonadotropins and E2 were analyzed using immunochemiluminometric assay. Logistic regression for CPP was performed. The best predictor of CPP is the E2-change model based on 3- to 24-h values, providing 80% sensitivity and 87% specificity. Three-hour luteinizing hormone (LH) provided 75% sensitivity and 87% specificity. Basal LH lowered sensitivity to 65% and specificity to 53%. The E2-change model provided the best predictive power; however, 3-h LH was more practical and convenient when evaluating puberty in girls.

  7. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    Science.gov (United States)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  8. ASTUS system for verifying the transport seal TITUS 1

    International Nuclear Information System (INIS)

    Barillaux; Monteil, D.; Destain, G.D.

    1991-01-01

    ASTUS, a system for acquisition and processing ultrasonic signatures of TITUS 1 seals has been developed. TITUS seals are used to verify the integrity of the fissile material's container sealing after transport. An autonomous portable reading case permit to take seals signatures at the starting point and to transmit these reference signatures to a central safeguards computer by phonic modem. Then, at the terminal point with a similar reading case, an authority takes again the signature of seals and immediately transmit these signatures to the central safeguards computer. The central computer processes the data in real time by autocorrelation and return its verdict to the terminal point

  9. Test, measurement and evaluation with the mine boot test and evaluation system

    CSIR Research Space (South Africa)

    Ramaloko, PM

    2014-09-01

    Full Text Available Protective footwear that mitigates the shock transferred to the victim’s leg during an antipersonnel landmine blast need to be evaluated to verify their protection levels. The Mine Boot Test and Evaluation System which include a surrogate lower leg...

  10. Election Verifiability: Cryptographic Definitions and an Analysis of Helios and JCJ

    Science.gov (United States)

    2015-08-06

    Computer Society, 2014. To appear. [26] David Chaum . Untraceable electronic mail, return addresses, and digital pseudonyms. Communications of the ACM...24(2):84–88, 1981. [27] David Chaum . Secret-ballot receipts: True voter-verifiable elections. IEEE Security and Privacy, 2(1):38–47, 2004. [28... David Chaum , Richard Carback, Jeremy Clark, Aleksander Essex, Stefan Popoveniuc, Ronald L. Rivest, Peter Y. A. Ryan, Emily Shen, and Alan T. Sherman

  11. A framework for verifying the dismantlement and abandonment of nuclear weapons. A policy implication for the denuclearization of Korea Peninsula

    International Nuclear Information System (INIS)

    Ichimasa, Sukeyuki

    2011-01-01

    Denuclearization of Korean Peninsula has been a serious security issue in the North East Asian region. Although the Six-Party Talks has been suspended since North Korea declared a boycott in 2008, aims of denuclearizing North Korea has still been discussed. For instance, the recent Japan and the U.S. '2+2' dialogue affirmed its importance to achieve complete and verifiable denuclearization of North Korea, including scrutinizing its uranium enrichment program, through irreversible steps under the Six Party process. In order to identify effective and efficient framework for denuclearization of North Korea, this paper examines 5 major denuclearization methods including (1) the Nunn-Luger Method, (2) the Iraqi Method, (3) the South African Method, (4) the Libyan Method and (5) the denuclearization method shown in the Nuclear Weapons Convention (NWC), while referring to the recent developments of the verification studies for nuclear disarmament, such as a joint research conducted by the United Kingdom and Norway and any other arguments made by disarmament experts. Moreover, this paper argues what political and security conditions will be required to make North Korea to accept intrusive verification for its denuclearization. Conditions for successful denuclearization talks among the Six-Party member states and a realistic approach of verifiable denuclearization will be also examined. (author)

  12. Identifying and evaluating electronic learning resources for use in adult-gerontology nurse practitioner education.

    Science.gov (United States)

    Thompson, Hilaire J; Belza, Basia; Baker, Margaret; Christianson, Phyllis; Doorenbos, Ardith; Nguyen, Huong

    2014-01-01

    Enhancing existing curricula to meet newly published adult-gerontology advanced practice registered nurse (APRN) competencies in an efficient manner presents a challenge to nurse educators. Incorporating shared, published electronic learning resources (ELRs) in existing or new courses may be appropriate in order to assist students in achieving competencies. The purposes of this project were to (a) identify relevant available ELR for use in enhancing geriatric APRN education and (b) to evaluate the educational utility of identified ELRs based on established criteria. A multilevel search strategy was used. Two independent team members reviewed identified ELR against established criteria to ensure utility. Only resources meeting all criteria were retained. Resources were found for each of the competency areas and included formats such as podcasts, Web casts, case studies, and teaching videos. In many cases, resources were identified using supplemental strategies and not through traditional search or search of existing geriatric repositories. Resources identified have been useful to advanced practice educators in improving lecture and seminar content in a particular topic area and providing students and preceptors with additional self-learning resources. Addressing sustainability within geriatric APRN education is critical for sharing of best practices among educators and for sustainability of teaching and related resources. © 2014.

  13. Psychological and social aspects verified after the Goiania's radioactive accident

    International Nuclear Information System (INIS)

    Helou, Suzana

    1995-01-01

    Psychological and social aspects verified after the radioactive accident occurred in 1987 in Goiania - brazilian city - are discussed. With this goal was going presented a public opinion research in order to retract the Goiania's radioactive accident residual psychological effects. They were going consolidated data obtained in 1.126 interviews. Four involvement different levels groups with the accident are compared with regard to the event. The research allowed to conclude that the accident affected psychologically somehow all Goiania's population. Besides, the research allowed to analyze the professionals performance quality standard in terms of the accident

  14. Spin temperature concept verified by optical magnetometry of nuclear spins

    Science.gov (United States)

    Vladimirova, M.; Cronenberger, S.; Scalbert, D.; Ryzhov, I. I.; Zapasskii, V. S.; Kozlov, G. G.; Lemaître, A.; Kavokin, K. V.

    2018-01-01

    We develop a method of nonperturbative optical control over adiabatic remagnetization of the nuclear spin system and apply it to verify the spin temperature concept in GaAs microcavities. The nuclear spin system is shown to exactly follow the predictions of the spin temperature theory, despite the quadrupole interaction that was earlier reported to disrupt nuclear spin thermalization. These findings open a way for the deep cooling of nuclear spins in semiconductor structures, with the prospect of realizing nuclear spin-ordered states for high-fidelity spin-photon interfaces.

  15. Independent technique of verifying high-dose rate (HDR) brachytherapy treatment plans

    International Nuclear Information System (INIS)

    Saw, Cheng B.; Korb, Leroy J.; Darnell, Brenda; Krishna, K. V.; Ulewicz, Dennis

    1998-01-01

    Purpose: An independent technique for verifying high-dose rate (HDR) brachytherapy treatment plans has been formulated and validated clinically. Methods and Materials: In HDR brachytherapy, dwell times at respective dwell positions are computed, using an optimization algorithm in a HDR treatment-planning system to deliver a specified dose to many target points simultaneously. Because of the variability of dwell times, concerns have been expressed regarding the ability of the algorithm to compute the correct dose. To address this concern, a commercially available low-dose rate (LDR) algorithm was used to compute the doses at defined distances, based on the dwell times obtained from the HDR treatment plans. The percent deviation between doses computed using the HDR and LDR algorithms were reviewed for HDR procedures performed over the last year. Results: In this retrospective study, the difference between computed doses using the HDR and LDR algorithms was found to be within 5% for about 80% of the HDR procedures. All of the reviewed procedures have dose differences of less than 10%. Conclusion: An independent technique for verifying HDR brachytherapy treatment plans has been validated based on clinical data. Provided both systems are available, this technique is universal in its applications and not limited to either a particular implant applicator, implant site, or implant type

  16. Middle-aged patients with an MRI-verified medial meniscal tear report symptoms commonly associated with knee osteoarthritis

    DEFF Research Database (Denmark)

    Hare, Kristoffer B.; Stefan Lohmander, L.; Kise, Nina Jullum

    2017-01-01

    Background and purpose — No consensus exists on when to perform arthroscopic partial meniscectomy in patients with a degenerative meniscal tear. Since MRI and clinical tests are not accurate in detecting a symptomatic meniscal lesion, the patient’s symptoms often play a large role when deciding...... when to perform surgery. We determined the prevalence and severity of self-reported knee symptoms in patients eligible for arthroscopic partial meniscectomy due to a degenerative meniscal tear. We investigated whether symptoms commonly considered to be related to meniscus injury were associated...... with early radiographic signs of knee osteoarthritis. Patients and methods — We included individual baseline items from the Knee injury and Osteoarthritis Outcome Score collected in 2 randomized controlled trials evaluating treatment for an MRI-verified degenerative medial meniscal tears in 199 patients aged...

  17. Evaluation of potential regulatory elements identified as DNase I hypersensitive sites in the CFTR gene

    DEFF Research Database (Denmark)

    Phylactides, M.; Rowntree, R.; Nuthall, H.

    2002-01-01

    hypersensitive sites (DHS) within the locus. We previously identified at least 12 clusters of DHS across the CFTR gene and here further evaluate DHS in introns 2,3,10,16,17a, 18, 20 and 21 to assess their functional importance in regulation of CFTR gene expression. Transient transfections of enhancer/reporter...

  18. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    OpenAIRE

    Weycker Derek; Sofrygin Oleg; Seefeld Kim; Deeter Robert G; Legg Jason; Edelsberg John

    2013-01-01

    Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classifie...

  19. Evaluation and technologic improvement of an enhanced imaging system

    International Nuclear Information System (INIS)

    Henry, D.

    1990-08-01

    Feature-based systems that combine imaging and signal analysis capabilities may be useful for nondestructive evaluation (NDE) of plant components. This report describes the metallurgical evaluation conducted to verify the performance of a feature-based system to discriminate intergranular stress corrosion cracking (IGSCC) from benign geometrical reflectors. The ultrasonic examination results were also evaluated by examination personnel trained in intergranular stress corrosion cracking (IGSCC) detection techniques. The welds were examined prior to their removal from the recirculation and Residual-Heat-Removal (RHR) piping systems of the Peach Bottom Atomic Power Plant, as described in the Phase 2 Interim Report issued in June 1989. In this phase of the program, a metallurgical evaluation was performed on piping system welds that were examined ultrasonically using a feature-based system for analysis. The feature-based system correctly identified crack, but incorrectly identified other features, e.g., root geometry and metallurgical interfaces, as cracks. While the results of the analysis by the feature-based system were not identical to the results of analysis by trained personnel, the overall performance of the feature-based system was comparable to that of the trained personnel. Based on the results of this program, the feature-based system may be useful as a supplementary method of identifying IGSCC indications. When used in conjunction with existing methods and techniques, it could improve the accuracy of IGSCC identification

  20. An method of verify period signal based on data acquisition card

    International Nuclear Information System (INIS)

    Zeng Shaoli

    2005-01-01

    This paper introduces an method to verify index voltage of Period Signal Generator by using data acquisition card. which it's error is less 0.5%. A corresponding Win32's program, which use voluntarily developed VxD to control data acquisition card direct I/O and multi thread technique for gain the best time scale precision, has developed in Windows platform. The program will real time collect inda voltage data and auto measure period. (authors)

  1. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    Science.gov (United States)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  2. Flux wire measurements in Cavalier for verifying computer code applications

    International Nuclear Information System (INIS)

    Fehr, M.; Stubbs, J.; Hosticka, B.

    1988-01-01

    The Cavalier and UVAR research reactors are to be converted from high-enrichment uranium (HEU) to low-enrichment uranium (LEU) fuel. As a first step, an extensive set of gold wire activation measurements has been taken on the Cavalier reactor. Axial traverses show internal consistency to the order of ±5%, while horizontal traverses show somewhat larger deviations. The activation measurements will be converted to flux measurements via the Thermos code and will then be used to verify the Leopard-2DB codes. The codes will ultimately be used to design an upgraded LEU core for the UVAR

  3. A detailed and verified wind resource atlas for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Mortensen, N G; Landberg, L; Rathmann, O; Nielsen, M N [Risoe National Lab., Roskilde (Denmark); Nielsen, P [Energy and Environmental Data, Aalberg (Denmark)

    1999-03-01

    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  4. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  5. The International Criticality Safety Benchmark Evaluation Project (ICSBEP)

    International Nuclear Information System (INIS)

    Briggs, J.B.

    2003-01-01

    The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organisation for Economic Cooperation and Development (OECD) - Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Yugoslavia, Kazakhstan, Israel, Spain, and Brazil are now participating. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive and internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled 'International Handbook of Evaluated Criticality Safety Benchmark Experiments.' The 2003 Edition of the Handbook contains benchmark model specifications for 3070 critical or subcritical configurations that are intended for validating computer codes that calculate effective neutron multiplication and for testing basic nuclear data. (author)

  6. Identifiability and Identification of Trace Continuous Pollutant Source

    Directory of Open Access Journals (Sweden)

    Hongquan Qu

    2014-01-01

    Full Text Available Accidental pollution events often threaten people’s health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions.

  7. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  8. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    Science.gov (United States)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  9. A new (k,n verifiable secret image sharing scheme (VSISS

    Directory of Open Access Journals (Sweden)

    Amitava Nag

    2014-11-01

    Full Text Available In this paper, a new (k,n verifiable secret image sharing scheme (VSISS is proposed in which third order LFSR (linear-feedback shift register-based public key cryptosystem is applied for the cheating prevention and preview before decryption. In the proposed scheme the secret image is first partitioned into several non-overlapping blocks of k pixels. Every k pixel is then used to form m=⌈k/4⌉+1 pixels of one encrypted share. The original secret image can be reconstructed by gathering any k or more encrypted shared images. The experimental results show that the proposed VSISS is an efficient and safe method.

  10. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases.

    Science.gov (United States)

    Weycker, Derek; Sofrygin, Oleg; Seefeld, Kim; Deeter, Robert G; Legg, Jason; Edelsberg, John

    2013-02-13

    Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive "gold standard" (ANC based definition (diagnosis codes for neutropenia, fever, and/or infection). Accuracy was evaluated principally based on positive predictive value (PPV) and sensitivity. Among 357 study subjects, 82 (23%) met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28), PPV was 100% and sensitivity was 34% (95% CI: 24-45). For the definition including neutropenia in the primary position (n=54), PPV was 87% (78-95) and sensitivity was 57% (46-68). For the definition including neutropenia in any position (n=71), PPV was 77% (68-87) and sensitivity was 67% (56-77). Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  11. Two statistics for evaluating parameter identifiability and error reduction

    Science.gov (United States)

    Doherty, John; Hunt, Randall J.

    2009-01-01

    Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.

  12. Evaluation of the ability of rod drop tests to verify the stability margins in FTR

    International Nuclear Information System (INIS)

    Harris, R.A.; Sevenich, R.A.

    1976-01-01

    Predictions of the stability characteristics of FTR indicate that the reactor can be easily controlled even under the worst possible conditions. Nevertheless, experimental verification and monitoring of these characteristics will be performed during operation of the reactor. An initial evaluation of rod drop experiments which could possibly provide this verification is presented

  13. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  14. Developing a flexible and verifiable integrated dose assessment capability

    International Nuclear Information System (INIS)

    Parzyck, D.C.; Rhea, T.A.; Copenhaver, E.D.; Bogard, J.S.

    1987-01-01

    A flexible yet verifiable system of computing and recording personnel doses is needed. Recent directions in statutes establish the trend of combining internal and external doses. We are developing a Health Physics Information Management System (HPIMS) that will centralize dosimetry calculations and data storage; integrate health physics records with other health-related disciplines, such as industrial hygiene, medicine, and safety; provide a more auditable system with published algorithms and clearly defined flowcharts of system operation; readily facilitate future changes dictated by new regulations, new dosimetric models, and new systems of units; and address ad-hoc inquiries regarding worker/workplace interactions, including potential synergisms with non-radiation exposures. The system is modular and provides a high degree of isolation from low-level detail, allowing flexibility for changes without adversely affecting other parts of the system. 10 refs., 3 figs

  15. FACEBOOK for CoP of Researchers: Identifying the Needs and Evaluating the Compatibility

    Directory of Open Access Journals (Sweden)

    Sami Miniaoui

    2011-11-01

    Full Text Available Communities of practice (CoPs are increasingly capturing the interest of many fields such as business companies, education and organizations. Many CoPs were developed for people who have common interest in healthcare, agriculture and environment, and teaching. However, there is lack of COPs dedicated for researchers. This research aims to explore the appropriateness of Facebook (FB as a platform for serving a CoP of researchers. To achieve this goal, first we identify the needs of CoPs for researchers within UAE context. Consequently, we adopted qualitative research approach to elicit the needs. We applied the grounded theory method to analyze the data. The results of the analysis showed seven main needs: collaboration, debating, awareness/ notification, reference management, cross search, customization, tracking, and user orientation. Secondly, we evaluated the compatibility of FB features to the identified needs. Although we found that FB covers most of CoPs needs, there are few needs which are not met successfully so this raised some technical and practical issues, which have been highlighted in the paper.

  16. Evaluation of 19,460 Wheat Accessions Conserved in the Indian National Genebank to Identify New Sources of Resistance to Rust and Spot Blotch Diseases

    Science.gov (United States)

    Jacob, Sherry R.; Srinivasan, Kalyani; Radhamani, J.; Parimalan, R.; Sivaswamy, M.; Tyagi, Sandhya; Yadav, Mamata; Kumari, Jyotisna; Deepali; Sharma, Sandeep; Bhagat, Indoo; Meeta, Madhu; Bains, N. S.; Chowdhury, A. K.; Saha, B. C.; Bhattacharya, P. M.; Kumari, Jyoti; Singh, M. C.; Gangwar, O. P.; Prasad, P.; Bharadwaj, S. C.; Gogoi, Robin; Sharma, J. B.; GM, Sandeep Kumar; Saharan, M. S.; Bag, Manas; Roy, Anirban; Prasad, T. V.; Sharma, R. K.; Dutta, M.; Sharma, Indu; Bansal, K. C.

    2016-01-01

    A comprehensive germplasm evaluation study of wheat accessions conserved in the Indian National Genebank was conducted to identify sources of rust and spot blotch resistance. Genebank accessions comprising three species of wheat–Triticum aestivum, T. durum and T. dicoccum were screened sequentially at multiple disease hotspots, during the 2011–14 crop seasons, carrying only resistant accessions to the next step of evaluation. Wheat accessions which were found to be resistant in the field were then assayed for seedling resistance and profiled using molecular markers. In the primary evaluation, 19,460 accessions were screened at Wellington (Tamil Nadu), a hotspot for wheat rusts. We identified 4925 accessions to be resistant and these were further evaluated at Gurdaspur (Punjab), a hotspot for stripe rust and at Cooch Behar (West Bengal), a hotspot for spot blotch. The second round evaluation identified 498 accessions potentially resistant to multiple rusts and 868 accessions potentially resistant to spot blotch. Evaluation of rust resistant accessions for seedling resistance against seven virulent pathotypes of three rusts under artificial epiphytotic conditions identified 137 accessions potentially resistant to multiple rusts. Molecular analysis to identify different combinations of genetic loci imparting resistance to leaf rust, stem rust, stripe rust and spot blotch using linked molecular markers, identified 45 wheat accessions containing known resistance genes against all three rusts as well as a QTL for spot blotch resistance. The resistant germplasm accessions, particularly against stripe rust, identified in this study can be excellent potential candidates to be employed for breeding resistance into the background of high yielding wheat cultivars through conventional or molecular breeding approaches, and are expected to contribute toward food security at national and global levels. PMID:27942031

  17. Evaluation of 19,460 Wheat Accessions Conserved in the Indian National Genebank to Identify New Sources of Resistance to Rust and Spot Blotch Diseases.

    Directory of Open Access Journals (Sweden)

    Sundeep Kumar

    Full Text Available A comprehensive germplasm evaluation study of wheat accessions conserved in the Indian National Genebank was conducted to identify sources of rust and spot blotch resistance. Genebank accessions comprising three species of wheat-Triticum aestivum, T. durum and T. dicoccum were screened sequentially at multiple disease hotspots, during the 2011-14 crop seasons, carrying only resistant accessions to the next step of evaluation. Wheat accessions which were found to be resistant in the field were then assayed for seedling resistance and profiled using molecular markers. In the primary evaluation, 19,460 accessions were screened at Wellington (Tamil Nadu, a hotspot for wheat rusts. We identified 4925 accessions to be resistant and these were further evaluated at Gurdaspur (Punjab, a hotspot for stripe rust and at Cooch Behar (West Bengal, a hotspot for spot blotch. The second round evaluation identified 498 accessions potentially resistant to multiple rusts and 868 accessions potentially resistant to spot blotch. Evaluation of rust resistant accessions for seedling resistance against seven virulent pathotypes of three rusts under artificial epiphytotic conditions identified 137 accessions potentially resistant to multiple rusts. Molecular analysis to identify different combinations of genetic loci imparting resistance to leaf rust, stem rust, stripe rust and spot blotch using linked molecular markers, identified 45 wheat accessions containing known resistance genes against all three rusts as well as a QTL for spot blotch resistance. The resistant germplasm accessions, particularly against stripe rust, identified in this study can be excellent potential candidates to be employed for breeding resistance into the background of high yielding wheat cultivars through conventional or molecular breeding approaches, and are expected to contribute toward food security at national and global levels.

  18. Evaluation of 19,460 Wheat Accessions Conserved in the Indian National Genebank to Identify New Sources of Resistance to Rust and Spot Blotch Diseases.

    Science.gov (United States)

    Kumar, Sundeep; Archak, Sunil; Tyagi, R K; Kumar, Jagdish; Vk, Vikas; Jacob, Sherry R; Srinivasan, Kalyani; Radhamani, J; Parimalan, R; Sivaswamy, M; Tyagi, Sandhya; Yadav, Mamata; Kumari, Jyotisna; Deepali; Sharma, Sandeep; Bhagat, Indoo; Meeta, Madhu; Bains, N S; Chowdhury, A K; Saha, B C; Bhattacharya, P M; Kumari, Jyoti; Singh, M C; Gangwar, O P; Prasad, P; Bharadwaj, S C; Gogoi, Robin; Sharma, J B; Gm, Sandeep Kumar; Saharan, M S; Bag, Manas; Roy, Anirban; Prasad, T V; Sharma, R K; Dutta, M; Sharma, Indu; Bansal, K C

    2016-01-01

    A comprehensive germplasm evaluation study of wheat accessions conserved in the Indian National Genebank was conducted to identify sources of rust and spot blotch resistance. Genebank accessions comprising three species of wheat-Triticum aestivum, T. durum and T. dicoccum were screened sequentially at multiple disease hotspots, during the 2011-14 crop seasons, carrying only resistant accessions to the next step of evaluation. Wheat accessions which were found to be resistant in the field were then assayed for seedling resistance and profiled using molecular markers. In the primary evaluation, 19,460 accessions were screened at Wellington (Tamil Nadu), a hotspot for wheat rusts. We identified 4925 accessions to be resistant and these were further evaluated at Gurdaspur (Punjab), a hotspot for stripe rust and at Cooch Behar (West Bengal), a hotspot for spot blotch. The second round evaluation identified 498 accessions potentially resistant to multiple rusts and 868 accessions potentially resistant to spot blotch. Evaluation of rust resistant accessions for seedling resistance against seven virulent pathotypes of three rusts under artificial epiphytotic conditions identified 137 accessions potentially resistant to multiple rusts. Molecular analysis to identify different combinations of genetic loci imparting resistance to leaf rust, stem rust, stripe rust and spot blotch using linked molecular markers, identified 45 wheat accessions containing known resistance genes against all three rusts as well as a QTL for spot blotch resistance. The resistant germplasm accessions, particularly against stripe rust, identified in this study can be excellent potential candidates to be employed for breeding resistance into the background of high yielding wheat cultivars through conventional or molecular breeding approaches, and are expected to contribute toward food security at national and global levels.

  19. Evaluation of training programs: A pragmatic perspective

    International Nuclear Information System (INIS)

    Wilkinson, J.D.

    1996-01-01

    The Canadian nuclear regulatory agency endorses the Systematic Approach to Training (SAT) as the most reliable method of providing effective, efficient training to Nuclear Power Plant (NPP) personnel. However the benefits of SAT cannot be realized unless all five phases of SAT are implemented. This is particularly true with respect to evaluation. Although each phase of SAT builds on the preceding one, the evaluation phase continuously feeds back into each of the others and also provides the means to verify the entire training programme building process. It is useful, therefore, to examine the issues relating to the what, why, who, when and how of training programme evaluation. ''What'' identifies the various aspects of the training programme to be evaluated, including the need for training, the training standard, the task list, trainer competence, test results, training results, program acceptance and numerous indicators that identify a need for evaluation. ''Why'' addresses legal and regulatory aspects, resource management, worker and public safety, worker and trainer competence and morale, and the cost/benefit of the training program. ''Who'' examines the need to involve trainers, trainees, plant subject matter experts (SMEs), and both plant and training centre supervisory and management staff. ''When'' addresses time-related concerns such as the importance of ensuring at the outset that the training program is actually needed, the necessity of responding promptly to local, national and world events, changes in legal and regulatory responsibilities, and the overriding importance of timely, routine training program evaluations. ''How'' describes the process of conducting a training program evaluation, and addresses the relationships of these five aspects of evaluation to each other. (author). 10 refs

  20. Calling Out Cheaters : Covert Security with Public VerifiabilitySecurity

    DEFF Research Database (Denmark)

    Asharov, Gilad; Orlandi, Claudio

    2012-01-01

    We introduce the notion of covert security with public verifiability, building on the covert security model introduced by Aumann and Lindell (TCC 2007). Protocols that satisfy covert security guarantee that the honest parties involved in the protocol will notice any cheating attempt with some...... constant probability ε. The idea behind the model is that the fear of being caught cheating will be enough of a deterrent to prevent any cheating attempt. However, in the basic covert security model, the honest parties are not able to persuade any third party (say, a judge) that a cheating occurred. We...... propose (and formally define) an extension of the model where, when an honest party detects cheating, it also receives a certificate that can be published and used to persuade other parties, without revealing any information about the honest party’s input. In addition, malicious parties cannot create fake...

  1. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  2. Standardized evaluation of lung congestion during COPD exacerbation better identifies patients at risk of dying

    Directory of Open Access Journals (Sweden)

    Høiseth AD

    2013-12-01

    Full Text Available Arne Didrik Høiseth,1 Torbjørn Omland,1 Bo Daniel Karlsson,2 Pål H Brekke,1 Vidar Søyseth11Cardiothoracic Research Group, Division of Medicine, Akershus University Hospital and Institute of Clinical Medicine, University of Oslo, Oslo, Norway; 2Deptartment of Radiology, Akershus University Hospital, Lørenskog, NorwayBackground: Congestive heart failure is underdiagnosed in patients with chronic obstructive pulmonary disease (COPD. Pulmonary congestion on chest radiograph at admission for acute exacerbation of COPD (AECOPD is associated with an increased risk of mortality. A standardized evaluation of chest radiographs may enhance prognostic accuracy.Purpose: We aimed to evaluate whether a standardized, liberal assessment of pulmonary congestion is superior to the routine assessment in identifying patients at increased risk of long-term mortality, and to investigate the association of heart failure with N-terminal prohormone of brain natriuretic peptide (NT-proBNP concentrations.Material and methods: This was a prospective cohort study of 99 patients admitted for AECOPD. Chest radiographs obtained on admission were routinely evaluated and then later evaluated by blinded investigators using a standardized protocol looking for Kerley B lines, enlarged vessels in the lung apex, perihilar cuffing, peribronchial haze, and interstitial or alveolar edema, defining the presence of pulmonary congestion. Adjusted associations with long-term mortality and NT-proBNP concentration were calculated.Results: The standardized assessment was positive for pulmonary congestion in 32 of the 195 radiographs (16% ruled negative in the routine assessment. The standardized assessment was superior in predicting death during a median follow up of 1.9 years (P=0.022, and in multivariable analysis, only the standardized assessment showed a significant association with mortality (hazard ratio 2.4, 95% confidence interval [CI] 1.2–4.7 (P=0.016 and NT-proBNP (relative

  3. Superfund TIO videos. Set A. Identifying PRPS. Removal process: Removal site evaluation. Part 2. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape is divided into three sections. Section 1 details the liability of Potentially Responsible Parties (PRPs) and describes the four classes of PRPs: current owners and operators, former owners and operators, generators, and transporters (if they selected the site). Section 2 lists the goals of the Potentially Responsible Party (PRP) search and explains how to identify key players during the PRP search. How to plan and conduct the PRP search is also outlined. Section 3 outlines the steps involved in conducting a removal site evaluation. A discussion of when to conduct a removal preliminary assessment, a removal site inspection, and an Engineering Evaluation/Cost Analysis (EE/AC) also is covered

  4. Noninteractive Verifiable Outsourcing Algorithm for Bilinear Pairing with Improved Checkability

    Directory of Open Access Journals (Sweden)

    Yanli Ren

    2017-01-01

    Full Text Available It is well known that the computation of bilinear pairing is the most expensive operation in pairing-based cryptography. In this paper, we propose a noninteractive verifiable outsourcing algorithm of bilinear pairing based on two servers in the one-malicious model. The outsourcer need not execute any expensive operation, such as scalar multiplication and modular exponentiation. Moreover, the outsourcer could detect any failure with a probability close to 1 if one of the servers misbehaves. Therefore, the proposed algorithm improves checkability and decreases communication cost compared with the previous ones. Finally, we utilize the proposed algorithm as a subroutine to achieve an anonymous identity-based encryption (AIBE scheme with outsourced decryption and an identity-based signature (IBS scheme with outsourced verification.

  5. Verifying reciprocal relations for experimental diffusion coefficients in multicomponent mixtures

    DEFF Research Database (Denmark)

    Medvedev, Oleg; Shapiro, Alexander

    2003-01-01

    The goal of the present study is to verify the agreement of the available data on diffusion in ternary mixtures with the theoretical requirement of linear non-equilibrium thermodynamics consisting in symmetry of the matrix of the phenomenological coefficients. A common set of measured diffusion...... coefficients for a three-component mixture consists of four Fickian diffusion coefficients, each being reported separately. However, the Onsager theory predicts the existence of only three independent coefficients, as one of them disappears due to the symmetry requirement. Re-calculation of the Fickian...... extended sets of experimental data and reliable thermodynamic models were available. The sensitivity of the symmetry property to different thermodynamic parameters of the models was also checked. (C) 2003 Elsevier Science B.V. All rights reserved....

  6. Thoughts on identifiers

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    As business processes and information transactions have become an inextricably intertwined with the Web, the importance of assignment, registration, discovery, and maintenance of identifiers has increased. In spite of this, integrated frameworks for managing identifiers have been slow to emerge. Instead, identification systems arise (quite naturally) from immediate business needs without consideration for how they fit into larger information architectures. In addition, many legacy identifier systems further complicate the landscape, making it difficult for content managers to select and deploy identifier systems that meet both the business case and long term information management objectives. This presentation will outline a model for evaluating identifier applications and the functional requirements of the systems necessary to support them. The model is based on a layered analysis of the characteristics of identifier systems, including: * Functional characteristics * Technology * Policy * Business * Social T...

  7. The websites of primary and secondary schools in Portugal: an evaluation proposal

    Directory of Open Access Journals (Sweden)

    Ana Maria SANTOS

    2017-10-01

    Full Text Available This article proposes an evaluation the quality in educational websites of two degrees of education, Primary School and the High school. To stablish this analysis was used a Model of Evaluation of the Quality of Educational Websites (EQEWS divided into Functional Aspects, with five criteria: authority, update, usability, accessibility and communication; and Technical-Aesthetic Aspects, with five principles: graphic design and multimedia quality, content, navigation, speed of access and interaction. This propose model of evaluation applied to 57 websites, according to the Likert scale of 0 to 4. We conclude that Secondary School web sites achieved better results in most of the evaluated criteria, it is verified that the authors of these resources are entirely identified with the needs and with the requirement that this degree of education requires the students, with levels of focused on achieving excellent results for university entrance.

  8. Evaluating Internal Technological Capabilities in Energy Companies

    Directory of Open Access Journals (Sweden)

    Mingook Lee

    2016-03-01

    Full Text Available As global competition increases, technological capability must be evaluated objectively as one of the most important factors for predominance in technological competition and to ensure sustainable business excellence. Most existing capability evaluation models utilize either quantitative methods, such as patent analysis, or qualitative methods, such as expert panels. Accordingly, they may be in danger of reflecting only fragmentary aspects of technological capabilities, and produce inconsistent results when different models are used. To solve these problems, this paper proposes a comprehensive framework for evaluating technological capabilities in energy companies by considering the complex properties of technological knowledge. For this purpose, we first explored various factors affecting technological capabilities and divided the factors into three categories: individual, organizational, and technology competitiveness. Second, we identified appropriate evaluation items for each category to measure the technological capability. Finally, by using a hybrid approach of qualitative and quantitative methods, we developed an evaluation method for each item and suggested a method to combine the results. The proposed framework was then verified with an energy generation and supply company to investigate its practicality. As one of the earliest attempts to evaluate multi-faceted technological capabilities, the suggested model can support technology and strategic planning.

  9. Identifying partial topology of complex dynamical networks via a pinning mechanism

    Science.gov (United States)

    Zhu, Shuaibing; Zhou, Jin; Lu, Jun-an

    2018-04-01

    In this paper, we study the problem of identifying the partial topology of complex dynamical networks via a pinning mechanism. By using the network synchronization theory and the adaptive feedback controlling method, we propose a method which can greatly reduce the number of nodes and observers in the response network. Particularly, this method can also identify the whole topology of complex networks. A theorem is established rigorously, from which some corollaries are also derived in order to make our method more cost-effective. Several numerical examples are provided to verify the effectiveness of the proposed method. In the simulation, an approach is also given to avoid possible identification failure caused by inner synchronization of the drive network.

  10. Predictive value of soluble haemoglobin scavenger receptor CD163 serum levels for survival in verified tuberculosis patients

    DEFF Research Database (Denmark)

    Knudsen, T.B.; Gustafson, P.; Kronborg, G.

    2005-01-01

    Pre-treatment serum levels of sCD163 were measured in a cohort of 236 suspected tuberculosis (TB) cases from Guinea-Bissau, with a median follow-up period of 3.3 years (range 0-6.4 years). In 113 cases, the diagnosis of TB was verified by positive sputum microscopy and/or culture. Among the verif......Pre-treatment serum levels of sCD163 were measured in a cohort of 236 suspected tuberculosis (TB) cases from Guinea-Bissau, with a median follow-up period of 3.3 years (range 0-6.4 years). In 113 cases, the diagnosis of TB was verified by positive sputum microscopy and/or culture. Among...

  11. Verifying Real-Time Systems using Explicit-time Description Methods

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking has been extensively researched in recent years. Many new formalisms with time extensions and tools based on them have been presented. On the other hand, Explicit-Time Description Methods aim to verify real-time systems with general untimed model checkers. Lamport presented an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables for time requirements. This paper proposes a new explicit-time description method with no reliance on global variables. Instead, it uses rendezvous synchronization steps between the Tick process and each system process to simulate time. This new method achieves better modularity and facilitates usage of more complex timing constraints. The two explicit-time description methods are implemented in DIVINE, a well-known distributed-memory model checker. Preliminary experiment results show that our new method, with better modularity, is comparable to Lamport's method with respect to time and memory efficiency.

  12. Intra-urban biomonitoring: Source apportionment using tree barks to identify air pollution sources.

    Science.gov (United States)

    Moreira, Tiana Carla Lopes; de Oliveira, Regiani Carvalho; Amato, Luís Fernando Lourenço; Kang, Choong-Min; Saldiva, Paulo Hilário Nascimento; Saiki, Mitiko

    2016-05-01

    It is of great interest to evaluate if there is a relationship between possible sources and trace elements using biomonitoring techniques. In this study, tree bark samples of 171 trees were collected using a biomonitoring technique in the inner city of São Paulo. The trace elements (Al, Ba, Ca, Cl, Cu, Fe, K, Mg, Mn, Na, P, Rb, S, Sr and Zn) were determined by the energy dispersive X-ray fluorescence (EDXRF) spectrometry. The Principal Component Analysis (PCA) was applied to identify the plausible sources associated with tree bark measurements. The greatest source was vehicle-induced non-tailpipe emissions derived mainly from brakes and tires wear-out and road dust resuspension (characterized with Al, Ba, Cu, Fe, Mn and Zn), which was explained by 27.1% of the variance, followed by cement (14.8%), sea salt (11.6%) and biomass burning (10%), and fossil fuel combustion (9.8%). We also verified that the elements related to vehicular emission showed different concentrations at different sites of the same street, which might be helpful for a new street classification according to the emission source. The spatial distribution maps of element concentrations were obtained to evaluate the different levels of pollution in streets and avenues. Results indicated that biomonitoring techniques using tree bark can be applied to evaluate dispersion of air pollution and provide reliable data for the further epidemiological studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Evaluation of food emergency response laboratories' capability for 210Po analysis using proficiency test material with verifiable traceability

    International Nuclear Information System (INIS)

    Zhongyu Wu; Zhichao Lin; Mackill, P.; Cong Wei; Noonan, J.; Cherniack, J.; Gillis-Landrum, D.

    2009-01-01

    Measurement capability and data comparability are essential for emergency response when analytical data from cooperative laboratories are used for risk assessment and post incident decision making. In this study, the current capability of food emergency response laboratories for the analysis of 210 Po in water was evaluated using a proficiency test scheme in compliance with ISO-43 and ILAC G13 guidelines, which comprises a test sample preparation and verification protocol and an insightful statistical data evaluation. The results of performance evaluations on relative bias, value trueness, precision, false positive detection, minimum detection limit, and limit of quantification, are presented. (author)

  14. Evaluation of an online family history tool for identifying hereditary and familial colorectal cancer.

    Science.gov (United States)

    Kallenberg, F G J; Aalfs, C M; The, F O; Wientjes, C A; Depla, A C; Mundt, M W; Bossuyt, P M M; Dekker, E

    2017-09-21

    Identifying a hereditary colorectal cancer (CRC) syndrome or familial CRC (FCC) in a CRC patient may enable the patient and relatives to enroll in surveillance protocols. As these individuals are insufficiently recognized, we evaluated an online family history tool, consisting of a patient-administered family history questionnaire and an automated genetic referral recommendation, to facilitate the identification of patients with hereditary CRC or FCC. Between 2015 and 2016, all newly diagnosed CRC patients in five Dutch outpatient clinics, were included in a trial with a stepped-wedge design, when first visiting the clinic. Each hospital continued standard procedures for identifying patients at risk (control strategy) and then, after a predetermined period, switched to offering the family history tool to included patients (intervention strategy). After considering the tool-based recommendation, the health care provider could decide on and arrange the referral. Primary outcome was the relative number of CRC patients who received screening or surveillance recommendations for themselves or relatives because of hereditary CRC or FCC, provided by genetic counseling. The intervention effect was evaluated using a logit-linear model. With the tool, 46/489 (9.4%) patients received a screening or surveillance recommendation, compared to 35/292 (12.0%) in the control group. In the intention-to-treat-analysis, accounting for time trends and hospital effects, this difference was not statistically significant (p = 0.58). A family history tool does not necessarily assist in increasing the number of CRC patients and relatives enrolled in screening or surveillance recommendations for hereditary CRC or FCC. Other interventions should be considered.

  15. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    Directory of Open Access Journals (Sweden)

    Weycker Derek

    2013-02-01

    Full Text Available Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive “gold standard” (ANC 9/L, and body temperature ≥38.3°C or receipt of antibiotics and claims-based definition (diagnosis codes for neutropenia, fever, and/or infection. Accuracy was evaluated principally based on positive predictive value (PPV and sensitivity. Results Among 357 study subjects, 82 (23% met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28, PPV was 100% and sensitivity was 34% (95% CI: 24–45. For the definition including neutropenia in the primary position (n=54, PPV was 87% (78–95 and sensitivity was 57% (46–68. For the definition including neutropenia in any position (n=71, PPV was 77% (68–87 and sensitivity was 67% (56–77. Conclusions Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  16. Veterans’ Pensions: Verifying Income with Tax Data Can Identify Significant Payment Problems.

    Science.gov (United States)

    1988-03-01

    NOT RETURN THE COMPLETED FORM TO THE VA BY DC 1, -1’on YOUR BENEFITS WILL BE DISCONTINUEr IMPOPTA -Pless. read the inclosO Evp IsuctIos (VA Porn 2 1...physically or mentally helpless before age 18. If you have unmarried children in any of these categories. show the number of such children. If a child is...away at school but still a member of your household, consider that child to be IN YOUR CUSTODY It you have no dependent children show Ŕ". 2. INCOME

  17. 78 FR 69871 - Agency Information Collection Activities: myE-Verify, Revision of a Currently Approved Collection

    Science.gov (United States)

    2013-11-21

    ... Collection (1) Type of Information Collection: Revision of a Currently Approved Collection. (2) Title of the... respond: E-Verify Self Check--Identity Authentication 2,900,000 responses at 0.0833 hours (5 minutes) per...

  18. Memoization in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Balat, Vincent; Danvy, Olivier

    2002-01-01

    We use a code generator—type-directed partial evaluation— to verify conversions between isomorphic types, or more precisely to verify that a composite function is the identity function at some complicated type. A typed functional language such as ML provides a natural support to express the funct......We use a code generator—type-directed partial evaluation— to verify conversions between isomorphic types, or more precisely to verify that a composite function is the identity function at some complicated type. A typed functional language such as ML provides a natural support to express...... originate in the handling of sums, which uses delimited continuations. We successfully eliminate these redundancies by extending type-directed partial evaluation with memoization capabilities. The result only works for pure functional programs, but it provides an unexpected use of code generation...... and it yields orders-of-magnitude improvements both in time and in space for type isomorphisms. Basic Research in Computer Science (www. brics. dk), funded by the Danish National Research Foundation....

  19. Tools for monitoring aquatic environments to identify anthropic effects.

    Science.gov (United States)

    da Rocha, Monyque Palagano; Dourado, Priscila Leocadia Rosa; Cardoso, Claudia Andrea Lima; Cândido, Liliam Silva; Pereira, Joelson Gonçalves; de Oliveira, Kelly Mari Pires; Grisolia, Alexeia Barufatti

    2018-01-05

    Anthropic activities are directly related to the contamination of aquatic ecosystems owing to the release of numerous chemicals from agricultural and urban waste. These contaminants cause environmental degradation and a decrease in the availability of water quality. The objective of this search was to evaluate the efficiency of physicochemical, chemical, and microbiological tests; extraction of chlorophyll a; and genetic parameters to identify anthropic activities and weather condition effects on the stream water quality and the consequences of its use by the population. The physicochemical parameters were within the limits allowed by the Brazilian law. However, contamination by metals (Cd 0.510 mg L -1 , Co 0.405 mg L -1 , and Ni 0.316 mg L -1 ) has been found at various collection points to be more than the allowable values. The antibiotic oxytetracycline was detected in stream water in quantities of up to 89 μg L -1 . In relation to microbiological contamination, Escherichia coli and Pseudomonas spp. have been isolated. The averages of chlorophyll a were up to 0.15558 mg cm -2 . Genetic tools identified greater number of micronuclei and DNA damage in periods that showed lower rainfall rates and lower amounts of metals. The analysis used for monitoring was efficient to verify the interference that animal breeding and planting of different cultures have caused on that stream. Thus, the continued use of this water for drinking, irrigation of vegetables, and recreational activities makes the population susceptible to contamination by bacteria and creates conditions for the development of genetic alterations in the long run.

  20. Utilization of two web-based continuing education courses evaluated by Markov chain model.

    Science.gov (United States)

    Tian, Hao; Lin, Jin-Mann S; Reeves, William C

    2012-01-01

    To evaluate the web structure of two web-based continuing education courses, identify problems and assess the effects of web site modifications. Markov chain models were built from 2008 web usage data to evaluate the courses' web structure and navigation patterns. The web site was then modified to resolve identified design issues and the improvement in user activity over the subsequent 12 months was quantitatively evaluated. Web navigation paths were collected between 2008 and 2010. The probability of navigating from one web page to another was analyzed. The continuing education courses' sequential structure design was clearly reflected in the resulting actual web usage models, and none of the skip transitions provided was heavily used. The web navigation patterns of the two different continuing education courses were similar. Two possible design flaws were identified and fixed in only one of the two courses. Over the following 12 months, the drop-out rate in the modified course significantly decreased from 41% to 35%, but remained unchanged in the unmodified course. The web improvement effects were further verified via a second-order Markov chain model. The results imply that differences in web content have less impact than web structure design on how learners navigate through continuing education courses. Evaluation of user navigation can help identify web design flaws and guide modifications. This study showed that Markov chain models provide a valuable tool to evaluate web-based education courses. Both the results and techniques in this study would be very useful for public health education and research specialists.

  1. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    Science.gov (United States)

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  2. 13 CFR 127.404 - What happens if SBA is unable to verify a concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA is unable to verify a concern's eligibility? 127.404 Section 127.404 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  3. On Verifying Currents and Other Features in the Hawaiian Islands Region Using Fully Coupled Ocean/Atmosphere Mesoscale Prediction System Compared to Global Ocean Model and Ocean Observations

    Science.gov (United States)

    Jessen, P. G.; Chen, S.

    2014-12-01

    This poster introduces and evaluates features concerning the Hawaii, USA region using the U.S. Navy's fully Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS-OS™) coupled to the Navy Coastal Ocean Model (NCOM). It also outlines some challenges in verifying ocean currents in the open ocean. The system is evaluated using in situ ocean data and initial forcing fields from the operational global Hybrid Coordinate Ocean Model (HYCOM). Verification shows difficulties in modelling downstream currents off the Hawaiian islands (Hawaii's wake). Comparing HYCOM to NCOM current fields show some displacement of small features such as eddies. Generally, there is fair agreement from HYCOM to NCOM in salinity and temperature fields. There is good agreement in SSH fields.

  4. Verifying operator fitness - an imperative not an option

    International Nuclear Information System (INIS)

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered

  5. Use of rank sum method in identifying high occupational dose jobs for ALARA implementation

    International Nuclear Information System (INIS)

    Cho, Yeong Ho; Kang, Chang Sun

    1998-01-01

    The cost-effective reduction of occupational radiation exposure (ORE) dose at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORE dose data of existing plants. It is necessary to identify what are high ORE jobs for ALARA implementation. In this study, the Rank Sum Method (RSM) is used in identifying high ORE jobs. As a case study, the database of ORE-related maintenance and repair jobs for Kori Units 3 and 4 is used for assessment, and top twenty high ORE jobs are identified. The results are also verified and validated using the Friedman test, and RSM is found to be a very efficient way of analyzing the data. (author)

  6. Modelling and Verifying Communication Failure of Hybrid Systems in HCSP

    DEFF Research Database (Denmark)

    Wang, Shuling; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    Hybrid systems are dynamic systems with interacting discrete computation and continuous physical processes. They have become ubiquitous in our daily life, e.g. automotive, aerospace and medical systems, and in particular, many of them are safety-critical. For a safety-critical hybrid system......, in the presence of communication failure, the expected control from the controller will get lost and as a consequence the physical process cannot behave as expected. In this paper, we mainly consider the communication failure caused by the non-engagement of one party in communication action, i.......e. the communication itself fails to occur. To address this issue, this paper proposes a formal framework by extending HCSP, a formal modeling language for hybrid systems, for modeling and verifying hybrid systems in the absence of receiving messages due to communication failure. We present two inference systems...

  7. Protocol: a systematic review of studies developing and/or evaluating search strategies to identify prognosis studies.

    Science.gov (United States)

    Corp, Nadia; Jordan, Joanne L; Hayden, Jill A; Irvin, Emma; Parker, Robin; Smith, Andrea; van der Windt, Danielle A

    2017-04-20

    Prognosis research is on the rise, its importance recognised because chronic health conditions and diseases are increasingly common and costly. Prognosis systematic reviews are needed to collate and synthesise these research findings, especially to help inform effective clinical decision-making and healthcare policy. A detailed, comprehensive search strategy is central to any systematic review. However, within prognosis research, this is challenging due to poor reporting and inconsistent use of available indexing terms in electronic databases. Whilst many published search filters exist for finding clinical trials, this is not the case for prognosis studies. This systematic review aims to identify and compare existing methodological filters developed and evaluated to identify prognosis studies of any of the three main types: overall prognosis, prognostic factors, and prognostic [risk prediction] models. Primary studies reporting the development and/or evaluation of methodological search filters to retrieve any type of prognosis study will be included in this systematic review. Multiple electronic bibliographic databases will be searched, grey literature will be sought from relevant organisations and websites, experts will be contacted, and citation tracking of key papers and reference list checking of all included papers will be undertaken. Titles will be screened by one person, and abstracts and full articles will be reviewed for inclusion independently by two reviewers. Data extraction and quality assessment will also be undertaken independently by two reviewers with disagreements resolved by discussion or by a third reviewer if necessary. Filters' characteristics and performance metrics reported in the included studies will be extracted and tabulated. To enable comparisons, filters will be grouped according to database, platform, type of prognosis study, and type of filter for which it was intended. This systematic review will identify all existing validated

  8. An Evaluation of Algorithms for Identifying Metastatic Breast, Lung, or Colorectal Cancer in Administrative Claims Data.

    Science.gov (United States)

    Whyte, Joanna L; Engel-Nitz, Nicole M; Teitelbaum, April; Gomez Rey, Gabriel; Kallich, Joel D

    2015-07-01

    Administrative health care claims data are used for epidemiologic, health services, and outcomes cancer research and thus play a significant role in policy. Cancer stage, which is often a major driver of cost and clinical outcomes, is not typically included in claims data. Evaluate algorithms used in a dataset of cancer patients to identify patients with metastatic breast (BC), lung (LC), or colorectal (CRC) cancer using claims data. Clinical data on BC, LC, or CRC patients (between January 1, 2007 and March 31, 2010) were linked to a health care claims database. Inclusion required health plan enrollment ≥3 months before initial cancer diagnosis date. Algorithms were used in the claims database to identify patients' disease status, which was compared with physician-reported metastases. Generic and tumor-specific algorithms were evaluated using ICD-9 codes, varying diagnosis time frames, and including/excluding other tumors. Positive and negative predictive values, sensitivity, and specificity were assessed. The linked databases included 14,480 patients; of whom, 32%, 17%, and 14.2% had metastatic BC, LC, and CRC, respectively, at diagnosis and met inclusion criteria. Nontumor-specific algorithms had lower specificity than tumor-specific algorithms. Tumor-specific algorithms' sensitivity and specificity were 53% and 99% for BC, 55% and 85% for LC, and 59% and 98% for CRC, respectively. Algorithms to distinguish metastatic BC, LC, and CRC from locally advanced disease should use tumor-specific primary cancer codes with 2 claims for the specific primary cancer >30-42 days apart to reduce misclassification. These performed best overall in specificity, positive predictive values, and overall accuracy to identify metastatic cancer in a health care claims database.

  9. AUTOMATIC ESTIMATION OF SIZE PARAMETERS USING VERIFIED COMPUTERIZED STEREOANALYSIS

    Directory of Open Access Journals (Sweden)

    Peter R Mouton

    2011-05-01

    Full Text Available State-of-the-art computerized stereology systems combine high-resolution video microscopy and hardwaresoftware integration with stereological methods to assist users in quantifying multidimensional parameters of importance to biomedical research, including volume, surface area, length, number, their variation and spatial distribution. The requirement for constant interactions between a trained, non-expert user and the targeted features of interest currently limits the throughput efficiency of these systems. To address this issue we developed a novel approach for automatic stereological analysis of 2-D images, Verified Computerized Stereoanalysis (VCS. The VCS approach minimizes the need for user interactions with high contrast [high signal-to-noise ratio (S:N] biological objects of interest. Performance testing of the VCS approach confirmed dramatic increases in the efficiency of total object volume (size estimation, without a loss of accuracy or precision compared to conventional computerized stereology. The broad application of high efficiency VCS to high-contrast biological objects on tissue sections could reduce labor costs, enhance hypothesis testing, and accelerate the progress of biomedical research focused on improvements in health and the management of disease.

  10. Prevalence of Ex Vivo High On-treatment Platelet Reactivity on Antiplatelet Therapy after Transient Ischemic Attack or Ischemic Stroke on the PFA-100(®) and VerifyNow(®).

    LENUS (Irish Health Repository)

    Kinsella, Justin A

    2012-09-12

    BACKGROUND: The prevalence of ex vivo high on-treatment platelet reactivity (HTPR) to commonly prescribed antiplatelet regimens after transient ischemic attack (TIA) or ischemic stroke is uncertain. METHODS: Platelet function inhibition was simultaneously assessed with modified light transmission aggregometry (VerifyNow; Accumetrics Inc, San Diego, CA) and with a moderately high shear stress platelet function analyzer (PFA-100; Siemens Medical Solutions USA, Inc, Malvern, PA) in a pilot, cross-sectional study of TIA or ischemic stroke patients. Patients were assessed on aspirin-dipyridamole combination therapy (n = 51) or clopidogrel monotherapy (n = 25). RESULTS: On the VerifyNow, HTPR on aspirin was identified in 4 of 51 patients (8%) on aspirin-dipyridamole combination therapy (≥550 aspirin reaction units on the aspirin cartridge). Eleven of 25 (44%) patients had HTPR on clopidogrel (≥194 P2Y12 reaction units on the P2Y12 cartridge). On the PFA-100, 21 of 51 patients (41%) on aspirin-dipyridamole combination therapy had HTPR on the collagen-epinephrine (C-EPI) cartridge. Twenty-three of 25 patients (92%) on clopidogrel had HTPR on the collagen-adenosine diphosphate (C-ADP) cartridge. The proportion of patients with antiplatelet HTPR was lower on the VerifyNow than PFA-100 in patients on both regimens (P < .001). CONCLUSIONS: The prevalence of ex vivo antiplatelet HTPR after TIA or ischemic stroke is markedly influenced by the method used to assess platelet reactivity. The PFA-100 C-ADP cartridge is not sensitive at detecting the antiplatelet effects of clopidogrel ex vivo. Larger prospective studies with the VerifyNow and with the PFA-100 C-EPI and recently released Innovance PFA P2Y cartridges (Siemens Medical Solutions USA, Inc) in addition to newer tests of platelet function are warranted to assess whether platelet function monitoring predicts clinical outcome in ischemic cerebrovascular disease.

  11. EVALUATION OF MUSCLE STRENGTH IN MEDULLAR INJURY: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    Tânia Valdameri Capelari

    Full Text Available ABSTRACT Objective: To identify the tools used to evaluate muscle strength in subjects with spinal cord injury in both clinical practice and scientific research. Methods: Initially, the literature review was carried out to identify the tools used in scientific research. The search was conducted in the following databases: Virtual Health Library (VHL, Pedro, and PubMed. Studies published between 1990 and 2016 were considered and selected, depicting an evaluation of muscle strength as an endpoint or for characterization of the sample. Next, a survey was carried out with physiotherapists to identify the instruments used for evaluation in clinical practice, and the degree of satisfaction of professionals with respect to them. Results: 495 studies were found; 93 were included for qualitative evaluation. In the studies, we verified the use of manual muscle test with different graduation systems, isokinetic dynamometer, hand-held dynamometer, and manual dynamometer. In clinical practice, the manual muscle test using the motor score recommended by the American Spinal Cord Injury Association was the most used method, despite the limitations highlighted by the physiotherapists interviewed. Conclusion: In scientific research, there is great variation in the methods and tools used to evaluate muscle strength in individuals with spinal cord injury, differently from clinical practice. The tools available and currently used have important limitations, which were highlighted by the professionals interviewed. No instrument depicts direct relationship of muscle strength and functionality of the subject. There is no consensus as to the best method for assessing muscle strength in spinal cord injury, and new instruments are needed that are specific for use in this population.

  12. Development and validation of a nursing professionalism evaluation model in a career ladder system.

    Science.gov (United States)

    Kim, Yeon Hee; Jung, Young Sun; Min, Ja; Song, Eun Young; Ok, Jung Hui; Lim, Changwon; Kim, Kyunghee; Kim, Ji-Su

    2017-01-01

    The clinical ladder system categorizes the degree of nursing professionalism and rewards and is an important human resource tool for managing nursing. We developed a model to evaluate nursing professionalism, which determines the clinical ladder system levels, and verified its validity. Data were collected using a clinical competence tool developed in this study, and existing methods such as the nursing professionalism evaluation tool, peer reviews, and face-to-face interviews to evaluate promotions and verify the presented content in a medical institution. Reliability and convergent and discriminant validity of the clinical competence evaluation tool were verified using SmartPLS software. The validity of the model for evaluating overall nursing professionalism was also analyzed. Clinical competence was determined by five dimensions of nursing practice: scientific, technical, ethical, aesthetic, and existential. The structural model explained 66% of the variance. Clinical competence scales, peer reviews, and face-to-face interviews directly determined nursing professionalism levels. The evaluation system can be used for evaluating nurses' professionalism in actual medical institutions from a nursing practice perspective. A conceptual framework for establishing a human resources management system for nurses and a tool for evaluating nursing professionalism at medical institutions is provided.

  13. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    International Nuclear Information System (INIS)

    Jacobson, Jacob J.; Jeffers, Robert F.; Matthern, Gretchen E.; Piet, Steven J.; Baker, Benjamin A.; Grimm, Joseph

    2009-01-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R and D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several Microsoft

  14. Verified Representations of Landau's "Grundlagen" in the lambda-delta Family and in the Calculus of Constructions

    Directory of Open Access Journals (Sweden)

    Ferruccio Guidi

    2016-01-01

    Full Text Available Landau's "Grundlagen der Analysis" formalized in the language Aut-QE, represents an early milestone in computer-checked mathematics and is the only non-trivial development finalized in the languages of the Automath family. Here we discuss an implemented procedure producing a faithful representation of the Grundlagen in the Calculus of Constructions, verified by the proof assistant Coq 8.4.3. The point at issue is distinguishing lambda-abstractions from pi-abstractions where the original text uses Automath unified binders, taking care of the cases in which a binder corresponds to both abstractions at one time. It is a fact that some binders can be disambiguated only by verifying the Grundlagen in a calculus accepting Aut-QE and the Calculus of Constructions. To this end, we rely on lambda-delta version 3, a system that the author is proposing here for the first time.

  15. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    Science.gov (United States)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  16. Long-term mortality and causes of death in endoscopically verified upper gastrointestinal bleeding: comparison of bleeding patients and population controls.

    Science.gov (United States)

    Miilunpohja, S; Jyrkkä, J; Kärkkäinen, J M; Kastarinen, H; Heikkinen, M; Paajanen, H; Rantanen, T; Hartikainen, Jek

    2017-11-01

    Upper gastrointestinal bleeding (UGIB) is a common emergency, with in-hospital mortality between 3 and 14%. However, the long-term mortality and causes of death are unknown. We investigated the long-term mortality and causes of death in UGIB patients in a retrospective single-centre case-control study design. A total of 569 consecutive patients, aged ≥18 years, admitted to Kuopio University Hospital for their first endoscopically verified UGIB during the years 2009-2011 were identified from hospital records. For each UGIB patient, an age, sex and hospital district matched control patient was identified from the Statistics Finland database. Data on endoscopy procedures, laboratory values, comorbidities and medication were obtained from patient records. Data on deaths and causes of death were obtained from Statistics Finland. In-hospital mortality of UGIB patients was low at 3.3%. The long-term (mean follow-up 32 months) mortality of UGIB patients was significantly higher than controls (34.1 versus 12.1%, p death compared to controls was highest (HR 19.2, 95% CI 7.0-52.4, p causes of death were related to comorbidities and did not differ from causes of death in controls. UGIB patients have three times higher long-term mortality than population controls.

  17. Neuropsychological Testing in Pathologically Verified Alzheimer Disease and Frontotemporal Dementia: How Well Do the Uniform Data Set Measures Differentiate Between Diseases?

    Science.gov (United States)

    Ritter, Aaron R; Leger, Gabriel C; Miller, Justin B; Banks, Sarah J

    2017-01-01

    Differences in cognition between frontotemporal dementia (FTD) and Alzheimer disease (AD) are well described in clinical cohorts, but have rarely been confirmed in studies with pathologic verification. For emerging therapeutics to succeed, determining underlying pathology early in the disease course is increasingly important. Neuropsychological evaluation is an important component of the diagnostic workup for AD and FTD. Patients with FTD are thought to have greater deficits in language and executive function while patients with AD are more likely to have deficits in memory. To determine if performance on initial cognitive testing can reliably distinguish between patients with frontotemporal lobar degeneration (FTLD) and AD neuropathology. In addition, are there other factors of the neuropsychological assessment that can be used to enhance the accuracy of underlying pathology? Using a logistic regression we retrospectively compared neurocognitive performance on initial evaluation of 106 patients with pathologically verified FTLD (pvFTLD), with 558 pathologically verified AD (pvAD) patients from the National Alzheimer's Coordinating Center using data from the Uniform Data Set (UDS) and the neuropathology data set. As expected, pvFTLD patients were younger, demonstrated better memory performance, and had more neuropsychiatric symptoms than pvAD patients. Other results were less predictable: pvFTLD patients performed better on one test of executive function (trail making test part B) but worse on another (digit span backward). Performance on language testing did not strongly distinguish the 2 groups. To determine what factors led to a misdiagnosis of AD in patients with FTLD, we further analyzed a small group of pvFTLD patients. These patients demonstrated older age and lower Neuropsychiatric Inventory Questionnaire counts compared with accurately diagnosed cases. Other than memory, numerical scores of neurocognitive performance on the UDS are of limited value in

  18. Evaluation of the WinROP system for identifying retinopathy of prematurity in Czech preterm infants.

    Science.gov (United States)

    Timkovic, Juraj; Pokryvkova, Martina; Janurova, Katerina; Barinova, Denisa; Polackova, Renata; Masek, Petr

    2017-03-01

    Retinopathy of Prematurity (ROP) is a potentially serious condition that can afflict preterm infants. Timely and correct identification of individuals at risk of developing a serious form of ROP is therefore of paramount importance. WinROP is an online system for predicting ROP based on birth weight and weight increments. However, the results vary significantly for various populations. It has not been evaluated in the Czech population. This study evaluates the test characteristics (specificity, sensitivity, positive and negative predictive values) of the WinROP system in Czech preterm infants. Data on 445 prematurely born infants included in the ROP screening program at the University Hospital Ostrava, Czech Republic, were retrospectively entered into the WinROP system and the outcomes of the WinROP and regular screening were compared. All 24 infants who developed high-risk (Type 1 or Type 2) ROP were correctly identified by the system. The sensitivity and negative predictive values for this group were 100%. However, the specificity and positive predictive values were substantially lower, resulting in a large number of false positives. Extending the analysis to low risk ROP, the system did not provide such reliable results. The system is a valuable tool for identifying infants who are not likely to develop high-risk ROP and this could help to substantially reduce the number of preterm infants in need of regular ROP screening. It is not suitable for predicting the development of less serious forms of ROP which is however in accordance with the declared aims of the WinROP system.

  19. International Reactor Physics Experiment Evaluation (IRPhE) Project. IRPhE Handbook - 2015 edition

    International Nuclear Information System (INIS)

    Bess, John D.; Gullifor, Jim

    2015-03-01

    The purpose of the International Reactor Physics Experiment Evaluation (IRPhE) Project is to provide an extensively peer-reviewed set of reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. This work of the IRPhE Project is formally documented in the 'International Handbook of Evaluated Reactor Physics Benchmark Experiments', a single source of verified and extensively peer-reviewed reactor physics benchmark measurements data. The evaluation process entails the following steps: Identify a comprehensive set of reactor physics experimental measurements data, Evaluate the data and quantify overall uncertainties through various types of sensitivity analysis to the extent possible, verify the data by reviewing original and subsequently revised documentation, and by talking with the experimenters or individuals who are familiar with the experimental facility, Compile the data into a standardized format, Perform calculations of each experiment with standard reactor physics codes where it would add information, Formally document the work into a single source of verified and peer reviewed reactor physics benchmark measurements data. The International Handbook of Evaluated Reactor Physics Benchmark Experiments contains reactor physics benchmark specifications that have been derived from experiments that were performed at nuclear facilities around the world. The benchmark specifications are intended for use by reactor designers, safety analysts and nuclear data evaluators to validate calculation techniques and data. Example calculations are presented; these do not constitute a validation or endorsement of the codes or cross-section data. The 2015 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments contains data from 143 experimental series that were

  20. Mechanisms of change in psychotherapy for depression : An empirical update and evaluation of research aimed at identifying psychological mediators

    NARCIS (Netherlands)

    Lemmens, L.H.J.M.; Müller, V.N.L.S.; Arntz, A.; Huibers, M.J.H.

    2016-01-01

    We present a systematic empirical update and critical evaluation of the current status of research aimed at identifying a variety of psychological mediators in various forms of psychotherapy for depression. We summarize study characteristics and results of 35 relevant studies, and discuss the extent

  1. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    Science.gov (United States)

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  2. An improved system to verify CANDU spent fuel elements in dry storage silos

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel

    2000-01-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  3. An improved system to verify CANDU spent fuel elements in dry storage silos

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2000-07-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  4. Evaluating predictive models for solar energy growth in the US states and identifying the key drivers

    Science.gov (United States)

    Chakraborty, Joheen; Banerji, Sugata

    2018-03-01

    Driven by a desire to control climate change and reduce the dependence on fossil fuels, governments around the world are increasing the adoption of renewable energy sources. However, among the US states, we observe a wide disparity in renewable penetration. In this study, we have identified and cleaned over a dozen datasets representing solar energy penetration in each US state, and the potentially relevant socioeconomic and other factors that may be driving the growth in solar. We have applied a number of predictive modeling approaches - including machine learning and regression - on these datasets over a 17-year period and evaluated the relative performance of the models. Our goals were: (1) identify the most important factors that are driving the growth in solar, (2) choose the most effective predictive modeling technique for solar growth, and (3) develop a model for predicting next year’s solar growth using this year’s data. We obtained very promising results with random forests (about 90% efficacy) and varying degrees of success with support vector machines and regression techniques (linear, polynomial, ridge). We also identified states with solar growth slower than expected and representing a potential for stronger growth in future.

  5. Standard test method for verifying the alignment of X-Ray diffraction instrumentation for residual stress measurement

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the preparation and use of a flat stress-free test specimen for the purpose of checking the systematic error caused by instrument misalignment or sample positioning in X-ray diffraction residual stress measurement, or both. 1.2 This test method is applicable to apparatus intended for X-ray diffraction macroscopic residual stress measurement in polycrystalline samples employing measurement of a diffraction peak position in the high-back reflection region, and in which the θ, 2θ, and ψ rotation axes can be made to coincide (see Fig. 1). 1.3 This test method describes the use of iron powder which has been investigated in round-robin studies for the purpose of verifying the alignment of instrumentation intended for stress measurement in ferritic or martensitic steels. To verify instrument alignment prior to stress measurement in other metallic alloys and ceramics, powder having the same or lower diffraction angle as the material to be measured should be prepared in similar fashion...

  6. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    Science.gov (United States)

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Instructor and course evaluation based on student-identified criteria.

    Science.gov (United States)

    Jackson, M O

    1977-02-01

    Students have come to school for an education and it is their right to evaluate the quality of the education they are receiving. They should not have to demand or even ask for the privilege of saying what they think. Instructors should be providing the opportunity for evaluation by requesting that information from the students. No value judgment can be totally objective, but an instrument composed of mutually agreed upon statements should encourage the greatest possible degree of objectivity. Using one accepted form throughout the school, all students would be considering the same characteristics and traits for every instructor and course evaluated. Each instructor would receive similar information about personal performance and about the course presented. Students would be free to talk to the faculty or to add comments if they so desired; but, a questionnaire used in every course would allow and even encourage responses from every student enrolled. Faculty responsibility would not end with the preparation and implementation of an evaluation instrument. Instructors would have to let the students know their opinions are important and will be considered in curricular and instructional decisions. Faculty and students would be communicating and hopefully fulfilling the needs of and responsibilities to each other.

  8. Verifying Elimination Programs with a Special Emphasis on Cysticercosis Endpoints and Postelimination Surveillance

    Directory of Open Access Journals (Sweden)

    Sukwan Handali

    2012-01-01

    Full Text Available Methods are needed for determining program endpoints or postprogram surveillance for any elimination program. Cysticercosis has the necessary effective strategies and diagnostic tools for establishing an elimination program; however, tools to verify program endpoints have not been determined. Using a statistical approach, the present study proposed that taeniasis and porcine cysticercosis antibody assays could be used to determine with a high statistical confidence whether an area is free of disease. Confidence would be improved by using secondary tests such as the taeniasis coproantigen assay and necropsy of the sentinel pigs.

  9. Floorball game skills (evaluation criteria)

    OpenAIRE

    Chlumský, Marek

    2013-01-01

    Title: Playing skills in floorball (evaluation criteria). Target: To create a list of playing skills which an ideal player should demonstrate. Find and verify the evaluation criteria of these skills and inspire trainers to develop these skills in the best way. Methods: Informal interviews, individually structured interviews, analysis and verification of data, pilot testing. Results: Defined playing skills in floorball, developed scale of values of floorball playing skills, creation of exercis...

  10. Transcriptomic analysis of rice aleurone cells identified a novel abscisic acid response element.

    Science.gov (United States)

    Watanabe, Kenneth A; Homayouni, Arielle; Gu, Lingkun; Huang, Kuan-Ying; Ho, Tuan-Hua David; Shen, Qingxi J

    2017-09-01

    Seeds serve as a great model to study plant responses to drought stress, which is largely mediated by abscisic acid (ABA). The ABA responsive element (ABRE) is a key cis-regulatory element in ABA signalling. However, its consensus sequence (ACGTG(G/T)C) is present in the promoters of only about 40% of ABA-induced genes in rice aleurone cells, suggesting other ABREs may exist. To identify novel ABREs, RNA sequencing was performed on aleurone cells of rice seeds treated with 20 μM ABA. Gibbs sampling was used to identify enriched elements, and particle bombardment-mediated transient expression studies were performed to verify the function. Gene ontology analysis was performed to predict the roles of genes containing the novel ABREs. This study revealed 2443 ABA-inducible genes and a novel ABRE, designated as ABREN, which was experimentally verified to mediate ABA signalling in rice aleurone cells. Many of the ABREN-containing genes are predicted to be involved in stress responses and transcription. Analysis of other species suggests that the ABREN may be monocot specific. This study also revealed interesting expression patterns of genes involved in ABA metabolism and signalling. Collectively, this study advanced our understanding of diverse cis-regulatory sequences and the transcriptomes underlying ABA responses in rice aleurone cells. © 2017 John Wiley & Sons Ltd.

  11. Per-service supervised learning for identifying desired WoT apps from user requests in natural language.

    Directory of Open Access Journals (Sweden)

    Young Yoon

    Full Text Available Web of Things (WoT platforms are growing fast so as the needs for composing WoT apps more easily and efficiently. We have recently commenced the campaign to develop an interface where users can issue requests for WoT apps entirely in natural language. This requires an effort to build a system that can learn to identify relevant WoT functions that fulfill user's requests. In our preceding work, we trained a supervised learning system with thousands of publicly-available IFTTT app recipes based on conditional random fields (CRF. However, the sub-par accuracy and excessive training time motivated us to devise a better approach. In this paper, we present a novel solution that creates a separate learning engine for each trigger service. With this approach, parallel and incremental learning becomes possible. For inference, our system first identifies the most relevant trigger service for a given user request by using an information retrieval technique. Then, the learning engine associated with the trigger service predicts the most likely pair of trigger and action functions. We expect that such two-phase inference method given parallel learning engines would improve the accuracy of identifying related WoT functions. We verify our new solution through the empirical evaluation with training and test sets sampled from a pool of refined IFTTT app recipes. We also meticulously analyze the characteristics of the recipes to find future research directions.

  12. Per-service supervised learning for identifying desired WoT apps from user requests in natural language.

    Science.gov (United States)

    Yoon, Young

    2017-01-01

    Web of Things (WoT) platforms are growing fast so as the needs for composing WoT apps more easily and efficiently. We have recently commenced the campaign to develop an interface where users can issue requests for WoT apps entirely in natural language. This requires an effort to build a system that can learn to identify relevant WoT functions that fulfill user's requests. In our preceding work, we trained a supervised learning system with thousands of publicly-available IFTTT app recipes based on conditional random fields (CRF). However, the sub-par accuracy and excessive training time motivated us to devise a better approach. In this paper, we present a novel solution that creates a separate learning engine for each trigger service. With this approach, parallel and incremental learning becomes possible. For inference, our system first identifies the most relevant trigger service for a given user request by using an information retrieval technique. Then, the learning engine associated with the trigger service predicts the most likely pair of trigger and action functions. We expect that such two-phase inference method given parallel learning engines would improve the accuracy of identifying related WoT functions. We verify our new solution through the empirical evaluation with training and test sets sampled from a pool of refined IFTTT app recipes. We also meticulously analyze the characteristics of the recipes to find future research directions.

  13. 24 CFR 5.218 - Penalties for failing to disclose and verify Social Security and Employer Identification Numbers.

    Science.gov (United States)

    2010-04-01

    ... and verify Social Security and Employer Identification Numbers. 5.218 Section 5.218 Housing and Urban... REQUIREMENTS; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security...

  14. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    International Nuclear Information System (INIS)

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing

  15. Verifying large modular systems using iterative abstraction refinement

    International Nuclear Information System (INIS)

    Lahtinen, Jussi; Kuismin, Tuomas; Heljanko, Keijo

    2015-01-01

    Digital instrumentation and control (I&C) systems are increasingly used in the nuclear engineering domain. The exhaustive verification of these systems is challenging, and the usual verification methods such as testing and simulation are typically insufficient. Model checking is a formal method that is able to exhaustively analyse the behaviour of a model against a formally written specification. If the model checking tool detects a violation of the specification, it will give out a counter-example that demonstrates how the specification is violated in the system. Unfortunately, sometimes real life system designs are too big to be directly analysed by traditional model checking techniques. We have developed an iterative technique for model checking large modular systems. The technique uses abstraction based over-approximations of the model behaviour, combined with iterative refinement. The main contribution of the work is the concrete abstraction refinement technique based on the modular structure of the model, the dependency graph of the model, and a refinement sampling heuristic similar to delta debugging. The technique is geared towards proving properties, and outperforms BDD-based model checking, the k-induction technique, and the property directed reachability algorithm (PDR) in our experiments. - Highlights: • We have developed an iterative technique for model checking large modular systems. • The technique uses BDD-based model checking, k-induction, and PDR in parallel. • We have tested our algorithm by verifying two models with it. • The technique outperforms classical model checking methods in our experiments

  16. Testing and evaluation of existing techniques for identifying uptakes and measuring retention of uranium in mill workers

    International Nuclear Information System (INIS)

    1983-03-01

    Preliminary tests and evaluations of existing bio-analytical techniques for identifying uptakes and measuring retention of uranium in mill workers were made at two uranium mills. Urinalysis tests were found to be more reliable indicators of uranium uptakes than personal air sampling. Static air samples were not found to be good indicators of personal uptakes. In vivo measurements of uranium in lung were successfully carried out in the presence of high and fluctuating background radiation. Interference from external contamination was common during end of shift measurements. A full scale study to evaluate model parameters for the uptake, retention and elimination of uranium should include, in addition to the above techniques, particle size determination of airborne uranium, solubility in simulated lung fluid, uranium analysis in faeces and bone and minute volume measurements for each subject

  17. Evaluating vibration performance of a subsea pump module by full-scale testing and numerical modelling

    NARCIS (Netherlands)

    Beek, P.J.G. van; Pereboom, H.P.; Slot, H.J.

    2016-01-01

    Prior to subsea installation, a subsea system has to be tested to verify whether it performs in accordance with specifications and component specific performance evaluation criteria. It is important to verify that the assembled components work in accordance with the assumptions and design criteria

  18. Evaluation and proposal of improvement for the measurement system in ATLAS

    International Nuclear Information System (INIS)

    Cho, Dong Woo; Kim, Jong Rok; Park, Jun Kwon

    2007-03-01

    The project independently evaluated the validities and reliability of measurement system in ATLAS, then proposed plans to improve the measurement system from evaluated results. For this objectives, we evaluated the design, technical backgrounds, verifying data of measurement system in ATLAS. From this evaluation, we proposed plans for improvement on parts which need improvement

  19. Study and survey of assembling parameters to a radioactive source production laboratory used to verify equipment

    International Nuclear Information System (INIS)

    Gauglitz, Erica

    2010-01-01

    This paper presents a survey of parameters for the proper and safe flooring, doors, windows, fume hoods and others, in a radiochemical laboratory. The layout of each item follows guidelines and national standards of the National Commission of Nuclear Energy (CNEN) and the International Atomic Energy Agency (IAEA), aiming to ensure the radiological protection of workers and environment. The adequate items arrangement in the radiochemical laboratory ensures quality and safety in the production of 57 Co 137 Cs and 133 Ba radioactive sealed sources, with activities 185, 9.3 and 5.4 MBq, respectively. These sources are used to verify meter activity equipment and should be available throughout the Nuclear Medicine Center, following the recommendations of CNEN-NN-3.05 standard R equirements for Radiation Protection and Safety Services for Nuclear Medicine , to verify the activity of radiopharmaceuticals that are administered in patients, for diagnosis and therapy. Verification of measuring activity equipment will be used to perform accuracy, reproducibility and linearity tests, which should show results within the limits specified in the standard CNEN-NN-3.05. (author)

  20. EMAS in Germany. Evaluation 2012; EMAS in Deutschland. Evaluierung 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-03-15

    Since the 1990ies, the Eco-Management and Audit Scheme (EMAS) is established in the European Union. Arqum GmbH (Munich, Germany) and Infra dimap (Berlin, Germany) performed the survey ''EMAS in Germany - Evaluation 2012'' at 573 EMAS organizations in the period between March 2012 and July 2012 in order to gain insight into the current EMAS practice in German organizations and to identify potentials for the future development of the EMAS system. The survey covered the following topics: (1) Reasons for participation of the companies and organizations with the EMAS system; (2) Cost-effectiveness of EMAS at the site of the organization; (3) Experiences with the environmental statement; (4) Experiences after the last EMAS amendment (EMAS III); (5) Experience with the environmental verifier and the validation process; (6) Evaluation of the continuation of the EMAS system; (7) Requests to the environmental policy. The main results of this survey are presented in the contribution under consideration.

  1. Proposed procedure and analysis of results to verify the indicator of the product dose-area in radiology equipment

    International Nuclear Information System (INIS)

    Garcia Marcos, R.; Gallego Franco, P.; Sierra Diaz, F.; Gonzalez Ruiz, C.; Rodriguez Checa, M.; Brasa Estevez, M.; Gomez Calvar, R.

    2013-01-01

    The aim of this work is to establish a procedure to verify the value of the product dose-area showing certain teams of Radiology, with an alternative to the use of external transmission cameras. (Author)

  2. Development of a TL personal dosimeter identifiable PA exposure, and comparison with commercial TL dosimeters

    International Nuclear Information System (INIS)

    Kwon, J.W.; Kim, H.K.; Lee, J.K.; Kim, J.L.

    2004-01-01

    A single-dosimeter worn on the anterior surface of the body of a worker was found to significantly underestimate the effective dose to the worker when the radiation comes from the back. Several researchers suggested that this sort of underestimation can be corrected to a certain extent by using an extra dosimeter on the back. However, use of multiple dosimeters also has disadvantages such as complication in control or incurrence of extra cost. Instead of the common multi-dosimeter approach, in this study, a single dosimeter introducing asymmetric filters which enabled to identify PA exposure was designed, and its dose evaluation algorithm for AP-PA mixed radiation fields was established. A prototype TL personal dosimeter was designed and constructed. The Monte Carlo simulations were utilized in the design process and verified by experiments. The dosimeter and algorithm were applicable to photon radiation having an effective energy beyond 100 keV in AP-PA mixed radiation fields. A simplified performance test based on ANSI N13.11 showed satisfactory results. Considering that the requirements of the International Electrotechnical Commission (IEC) and the American National Standards Institute (ANSI) with regard to the dosimeter on angular dependency is reinforced, the dosimeter and the dose evaluation algorithm developed in this study provides a useful approach in practical personal dosimetry against inhomogeneous high energy radiation fields. (author)

  3. A Hybrid Verifiable and Delegated Cryptographic Model in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jaber Ibrahim Naser

    2018-02-01

    Full Text Available Access control is very important in cloud data sharing. Especially in the domains like healthcare, it is essential to have access control mechanisms in place for confidentiality and secure data access. Attribute based encryption has been around for many years to secure data and provide controlled access. In this paper, we proposed a framework that supports circuit and attributes based encryption mechanism that involves multiple parties. They are data owner, data user, cloud server and attribute authority. An important feature of the proposed system is the verifiable delegation of the decryption process to cloud server. Data owner encrypts data and delegates decryption process to cloud. Cloud server performs partial decryption and then the final decrypted data are shared for users as per the privileges. Data owner  thus reduces computational complexity by delegating decryption process cloud server. We built a prototype application using the Microsoft.NET platform for proof of the concept. The empirical results revealed that there is controlled access with multiple user roles and access control rights for secure and confidential data access in cloud computing.

  4. Defining and Verifying Research Grade Airborne Laser Swath Mapping (ALSM) Observations

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Slatton, C. C.

    2004-12-01

    The first and primary goal of the National Science Foundation (NSF) supported Center for Airborne Laser Mapping (NCALM), operated jointly by the University of Florida and the University of California, Berkeley, is to make "research grade" ALSM data widely available at affordable cost to the national scientific community. Cost aside, researchers need to know what NCALM considers research grade data and how the quality of the data is verified, to be able to determine the likelihood that the data they receive will meet their project specific requirements. Given the current state of the technology it is reasonable to expect a well planned and executed survey to produce surface elevations with uncertainties less than 10 centimeters and horizontal uncertainties of a few decimeters. Various components of the total error are generally associated with the aircraft trajectory, aircraft orientation, or laser vectors. Aircraft trajectory error is dependent largely on the Global Positioning System (GPS) observations, aircraft orientation on Inertial Measurement Unit (IMU) observations, and laser vectors on the scanning and ranging instrumentation. In addition to the issue of the precision or accuracy of the coordinates of the surface points, consideration must also be given to the point-to-point spacing and voids in the coverage. The major sources of error produce distinct artifacts in the data set. For example, aircraft trajectory errors tend to change slowly as the satellite constellation geometry varies, producing slopes within swaths and offsets between swaths. Roll, pitch and yaw biases in the IMU observations tend to persist through whole flights, and created distinctive artifacts in the swath overlap areas. Errors in the zero-point and scale of the laser scanner cause the edges of swaths to turn up or down. Range walk errors cause offsets between bright and dark surfaces, causing paint stripes to float above the dark surfaces of roads. The three keys to producing

  5. MUSE: An Efficient and Accurate Verifiable Privacy-Preserving Multikeyword Text Search over Encrypted Cloud Data

    Directory of Open Access Journals (Sweden)

    Zhu Xiangyang

    2017-01-01

    Full Text Available With the development of cloud computing, services outsourcing in clouds has become a popular business model. However, due to the fact that data storage and computing are completely outsourced to the cloud service provider, sensitive data of data owners is exposed, which could bring serious privacy disclosure. In addition, some unexpected events, such as software bugs and hardware failure, could cause incomplete or incorrect results returned from clouds. In this paper, we propose an efficient and accurate verifiable privacy-preserving multikeyword text search over encrypted cloud data based on hierarchical agglomerative clustering, which is named MUSE. In order to improve the efficiency of text searching, we proposed a novel index structure, HAC-tree, which is based on a hierarchical agglomerative clustering method and tends to gather the high-relevance documents in clusters. Based on the HAC-tree, a noncandidate pruning depth-first search algorithm is proposed, which can filter the unqualified subtrees and thus accelerate the search process. The secure inner product algorithm is used to encrypted the HAC-tree index and the query vector. Meanwhile, a completeness verification algorithm is given to verify search results. Experiment results demonstrate that the proposed method outperforms the existing works, DMRS and MRSE-HCI, in efficiency and accuracy, respectively.

  6. Small scale combustion of reed canary grass - inventory and evaluation of available technology; Smaaskalig foerbraenning av roerflen - inventering och vaerdering av tillgaenglig teknik

    Energy Technology Data Exchange (ETDEWEB)

    Gustavsson, Lennart; Paulrud, Susanne

    2011-07-01

    The feasibility of commercially available boilers in the interval 50 kW to 1 MW for use with reed canary grass (RCG) as fuel has been preliminary evaluated. The capacity to handle the large ash volumes generated by RCG both in terms of ash withdrawal and combustion quality was used as the main criteria. Nine boilers and two burners were identified and classified in a three-step scale from verified functioning on RCG to possible functioning with some design changes

  7. Identifying and Evaluating Options for Improving Sediment Management and Fish Passage at Hydropower Dams in the Lower Mekong River Basin

    Science.gov (United States)

    Wild, T. B.; Reed, P. M.; Loucks, D. P.

    2015-12-01

    The Mekong River basin in Southeast Asia is undergoing intensive and pervasive hydropower development to satisfy demand for increased energy and income to support its growing population of 60 million people. Just 20 years ago this river flowed freely. Today some 30 large dams exist in the basin, and over 100 more are being planned for construction. These dams will alter the river's natural water, sediment and nutrient flows, thereby impacting river morphology and ecosystems, and will fragment fish migration pathways. In doing so, they will degrade one of the world's most valuable and productive freshwater fish habitats. For those dams that have not yet been constructed, there still exist opportunities to modify their siting, design and operation (SDO) to potentially achieve a more balanced set of tradeoffs among hydropower production, sediment/nutrient passage and fish passage. We introduce examples of such alternative SDO opportunities for Sambor Dam in Cambodia, planned to be constructed on the main stem of the Mekong River. To evaluate the performance of such alternatives, we developed a Python-based simulation tool called PySedSim. PySedSim is a daily time step mass balance model that identifies the relative tradeoffs among hydropower production, and flow and sediment regime alteration, associated with reservoir sediment management techniques such as flushing, sluicing, bypassing, density current venting and dredging. To date, there has been a very limited acknowledgement or evaluation of the significant uncertainties that impact the evaluation of SDO alternatives. This research is formalizing a model diagnostic assessment of the key assumptions and parametric uncertainties that strongly influence PySedSim SDO evaluations. Using stochastic hydrology and sediment load data, our diagnostic assessment evaluates and compares several Sambor Dam alternatives using several performance measures related to energy production, sediment trapping and regime alteration, and

  8. Evaluation of resistivity meters for concrete quality assurance.

    Science.gov (United States)

    2015-06-01

    This research evaluated a series of MoDOT concrete mixtures to verify existing relationships between surface resistivity (SR), rapid : chloride permeability (RCP), chloride ion diffusion, and the AASHTO penetrability classes. The research also perfor...

  9. Evaluation of three refurbished Guralp CMG-3TB seismometers.

    Energy Technology Data Exchange (ETDEWEB)

    Hart, Darren M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Merchant, Bion J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The overall objective of testing the Guralp CMG-3TB refurbished seismometers is to determine whether or not the refurbished sensors exhibit better data quality and require less maintenance when deployed than the original Guralp CMG-3TBs. SNL will test these 3 refurbished Guralps to verify performance specifications. The specifications that will be evaluated are sensitivity, bandwidth, self-noise, output impedance, clip-level, dynamic range over application passband, verify mathematical response and calibration response parameters for amplitude and phase.

  10. 30 CFR 253.27 - When I submit audited annual financial statements to verify my unencumbered assets, what...

    Science.gov (United States)

    2010-07-01

    ... financial statements to verify my unencumbered assets, what standards must they meet? Any audited annual financial statements that you submit must: (a) Meet the standards in § 253.24; and (b) Include a certification by the independent accountant who audited the financial statements that states: (1) The value of...

  11. High-Resolution Melting Curve Analysis of the 16S Ribosomal Gene to Detect and Identify Pathogenic and Saprophytic Leptospira Species in Colombian Isolates.

    Science.gov (United States)

    Peláez Sánchez, Ronald G; Quintero, Juan Álvaro López; Pereira, Martha María; Agudelo-Flórez, Piedad

    2017-05-01

    AbstractIt is important to identify the circulating Leptospira agent to enhance the performance of serodiagnostic tests by incorporating specific antigens of native species, develop vaccines that take into account the species/serovars circulating in different regions, and optimize prevention and control strategies. The objectives of this study were to develop a polymerase chain reaction (PCR)-high-resolution melting (HRM) assay for differentiating between species of the genus Leptospira and to verify its usefulness in identifying unknown samples to species level. A set of primers from the initial region of the 16S ribosomal gene was designed to detect and differentiate the 22 species of Leptospira . Eleven reference strains were used as controls to establish the reference species and differential melting curves. Twenty-five Colombian Leptospira isolates were studied to evaluate the usefulness of the PCR-HRM assay in identifying unknown samples to species level. This identification was confirmed by sequencing and phylogenetic analysis of the 16S ribosomal gene. Eleven Leptospira species were successfully identified, except for Leptospira meyeri / Leptospira yanagawae because the sequences were 100% identical. The 25 isolates from humans, animals, and environmental water sources were identified as Leptospira santarosai (twelve), Leptospira interrogans (nine), and L. meyeri / L. yanagawae (four). The species verification was 100% concordant between PCR-HRM and phylogenetic analysis of the 16S ribosomal gene. The PCR-HRM assay designed in this study is a useful tool for identifying Leptospira species from isolates.

  12. Experimental observation of G banding verifying X-ray workers' chromosome translocation detected by FISH

    International Nuclear Information System (INIS)

    Sun Yuanming; Li Jin; Wang Qin; Tang Weisheng; Wang Zhiquan

    2002-01-01

    Objective: FISH is the most effective way of detecting chromosome aberration and many factors affect its accuracy. G-banding is used to verify the results of early X-ray workers' chromosome translocation examined by FISH. Methods: The chromosome translocations of early X-ray workers have been analysed by FISH (fluorescence in situ hybridization) and G-banding, yields of translocation treated with statistics. Results: The chromosome aberrations frequencies by tow methods are closely related. Conclusion: FISH is a feasible way to analyse chromosome aberrations of X-ray workers and reconstruct dose

  13. Verifying Quality of Service of ARCnet Based ATOMOS Communication System for Integrated Ship Control

    DEFF Research Database (Denmark)

    Nielsen, N.N.; Nielsen, Jens Frederik Dalsgaard; Schiøler, Henrik

    point) layer. An important characteristic of the communication system is that the functionality and timing must be verifiable in order to satisfy requirements from classification companies like Lloyds and Norsk Veritas. By including Service Categories, Traffic Descriptors and Quality of Service concepts......As part of the ATOMOS project (Funded by EU, DG VII) a reliable communication system with predictable behaviour has been designed. The selected solution is a network based on redundant ARCnet segments extended with an EN50170 compliant fieldbus based layer on top of an ARCnet SAP (service access...

  14. Verifying Quality of Service of ARCnet Based ATOMOS Communication System for Integrated Ship Control

    DEFF Research Database (Denmark)

    Nielsen, N.N.; Nielsen, Jens Frederik Dalsgaard; Schiøler, Henrik

    1999-01-01

    point) layer. An important characteristic of the communication system is that the functionality and timing must be verifiable in order to satisfy requirements from classification companies like Lloyds and Norsk Veritas. By including Service Categories, Traffic Descriptors and Quality of Service concepts......As part of the ATOMOS project (Funded by EU, DG VII) a reliable communication system with predictable behaviour has been designed. The selected solution is a network based on redundant ARCnet segments extended with an EN50170 compliant fieldbus based layer on top of an ARCnet SAP (service access...

  15. Hombres Sanos: evaluation of a social marketing campaign for heterosexually identified Latino men who have sex with men and women.

    Science.gov (United States)

    Martínez-Donate, Ana P; Zellner, Jennifer A; Sañudo, Fernando; Fernandez-Cerdeño, Araceli; Hovell, Melbourne F; Sipan, Carol L; Engelberg, Moshe; Carrillo, Hector

    2010-12-01

    We evaluated the effectiveness of Hombres Sanos [Healthy Men] a social marketing campaign to increase condom use and HIV testing among heterosexually identified Latino men, especially among heterosexually identified Latino men who have sex with men and women (MSMW). Hombres Sanos was implemented in northern San Diego County, California, from June 2006 through December 2006. Every other month we conducted cross-sectional surveys with independent samples of heterosexually identified Latino men before (n = 626), during (n = 752), and after (n = 385) the campaign. Respondents were randomly selected from 12 targeted community venues to complete an anonymous, self-administered survey on sexual practices and testing for HIV and other sexually transmitted infections. About 5.6% of respondents (n = 98) were heterosexually identified Latino MSMW. The intervention was associated with reduced rates of recent unprotected sex with both females and males among heterosexually identified Latino MSMW. The campaign was also associated with increases in perception of HIV risk, knowledge of testing locations, and condom carrying among heterosexual Latinos. Social marketing represents a promising approach for abating HIV transmission among heterosexually identified Latinos, particularly for heterosexually identified Latino MSMW. Given the scarcity of evidence-based HIV prevention interventions for these populations, this prevention strategy warrants further investigation.

  16. Developing the Inundu fast-jet electronics test and evaluation pod

    CSIR Research Space (South Africa)

    Jamison, Kevin

    2015-09-01

    Full Text Available modifications. The pod’s radio-frequency (RF) antenna is pointed towards the target despite aircraft manoeuvres using a gimbal. To steer the gimbal, the pod incorporates a combined satellite navigation and inertial navigation system (INS) that, coupled... the ECU operational environment. The performance of the ECU can be evaluated and verified at this level. To verify the pod’s performance in flight, its compatibility with the Hawker Hunter aircraft and to measure the internal environment...

  17. Further Evaluation of Methods to Identify Matched Stimulation

    OpenAIRE

    Rapp, John T

    2007-01-01

    The effects of preferred stimulation on the vocal stereotypy of 2 individuals were evaluated in two experiments. The results of Experiment 1 showed that (a) the vocal stereotypy of both participants persisted in the absence of social consequences, (b) 1 participant manipulated toys that did and did not produce auditory stimulation, but only sound-producing toys decreased his vocal stereotypy, and (c) only noncontingent music decreased vocal stereotypy for the other participant, but sterotypy ...

  18. Stress wave nondestructive evaluation of Douglas-fir peeler cores

    Science.gov (United States)

    Robert J. Ross; John I. Zerbe; Xiping Wang; David W. Green; Roy F. Pellerin

    2005-01-01

    With the need for evaluating the utilization of veneer peeler log cores in higher value products and the increasing importance of utilizing round timbers in poles, posts, stakes, and building construction components, we conducted a cooperative project to verify the suitability of stress wave nondestructive evaluation techniques for assessing peeler cores and some...

  19. Error prevention in radiotherapy treatments using a record and verify system

    International Nuclear Information System (INIS)

    Navarrete Campos, S.; Hernandez Vitoria, A.; Canellas Anoz, M.; Millan Cebrian, E.; Garcia Romero, A.

    2001-01-01

    Computerized record-and-verify systems (RVS) are being used increasingly to improve the precision of radiotherapy treatments. With the introduction of new treatment devices, such as multileaf or asymmetric collimators and virtual wedges, the responsibility to ensure correct treatment has increased. The purpose of this paper is to present the method that we are following to prevent some potential radiotherapy errors and to point out some errors that can be easily detected using a RVS, through a check of the daily recorded treatment information. We conclude that a RVS prevents the occurrence of many errors, when the settings of the treatment machine do not match the intended parameters within some maximal authorized deviation, and allows to detect easily other potential errors related with a incorrect selection of the treatment patient data. A quality assurance program, including a check of all beam data and a weekly control of the manual and electronic chart, has helped reduce errors. (author)

  20. Statistical Evaluation of the Identified Structural Parameters of an idling Offshore Wind Turbine

    International Nuclear Information System (INIS)

    Kramers, Hendrik C.; Van der Valk, Paul L.C.; Van Wingerden, Jan-Willem

    2016-01-01

    With the increased need for renewable energy, new offshore wind farms are being developed at an unprecedented scale. However, as the costs of offshore wind energy are still too high, design optimization and new innovations are required for lowering its cost. The design of modern day offshore wind turbines relies on numerical models for estimating ultimate and fatigue loads of the turbines. The dynamic behavior and the resulting structural loading of the turbines is determined for a large part by its structural properties, such as the natural frequencies and damping ratios. Hence, it is important to obtain accurate estimates of these modal properties. For this purpose stochastic subspace identification (SSI), in combination with clustering and statistical evaluation methods, is used to obtain the variance of the identified modal properties of an installed 3.6MW offshore wind turbine in idling conditions. It is found that one is able to obtain confidence intervals for the means of eigenfrequencies and damping ratios of the fore-aft and side-side modes of the wind turbine. (paper)

  1. Usability Evaluation of An Electronic Medication Administration Record (eMAR) Application

    Science.gov (United States)

    Guo, J.; Iribarren, S.; Kapsandoy, S.; Perri, S.; Staggers, N.

    2011-01-01

    Background Electronic medication administration records (eMARs) have been widely used in recent years. However, formal usability evaluations are not yet available for these vendor applications, especially from the perspective of nurses, the largest group of eMAR users. Objective To conduct a formal usability evaluation of an implemented eMAR. Methods Four evaluators examined a commercial vendor eMAR using heuristic evaluation techniques. The evaluators defined seven tasks typical of eMAR use and independently evaluated the application. Consensus techniques were used to obtain 100% agreement of identified usability problems and severity ratings. Findings were reviewed with 5 clinical staff nurses and the Director of Clinical Informatics who verified findings with a small group of clinical nurses. Results Evaluators found 60 usability problems categorized into 233 heuristic violations. Match, Error, and Visibility heuristics were the most frequently violated. Administer Medication and Order and Modify Medications tasks had the highest number of heuristic violations and usability problems rated as major or catastrophic. Conclusion The high number of usability problems could impact the effectiveness, efficiency and satisfaction of nurses’ medication administration activities and may include concerns about patient safety. Usability is a joint responsibility between sites and vendors. We offer a call to action for usability evaluations at all sites and eMAR application redesign as necessary to improve the user experience and promote patient safety. PMID:23616871

  2. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    Science.gov (United States)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  3. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  4. Green Capital: Student Capital student-led evaluation

    OpenAIRE

    Runkle, Q.; Haines, T.; Piper, K.; Leach, S.

    2016-01-01

    To assess and evaluate the impact of the Green Capital: Student Capital project, the partnership (the University of the West of England, the University of Bristol, the Students’ Union at UWE, and Bristol Students’ Union) worked with NUS to train a team of students from both universities to lead an evaluation process. There were two key aims for the evaluation: \\ud \\ud • To verify the quantitative outputs of the Green Capital: Student Capital project; \\ud • And to make a qualitative assessment...

  5. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    Science.gov (United States)

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  6. Large-scale evaluation of candidate genes identifies associations between VEGF polymorphisms and bladder cancer risk.

    Directory of Open Access Journals (Sweden)

    Montserrat García-Closas

    2007-02-01

    Full Text Available Common genetic variation could alter the risk for developing bladder cancer. We conducted a large-scale evaluation of single nucleotide polymorphisms (SNPs in candidate genes for cancer to identify common variants that influence bladder cancer risk. An Illumina GoldenGate assay was used to genotype 1,433 SNPs within or near 386 genes in 1,086 cases and 1,033 controls in Spain. The most significant finding was in the 5' UTR of VEGF (rs25648, p for likelihood ratio test, 2 degrees of freedom = 1 x 10(-5. To further investigate the region, we analyzed 29 additional SNPs in VEGF, selected to saturate the promoter and 5' UTR and to tag common genetic variation in this gene. Three additional SNPs in the promoter region (rs833052, rs1109324, and rs1547651 were associated with increased risk for bladder cancer: odds ratio (95% confidence interval: 2.52 (1.06-5.97, 2.74 (1.26-5.98, and 3.02 (1.36-6.63, respectively; and a polymorphism in intron 2 (rs3024994 was associated with reduced risk: 0.65 (0.46-0.91. Two of the promoter SNPs and the intron 2 SNP showed linkage disequilibrium with rs25648. Haplotype analyses revealed three blocks of linkage disequilibrium with significant associations for two blocks including the promoter and 5' UTR (global p = 0.02 and 0.009, respectively. These findings are biologically plausible since VEGF is critical in angiogenesis, which is important for tumor growth, its elevated expression in bladder tumors correlates with tumor progression, and specific 5' UTR haplotypes have been shown to influence promoter activity. Associations between bladder cancer risk and other genes in this report were not robust based on false discovery rate calculations. In conclusion, this large-scale evaluation of candidate cancer genes has identified common genetic variants in the regulatory regions of VEGF that could be associated with bladder cancer risk.

  7. A lanthipeptide library used to identify a protein-protein interaction inhibitor.

    Science.gov (United States)

    Yang, Xiao; Lennard, Katherine R; He, Chang; Walker, Mark C; Ball, Andrew T; Doigneaux, Cyrielle; Tavassoli, Ali; van der Donk, Wilfred A

    2018-04-01

    In this article we describe the production and screening of a genetically encoded library of 10 6 lanthipeptides in Escherichia coli using the substrate-tolerant lanthipeptide synthetase ProcM. This plasmid-encoded library was combined with a bacterial reverse two-hybrid system for the interaction of the HIV p6 protein with the UEV domain of the human TSG101 protein, which is a critical protein-protein interaction for HIV budding from infected cells. Using this approach, we identified an inhibitor of this interaction from the lanthipeptide library, whose activity was verified in vitro and in cell-based virus-like particle-budding assays. Given the variety of lanthipeptide backbone scaffolds that may be produced with ProcM, this method may be used for the generation of genetically encoded libraries of natural product-like lanthipeptides containing substantial structural diversity. Such libraries may be combined with any cell-based assay to identify lanthipeptides with new biological activities.

  8. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    Science.gov (United States)

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the

  9. Evaluating genome-wide association study-identified breast cancer risk variants in African-American women.

    Directory of Open Access Journals (Sweden)

    Jirong Long

    Full Text Available Genome-wide association studies (GWAS, conducted mostly in European or Asian descendants, have identified approximately 67 genetic susceptibility loci for breast cancer. Given the large differences in genetic architecture between the African-ancestry genome and genomes of Asians and Europeans, it is important to investigate these loci in African-ancestry populations. We evaluated index SNPs in all 67 breast cancer susceptibility loci identified to date in our study including up to 3,300 African-American women (1,231 cases and 2,069 controls, recruited in the Southern Community Cohort Study (SCCS and the Nashville Breast Health Study (NBHS. Seven SNPs were statistically significant (P ≤ 0.05 with the risk of overall breast cancer in the same direction as previously reported: rs10069690 (5p15/TERT, rs999737 (14q24/RAD51L1, rs13387042 (2q35/TNP1, rs1219648 (10q26/FGFR2, rs8170 (19p13/BABAM1, rs17817449 (16q12/FTO, and rs13329835 (16q23/DYL2. A marginally significant association (P<0.10 was found for three additional SNPs: rs1045485 (2q33/CASP8, rs4849887 (2q14/INHBB, and rs4808801 (19p13/ELL. Three additional SNPs, including rs1011970 (9p21/CDKN2A/2B, rs941764 (14q32/CCDC88C, and rs17529111 (6q14/FAM46A, showed a significant association in analyses conducted by breast cancer subtype. The risk of breast cancer was elevated with an increasing number of risk variants, as measured by quintile of the genetic risk score, from 1.00 (reference, to 1.75 (1.30-2.37, 1.56 (1.15-2.11, 2.02 (1.50-2.74 and 2.63 (1.96-3.52, respectively, (P = 7.8 × 10(-10. Results from this study highlight the need for large genetic studies in AAs to identify risk variants impacting this population.

  10. Can a structured questionnaire identify patients with reduced renal function?

    DEFF Research Database (Denmark)

    Azzouz, Manal; Rømsing, Janne; Thomsen, Henrik

    2014-01-01

    To evaluate a structured questionnaire in identifying outpatients with renal dysfunction before MRI or CT in various age groups.......To evaluate a structured questionnaire in identifying outpatients with renal dysfunction before MRI or CT in various age groups....

  11. Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.

    Science.gov (United States)

    He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk

    2014-10-01

    The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.

  12. Synthesis, characterization and biocompatibility evaluation of hydroxyapatite - gelatin polyLactic acid ternary nanocomposite

    Directory of Open Access Journals (Sweden)

    Z. Nabipour

    2016-04-01

    Full Text Available Objective(s: The current study reports the production and biocompatibility evaluation of a ternary nanocomposite consisting of HA, PLA, and gelatin for biomedical application.Materials and Methods: Hydroxyapatite nanopowder (HA: Ca10(PO46(OH2 was produced by burning the bovine cortical bone within the temperature range of 350-450 oC followed by heating in an oven at 800. Synthesis of the ternary nanocomposite was carried out in two steps: synthesis of gelatin-hydroxyapatite binary nanocomposite and addition of poly lactic acid with different percentages to the resulting composition. The crystal structure was determined by X-ray diffraction (XRD, while major elements and impurities of hydroxyapatite were identified by elemental analysis of X-ray fluorescence (XRF. Functional groups were determined by Fourier transform infrared spectroscopy (FTIR. Morphology and size of the nanocomposites were evaluated using field emission scanning electron microscope (FE-SEM.Biocompatibility of nanocomposites was investigated by MTT assay. Results: XRD patterns verified the ideal crystal structure of the hydroxyapatite, which indicated an appropriate synthesis process and absence of disturbing phases. Results of FTIR analysis determined the polymers’ functional groups, specified formation of the polymers on the hydroxyapatite surface, and verified synthesis of nHA/PLA/Gel composite. FESEM images also indicated the homogeneous structure of the composite in the range of 50 nanometers. MTT assay results confirmed the biocompatibility of nanocomposite samples.Conclusion: This study suggested that the ternary nanocomposite of nHA/PLA/Gel can be a good candidate for biomedical application such as drug delivery systems, but for evaluation of its potential in hard tissue replacement, mechanical tests should be performed.

  13. A housing stock model of non-heating end-use energy in England verified by aggregate energy use data

    International Nuclear Information System (INIS)

    Lorimer, Stephen

    2012-01-01

    This paper proposes a housing stock model of non-heating end-use energy for England that can be verified using aggregate energy use data available for small areas. These end-uses, commonly referred to as appliances and lighting, are a rapidly increasing part of residential energy demand. This paper proposes a model that can be verified using aggregated data of electricity meters in small areas and census data on housing. Secondly, any differences that open up between major collections of housing could potentially be resolved by using data from frequently updated expenditure surveys. For the year 2008, the model overestimated domestic non-heating energy use at the national scale by 1.5%. This model was then used on the residential sector with various area classifications, which found that rural and suburban areas were generally underestimated by up to 3.3% and urban areas overestimated by up to 5.2% with the notable exception of “professional city life” classifications. The model proposed in this paper has the potential to be a verifiable and adaptable model for non-heating end-use energy in households in England for the future. - Highlights: ► Housing stock energy model was developed for end-uses outside of heating for UK context. ► This entailed changes to the building energy model that serves as the bottom of the stock model. ► The model is adaptable to reflect rapid changes in consumption between major housing surveys. ► Verification was done against aggregated consumption data and for the first time uses a measured size of the housing stock. ► The verification process revealed spatial variations in consumption patterns for future research.

  14. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  15. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Kohlhof, Hendrik; Heidt, Christoph; Bähler, Alexandrine; Kohl, Sandro; Gravius, Sascha; Friedrich, Max J.; Ziebarth, Kai; Stranzinger, Enno

    2015-01-01

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  16. Controlled Experiment Replication in Evaluation of E-Learning System's Educational Influence

    Science.gov (United States)

    Grubisic, Ani; Stankov, Slavomir; Rosic, Marko; Zitko, Branko

    2009-01-01

    We believe that every effectiveness evaluation should be replicated at least in order to verify the original results and to indicate evaluated e-learning system's advantages or disadvantages. This paper presents the methodology for conducting controlled experiment replication, as well as, results of a controlled experiment and an internal…

  17. Method for verifying the pressure in a nuclear reactor fuel rod

    International Nuclear Information System (INIS)

    Jones, W.J.

    1979-01-01

    Disclosed is a method of accurately verifying the pressure contained in a sealed pressurized fuel rod by utilizing a pressure balance measurement technique wherein an end of the fuel rod extends through and is sealed in a wall of a small chamber. The chamber is pressurized to the nominal (desired) fuel rod pressure and the fuel rod is then pierced to interconnect the chamber and fuel rod. The deviation of chamber pressure is noted. The final combined pressure of the fuel rod and drill chamber is substantially equal to the nominal rod pressure; departure of the combined pressure from nominal is in direct proportion to departure of rod pressure from nominal. The maximum error in computing the rod pressure from the deviation of the combined pressure from nominal is estimated at plus or minus 3.0 psig for rod pressures within the specified production limits. If the rod pressure is corrected for rod void volume using a digital printer data record, the accuracy improves to about plus or minus 2.0 psig

  18. Agreement between self-reported and physically verified male circumcision status in Nyanza region, Kenya: Evidence from the TASCO study.

    Science.gov (United States)

    Odoyo-June, Elijah; Agot, Kawango; Mboya, Edward; Grund, Jonathan; Musingila, Paul; Emusu, Donath; Soo, Leonard; Otieno-Nyunya, Boaz

    2018-01-01

    Self-reported male circumcision (MC) status is widely used to estimate community prevalence of circumcision, although its accuracy varies in different settings depending on the extent of misreporting. Despite this challenge, self-reported MC status remains essential because it is the most feasible method of collecting MC status data in community surveys. Therefore, its accuracy is an important determinant of the reliability of MC prevalence estimates based on such surveys. We measured the concurrence between self-reported and physically verified MC status among men aged 25-39 years during a baseline household survey for a study to test strategies for enhancing MC uptake by older men in Nyanza region of Kenya. The objective was to determine the accuracy of self-reported MC status in communities where MC for HIV prevention is being rolled out. Agreement between self-reported and physically verified MC status was measured among 4,232 men. A structured questionnaire was used to collect data on MC status followed by physical examination to verify the actual MC status whose outcome was recorded as fully circumcised (no foreskin), partially circumcised (foreskin is past corona sulcus but covers less than half of the glans) or uncircumcised (foreskin covers half or more of the glans). The sensitivity and specificity of self-reported MC status were calculated using physically verified MC status as the gold standard. Out of 4,232 men, 2,197 (51.9%) reported being circumcised, of whom 99.0% were confirmed to be fully circumcised on physical examination. Among 2,035 men who reported being uncircumcised, 93.7% (1,907/2,035) were confirmed uncircumcised on physical examination. Agreement between self-reported and physically verified MC status was almost perfect, kappa (k) = 98.6% (95% CI, 98.1%-99.1%. The sensitivity of self-reporting being circumcised was 99.6% (95% CI, 99.2-99.8) while specificity of self-reporting uncircumcised was 99.0% (95% CI, 98.4-99.4) and did not differ

  19. How to Verify Plagiarism of the Paper Written in Macedonian and Translated in Foreign Language?

    Science.gov (United States)

    Spiroski, Mirko

    2016-03-15

    The aim of this study was to show how to verify plagiarism of the paper written in Macedonian and translated in foreign language. Original article "Ethics in Medical Research Involving Human Subjects", written in Macedonian, was submitted as an assay-2 for the subject Ethics and published by Ilina Stefanovska, PhD candidate from the Iustinianus Primus Faculty of Law, Ss Cyril and Methodius University of Skopje (UKIM), Skopje, Republic of Macedonia in Fabruary, 2013. Suspected article for plagiarism was published by Prof. Dr. Gordana Panova from the Faculty of Medical Sciences, University Goce Delchev, Shtip, Republic of Macedonia in English with the identical title and identical content in International scientific on-line journal "SCIENCE & TECHNOLOGIES", Publisher "Union of Scientists - Stara Zagora". Original document (written in Macedonian) was translated with Google Translator; suspected article (published in English pdf file) was converted into Word document, and compared both documents with several programs for plagiarism detection. It was found that both documents are identical in 71%, 78% and 82%, respectively, depending on the computer program used for plagiarism detection. It was obvious that original paper was entirely plagiarised by Prof. Dr. Gordana Panova, including six references from the original paper. Plagiarism of the original papers written in Macedonian and translated in other languages can be verified after computerised translation in other languages. Later on, original and translated documents can be compared with available software for plagiarism detection.

  20. An experimental method to verify soil conservation by check dams on the Loess Plateau, China.

    Science.gov (United States)

    Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q

    2009-12-01

    A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.

  1. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    Science.gov (United States)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from

  2. Evaluation of Learning and Competence in the Training of Nurses

    Directory of Open Access Journals (Sweden)

    Cícera Maria Braz da Silva

    2017-02-01

    Full Text Available Introduction: health education becomes a more complex process, since it aims to ensure the training of professionals with the knowledge, skills, attitudes and values necessary for their performance, requiring the adoption of strategies that allow the integral evaluation of these competences. Objective: analyze the scientific evidence about the evaluation of learning and competence in undergraduate nursing education.  Method: integrative literature review with online search in LILACS, MEDLINE, Web of Science, SCOPUS and CINAHL databases, using these descriptors: Competence Based Education, Nursing Education, Learning and Assessment.  Results: the 18 articles analyzed, based on a synthesis and critical analysis, allowed the identification of the following thematic categories: concept of competence; essential competences to the training of nurses; learning strategies; and evaluation. It was evidenced that, despite the polysemy around the term competence, the concept presented more similarities than differences. The nursing competencies identified are similar to those recommended by the National Curriculum Guidelines, emphasizing learning strategies in simulated settings and doubts about methods and the construction of evaluation tools.  Conclusions: the evaluation of learning and competence continues to be a challenge for nursing educators and it is recognized that there are difficulties in this process. In this sense, it seems necessary to develop reliable evaluation tools, based on criteria and indicators, that can verify the performance of the student in action and their earliest possible approximation to real learning scenarios. Keywords: Competency-Based Education. Education. Nursing. Learning. Evaluation.

  3. Evaluation of the Impact of EISA Federal Project Investments

    Energy Technology Data Exchange (ETDEWEB)

    Judd, Kathleen S.; Wendel, Emily M.; Morris, Scott L.; Williamson, Jennifer L.; Halverson, Mark A.; Livingston, Olga V.; Loper, Susan A.

    2012-12-31

    The DOE's Federal Energy Management Program has been charged by Office of Management and Budget to conduct an evaluation on actual and verifiable energy savings and carbon emissions reductions from federal energy management investments made across the Federal government as a result of the Energy Independence and Security Act of 2007. This study presents the findings from that evaluation.

  4. Deep geological strucure of a volcano verified by seismic wave. Jishinha de mita kazan no shinbu kozo

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, A. (Tohoku University, Sendai (Japan). Faculty of Science)

    1991-09-01

    Three dimensional structure of seismic wave velocity for the crest and upper mantle under the North East Japan is determined by the seismic tomography which is prepared by the natural earthquakes confirmed by the observation network for micro earthquakes, indicating that the low velocity region exists just under the corresponding volcano to the upper mantle. Further, the following contents can be verified: Any micro earthquakes which are verified by the above observation network and occur at the depth of 25-40km show the lower generation rate less than 1% and the low dominant frequency compared with the conventional inland earthquake(lower limit of depth is 15km) in the same region and occur around volcanos. The existence of the remarkable reflection surface for S wave which is found at the depth of 10-20km seems to be caused by the melting mass. The above mentioned low velocity region is estimated to correspond to the lifting region of high temperature magma, micro earthquakes of low frequency to the magma activity around that magma and the reflection surface for S wave to the part of the magma. 8 refs., 4 figs.

  5. Guidelines for evaluating software configuration management plans for digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Cheon, Se Woo; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Kim, Jang Yeon

    2001-08-01

    Software configuration management (SCM) is the process for identifying software configuration items (CIs), controlling the implementation and changes to software, recording and reporting the status of changes, and verifying the completeness and correctness of the released software. SCM consists of two major aspects: planning and implementation. Effective SCM involves planning for how activities are to be performed, and performing these activities in accordance with the Plan. This report first reviews the background of SCM that include key standards, SCM disciplines, SCM basic functions, baselines, software entity, SCM process, the implementation of SCM, and the tools of SCM. In turn, the report provides the guidelines for evaluating the SCM Plan for digital I and C systems of nuclear power plants. Most of the guidelines in the report are based on IEEE Std 828 and ANSI/IEEE Std 1042. According to BTP-14, NUREG-0800, the evaluation topics on the SCM Plan is classified into three categories: management, implementation, and resource characteristics

  6. Reliability of coded data to identify earliest indications of cognitive decline, cognitive evaluation and Alzheimer's disease diagnosis: a pilot study in England.

    Science.gov (United States)

    Dell'Agnello, Grazia; Desai, Urvi; Kirson, Noam Y; Wen, Jody; Meiselbach, Mark K; Reed, Catherine C; Belger, Mark; Lenox-Smith, Alan; Martinez, Carlos; Rasmussen, Jill

    2018-03-22

    Evaluate the reliability of using diagnosis codes and prescription data to identify the timing of symptomatic onset, cognitive assessment and diagnosis of Alzheimer's disease (AD) among patients diagnosed with AD. This was a retrospective cohort study using the UK Clinical Practice Research Datalink (CPRD). The study cohort consisted of a random sample of 50 patients with first AD diagnosis in 2010-2013. Additionally, patients were required to have a valid text-field code and a hospital episode or a referral in the 3 years before the first AD diagnosis. The earliest indications of cognitive impairment, cognitive assessment and AD diagnosis were identified using two approaches: (1) using an algorithm based on diagnostic codes and prescription drug information and (2) using information compiled from manual review of both text-based and coded data. The reliability of the code-based algorithm for identifying the earliest dates of the three measures described earlier was evaluated relative to the comprehensive second approach. Additionally, common cognitive assessments (with and without results) were described for both approaches. The two approaches identified the same first dates of cognitive symptoms in 33 (66%) of the 50 patients, first cognitive assessment in 29 (58%) patients and first AD diagnosis in 43 (86%) patients. Allowing for the dates from the two approaches to be within 30 days, the code-based algorithm's success rates increased to 74%, 70% and 94%, respectively. Mini-Mental State Examination was the most commonly observed cognitive assessment in both approaches; however, of the 53 tests performed, only 19 results were observed in the coded data. The code-based algorithm shows promise for identifying the first AD diagnosis. However, the reliability of using coded data to identify earliest indications of cognitive impairment and cognitive assessments is questionable. Additionally, CPRD is not a recommended data source to identify results of cognitive

  7. A New Tool for Identifying Research Standards and Evaluating Research Performance

    Science.gov (United States)

    Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki

    2012-01-01

    Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…

  8. Applying the Water Vapor Radiometer to Verify the Precipitable Water Vapor Measured by GPS

    Directory of Open Access Journals (Sweden)

    Ta-Kang Yeh

    2014-01-01

    Full Text Available Taiwan is located at the land-sea interface in a subtropical region. Because the climate is warm and moist year round, there is a large and highly variable amount of water vapor in the atmosphere. In this study, we calculated the Zenith Wet Delay (ZWD of the troposphere using the ground-based Global Positioning System (GPS. The ZWD measured by two Water Vapor Radiometers (WVRs was then used to verify the ZWD that had been calculated using GPS. We also analyzed the correlation between the ZWD and the precipitation data of these two types of station. Moreover, we used the observational data from 14 GPS and rainfall stations to evaluate three cases. The offset between the GPS-ZWD and the WVR-ZWD ranged from 1.31 to 2.57 cm. The correlation coefficient ranged from 0.89 to 0.93. The results calculated from GPS and those measured using the WVR were very similar. Moreover, when there was no rain, light rain, moderate rain, or heavy rain, the flatland station ZWD was 0.31, 0.36, 0.38, or 0.40 m, respectively. The mountain station ZWD exhibited the same trend. Therefore, these results have demonstrated that the potential and strength of precipitation in a region can be estimated according to its ZWD values. Now that the precision of GPS-ZWD has been confirmed, this method can eventually be expanded to the more than 400 GPS stations in Taiwan and its surrounding islands. The near real-time ZWD data with improved spatial and temporal resolution can be provided to the city and countryside weather-forecasting system that is currently under development. Such an exchange would fundamentally improve the resources used to generate weather forecasts.

  9. 30 CFR 253.24 - When I submit audited annual financial statements to verify my net worth, what standards must...

    Science.gov (United States)

    2010-07-01

    ... statements to verify my net worth, what standards must they meet? (a) Your audited annual financial statements must be bound. (b) Your audited annual financial statements must include the unqualified opinion of an independent accountant that states: (1) The financial statements are free from material...

  10. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    Science.gov (United States)

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour

  11. Evaluating a satellite-based seasonal evapotranspiration product and identifying its relationship with other satellite-derived products and crop yield: A case study for Ethiopia

    Science.gov (United States)

    Tadesse, Tsegaye; Senay, Gabriel B.; Berhan, Getachew; Regassa, Teshome; Beyene, Shimelis

    2015-08-01

    Satellite-derived evapotranspiration anomalies and normalized difference vegetation index (NDVI) products from Moderate Resolution Imaging Spectroradiometer (MODIS) data are currently used for African agricultural drought monitoring and food security status assessment. In this study, a process to evaluate satellite-derived evapotranspiration (ETa) products with a geospatial statistical exploratory technique that uses NDVI, satellite-derived rainfall estimate (RFE), and crop yield data has been developed. The main goal of this study was to evaluate the ETa using the NDVI and RFE, and identify a relationship between the ETa and Ethiopia's cereal crop (i.e., teff, sorghum, corn/maize, barley, and wheat) yields during the main rainy season. Since crop production is one of the main factors affecting food security, the evaluation of remote sensing-based seasonal ETa was done to identify the appropriateness of this tool as a proxy for monitoring vegetation condition in drought vulnerable and food insecure areas to support decision makers. The results of this study showed that the comparison between seasonal ETa and RFE produced strong correlation (R2 > 0.99) for all 41 crop growing zones in Ethiopia. The results of the spatial regression analyses of seasonal ETa and NDVI using Ordinary Least Squares and Geographically Weighted Regression showed relatively weak yearly spatial relationships (R2 products have a good predictive potential for these 31 identified zones in Ethiopia. Decision makers may potentially use ETa products for monitoring cereal crop yields and early warning of food insecurity during drought years for these identified zones.

  12. Verifier-based three-party authentication schemes using extended chaotic maps for data exchange in telecare medicine information systems.

    Science.gov (United States)

    Lee, Tian-Fu

    2014-12-01

    Telecare medicine information systems provide a communicating platform for accessing remote medical resources through public networks, and help health care workers and medical personnel to rapidly making correct clinical decisions and treatments. An authentication scheme for data exchange in telecare medicine information systems enables legal users in hospitals and medical institutes to establish a secure channel and exchange electronic medical records or electronic health records securely and efficiently. This investigation develops an efficient and secure verified-based three-party authentication scheme by using extended chaotic maps for data exchange in telecare medicine information systems. The proposed scheme does not require server's public keys and avoids time-consuming modular exponential computations and scalar multiplications on elliptic curve used in previous related approaches. Additionally, the proposed scheme is proven secure in the random oracle model, and realizes the lower bounds of messages and rounds in communications. Compared to related verified-based approaches, the proposed scheme not only possesses higher security, but also has lower computational cost and fewer transmissions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Failure Capacity Evaluation for Anchor System of NPP Facilities by using a Shaking Table Test

    International Nuclear Information System (INIS)

    Kwon, Hyung O; Jung, Min Ki; Park, Jin Wan; Lim, Ji Hoon

    2010-02-01

    This study investigate the destructive influence of crack locations on the anchor performance to evaluate the seismic performance of NPP equipment anchored on damaged concrete. For this purpose, small-scale specimens were fabricated according to the following three cases: 1) with a non-damaged anchor; 2) with cracks running through the anchor; and 3) with cracks along the expected corn-shape fracture away from the anchor. The result verified with the finite element method is as follows: In the first and second cases that is, with a non-damaged anchor and with cracks running through the anchor destruction occurred at the anchor steel. In the third case that is, with cracks around the anchor, a 30% decline in the seismic performance was identified. This result indicates that evaluation of seismic performance and relevant reinforcement are required when cracks occur away from the anchor along the expected destructive surface

  14. Using measurements for evaluation of black carbon modeling

    Directory of Open Access Journals (Sweden)

    S. Gilardoni

    2011-01-01

    Full Text Available The ever increasing use of air quality and climate model assessments to underpin economic, public health, and environmental policy decisions makes effective model evaluation critical. This paper discusses the properties of black carbon and light attenuation and absorption observations that are the key to a reliable evaluation of black carbon model and compares parametric and nonparametric statistical tools for the quantification of the agreement between models and observations. Black carbon concentrations are simulated with TM5/M7 global model from July 2002 to June 2003 at four remote sites (Alert, Jungfraujoch, Mace Head, and Trinidad Head and two regional background sites (Bondville and Ispra. Equivalent black carbon (EBC concentrations are calculated using light attenuation measurements from January 2000 to December 2005. Seasonal trends in the measurements are determined by fitting sinusoidal functions and the representativeness of the period simulated by the model is verified based on the scatter of the experimental values relative to the fit curves. When the resolution of the model grid is larger than 1° × 1°, it is recommended to verify that the measurement site is representative of the grid cell. For this purpose, equivalent black carbon measurements at Alert, Bondville and Trinidad Head are compared to light absorption and elemental carbon measurements performed at different sites inside the same model grid cells. Comparison of these equivalent black carbon and elemental carbon measurements indicates that uncertainties in black carbon optical properties can compromise the comparison between model and observations. During model evaluation it is important to examine the extent to which a model is able to simulate the variability in the observations over different integration periods as this will help to identify the most appropriate timescales. The agreement between model and observation is accurately described by the overlap of

  15. Verifying Embedded Systems using Component-based Runtime Observers

    DEFF Research Database (Denmark)

    Guan, Wei; Marian, Nicolae; Angelov, Christo K.

    against formally specified properties. This paper presents a component-based design method for runtime observers, which are configured from instances of prefabricated reusable components---Predicate Evaluator (PE) and Temporal Evaluator (TE). The PE computes atomic propositions for the TE; the latter...... is a reconfigurable component processing a data structure, representing the state transition diagram of a non-deterministic state machine, i.e. a Buchi automaton derived from a system property specified in Linear Temporal Logic (LTL). Observer components have been implemented using design models and design patterns...

  16. 49 CFR 40.137 - On what basis does the MRO verify test results involving marijuana, cocaine, amphetamines, or PCP?

    Science.gov (United States)

    2010-10-01

    ... involving marijuana, cocaine, amphetamines, or PCP? 40.137 Section 40.137 Transportation Office of the... results involving marijuana, cocaine, amphetamines, or PCP? (a) As the MRO, you must verify a confirmed positive test result for marijuana, cocaine, amphetamines, and/or PCP unless the employee presents a...

  17. Detection of Botnet Command and Control Traffic by the Multistage Trust Evaluation of Destination Identifiers

    Directory of Open Access Journals (Sweden)

    Pieter Burghouwt

    2015-10-01

    Full Text Available Network-based detection of botnet Command and Control communication is a difficult task if the traffic has a relatively low volume and if popular protocols, such as HTTP, are used to resemble normal traffic. We present a new network-based detection approach that is capable of detecting this type of Command and Control traffic in an enterprise network by estimating the trustworthiness of the traffic destinations. If the destination identifier of a traffic flow origins directly from: human input, prior traffic from a trusted destination, or a defined set of legitimate applications, the destination is trusted and its associated traffic is classified as normal. Advantages of this approach are: the ability of zero day malicious traffic detection, low exposure to malware by passive host-external traffic monitoring, and the applicability for real-time filtering. Experimental evaluation demonstrates successful detection of diverse types of Command and Control Traffic.

  18. Project evaluation: main characteristics

    OpenAIRE

    Moutinho, Nuno

    2010-01-01

    — The evaluation process of real investment projects must consider not only the traditional financial approach, but also non-financial aspects. Non financial analysis can provide additional relevant information about projects. We investigate financial and non-financial areas most relevant in project appraisal. We present main critical success factors and areas of analysis that lead to the perception of project success. Finally, companies are segmented to verify its financial and non-financial...

  19. Standard guide to In-Plant performance evaluation of Hand-Held SNM monitors

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1999-01-01

    1.1 This guide is one of a series on the application and evaluation of special nuclear material (SNM) monitors. Other guides in the series are listed in Section 2, and the relationship of in-plant performance evaluation to other procedures described in the series is illustrated in Fig. 1. Hand-held SNM monitors are described in of Guide C1112, and performance criteria illustrating their capabilities can be found in Appendix X1. 1.2 The purpose of this guide to in-plant performance evaluation is to provide a comparatively rapid procedure to verify that a hand-held SNM monitor performs as expected for detecting SNM or alternative test sources or to disclose the need for repair. The procedure can be used as a routine operational evaluation or it can be used to verify performance after a monitor is calibrated. 1.3 In-plant performance evaluations are more comprehensive than daily functional tests. They take place less often, at intervals ranging from weekly to once every three months, and derive their result fr...

  20. A Usability Evaluation Model for Academic Library Websites: Efficiency, Effectiveness and Learnability

    Directory of Open Access Journals (Sweden)

    Soohyung Joo

    2011-12-01

    Full Text Available Purpose – This paper aimed to develop a usability evaluation model and associated survey tool in the context of academic libraries. This study not only proposed a usability evaluation model but also a practical survey tool tailored to academic library websites. Design/methodology – A usability evaluation model has been developed for academic library websites based on literature review and expert consultation. Then, the authors verified the reliability and validity of the usability evaluation model empirically using the survey data from actual users. Statistical analysis, such as descriptive statistics, internal consistency test, and a factor analysis, were applied to ensure both the reliability and validity of the usability evaluation tool. Findings – From the document analysis and expert consultation, this study identified eighteen measurement items to survey the three constructs of the usability, effectiveness, efficiency, and learnability, in academic library websites. The evaluation tool was then validated with regard to data distribution, reliability, and validity. The empirical examination based on 147 actual user responses proved the survey evaluation tool suggested herein is acceptable in assessing academic library website usability. Originality/Value – This research is one of the few studies to engender a practical survey tool in evaluating library website usability. The usability model and corresponding survey tool would be useful for librarians and library administrators in academic libraries who plan to conduct a usability evaluation involving large sample.

  1. Novel mutations in CRB1 gene identified in a chinese pedigree with retinitis pigmentosa by targeted capture and next generation sequencing

    Science.gov (United States)

    Lo, David; Weng, Jingning; Liu, xiaohong; Yang, Juhua; He, Fen; Wang, Yun; Liu, Xuyang

    2016-01-01

    PURPOSE To detect the disease-causing gene in a Chinese pedigree with autosomal-recessive retinitis pigmentosa (ARRP). METHODS All subjects in this family underwent a complete ophthalmic examination. Targeted-capture next generation sequencing (NGS) was performed on the proband to detect variants. All variants were verified in the remaining family members by PCR amplification and Sanger sequencing. RESULTS All the affected subjects in this pedigree were diagnosed with retinitis pigmentosa (RP). The compound heterozygous c.138delA (p.Asp47IlefsX24) and c.1841G>T (p.Gly614Val) mutations in the Crumbs homolog 1 (CRB1) gene were identified in all the affected patients but not in the unaffected individuals in this family. These mutations were inherited from their parents, respectively. CONCLUSION The novel compound heterozygous mutations in CRB1 were identified in a Chinese pedigree with ARRP using targeted-capture next generation sequencing. After evaluating the significant heredity and impaired protein function, the compound heterozygous c.138delA (p.Asp47IlefsX24) and c.1841G>T (p.Gly614Val) mutations are the causal genes of early onset ARRP in this pedigree. To the best of our knowledge, there is no previous report regarding the compound mutations. PMID:27806333

  2. On the safe use of verify-and-record systems in external beam radiation therapy

    International Nuclear Information System (INIS)

    Seelantag, W.W.; Davis, J.B.

    1997-01-01

    Verify-and-record (V and R) systems are being used increasingly, not only for verification, but also for computer aided setup and chart printing. The close intercorrelation between V and R system and treatment routine requires new ideas for quality assurance (QA): pure ''machine checking'' as with treatment units is not sufficient anymore. The level of QA obviously depends on the tasks of the V and R system: the most advanced case of the system being used for computer aided setup and for chart printing is discussed -both are indispensable for an efficient use of V and R systems. Seven propositions are defined to make this not only efficient but safe. (author)

  3. High performance APCS conceptual design and evaluation scoping study

    International Nuclear Information System (INIS)

    Soelberg, N.; Liekhus, K.; Chambers, A.; Anderson, G.

    1998-02-01

    This Air Pollution Control System (APCS) Conceptual Design and Evaluation study was conducted to evaluate a high-performance (APC) system for minimizing air emissions from mixed waste thermal treatment systems. Seven variations of high-performance APCS designs were conceptualized using several design objectives. One of the system designs was selected for detailed process simulation using ASPEN PLUS to determine material and energy balances and evaluate performance. Installed system capital costs were also estimated. Sensitivity studies were conducted to evaluate the incremental cost and benefit of added carbon adsorber beds for mercury control, specific catalytic reduction for NO x control, and offgas retention tanks for holding the offgas until sample analysis is conducted to verify that the offgas meets emission limits. Results show that the high-performance dry-wet APCS can easily meet all expected emission limits except for possibly mercury. The capability to achieve high levels of mercury control (potentially necessary for thermally treating some DOE mixed streams) could not be validated using current performance data for mercury control technologies. The engineering approach and ASPEN PLUS modeling tool developed and used in this study identified APC equipment and system performance, size, cost, and other issues that are not yet resolved. These issues need to be addressed in feasibility studies and conceptual designs for new facilities or for determining how to modify existing facilities to meet expected emission limits. The ASPEN PLUS process simulation with current and refined input assumptions and calculations can be used to provide system performance information for decision-making, identifying best options, estimating costs, reducing the potential for emission violations, providing information needed for waste flow analysis, incorporating new APCS technologies in existing designs, or performing facility design and permitting activities

  4. Evaluation of unique identifiers used for citation linking [version 1; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Heidi Holst Madsen

    2016-06-01

    Full Text Available Unique identifiers (UID are seen as an effective tool to create links between identical publications in databases or identify duplicates in a database. The purpose of the present study is to investigate how well UIDs work for citation linking. We have two objectives: Explore the coverage, precision, and characteristics of publications matched versus not matched with UIDs as the match key.   Illustrate how publication sets formed by using UIDs as the match key may affect the bibliometric indicators: Number of publications, number of citations and the average number of citations per publication.   The objectives are addressed in a literature review and a case study. The literature review shows that only a few studies evaluate how well UIDs work as a match key. From the literature we identify four error types: Duplicate digital object identifiers (DOI, incorrect DOIs in reference lists and databases, DOIs not registered by the database where a bibliometric analysis is performed, and erroneous optical or special character recognition.   The case study explores the use of UIDs in the integration between the databases Pure and SciVal. Specifically journal publications in English are matched between the two databases. We find all error types except erroneous optical or special character recognition in our publication sets. In particular the duplicate DOIs constitute a problem for the calculation of bibliometric indicators as both keeping the duplicates to improve the reliability of citation counts and deleting them to improve the reliability of publication counts will distort the calculation of average number of citations per publication.   The use of UIDs as a match key in citation linking is implemented in many settings, and the availability of UIDs may become critical for the inclusion of a publication or a database in a bibliometric analysis.

  5. Force10 networks performance in world's first transcontinental 10 gigabit ethernet network verified by Ixia

    CERN Multimedia

    2003-01-01

    Force10 Networks, Inc., today announced that the performance of the Force10 E-Series switch/routers deployed in a transcontinental network has been verified as line-rate 10 GE throughput by Ixia, a leading provider of high-speed, network performance and conformance analysis systems. The network, the world's first transcontinental 10 GE wide area network, consists of a SURFnet OC-192 lambda between Geneva and the StarLight facility in Chicago via Amsterdam and another OC-192 lambda between this same facility in Chicago and Carleton University in Ottawa, Canada provided by CANARIE and ORANO (1/2 page).

  6. On verifying magnetic dipole moment of a magnetic torquer by experiments

    Science.gov (United States)

    Kuyyakanont, Aekjira; Kuntanapreeda, Suwat; Fuengwarodsakul, Nisai H.

    2018-01-01

    Magnetic torquers are used for the attitude control of small satellites, such as CubeSats with Low Earth Orbit (LEO). During the design of magnetic torquers, it is necessary to confirm if its magnetic dipole moment is enough to control the satellite attitude. The magnetic dipole moment can affect the detumbling time and the satellite rotation time. In addition, it is also necessary to understand how to design the magnetic torquer for operation in a CubeSat under the space environment at LEO. This paper reports an investigation of the magnetic dipole moment and the magnetic field generated by a circular air-coil magnetic torquer using experimental measurements. The experiment testbed was built on an air-bearing under a magnetic field generated by a Helmholtz coil. This paper also describes the procedure to determine and verify the magnetic dipole moment value of the designed circular air-core magnetic torquer. The experimental results are compared with the design calculations. According to the comparison results, the designed magnetic torquer reaches the required magnetic dipole moment. This designed magnetic torquer will be applied to the attitude control systems of a 1U CubeSat satellite in the project “KNACKSAT.”

  7. To what extent can behaviour change techniques be identified within an adaptable implementation package for primary care? A prospective directed content analysis.

    Science.gov (United States)

    Glidewell, Liz; Willis, Thomas A; Petty, Duncan; Lawton, Rebecca; McEachan, Rosemary R C; Ingleson, Emma; Heudtlass, Peter; Davies, Andrew; Jamieson, Tony; Hunter, Cheryl; Hartley, Suzanne; Gray-Burrows, Kara; Clamp, Susan; Carder, Paul; Alderson, Sarah; Farrin, Amanda J; Foy, Robbie

    2018-02-17

    Interpreting evaluations of complex interventions can be difficult without sufficient description of key intervention content. We aimed to develop an implementation package for primary care which could be delivered using typically available resources and could be adapted to target determinants of behaviour for each of four quality indicators: diabetes control, blood pressure control, anticoagulation for atrial fibrillation and risky prescribing. We describe the development and prospective verification of behaviour change techniques (BCTs) embedded within the adaptable implementation packages. We used an over-lapping multi-staged process. We identified evidence-based, candidate delivery mechanisms-mainly audit and feedback, educational outreach and computerised prompts and reminders. We drew upon interviews with primary care professionals using the Theoretical Domains Framework to explore likely determinants of adherence to quality indicators. We linked determinants to candidate BCTs. With input from stakeholder panels, we prioritised likely determinants and intervention content prior to piloting the implementation packages. Our content analysis assessed the extent to which embedded BCTs could be identified within the packages and compared them across the delivery mechanisms and four quality indicators. Each implementation package included at least 27 out of 30 potentially applicable BCTs representing 15 of 16 BCT categories. Whilst 23 BCTs were shared across all four implementation packages (e.g. BCTs relating to feedback and comparing behaviour), some BCTs were unique to certain delivery mechanisms (e.g. 'graded tasks' and 'problem solving' for educational outreach). BCTs addressing the determinants 'environmental context' and 'social and professional roles' (e.g. 'restructuring the social and 'physical environment' and 'adding objects to the environment') were indicator specific. We found it challenging to operationalise BCTs targeting 'environmental context

  8. A Tentative Study on the Evaluation of Community Health Service Quality*

    Science.gov (United States)

    Ma, Zhi-qiang; Zhu, Yong-yue

    Community health service is the key point of health reform in China. Based on pertinent studies, this paper constructed an indicator system for the community health service quality evaluation from such five perspectives as visible image, reliability, responsiveness, assurance and sympathy, according to service quality evaluation scale designed by Parasuraman, Zeithaml and Berry. A multilevel fuzzy synthetical evaluation model was constructed to evaluate community health service by fuzzy mathematics theory. The applicability and maneuverability of the evaluation indicator system and evaluation model were verified by empirical analysis.

  9. A radiating shock evaluated using Implicit Monte Carlo Diffusion

    International Nuclear Information System (INIS)

    Cleveland, M.; Gentile, N.

    2013-01-01

    Implicit Monte Carlo [1] (IMC) has been shown to be very expensive when used to evaluate a radiation field in opaque media. Implicit Monte Carlo Diffusion (IMD) [2], which evaluates a spatial discretized diffusion equation using a Monte Carlo algorithm, can be used to reduce the cost of evaluating the radiation field in opaque media [2]. This work couples IMD to the hydrodynamics equations to evaluate opaque diffusive radiating shocks. The Lowrie semi-analytic diffusive radiating shock benchmark[a] is used to verify our implementation of the coupled system of equations. (authors)

  10. Identifying MRI markers to evaluate early treatment-related changes post-laser ablation for cancer pain management

    Science.gov (United States)

    Tiwari, Pallavi; Danish, Shabbar; Madabhushi, Anant

    2014-03-01

    by correcting for intensity drift in order to examine tissue-specific response, and (3) quantification of MRI maps via texture and intensity features to evaluate changes in MR markers pre- and post-LITT. A total of 78 texture features comprising of non-steerable and steerable gradient and second order statistical features were extracted from pre- and post-LITT MP-MRI on a per-voxel basis. Quantitative, voxel-wise comparison of the changes in MRI texture features between pre-, and post-LITT MRI indicate that (a) steerable and non-steerable gradient texture features were highly sensitive as well as specific in predicting subtle micro-architectural changes within and around the ablation zone pre- and post-LITT, (b) FLAIR was identified as the most sensitive MRI protocol in identifying early treatment changes yielding a normalized percentage change of 360% within the ablation zone relative to its pre-LITT value, and (c) GRE was identified as the most sensitive MRI protocol in quantifying changes outside the ablation zone post-LITT. Our preliminary results thus indicate great potential for non-invasive computerized MRI features in determining localized micro-architectural focal treatment related changes post-LITT.

  11. Identifying multiple influential spreaders based on generalized closeness centrality

    Science.gov (United States)

    Liu, Huan-Li; Ma, Chuang; Xiang, Bing-Bing; Tang, Ming; Zhang, Hai-Feng

    2018-02-01

    To maximize the spreading influence of multiple spreaders in complex networks, one important fact cannot be ignored: the multiple spreaders should be dispersively distributed in networks, which can effectively reduce the redundance of information spreading. For this purpose, we define a generalized closeness centrality (GCC) index by generalizing the closeness centrality index to a set of nodes. The problem converts to how to identify multiple spreaders such that an objective function has the minimal value. By comparing with the K-means clustering algorithm, we find that the optimization problem is very similar to the problem of minimizing the objective function in the K-means method. Therefore, how to find multiple nodes with the highest GCC value can be approximately solved by the K-means method. Two typical transmission dynamics-epidemic spreading process and rumor spreading process are implemented in real networks to verify the good performance of our proposed method.

  12. Procedures for measuring and verifying gastric tube placement in newborns: an integrative review.

    Science.gov (United States)

    Dias, Flávia de Souza Barbosa; Emidio, Suellen Cristina Dias; Lopes, Maria Helena Baena de Moraes; Shimo, Antonieta Keiko Kakuda; Beck, Ana Raquel Medeiros; Carmona, Elenice Valentim

    2017-07-10

    to investigate evidence in the literature on procedures for measuring gastric tube insertion in newborns and verifying its placement, using alternative procedures to radiological examination. an integrative review of the literature carried out in the Cochrane, LILACS, CINAHL, EMBASE, MEDLINE and Scopus databases using the descriptors "Intubation, gastrointestinal" and "newborns" in original articles. seventeen publications were included and categorized as "measuring method" or "technique for verifying placement". Regarding measuring methods, the measurements of two morphological distances and the application of two formulas, one based on weight and another based on height, were found. Regarding the techniques for assessing placement, the following were found: electromagnetic tracing, diaphragm electrical activity, CO2 detection, indigo carmine solution, epigastrium auscultation, gastric secretion aspiration, color inspection, and evaluation of pH, enzymes and bilirubin. the measuring method using nose to earlobe to a point midway between the xiphoid process and the umbilicus measurement presents the best evidence. Equations based on weight and height need to be experimentally tested. The return of secretion into the tube aspiration, color assessment and secretion pH are reliable indicators to identify gastric tube placement, and are the currently indicated techniques. investigar, na literatura, evidências sobre procedimentos de mensuração da sonda gástrica em recém-nascidos e de verificação do seu posicionamento, procedimentos alternativos ao exame radiológico. revisão integrativa da literatura nas bases Biblioteca Cochrane, LILACS, CINAHL, EMBASE, MEDLINE e Scopus, utilizando os descritores "intubação gastrointestinal" e "recém-nascido" em artigos originais. dezessete publicações foram incluídas e categorizadas em "método de mensuração" ou "técnica de verificação do posicionamento". Como métodos de mensuração, foram encontrados os de tomada

  13. New verifiable stationarity concepts for a class of mathematical programs with disjunctive constraints.

    Science.gov (United States)

    Benko, Matúš; Gfrerer, Helmut

    2018-01-01

    In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.

  14. IRPhEP-handbook, International Handbook of Evaluated Reactor Physics Benchmark Experiments

    International Nuclear Information System (INIS)

    Sartori, Enrico; Blair Briggs, J.

    2008-01-01

    1 - Description: The purpose of the International Reactor Physics Experiment Evaluation Project (IRPhEP) is to provide an extensively peer-reviewed set of reactor physics-related integral data that can be used by reactor designers and safety analysts to validate the analytical tools used to design next-generation reactors and establish the safety basis for operation of these reactors. This work of the IRPhEP is formally documented in the 'International Handbook of Evaluated Reactor Physics Benchmark Experiments,' a single source of verified and extensively peer-reviewed reactor physics benchmark measurements data. The IRPhE Handbook is available on DVD. You may request a DVD by completing the DVD Request Form available at: http://irphep.inl.gov/handbook/hbrequest.shtml The evaluation process entails the following steps: 1. Identify a comprehensive set of reactor physics experimental measurements data, 2. Evaluate the data and quantify overall uncertainties through various types of sensitivity analysis to the extent possible, verify the data by reviewing original and subsequently revised documentation, and by talking with the experimenters or individuals who are familiar with the experimental facility, 3. Compile the data into a standardized format, 4. Perform calculations of each experiment with standard reactor physics codes where it would add information, 5. Formally document the work into a single source of verified and peer reviewed reactor physics benchmark measurements data. The International Handbook of Evaluated Reactor Physics Benchmark Experiments contains reactor physics benchmark specifications that have been derived from experiments that were performed at various nuclear experimental facilities around the world. The benchmark specifications are intended for use by reactor physics personal to validate calculational techniques. The 2008 Edition of the International Handbook of Evaluated Reactor Physics Experiments contains data from 25 different

  15. Automatic system for evaluation of ionizing field

    International Nuclear Information System (INIS)

    Pimenta, N.L.; Calil, S.J.

    1992-01-01

    A three-dimensional cartesian manipulator for evaluating the ionizing field and able to position a ionization chamber in any point of the space is developed. The control system is made using a IBM microcomputer. The system aimed the study of isodose curves from ionizing sources, verifying the performance of radiotherapeutic equipment. (C.G.C.)

  16. Evaluating SPARQL queries on massive RDF datasets

    KAUST Repository

    Al-Harbi, Razen; Abdelaziz, Ibrahim; Kalnis, Panos; Mamoulis, Nikos

    2015-01-01

    In this paper, we propose AdHash, a distributed RDF system which addresses the shortcomings of previous work. First, AdHash initially applies lightweight hash partitioning, which drastically minimizes the startup cost, while favoring the parallel processing of join patterns on subjects, without any data communication. Using a locality-aware planner, queries that cannot be processed in parallel are evaluated with minimal communication. Second, AdHash monitors the data access patterns and adapts dynamically to the query load by incrementally redistributing and replicating frequently accessed data. As a result, the communication cost for future queries is drastically reduced or even eliminated. Our experiments with synthetic and real data verify that AdHash (i) starts faster than all existing systems, (ii) processes thousands of queries before other systems become online, and (iii) gracefully adapts to the query load, being able to evaluate queries on billion-scale RDF data in sub-seconds. In this demonstration, audience can use a graphical interface of AdHash to verify its performance superiority compared to state-of-the-art distributed RDF systems.

  17. Toward verifying fossil fuel CO2 emissions with the CMAQ model: motivation, model description and initial simulation.

    Science.gov (United States)

    Liu, Zhen; Bambha, Ray P; Pinto, Joseph P; Zeng, Tao; Boylan, Jim; Huang, Maoyi; Lei, Huimin; Zhao, Chun; Liu, Shishi; Mao, Jiafu; Schwalm, Christopher R; Shi, Xiaoying; Wei, Yaxing; Michelsen, Hope A

    2014-04-01

    Motivated by the question of whether and how a state-of-the-art regional chemical transport model (CTM) can facilitate characterization of CO2 spatiotemporal variability and verify CO2 fossil-fuel emissions, we for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate CO2. This paper presents methods, input data, and initial results for CO2 simulation using CMAQ over the contiguous United States in October 2007. Modeling experiments have been performed to understand the roles of fossil-fuel emissions, biosphere-atmosphere exchange, and meteorology in regulating the spatial distribution of CO2 near the surface over the contiguous United States. Three sets of net ecosystem exchange (NEE) fluxes were used as input to assess the impact of uncertainty of NEE on CO2 concentrations simulated by CMAQ. Observational data from six tall tower sites across the country were used to evaluate model performance. In particular, at the Boulder Atmospheric Observatory (BAO), a tall tower site that receives urban emissions from Denver CO, the CMAQ model using hourly varying, high-resolution CO2 fossil-fuel emissions from the Vulcan inventory and Carbon Tracker optimized NEE reproduced the observed diurnal profile of CO2 reasonably well but with a low bias in the early morning. The spatial distribution of CO2 was found to correlate with NO(x), SO2, and CO, because of their similar fossil-fuel emission sources and common transport processes. These initial results from CMAQ demonstrate the potential of using a regional CTM to help interpret CO2 observations and understand CO2 variability in space and time. The ability to simulate a full suite of air pollutants in CMAQ will also facilitate investigations of their use as tracers for CO2 source attribution. This work serves as a proof of concept and the foundation for more comprehensive examinations of CO2 spatiotemporal variability and various uncertainties in the future. Atmospheric CO2 has long been modeled

  18. Evaluation of the performance of an ultrasonic cross-correlation flowmeter

    International Nuclear Information System (INIS)

    Bazerghi, H.; Serdula, K.J.

    1977-09-01

    An ultrasonic cross-correlation flowmeter, developed to assist in improving performance of heavy water plants, was evaluated. Overall performance of the flowmeter is satisfactory. The flowmeter is ideally suited to industrial applications and has an accuracy and repeatability comparable to many laboratory instruments. An accuracy of 3% is readily obtainable. This new 'clamp-on' portable flowmeter should prove useful in applications which provide flow measurements in systems where pipe penetration is too costly or not practical, verify or replace existing flowmeters, and measure flows in lines not previously instrumented to provide better control or to verify performance of systems

  19. Identifying attributes of food literacy: a scoping review.

    Science.gov (United States)

    Azevedo Perry, Elsie; Thomas, Heather; Samra, H Ruby; Edmonstone, Shannon; Davidson, Lyndsay; Faulkner, Amy; Petermann, Lisa; Manafò, Elizabeth; Kirkpatrick, Sharon I

    2017-09-01

    An absence of food literacy measurement tools makes it challenging for nutrition practitioners to assess the impact of food literacy on healthy diets and to evaluate the outcomes of food literacy interventions. The objective of the present scoping review was to identify the attributes of food literacy. A scoping review of peer-reviewed and grey literature was conducted and attributes of food literacy identified. Subjects included in the search were high-risk groups. Eligible articles were limited to research from Canada, USA, the UK, Australia and New Zealand. The search identified nineteen peer-reviewed and thirty grey literature sources. Fifteen identified food literacy attributes were organized into five categories. Food and Nutrition Knowledge informs decisions about intake and distinguishing between 'healthy' and 'unhealthy' foods. Food Skills focuses on techniques of food purchasing, preparation, handling and storage. Self-Efficacy and Confidence represent one's capacity to perform successfully in specific situations. Ecologic refers to beyond self and the interaction of macro- and microsystems with food decisions and behaviours. Food Decisions reflects the application of knowledge, information and skills to make food choices. These interdependent attributes are depicted in a proposed conceptual model. The lack of evaluated tools inhibits the ability to assess and monitor food literacy; tailor, target and evaluate programmes; identify gaps in programming; engage in advocacy; and allocate resources. The present scoping review provides the foundation for the development of a food literacy measurement tool to address these gaps.

  20. Verifying the transition from low levels of nuclear weapons to a nuclear weapon-free world. VERTIC research report no. 2

    International Nuclear Information System (INIS)

    Milne, T.; Wilson, H.

    1999-01-01

    The process of verifying the complete elimination of nuclear warheads in national stockpiles can be divided, conceptually, into four stages: first, comprehensive declarations of warhead and material inventories, as a base-line from which verified disarmament can proceed; second, the transfer of all nuclear weapons and weapons-grade fissile material into bonded store; third, demilitarisation measures, such as to render warheads unusable without disassembly and refabrication; fourth, dismantlement of warheads and disposition of fissile material. Many of the technologies and technologies and techniques needed for verifying the elimination of nuclear warheads have been worked out at a general level, largely in US studies. While it is essential that these techniques are refined and improved, what is most important now, if disarmament is to proceed expeditiously, is for each of the nuclear weapon states (NWS) themselves to study the central verification problems and requirements in order to identify particular techniques and approaches that meet their needs. As yet there is no system of integrated data exchange and verification that any of the NWS is willing to endorse. Each of the NWS should give detailed consideration to the logistics of dismantling the warheads in their respective stockpiles, including, for example, the practicalities of accommodating international verification at their potential dismantlement facilities. Each of the NWS might usefully review exactly which details of warhead design and construction have to remain secret in the course of the disarmament process, in the first place from one another, and second from the IAEA or any other international body that might be involved in international disarmament arrangements. Introducing transparency and verification into national nuclear weapons programmes might have a significant financial cost. Research and ingenuity might reduce this cost, however, and early investments in these fields, with sharing of

  1. Guidance to Risk-Informed Evaluation of Technical Specifications using PSA

    International Nuclear Information System (INIS)

    Baeckstroem, Ola; Haeggstroem, Anna; Maennistoe, Ilkka

    2010-04-01

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  2. Guidance to risk-informed evaluation of technical specifications using PSA

    International Nuclear Information System (INIS)

    Baeckstroem, O.; Haeggstroem, A.; Maennistoe, I.

    2010-10-01

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  3. Guidance to risk-informed evaluation of technical specifications using PSA

    Energy Technology Data Exchange (ETDEWEB)

    Baeckstroem, O.; Haeggstroem, A. (Scandpower AB, Stockholm (Sweden)); Maennistoe, I. (VTT, Helsingfors (Finland))

    2010-04-15

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  4. Guidance to Risk-Informed Evaluation of Technical Specifications using PSA

    Energy Technology Data Exchange (ETDEWEB)

    Baeckstroem, Ola; Haeggstroem, Anna (Scandpower AB, Stockholm (Sweden)); Maennistoe, Ilkka (VTT, Helsingfors (Finland))

    2010-04-15

    This report presents guidance for evaluation of Technical Specification conditions with PSA. It covers quality in PSA, how to verify that the PSA model is sufficiently robust and sufficiently complete and general requirements on methods. Acceptance criteria for evaluation of changes in the TS conditions are presented. As the probabilistic safety assessment (PSA) has developed over the years, it has demonstrated to constitute a useful tool for evaluating many aspects of the TS from a risk point of view. and in that way making the PSAs as well as the decision tools better. This also means that it will be possible to take credit for safety system overcapacity as well as inherent safety features and strength of non-safety classed systems. However, PSA is only one of the tools that shall be used in an evaluation process of TS changes (strengthening/relaxation). PSA is an excellent tool to be used to verify the importance, and thereby possibly relaxation, of TS requirements. But, since PSA is only one tool in the evaluation, it is not sufficient in itself for defining which equipment that shall or shall not have TS requirements. The purpose of this guidance document is to provide general requirements, requirements on methods and acceptance criteria on risk-informed evaluation of TS changes based on PSA. The purpose is not to provide a single solution. As part of the review of the TS conditions this guidance specify requirements on: - Quality verification of the PSA model; - Verification that the PSA model is sufficiently robust with regard to SSCs for which requirements both are and are not defined by the TS; - Verification that the SSCs, for which TS demands are to be evaluated, are modelled in a sufficient manner; - Methods for performing the evaluation; - Which evaluation criteria that shall be used (and how that is verified to be correct); - Acceptance criteria: This guidance also briefly discusses the documentation of the analysis of the TS changes. This guidance

  5. Identifying e-cigarette vape stores: description of an online search methodology.

    Science.gov (United States)

    Kim, Annice E; Loomis, Brett; Rhodes, Bryan; Eggers, Matthew E; Liedtke, Christopher; Porter, Lauren

    2016-04-01

    Although the overall impact of Electronic Nicotine Delivery Systems (ENDS) on public health is unclear, awareness, use, and marketing of the products have increased markedly in recent years. Identifying the increasing number of 'vape stores' that specialise in selling ENDS can be challenging given the lack of regulatory policies and licensing. This study assesses the utility of online search methods in identifying ENDS vape stores. We conducted online searches in Google Maps, Yelp, and YellowPages to identify listings of ENDS vape stores in Florida, and used a crowdsourcing platform to call and verify stores that primarily sold ENDS to consumers. We compared store listings generated from the online search and crowdsourcing methodology to list licensed tobacco and ENDS retailers from the Florida Department of Business and Professional Regulation. The combined results from all three online sources yielded a total of 403 ENDS vape stores. Nearly 32.5% of these stores were on the state tobacco licensure list, while 67.5% were not. Accuracy of online results was highest for Yelp (77.6%), followed by YellowPages (77.1%) and Google (53.0%). Using the online search methodology we identified more ENDS vape stores than were on the state tobacco licensure list. This approach may be a promising strategy to identify and track the growth of ENDS vape stores over time, especially in states without a systematic licensing requirement for such stores. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  6. Is it possible to verify directly a proton-treatment plan using positron emission tomography?

    International Nuclear Information System (INIS)

    Vynckier, S.; Derreumaux, S.; Richard, F.; Wambersie, A.; Bol, A.; Michel, C.

    1993-01-01

    A PET camera is used to visualize the positron activity induced during protonbeam therapy in order to verify directly the proton-treatment plans. The positron emitters created are predominantly the 15 O and 11 C, whose total activity amounts to 12 MBq after an irradiation with 85 MeV protons, delivering 3 Gy in a volume of approximately 300 cm 3 . Although this method is a useful verification of patient setup, care must be taken when deriving dose distributions from activity distributions. Correlation between both quantities is difficult, moreover at the last millimeters of their range, protons will no longer activate tissue. Due to the short half-lives the PET camera must be located close to the treatment facility. (author) 17 refs

  7. How to verify lightning protection efficiency for electrical systems? Testing procedures and practical applications

    Energy Technology Data Exchange (ETDEWEB)

    Birkl, Josef; Zahlmann, Peter [DEHN and SOEHNE, Neumarkt (Germany)], Emails: Josef.Birkl@technik.dehn.de, Peter.Zahlmann@technik.dehn.de

    2007-07-01

    There are increasing numbers of applications, installing Surge Protective Devices (SPDs), through which partial lightning currents flow, and highly sensitive, electronic devices to be protected closely next to each other due to the design of electric distribution systems and switchgear installations which is getting more and more compact. In these cases, the protective function of the SPDs has to be co-ordinated with the individual immunity of the equipment against energetic, conductive impulse voltages and impulse currents. In order to verify the immunity against partial lightning currents of the complete system laboratory tests on a system level are a suitable approach. The proposed test schemes for complete systems have been successfully performed on various applications. Examples will be presented. (author)

  8. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  9. Validating the use of the evaluation tool of children's handwriting-manuscript to identify handwriting difficulties and detect change in school-age children.

    Science.gov (United States)

    Brossard-Racine, Marie; Mazer, Barbara; Julien, Marilyse; Majnemer, Annette

    2012-01-01

    In this study we sought to validate the discriminant ability of the Evaluation Tool of Children's Handwriting-Manuscript in identifying children in Grades 2-3 with handwriting difficulties and to determine the percentage of change in handwriting scores that is consistently detected by occupational therapists. Thirty-four therapists judged and compared 35 pairs of handwriting samples. Receiver operating characteristic (ROC) analyses were performed to determine (1) the optimal cutoff values for word and letter legibility scores that identify children with handwriting difficulties who should be seen in rehabilitation and (2) the minimal clinically important difference (MCID) in handwriting scores. Cutoff scores of 75.0% for total word legibility and 76.0% for total letter legibility were found to provide excellent levels of accuracy. A difference of 10.0%-12.5% for total word legibility and 6.0%-7.0% for total letter legibility were found as the MCID. Study findings enable therapists to quantitatively support clinical judgment when evaluating handwriting. Copyright © 2012 by the American Occupational Therapy Association, Inc.

  10. National, Regional and Global Certification Bodies for Polio Eradication: A Framework for Verifying Measles Elimination.

    Science.gov (United States)

    Deblina Datta, S; Tangermann, Rudolf H; Reef, Susan; William Schluter, W; Adams, Anthony

    2017-07-01

    The Global Certification Commission (GCC), Regional Certification Commissions (RCCs), and National Certification Committees (NCCs) provide a framework of independent bodies to assist the Global Polio Eradication Initiative (GPEI) in certifying and maintaining polio eradication in a standardized, ongoing, and credible manner. Their members meet regularly to comprehensively review population immunity, surveillance, laboratory, and other data to assess polio status in the country (NCC), World Health Organization (WHO) region (RCC), or globally (GCC). These highly visible bodies provide a framework to be replicated to independently verify measles and rubella elimination in the regions and globally. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  11. Identifying key genes associated with acute myocardial infarction.

    Science.gov (United States)

    Cheng, Ming; An, Shoukuan; Li, Junquan

    2017-10-01

    This study aimed to identify key genes associated with acute myocardial infarction (AMI) by reanalyzing microarray data. Three gene expression profile datasets GSE66360, GSE34198, and GSE48060 were downloaded from GEO database. After data preprocessing, genes without heterogeneity across different platforms were subjected to differential expression analysis between the AMI group and the control group using metaDE package. P FI) network. Then, DEGs in each module were subjected to pathway enrichment analysis using DAVID. MiRNAs and transcription factors predicted to regulate target DEGs were identified. Quantitative real-time polymerase chain reaction (RT-PCR) was applied to verify the expression of genes. A total of 913 upregulated genes and 1060 downregulated genes were identified in the AMI group. A FI network consists of 21 modules and DEGs in 12 modules were significantly enriched in pathways. The transcription factor-miRNA-gene network contains 2 transcription factors FOXO3 and MYBL2, and 2 miRNAs hsa-miR-21-5p and hsa-miR-30c-5p. RT-PCR validations showed that expression levels of FOXO3 and MYBL2 were significantly increased in AMI, and expression levels of hsa-miR-21-5p and hsa-miR-30c-5p were obviously decreased in AMI. A total of 41 DEGs, such as SOCS3, VAPA, and COL5A2, are speculated to have roles in the pathogenesis of AMI; 2 transcription factors FOXO3 and MYBL2, and 2 miRNAs hsa-miR-21-5p and hsa-miR-30c-5p may be involved in the regulation of the expression of these DEGs.

  12. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity

    KAUST Repository

    Pahil, Sapna

    2017-08-02

    Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I), a putative heat shock protein (EL PGI II), Spa32 (EL PGI III), IcsB (EL PGI IV) and a hypothetical protein (EL PGI V). These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP) and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  13. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity.

    Science.gov (United States)

    Pahil, Sapna; Taneja, Neelam; Ansari, Hifzur Rahman; Raghava, G P S

    2017-01-01

    Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I), a putative heat shock protein (EL PGI II), Spa32 (EL PGI III), IcsB (EL PGI IV) and a hypothetical protein (EL PGI V). These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP) and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  14. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity

    KAUST Repository

    Pahil, Sapna; Taneja, Neelam; Ansari, Hifzur Rahman; Raghava, G. P. S.

    2017-01-01

    Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I), a putative heat shock protein (EL PGI II), Spa32 (EL PGI III), IcsB (EL PGI IV) and a hypothetical protein (EL PGI V). These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP) and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  15. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity.

    Directory of Open Access Journals (Sweden)

    Sapna Pahil

    Full Text Available Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I, a putative heat shock protein (EL PGI II, Spa32 (EL PGI III, IcsB (EL PGI IV and a hypothetical protein (EL PGI V. These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  16. Experimentally verifiable Yang-Mills spin 2 gauge theory of gravity with group U(1) x SU(2)

    International Nuclear Information System (INIS)

    Peng, H.

    1988-01-01

    In this work, a Yang-Mills spin 2 gauge theory of gravity is proposed. Based on both the verification of the helicity 2 property of the SU(2) gauge bosons of the theory and the agreement of the theory with most observational and experimental evidence, the authors argues that the theory is truly a gravitational theory. An internal symmetry group, the eigenvalues of its generators are identical with quantum numbers, characterizes the interactions of a given class. The author demonstrates that the 4-momentum P μ of a fermion field generates the U(1) x SU(2) internal symmetry group for gravity, but not the transformation group T 4 . That particles are classified by mass and spin implies that the U(1) x SU(2), instead of the Poincare group, is a symmetry group of gravity. It is shown that the U(1) x SU(2) group represents the time displacement and rotation in ordinary space. Thereby internal space associated with gravity is identical with Minkowski spacetime, so a gauge potential of gravity carries two space-time indices. Then he verifies that the SU(2) gravitational boson has helicity 2. It is this fact, spin from internal spin, that explains alternatively why the gravitational field is the only field which is characterized by spin 2. The Physical meaning of gauge potentials of gravity is determined by comparing theory with the results of experiments, such as the Collella-Overhauser-Werner (COW) experiment and the Newtonian limit, etc. The gauge potentials this must identify with ordinary gravitational potentials

  17. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  18. Histomorphological evaluation of Compound bone of Granulated Ricinus in bone regeneration in rabbits

    International Nuclear Information System (INIS)

    Mateus, Christiano Pavan; Chierice, Gilberto Orivaldo; Okamoto, Tetuo

    2011-01-01

    Histological evaluation is an effective method in the behavioral description of the qualitative and quantitative implanted materials. The research validated the performance of Compound bone of Granulated Ricinus on bone regeneration with the histomorphological analysis results. Were selected 30 rabbits, females, divided into 3 groups of 10 animals (G1, G2, G3) with a postoperative time of 45, 70 and 120 days respectively. Each animal is undergone 2 bone lesions in the ilium, one implemented in the material: Compound bone of Granulated Ricinus and the other for control. After the euthanasia, the iliac bone was removed, identified and subjected to histological procedure. The evaluation histological, histomorphological results were interpreted and described by quantitative and qualitative analysis based facts verified in the three experimental groups evaluating the rate of absorption of the material in the tissue regeneration, based on the neo-bone formation. The histomorphologic results classified as a material biocompatible and biologically active. Action in regeneration by bone resorption occurs slowly and gradually. Knowing the time and rate of absorption and neo-formation bone biomaterial, which can be determined in the bone segment applicable in the clinical surgical area.

  19. Quality assurance for high dose rate brachytherapy treatment planning optimization: using a simple optimization to verify a complex optimization

    International Nuclear Information System (INIS)

    Deufel, Christopher L; Furutani, Keith M

    2014-01-01

    As dose optimization for high dose rate brachytherapy becomes more complex, it becomes increasingly important to have a means of verifying that optimization results are reasonable. A method is presented for using a simple optimization as quality assurance for the more complex optimization algorithms typically found in commercial brachytherapy treatment planning systems. Quality assurance tests may be performed during commissioning, at regular intervals, and/or on a patient specific basis. A simple optimization method is provided that optimizes conformal target coverage using an exact, variance-based, algebraic approach. Metrics such as dose volume histogram, conformality index, and total reference air kerma agree closely between simple and complex optimizations for breast, cervix, prostate, and planar applicators. The simple optimization is shown to be a sensitive measure for identifying failures in a commercial treatment planning system that are possibly due to operator error or weaknesses in planning system optimization algorithms. Results from the simple optimization are surprisingly similar to the results from a more complex, commercial optimization for several clinical applications. This suggests that there are only modest gains to be made from making brachytherapy optimization more complex. The improvements expected from sophisticated linear optimizations, such as PARETO methods, will largely be in making systems more user friendly and efficient, rather than in finding dramatically better source strength distributions. (paper)

  20. An evaluation for spatial resolution, using a single target on a medical image

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Sung [Dept. of Radiotechnology, Cheju Halla University, Cheju (Korea, Republic of)

    2016-12-15

    Hitherto, spatial resolution has commonly been evaluated by test patterns or phantoms built on some specific distances (from close to far) between two objects (or double targets). This evaluation method's shortcoming is that resolution is restricted to target distances of phantoms made for test. Therefore, in order to solve the problem, this study proposes and verifies a new method to efficiently test spatial resolution with a single target. For the research I used PSF and JND to propose an idea to measure spatial resolution. After that, I made experiments by commonly used phantoms to verify my new evaluation hypothesis inferred from the above method. To analyse the hypothesis, I used LabVIEW program and got a line pixel from digital image. The result was identical to my spatial-resolution hypothesis inferred from a single target. The findings of the experiment proves only a single target can be enough to relatively evaluate spatial resolution on a digital image. In other words, the limit of the traditional spatial-resolution evaluation method, based on double targets, can be overcome by my new evaluation one using a single target.

  1. Material control test and evaluation system at the ICPP

    International Nuclear Information System (INIS)

    Johnson, C.E.

    1979-01-01

    The US DOE is evaluating process monitoring as part of a total nuclear material safeguards system. A monitoring system is being installed at the Idaho Chemical Processing Plant to test and evaluate material control and surveillance concepts in an operating nuclear fuel reprocessing plant. Process monitoring for nuclear material control complements conventional safeguards accountability and physical protection to assure adherence to approved safeguards procedures and verify containment of nuclear materials within the processing plant

  2. Statistical identifiability and convergence evaluation for nonlinear pharmacokinetic models with particle swarm optimization.

    Science.gov (United States)

    Kim, Seongho; Li, Lang

    2014-02-01

    The statistical identifiability of nonlinear pharmacokinetic (PK) models with the Michaelis-Menten (MM) kinetic equation is considered using a global optimization approach, which is particle swarm optimization (PSO). If a model is statistically non-identifiable, the conventional derivative-based estimation approach is often terminated earlier without converging, due to the singularity. To circumvent this difficulty, we develop a derivative-free global optimization algorithm by combining PSO with a derivative-free local optimization algorithm to improve the rate of convergence of PSO. We further propose an efficient approach to not only checking the convergence of estimation but also detecting the identifiability of nonlinear PK models. PK simulation studies demonstrate that the convergence and identifiability of the PK model can be detected efficiently through the proposed approach. The proposed approach is then applied to clinical PK data along with a two-compartmental model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. De-identifying Swedish clinical text - refinement of a gold standard and experiments with Conditional random fields

    Directory of Open Access Journals (Sweden)

    Dalianis Hercules

    2010-04-01

    Full Text Available Abstract Background In order to perform research on the information contained in Electronic Patient Records (EPRs, access to the data itself is needed. This is often very difficult due to confidentiality regulations. The data sets need to be fully de-identified before they can be distributed to researchers. De-identification is a difficult task where the definitions of annotation classes are not self-evident. Results We present work on the creation of two refined variants of a manually annotated Gold standard for de-identification, one created automatically, and one created through discussions among the annotators. The data is a subset from the Stockholm EPR Corpus, a data set available within our research group. These are used for the training and evaluation of an automatic system based on the Conditional Random Fields algorithm. Evaluating with four-fold cross-validation on sets of around 4-6 000 annotation instances, we obtained very promising results for both Gold Standards: F-score around 0.80 for a number of experiments, with higher results for certain annotation classes. Moreover, 49 false positives that were verified true positives were found by the system but missed by the annotators. Conclusions Our intention is to make this Gold standard, The Stockholm EPR PHI Corpus, available to other research groups in the future. Despite being slightly more time-consuming we believe the manual consensus gold standard is the most valuable for further research. We also propose a set of annotation classes to be used for similar de-identification tasks.

  4. Diagnostic tools for identifying sleepy drivers in the field.

    Science.gov (United States)

    2013-05-06

    The overarching goal of this project was to identify and evaluate cognitive and behavioral indices that are sensitive to sleep : deprivation and may help identify commercial motor vehicle drivers (CMV) who are at-risk for driving in a sleep deprived ...

  5. Systematic Evaluation of Pleiotropy Identifies 6 Further Loci Associated With Coronary Artery Disease

    NARCIS (Netherlands)

    Webb, Thomas R.; Erdmann, Jeanette; Stirrups, Kathleen E.; Stitziel, Nathan O.; Masca, Nicholas G. D.; Jansen, Henning; Kanoni, Stavroula; Nelson, Christopher P.; Ferrario, Paola G.; König, Inke R.; Eicher, John D.; Johnson, Andrew D.; Hamby, Stephen E.; Betsholtz, Christer; Ruusalepp, Arno; Franzén, Oscar; Schadt, Eric E.; Björkegren, Johan L. M.; Weeke, Peter E.; Auer, Paul L.; Schick, Ursula M.; Lu, Yingchang; Zhang, He; Dube, Marie-Pierre; Goel, Anuj; Farrall, Martin; Peloso, Gina M.; Won, Hong-Hee; Do, Ron; van Iperen, Erik; Kruppa, Jochen; Mahajan, Anubha; Scott, Robert A.; Willenborg, Christina; Braund, Peter S.; van Capelleveen, Julian C.; Doney, Alex S. F.; Donnelly, Louise A.; Asselta, Rosanna; Merlini, Pier A.; Duga, Stefano; Marziliano, Nicola; Denny, Josh C.; Shaffer, Christian; El-Mokhtari, Nour Eddine; Franke, Andre; Heilmann, Stefanie; Hengstenberg, Christian; Hoffmann, Per; Holmen, Oddgeir L.; Hveem, Kristian; Jansson, Jan-Håkan; Jöckel, Karl-Heinz; Kessler, Thorsten; Kriebel, Jennifer; Laugwitz, Karl L.; Marouli, Eirini; Martinelli, Nicola; McCarthy, Mark I.; van Zuydam, Natalie R.; Meisinger, Christa; Esko, Tõnu; Mihailov, Evelin; Escher, Stefan A.; Alver, Maris; Moebus, Susanne; Morris, Andrew D.; Virtamo, Jarma; Nikpay, Majid; Olivieri, Oliviero; Provost, Sylvie; AlQarawi, Alaa; Robertson, Neil R.; Akinsansya, Karen O.; Reilly, Dermot F.; Vogt, Thomas F.; Yin, Wu; Asselbergs, Folkert W.; Kooperberg, Charles; Jackson, Rebecca D.; Stahl, Eli; Müller-Nurasyid, Martina; Strauch, Konstantin; Varga, Tibor V.; Waldenberger, Melanie; Zeng, Lingyao; Chowdhury, Rajiv; Salomaa, Veikko; Ford, Ian; Jukema, J. Wouter; Amouyel, Philippe; Kontto, Jukka; Nordestgaard, Børge G.; Ferrières, Jean; Saleheen, Danish; Sattar, Naveed; Surendran, Praveen; Wagner, Aline; Young, Robin; Howson, Joanna M. M.; Butterworth, Adam S.; Danesh, John; Ardissino, Diego; Bottinger, Erwin P.; Erbel, Raimund; Franks, Paul W.; Girelli, Domenico; Hall, Alistair S.; Hovingh, G. Kees; Kastrati, Adnan; Lieb, Wolfgang; Meitinger, Thomas; Kraus, William E.; Shah, Svati H.; McPherson, Ruth; Orho-Melander, Marju; Melander, Olle; Metspalu, Andres; Palmer, Colin N. A.; Peters, Annette; Rader, Daniel J.; Reilly, Muredach P.; Loos, Ruth J. F.; Reiner, Alex P.; Roden, Dan M.; Tardif, Jean-Claude; Thompson, John R.; Wareham, Nicholas J.; Watkins, Hugh; Willer, Cristen J.; Samani, Nilesh J.; Schunkert, Heribert; Deloukas, Panos; Kathiresan, Sekar

    2017-01-01

    Genome-wide association studies have so far identified 56 loci associated with risk of coronary artery disease (CAD). Many CAD loci show pleiotropy; that is, they are also associated with other diseases or traits. This study sought to systematically test if genetic variants identified for non-CAD

  6. Can surveillance systems identify and avert adverse drug events? A prospective evaluation of a commercial application.

    Science.gov (United States)

    Jha, Ashish K; Laguette, Julia; Seger, Andrew; Bates, David W

    2008-01-01

    Computerized monitors can effectively detect and potentially prevent adverse drug events (ADEs). Most monitors have been developed in large academic hospitals and are not readily usable in other settings. We assessed the ability of a commercial program to identify and prevent ADEs in a community hospital. and Measurement We prospectively evaluated the commercial application in a community-based hospital. We examined the frequency and types of alerts produced, how often they were associated with ADEs and potential ADEs, and the potential financial impact of monitoring for ADEs. Among 2,407 patients screened, the application generated 516 high priority alerts. We were able to review 266 alerts at the time they were generated and among these, 30 (11.3%) were considered substantially important to warrant contacting the physician caring for the patient. These 30 alerts were associated with 4 ADEs and 11 potential ADEs. In all 15 cases, the responsible physician was unaware of the event, leading to a change in clinical care in 14 cases. Overall, 23% of high priority alerts were associated with an ADE (95% confidence interval [CI] 12% to 34%) and another 15% were associated with a potential ADE (95% CI 6% to 24%). Active surveillance used approximately 1.5 hours of pharmacist time daily. A commercially available, computer-based ADE detection tool was effective at identifying ADEs. When used as part of an active surveillance program, it can have an impact on preventing or ameliorating ADEs.

  7. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  8. Hanford Site Storm Water Comprehensive Site Compliance Evaluation Report - July 1, 1997 Through June 30, 1998

    International Nuclear Information System (INIS)

    Landon, R.J.

    1999-01-01

    On September 9, 1992, the U.S. Environmental Protection Agency (EPA) issued General Permit No. WA-R-00-000F, ''Authorization to Discharge Under the National Pollutant Discharge Elimination System (NPDES) for Storm Water Discharges Associated with Industrial Activity'' (EPA 1992) to the U.S. Department of Energy, Richland Operations Office (RL). As required by General Permit, Section IV, Part D, Section 4.c (EPA 1992), an annual report must be developed by RL and retained onsite to verify that the requirements listed in the General Permit are implemented. This document fulfills the requirement to prepare an annual report. This report also describes the methods used to conduct the Storm Water Comprehensive Site Compliance Evaluation (SWCSCE) as required in the General Permit, Part IV, Section D.4.c (EPA 1992); identifies the pollution prevention team (PPT) (Appendix A); summarizes the results of the compliance evaluation (Appendix B); and documents significant leaks and spills (Appendix C)

  9. Evaluation of multiple approaches to identify genome-wide polymorphisms in closely related genotypes of sweet cherry (Prunus avium L.

    Directory of Open Access Journals (Sweden)

    Seanna Hewitt

    Full Text Available Identification of genetic polymorphisms and subsequent development of molecular markers is important for marker assisted breeding of superior cultivars of economically important species. Sweet cherry (Prunus avium L. is an economically important non-climacteric tree fruit crop in the Rosaceae family and has undergone a genetic bottleneck due to breeding, resulting in limited genetic diversity in the germplasm that is utilized for breeding new cultivars. Therefore, it is critical to recognize the best platforms for identifying genome-wide polymorphisms that can help identify, and consequently preserve, the diversity in a genetically constrained species. For the identification of polymorphisms in five closely related genotypes of sweet cherry, a gel-based approach (TRAP, reduced representation sequencing (TRAPseq, a 6k cherry SNParray, and whole genome sequencing (WGS approaches were evaluated in the identification of genome-wide polymorphisms in sweet cherry cultivars. All platforms facilitated detection of polymorphisms among the genotypes with variable efficiency. In assessing multiple SNP detection platforms, this study has demonstrated that a combination of appropriate approaches is necessary for efficient polymorphism identification, especially between closely related cultivars of a species. The information generated in this study provides a valuable resource for future genetic and genomic studies in sweet cherry, and the insights gained from the evaluation of multiple approaches can be utilized for other closely related species with limited genetic diversity in the breeding germplasm. Keywords: Polymorphisms, Prunus avium, Next-generation sequencing, Target region amplification polymorphism (TRAP, Genetic diversity, SNParray, Reduced representation sequencing, Whole genome sequencing (WGS

  10. Baccalaureate Nursing Students' Abilities in Critically Identifying and Evaluating the Quality of Online Health Information.

    Science.gov (United States)

    Theron, Maggie; Redmond, Anne; Borycki, Elizabeth M

    2017-01-01

    Both the Internet and social media have become important tools that patients and health professionals, including health professional students, use to obtain information and support their decision-making surrounding health care. Students in the health sciences require increased competence to select, appraise, and use online sources to adequately educate and support patients and advocate for patient needs and best practices. The purpose of this study was to ascertain if second year nursing students have the ability to critically identify and evaluate the quality of online health information through comparisons between student and expert assessments of selected online health information postings using an adapted Trust in Online Health Information scale. Interviews with experts provided understanding of how experts applied the selected criteria and what experts recommend for implementing nursing informatics literacy in curriculums. The difference between student and expert assessments of the quality of the online information is on average close to 40%. Themes from the interviews highlighted several possible factors that may influence informatics competency levels in students, specifically regarding the critical appraisal of the quality of online health information.

  11. Verifying the competition between haloperidol and biperiden in serum albumin through a model based on spectrofluorimetry

    Science.gov (United States)

    Muniz da Silva Fragoso, Viviane; Patrícia de Morais e Coura, Carla; Paulino, Erica Tex; Valdez, Ethel Celene Narvaez; Silva, Dilson; Cortez, Celia Martins

    2017-11-01

    The aim of this work was to apply mathematical-computational modeling to study the interactions of haloperidol (HLP) and biperiden (BPD) with human (HSA) and bovine (BSA) serum albumin in order to verify the competition of these drugs for binding sites in HSA, using intrinsic tryptophan fluorescence quenching data. The association constants estimated for HPD-HSA was 2.17(±0.05) × 107 M-1, BPD-HSA was 2.01(±0.03) × 108 M-1 at 37 °C. Results have shown that drugs do not compete for the same binding sites in albumin.

  12. Building and verifying a severity prediction model of acute pancreatitis (AP) based on BISAP, MEWS and routine test indexes.

    Science.gov (United States)

    Ye, Jiang-Feng; Zhao, Yu-Xin; Ju, Jian; Wang, Wei

    2017-10-01

    To discuss the value of the Bedside Index for Severity in Acute Pancreatitis (BISAP), Modified Early Warning Score (MEWS), serum Ca2+, similarly hereinafter, and red cell distribution width (RDW) for predicting the severity grade of acute pancreatitis and to develop and verify a more accurate scoring system to predict the severity of AP. In 302 patients with AP, we calculated BISAP and MEWS scores and conducted regression analyses on the relationships of BISAP scoring, RDW, MEWS, and serum Ca2+ with the severity of AP using single-factor logistics. The variables with statistical significance in the single-factor logistic regression were used in a multi-factor logistic regression model; forward stepwise regression was used to screen variables and build a multi-factor prediction model. A receiver operating characteristic curve (ROC curve) was constructed, and the significance of multi- and single-factor prediction models in predicting the severity of AP using the area under the ROC curve (AUC) was evaluated. The internal validity of the model was verified through bootstrapping. Among 302 patients with AP, 209 had mild acute pancreatitis (MAP) and 93 had severe acute pancreatitis (SAP). According to single-factor logistic regression analysis, we found that BISAP, MEWS and serum Ca2+ are prediction indexes of the severity of AP (P-value0.05). The multi-factor logistic regression analysis showed that BISAP and serum Ca2+ are independent prediction indexes of AP severity (P-value0.05); BISAP is negatively related to serum Ca2+ (r=-0.330, P-valuemodel is as follows: ln()=7.306+1.151*BISAP-4.516*serum Ca2+. The predictive ability of each model for SAP follows the order of the combined BISAP and serum Ca2+ prediction model>Ca2+>BISAP. There is no statistical significance for the predictive ability of BISAP and serum Ca2+ (P-value>0.05); however, there is remarkable statistical significance for the predictive ability using the newly built prediction model as well as BISAP

  13. Identifying the Evaluative Impulse in Local Culture: Insights from West African Proverbs

    Science.gov (United States)

    Easton, Peter B.

    2012-01-01

    Attention to cultural competence has significantly increased in the human services over the last two decades. Evaluators have long had similar concerns and have made a more concentrated effort in recent years to adapt evaluation methodology to varying cultural contexts. Little of this literature, however, has focused on the extent to which local…

  14. Structural identifiability of cyclic graphical models of biological networks with latent variables.

    Science.gov (United States)

    Wang, Yulin; Lu, Na; Miao, Hongyu

    2016-06-13

    Graphical models have long been used to describe biological networks for a variety of important tasks such as the determination of key biological parameters, and the structure of graphical model ultimately determines whether such unknown parameters can be unambiguously obtained from experimental observations (i.e., the identifiability problem). Limited by resources or technical capacities, complex biological networks are usually partially observed in experiment, which thus introduces latent variables into the corresponding graphical models. A number of previous studies have tackled the parameter identifiability problem for graphical models such as linear structural equation models (SEMs) with or without latent variables. However, the limited resolution and efficiency of existing approaches necessarily calls for further development of novel structural identifiability analysis algorithms. An efficient structural identifiability analysis algorithm is developed in this study for a broad range of network structures. The proposed method adopts the Wright's path coefficient method to generate identifiability equations in forms of symbolic polynomials, and then converts these symbolic equations to binary matrices (called identifiability matrix). Several matrix operations are introduced for identifiability matrix reduction with system equivalency maintained. Based on the reduced identifiability matrices, the structural identifiability of each parameter is determined. A number of benchmark models are used to verify the validity of the proposed approach. Finally, the network module for influenza A virus replication is employed as a real example to illustrate the application of the proposed approach in practice. The proposed approach can deal with cyclic networks with latent variables. The key advantage is that it intentionally avoids symbolic computation and is thus highly efficient. Also, this method is capable of determining the identifiability of each single parameter and

  15. Molecular docking and NMR binding studies to identify novel inhibitors of human phosphomevalonate kinase

    Energy Technology Data Exchange (ETDEWEB)

    Boonsri, Pornthip [Chemical Proteomics Facility at Marquette, Department of Chemistry, Marquette University, Milwaukee, WI 53201 (United States); Department of Chemistry, NANOTEC Center of Nanotechnology, National Nanotechnology Center, Faculty of Science, Kasetsart University, Bangkok 10900 (Thailand); Neumann, Terrence S.; Olson, Andrew L.; Cai, Sheng [Chemical Proteomics Facility at Marquette, Department of Chemistry, Marquette University, Milwaukee, WI 53201 (United States); Herdendorf, Timothy J.; Miziorko, Henry M. [Division of Molecular Biology and Biochemistry, School of Biological Sciences, University of Missouri-Kansas City, Kansas City, MO 64110 (United States); Hannongbua, Supa [Department of Chemistry, NANOTEC Center of Nanotechnology, National Nanotechnology Center, Faculty of Science, Kasetsart University, Bangkok 10900 (Thailand); Sem, Daniel S., E-mail: daniel.sem@cuw.edu [Chemical Proteomics Facility at Marquette, Department of Chemistry, Marquette University, Milwaukee, WI 53201 (United States)

    2013-01-04

    Highlights: Black-Right-Pointing-Pointer Natural and synthetic inhibitors of human phosphomevalonate kinase identified. Black-Right-Pointing-Pointer Virtual screening yielded a hit rate of 15%, with inhibitor K{sub d}'s of 10-60 {mu}M. Black-Right-Pointing-Pointer NMR studies indicate significant protein conformational changes upon binding. -- Abstract: Phosphomevalonate kinase (PMK) phosphorylates mevalonate-5-phosphate (M5P) in the mevalonate pathway, which is the sole source of isoprenoids and steroids in humans. We have identified new PMK inhibitors with virtual screening, using autodock. Promising hits were verified and their affinity measured using NMR-based {sup 1}H-{sup 15}N heteronuclear single quantum coherence (HSQC) chemical shift perturbation and fluorescence titrations. Chemical shift changes were monitored, plotted, and fitted to obtain dissociation constants (K{sub d}). Tight binding compounds with K{sub d}'s ranging from 6-60 {mu}M were identified. These compounds tended to have significant polarity and negative charge, similar to the natural substrates (M5P and ATP). HSQC cross peak changes suggest that binding induces a global conformational change, such as domain closure. Compounds identified in this study serve as chemical genetic probes of human PMK, to explore pharmacology of the mevalonate pathway, as well as starting points for further drug development.

  16. Which Doctor to Trust: A Recommender System for Identifying the Right Doctors.

    Science.gov (United States)

    Guo, Li; Jin, Bo; Yao, Cuili; Yang, Haoyu; Huang, Degen; Wang, Fei

    2016-07-07

    Key opinion leaders (KOLs) are people who can influence public opinion on a certain subject matter. In the field of medical and health informatics, it is critical to identify KOLs on various disease conditions. However, there have been very few studies on this topic. We aimed to develop a recommender system for identifying KOLs for any specific disease with health care data mining. We exploited an unsupervised aggregation approach for integrating various ranking features to identify doctors who have the potential to be KOLs on a range of diseases. We introduce the design, implementation, and deployment details of the recommender system. This system collects the professional footprints of doctors, such as papers in scientific journals, presentation activities, patient advocacy, and media exposure, and uses them as ranking features to identify KOLs. We collected the information of 2,381,750 doctors in China from 3,657,797 medical journal papers they published, together with their profiles, academic publications, and funding. The empirical results demonstrated that our system outperformed several benchmark systems by a significant margin. Moreover, we conducted a case study in a real-world system to verify the applicability of our proposed method. Our results show that doctors' profiles and their academic publications are key data sources for identifying KOLs in the field of medical and health informatics. Moreover, we deployed the recommender system and applied the data service to a recommender system of the China-based Internet technology company NetEase. Patients can obtain authority ranking lists of doctors with this system on any given disease.

  17. Sustainability in Health care by Allocating Resources Effectively (SHARE) 6: investigating methods to identify, prioritise, implement and evaluate disinvestment projects in a local healthcare setting.

    Science.gov (United States)

    Harris, Claire; Allen, Kelly; Brooke, Vanessa; Dyer, Tim; Waller, Cara; King, Richard; Ramsey, Wayne; Mortimer, Duncan

    2017-05-25

    This is the sixth in a series of papers reporting Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE program was established to investigate a systematic, integrated, evidence-based approach to disinvestment within a large Australian health service. This paper describes the methods employed in undertaking pilot disinvestment projects. It draws a number of lessons regarding the strengths and weaknesses of these methods; particularly regarding the crucial first step of identifying targets for disinvestment. Literature reviews, survey, interviews, consultation and workshops were used to capture and process the relevant information. A theoretical framework was adapted for evaluation and explication of disinvestment projects, including a taxonomy for the determinants of effectiveness, process of change and outcome measures. Implementation, evaluation and costing plans were developed. Four literature reviews were completed, surveys were received from 15 external experts, 65 interviews were conducted, 18 senior decision-makers attended a data gathering workshop, 22 experts and local informants were consulted, and four decision-making workshops were undertaken. Mechanisms to identify disinvestment targets and criteria for prioritisation and decision-making were investigated. A catalogue containing 184 evidence-based opportunities for disinvestment and an algorithm to identify disinvestment projects were developed. An Expression of Interest process identified two potential disinvestment projects. Seventeen additional projects were proposed through a non-systematic nomination process. Four of the 19 proposals were selected as pilot projects but only one reached the implementation stage. Factors with potential influence on the outcomes of disinvestment projects are discussed and barriers and enablers in the pilot projects are summarised. This study provides an in-depth insight into the experience of disinvestment

  18. K/sub infinity/-meter concept verified via subcritical-critical TRIGA experiments

    International Nuclear Information System (INIS)

    Ocampo Mansilla, H.

    1983-01-01

    This work presents a technique for building a device to measure the k/sub infinity/ of a spent nuclear fuel assembly discharged from the core of a nuclear power plant. The device, called a k/sub infinity/-meter, consists of a cross-shaped subcritical assembly, two artificial neutron sources, and two separate neutron counting systems. The central position of the subcritical assembly is used to measure k/sub infinity/ of the spent fuel assembly. The initial subcritical assembly is calibrated to determine its k/sub eff/ and verify the assigned k/sub infinity/ of a selected fuel assembly placed in the central position. Count rates are taken with the fuel assembly of known k/sub infinity/'s placed in the central position and then repeated with a fuel assembly of unknown k/sub infinity/ placed in the central position. The count rate ratio of the unknown fuel assembly to the known fuel assembly is used to determine the k/sub infinity/ of the unknown fuel assembly. The k/sub infinity/ of the unknown fuel assembly is represented as a polynomial function of the count rate ratios. The coefficients of the polynomial equation are determined using the neutronic codes LEOPARD and EXTERMINATOR-II. The analytical approach has been validated by performing several subcritical/critical experiments, using the Penn State Breazeale TRIGA Reactor (PSBR), and comparing the experimental results with the calculations

  19. Performance Evaluation in Sodium-to-Sodium Heat Exchangers in STELLA-2

    International Nuclear Information System (INIS)

    Jo, Youngchul; Son, Seok-kwon; Yoon, Jung; Jeong, Jiyoung

    2016-01-01

    The program aiming at an integral effect test is called STELLA-2, which will be used for synthetic review of the key safety issues of PGSFR. The basic and detailed design phases of the STELLA-2 test facility are underway in accordance with the specific design requirements reflecting the whole design features of PGSFR. Based on the STELLA-2 platform, a simulation of the PGSFR transient will be made to evaluate the plant dynamic behaviors and demonstrate the decay heat removal performance. The multi-dimensional effects coming from a large sodium pool system will be identified as well. Among several components of STELLA-2, there are five different types of model heat exchangers such as IHX, DHX, FHX, AHX, and UHX. Each heat exchanger has different characteristics, and it is very important to verify the heat transfer and pressure drop performance in each heat exchanger. The performance evaluation of the sodium-to-sodium heat exchangers (IHX and DHX) in STELLA-2 is performed using CFD. Also, these results are compared with 1-D heat exchanger design code. The shell/tube outlet temperature and heat transfer rate of the heat exchanger obtained by the CFD is not significantly

  20. Quality Issues Identified During the Evaluation of Biosimilars by the European Medicines Agency's Committee for Medicinal Products for Human Use.

    Science.gov (United States)

    Cilia, Mark; Ruiz, Sol; Richardson, Peter; Salmonson, Tomas; Serracino-Inglott, Anthony; Wirth, Francesca; Borg, John Joseph

    2018-02-01

    The aim of this study was to identify trends in deficiencies raised during the EU evaluation of the quality part of dossiers for marketing authorisation applications of biosimilar medicinal products. All adopted day 120 list of questions on the quality module of 22 marketing authorisation applications for biosimilars submitted to the European Medicines Agency and concluded by the end of October 2015 was analysed. Frequencies of common deficiencies identified were calculated and summarised descriptions included. Frequencies and trends on quality deficiencies were recorded and presented for 22 biosimilar applications. Thirty-two 'major objections' for 9 products were identified from 14 marketing authorisation applications with 15 raised for drug substance and 17 for drug product. In addition, 547 'other concerns' for drug substance and 495 for drug product were also adopted. The frequencies and trends of the identified deficiencies together with their impact were discussed from a regulatory perspective and how these impact key manufacturing processes and key materials used in the production of biosimilars. This study provides an insight to the regulatory challenges prospective companies need to consider when developing biosimilars; it also helps elucidate common pitfalls in the development and production of biosimilars and in the submission of dossiers for their marketing authorisations. The results are expected to be of interest to pharmaceutical companies but also to regulators to obtain consistent information on medicinal products based on transparent rules safeguarding the necessary pharmaceutical quality of medicinal products.

  1. Evaluating a satellite-based seasonal evapotranspiration product and identifying its relationship with other satellite-derived products and crop yield: A case study for Ethiopia

    Science.gov (United States)

    Tadesse, Tsegaye; Senay, Gabriel B.; Berhan, Getachew; Regassa, Teshome; Beyene, Shimelis

    2015-01-01

    Satellite-derived evapotranspiration anomalies and normalized difference vegetation index (NDVI) products from Moderate Resolution Imaging Spectroradiometer (MODIS) data are currently used for African agricultural drought monitoring and food security status assessment. In this study, a process to evaluate satellite-derived evapotranspiration (ETa) products with a geospatial statistical exploratory technique that uses NDVI, satellite-derived rainfall estimate (RFE), and crop yield data has been developed. The main goal of this study was to evaluate the ETa using the NDVI and RFE, and identify a relationship between the ETa and Ethiopia’s cereal crop (i.e., teff, sorghum, corn/maize, barley, and wheat) yields during the main rainy season. Since crop production is one of the main factors affecting food security, the evaluation of remote sensing-based seasonal ETa was done to identify the appropriateness of this tool as a proxy for monitoring vegetation condition in drought vulnerable and food insecure areas to support decision makers. The results of this study showed that the comparison between seasonal ETa and RFE produced strong correlation (R2 > 0.99) for all 41 crop growing zones in Ethiopia. The results of the spatial regression analyses of seasonal ETa and NDVI using Ordinary Least Squares and Geographically Weighted Regression showed relatively weak yearly spatial relationships (R2 < 0.7) for all cropping zones. However, for each individual crop zones, the correlation between NDVI and ETa ranged between 0.3 and 0.84 for about 44% of the cropping zones. Similarly, for each individual crop zones, the correlation (R2) between the seasonal ETa anomaly and de-trended cereal crop yield was between 0.4 and 0.82 for 76% (31 out of 41) of the crop growing zones. The preliminary results indicated that the ETa products have a good predictive potential for these 31 identified zones in Ethiopia. Decision makers may potentially use ETa products for monitoring cereal

  2. SPARQL-enabled identifier conversion with Identifiers.org.

    Science.gov (United States)

    Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-06-01

    On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.

  3. SPARQL-enabled identifier conversion with Identifiers.org

    Science.gov (United States)

    Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-01-01

    Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809

  4. Performance evaluation methodology for historical document image binarization.

    Science.gov (United States)

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  5. Quantitative phosphoproteomics using acetone-based peptide labeling: Method evaluation and application to a cardiac ischemia/reperfusion model

    Science.gov (United States)

    Wijeratne, Aruna B.; Manning, Janet R.; Schultz, Jo El J.; Greis, Kenneth D.

    2013-01-01

    Mass spectrometry (MS) techniques to globally profile protein phosphorylation in cellular systems that are relevant to physiological or pathological changes have been of significant interest in biological research. In this report, an MS-based strategy utilizing an inexpensive acetone-based peptide labeling technique known as reductive alkylation by acetone (RABA) for quantitative phosphoproteomics was explored to evaluate its capacity. Since the chemistry for RABA-labeling for phosphorylation profiling had not been previously reported, it was first validated using a standard phosphoprotein and identical phosphoproteomes from cardiac tissue extracts. A workflow was then utilized to compare cardiac tissue phosphoproteomes from mouse hearts not expressing FGF2 vs. hearts expressing low molecular weight fibroblast growth factor-2 (LMW FGF2) to relate low molecular weight fibroblast growth factor-2 (LMW FGF2) mediated cardioprotective phenomena induced by ischemia/reperfusion (I/R) injury of hearts, with downstream phosphorylation changes in LMW FGF2 signaling cascades. Statistically significant phosphorylation changes were identified at 14 different sites on 10 distinct proteins including some with mechanisms already established for LMW FGF2-mediated cardioprotective signaling (e.g. connexin-43), some with new details linking LMW FGF2 to the cardioprotective mechanisms (e.g. cardiac myosin binding protein C or cMyBPC), and also several new downstream effectors not previously recognized for cardio-protective signaling by LMW FGF2. Additionally, one of the phosphopeptides, cMyBPC/pSer-282, identified was further verified with site-specific quantification using an SRM (selected reaction monitoring)-based approach that also relies on isotope labeling of a synthetic phosphopeptide with deuterated acetone as an internal standard. Overall, this study confirms that the inexpensive acetone-based peptide labeling can be used in both exploratory and targeted quantification

  6. Scenarios for exercising technical approaches to verified nuclear reductions

    International Nuclear Information System (INIS)

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  7. Identifying trace evidence in data wiping application software

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2012-06-01

    Full Text Available One area of particular concern for computer forensics examiners involves situations in which someone utilized software applications to destroy evidence. There are products available in the marketplace that are relatively inexpensive and advertised as being able to destroy targeted portions of data stored within a computer system. This study was undertaken to identify these tools and analyze them to determine the extent to which each of the evaluated data wiping applications perform their tasks and to identify trace evidence, if any, left behind on disk media after executing these applications. We evaluated five Windows 7 compatible software products whose advertised features include the ability for users to wipe targeted files, folders, or evidence of selected activities. We conducted a series of experiments that involved executing each application on systems with identical data, and we then analyzed the results and compared the before and after images for each application. We identified information for each application that is beneficial to forensics examiners when faced with similar situations. This paper describes our application selection process, our application evaluation methodology, and our findings. Following this, we describe limitations of this study and suggest areas of additional research that will benefit the study of digital forensics.

  8. The complexity of evaluating and increasing adherence in inflammatory bowel disease

    DEFF Research Database (Denmark)

    Weimers, Petra; Burisch, Johan; Munkholm, Pia

    2017-01-01

    . Nonetheless, adherence remains a common and complex issue in IBD care. Patient characteristics such as young age, male sex and employment has previously been verified as possible predictors of non-adherence. Additionally, evaluating adherence in itself is a challenge since both accurate and easy...

  9. Verifying Operational and Developmental Air Force Weather Cloud Analysis and Forecast Products Using Lidar Data from Department of Energy Atmospheric Radiation Measurement (ARM) Sites

    Science.gov (United States)

    Hildebrand, E. P.

    2017-12-01

    Air Force Weather has developed various cloud analysis and forecast products designed to support global Department of Defense (DoD) missions. A World-Wide Merged Cloud Analysis (WWMCA) and short term Advected Cloud (ADVCLD) forecast is generated hourly using data from 16 geostationary and polar-orbiting satellites. Additionally, WWMCA and Numerical Weather Prediction (NWP) data are used in a statistical long-term (out to five days) cloud forecast model known as the Diagnostic Cloud Forecast (DCF). The WWMCA and ADVCLD are generated on the same polar stereographic 24 km grid for each hemisphere, whereas the DCF is generated on the same grid as its parent NWP model. When verifying the cloud forecast models, the goal is to understand not only the ability to detect cloud, but also the ability to assign it to the correct vertical layer. ADVCLD and DCF forecasts traditionally have been verified using WWMCA data as truth, but this might over-inflate the performance of those models because WWMCA also is a primary input dataset for those models. Because of this, in recent years, a WWMCA Reanalysis product has been developed, but this too is not a fully independent dataset. This year, work has been done to incorporate data from external, independent sources to verify not only the cloud forecast products, but the WWMCA data itself. One such dataset that has been useful for examining the 3-D performance of the cloud analysis and forecast models is Atmospheric Radiation Measurement (ARM) data from various sites around the globe. This presentation will focus on the use of the Department of Energy (DoE) ARM data to verify Air Force Weather cloud analysis and forecast products. Results will be presented to show relative strengths and weaknesses of the analyses and forecasts.

  10. Use of macroinvertebrates to identify cultivated wetlands in the Prairie Pothole Region

    Science.gov (United States)

    Euliss, Ned H.; Mushet, David M.; Johnson, Douglas H.

    2001-01-01

    We evaluated the use of macroinvertebrates as a potential tool to identify dry and intensively farmed temporary and seasonal wetlands in the Prairie Pothole Region. The techniques we designed and evaluated used the dried remains of invertebrates or their egg banks in soils as indicators of wetlands. For both the dried remains of invertebrates and their egg banks, we weighted each taxon according to its affinity for wetlands or uplands. Our study clearly demonstrated that shells, exoskeletons, head capsules, eggs, and other remains of macroinvertebrates can be used to identify wetlands, even when they are dry, intensively farmed, and difficult to identify as wetlands using standard criteria (i.e., hydrology, hydrophytic vegetation, and hydric soils). Although both dried remains and egg banks identified wetlands, the combination was more useful, especially for identifying drained or filled wetlands. We also evaluated the use of coarse taxonomic groupings to stimulate use of the technique by nonspecialists and obtained satisfactory results in most situations.

  11. Identifying sources of atmospheric fine particles in Havana City using Positive Matrix Factorization technique

    International Nuclear Information System (INIS)

    Pinnera, I.; Perez, G.; Ramos, M.; Guibert, R.; Aldape, F.; Flores M, J.; Martinez, M.; Molina, E.; Fernandez, A.

    2011-01-01

    In previous study a set of samples of fine and coarse airborne particulate matter collected in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique. The concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were consistently determined in both particle sizes. The analytical database provided by PIXE was statistically analyzed in order to determine the local pollution sources. The Positive Matrix Factorization (PMF) technique was applied to fine particle data in order to identify possible pollution sources. These sources were further verified by enrichment factor (EF) calculation. A general discussion about these results is presented in this work. (Author)

  12. Evaluation of radiation protection conditions in intraoral radiology

    Energy Technology Data Exchange (ETDEWEB)

    Miguel, Cristiano; Barros, Frieda Saicla; Rocha, Anna Silvia Penteado Setti da, E-mail: miguel_cristianoch@yahoo.com.br [Universidade Tecnologica Federal do Parana (PPGEB/UTFPR), Curitiba, PR (Brazil). Programa de Pos-graduacao em Engenharia Biomedica; Tilly Junior, Joao Gilberto [Universidade Federal do Parana (UNIR/UFPR), Curitiba, PR (Brazil). Hospital de Clinicas. Unidade de Imagem e Radioterapia; Almeida, Claudio Domingues de [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Dept. de Fisica Medica

    2016-04-15

    Introduction: The dental radiology represents about 20% of human exposure to radiation in radio diagnostic. Although the doses practiced in intraoral dentistry are considered low, they should not be ignored due to the volume of the performed procedures. This study presents the radiation protection conditions for intraoral radiology in Curitiba - PR. Methods: Data was collected through a quantitative field research of a descriptive nature during the period between September of 2013 and December of 2014. The survey sample consisted of 97 dentists and 130 intraoral equipment. The data related to the equipment was collected using structured questions and quality control evaluations. The evaluations of the entrance skin dose, the size of the radiation field and the total filtration were performed with dosimetry kits provided and evaluated by IRD/CNEN. The exposure time and voltage were measured using noninvasive detectors. The occupational dose was verified by thermoluminescent dosimeters. The existence of personal protection equipment, the type of image processing and knowledge of dentists about radiation protection were verified through the application of a questionnaire. Results: Among the survey's results, it is important to emphasize that 90% of the evaluated equipment do not meet all the requirements of the Brazilian radiation protection standards. Conclusion: The lack of knowledge about radiation protection, the poor operating conditions of the equipment, and the image processing through visual method are mainly responsible for the unnecessary exposure of patients to ionizing radiation. (author)

  13. Performance Evaluation of Spectral Clustering Algorithm using Various Clustering Validity Indices

    OpenAIRE

    M. T. Somashekara; D. Manjunatha

    2014-01-01

    In spite of the popularity of spectral clustering algorithm, the evaluation procedures are still in developmental stage. In this article, we have taken benchmarking IRIS dataset for performing comparative study of twelve indices for evaluating spectral clustering algorithm. The results of the spectral clustering technique were also compared with k-mean algorithm. The validity of the indices was also verified with accuracy and (Normalized Mutual Information) NMI score. Spectral clustering algo...

  14. Verification and Performance Evaluation of Timed Game Strategies

    DEFF Research Database (Denmark)

    David, Alexandre; Fang, Huixing; Larsen, Kim Guldstrand

    2014-01-01

    Control synthesis techniques, based on timed games, derive strategies to ensure a given control objective, e.g., time-bounded reachability. Model checking verifies correctness properties of systems. Statistical model checking can be used to analyse performance aspects of systems, e.g., energy...... consumption. In this work, we propose to combine these three techniques. In particular, given a strategy synthesized for a timed game and a given control objective, we want to make a deeper examination of the consequences of adopting this strategy. Firstly, we want to apply model checking to the timed game...... under the synthesized strategy in order to verify additional correctness properties. Secondly, we want to apply statistical model checking to evaluate various performance aspects of the synthesized strategy. For this, the underlying timed game is extended with relevant price and stochastic information...

  15. Ringhals 1 PSA - Evaluation of safety raising measures

    International Nuclear Information System (INIS)

    Hellstroem, P.; Enerholm, A.; Holmgren, P.

    1995-09-01

    A PSA-study of the BWR-reactor has been evaluated, and the following comments are made: The objectives of the study have been reached, but an overview is lacking. Means for verifying the functional demands at primary events should have a very high priority. A deeper analysis is needed to verify that overpressurizing and overfilling not cause HS. The analysis should include a report on dynamic effects at a LOCA. The level of detail in modelling functional dependencies should be increased. The low probabilities for faults caused by the operating personnel should be explained and possible defaults in instructions etc. documented. The results of the study should be analyzed deeper in comparison with other studies. Suggestions for improvement of the plant and priorities for these are missing. The documentation should be improved

  16. External evaluation of the Radiation Therapy Oncology Group brachial plexus contouring protocol: several issues identified

    International Nuclear Information System (INIS)

    Min, Myo; Carruthers, Scott; Zanchetta, Lydia; Roos, Daniel; Keating, Elly; Shakeshaft, John; Baxi, Siddhartha; Penniment, Michael; Wong, Karen

    2014-01-01

    The aims of the study were to evaluate interobserver variability in contouring the brachial plexus (BP) using the Radiation Therapy Oncology Group (RTOG)-approved protocol and to analyse BP dosimetries. Seven outliners independently contoured the BPs of 15 consecutive patients. Interobserver variability was reviewed qualitatively (visually by using planning axial computed-tomography images and anteroposterior digitally reconstructed radiographs) and quantitatively (by volumetric and statistical analyses). Dose–volume histograms of BPs were calculated and compared. We found significant interobserver variability among outliners in both qualitative and quantitative analyses. These were most pronounced for the T1 nerve roots on visual inspection and for the BP volume on statistical analysis. The BP volumes were smaller than those described in the RTOG atlas paper, with a mean volume of 20.8cc (range 11–40.7 cc) compared with 33±4cc (25.1–39.4cc). The average values of mean dose, maximum dose, V60Gy, V66Gy and V70Gy for patients treated with conventional radiotherapy and IMRT were 42.2Gy versus 44.8Gy, 64.5Gy versus 68.5Gy, 6.1% versus 7.6%, 2.9% versus 2.4% and 0.6% versus 0.3%, respectively. This is the first independent external evaluation of the published protocol. We have identified several issues, including significant interobserver variation. Although radiation oncologists should contour BPs to avoid dose dumping, especially when using IMRT, the RTOG atlas should be used with caution. Because BPs are largely radiologically occult on CT, we propose the term brachial-plexus regions (BPRs) to represent regions where BPs are likely to be present. Consequently, BPRs should in principle be contoured generously.

  17. Identifying modular relations in complex brain networks

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther; Mørup, Morten; Siebner, Hartwig

    2012-01-01

    We evaluate the infinite relational model (IRM) against two simpler alternative nonparametric Bayesian models for identifying structures in multi subject brain networks. The models are evaluated for their ability to predict new data and infer reproducible structures. Prediction and reproducibility...... and obtains comparable reproducibility and predictability. For resting state functional magnetic resonance imaging data from 30 healthy controls the IRM model is also superior to the two simpler alternatives, suggesting that brain networks indeed exhibit universal complex relational structure...

  18. Simulation-Based Evaluation of the Performances of an Algorithm for Detecting Abnormal Disease-Related Features in Cattle Mortality Records.

    Science.gov (United States)

    Perrin, Jean-Baptiste; Durand, Benoît; Gay, Emilie; Ducrot, Christian; Hendrikx, Pascal; Calavas, Didier; Hénaux, Viviane

    2015-01-01

    We performed a simulation study to evaluate the performances of an anomaly detection algorithm considered in the frame of an automated surveillance system of cattle mortality. The method consisted in a combination of temporal regression and spatial cluster detection which allows identifying, for a given week, clusters of spatial units showing an excess of deaths in comparison with their own historical fluctuations. First, we simulated 1,000 outbreaks of a disease causing extra deaths in the French cattle population (about 200,000 herds and 20 million cattle) according to a model mimicking the spreading patterns of an infectious disease and injected these disease-related extra deaths in an authentic mortality dataset, spanning from January 2005 to January 2010. Second, we applied our algorithm on each of the 1,000 semi-synthetic datasets to identify clusters of spatial units showing an excess of deaths considering their own historical fluctuations. Third, we verified if the clusters identified by the algorithm did contain simulated extra deaths in order to evaluate the ability of the algorithm to identify unusual mortality clusters caused by an outbreak. Among the 1,000 simulations, the median duration of simulated outbreaks was 8 weeks, with a median number of 5,627 simulated deaths and 441 infected herds. Within the 12-week trial period, 73% of the simulated outbreaks were detected, with a median timeliness of 1 week, and a mean of 1.4 weeks. The proportion of outbreak weeks flagged by an alarm was 61% (i.e. sensitivity) whereas one in three alarms was a true alarm (i.e. positive predictive value). The performances of the detection algorithm were evaluated for alternative combination of epidemiologic parameters. The results of our study confirmed that in certain conditions automated algorithms could help identifying abnormal cattle mortality increases possibly related to unidentified health events.

  19. Evaluation Guide for the Evaluated Spent Nuclear Fuel Assay Database (SFCOMPO)

    International Nuclear Information System (INIS)

    Ortego, P.; Boulanger, D.; Gauld, I.; Gysemans, M.; Hennebach, M.; Mennerdahl, D.; Neuber, J.C.; Tore, C.; Tittelbach, S.; Van Winckel, S.; Zwicky, H.-U.

    2016-01-01

    The Expert Group on Assay Data for Spent Nuclear Fuel (EGADSNF) of the Working Party on Nuclear Criticality Safety (WPNCS) under the auspices of the NEA Nuclear Science Committee was set up in 2007 with the objective of compiling and evaluating experimental data on the nuclide compositions of irradiated nuclear fuel. Experimental data refer not only to the measured nuclide inventories and uncertainties as determined mainly by destructive radiochemical assay of spent fuel samples, but also the fuel assembly design data, reactor design information, and operating data necessary to develop benchmark models. These data provide an important basis for validating calculation methods (computer codes and nuclear data) used in fuel burn-up analyses applied to spent fuel criticality safety analyses using burnup credit, thermal analysis, radiation shielding, accident dose consequence analysis, fuel cycle safety, reprocessing, and deep geological repository safety studies. The quality and usefulness of the experimental data for methods validation can be improved by developing complete descriptions of the experiments, providing benchmark specifications, and by performing independent evaluations of the experimental and benchmark data. Evaluations of the experimental data have been identified as an important task to verify the quality of the information in the spent fuel composition (SFCOMPO) database maintained by the Nuclear Energy Agency (NEA) Data Bank. This guide defines the evaluation document format and data review procedures for evaluators tasked with reviewing the experimental data. Guidance is developed to provide recommended procedures and criteria developed by experts in the field on how to perform standardised reviews, how to identify potential problems in the measurement data and gaps in the experimental description, and provide guidance on how to resolve these issues, when possible, using a consistent technical basis. Procedures for deriving benchmark data and models

  20. Evaluation of management information systems: A study at a further education and training college

    Directory of Open Access Journals (Sweden)

    Mariette Visser

    2013-03-01

    Objectives: The main objective was to propose a MIS evaluation model and evaluation tool(questionnaire, and verify the model empirically by evaluating the MIS at a selected FET college. The supporting objectives were firstly, to identify the most appropriate MIS evaluation models from literature. Secondly, to propose a MIS evaluation model for FET colleges based on the literature. Thirdly, to develop the evaluation tool (questionnaire based on these models. Fourthly, to capture and analyse data from one FET college, in order to evaluate the performance of the MIS at the college. The final supporting objective was to evaluate the proposed model by triangulating the findings from the survey with the findings from the interviews. Method: The proposed MIS evaluation model is based on the integration of three existing MIS evaluation models. The evaluation tool was developed by combining four empirically tested questionnaires that capture the constructs in the underlying models. A survey and semi-structured interviews were used as data collection methods. The statistical tests for consistency, scale reliability (Cronbach’s alpha and unidimensionality (Principal Component Analysis were applied to explore the constructs in the model. Results: Results from the empirical testing of the newly designed evaluation tool were used to refine the initial model. The qualitative data capturing and analysis added value in explaining and contextualising the quantitative findings. Conclusion: The main contribution is the SA-FETMIS success model and evaluation tool which managers can use to evaluate the MIS at an educational institution. The novelty of the research lies in using a mixed methods approach where previous MIS success evaluation studies mainly used quantitative methods.

  1. Communication difficulties in children identified with psychiatric problems

    OpenAIRE

    Helland, Wenche Andersen

    2010-01-01

    Several studies have pointed to an overlap between different developmental psychopathological conditions and language impairments, and difficulties with communication have been identified in children of various diagnostic backgrounds. This thesis is based on three empirical studies, and the purposes are to investigate communication difficulties as reported by parents, in children identified with psychiatric problems as well as to evaluate a Norwegian adaptation of the Children’...

  2. Mechanical and assembly units of viral capsids identified via quasi-rigid domain decomposition.

    Directory of Open Access Journals (Sweden)

    Guido Polles

    Full Text Available Key steps in a viral life-cycle, such as self-assembly of a protective protein container or in some cases also subsequent maturation events, are governed by the interplay of physico-chemical mechanisms involving various spatial and temporal scales. These salient aspects of a viral life cycle are hence well described and rationalised from a mesoscopic perspective. Accordingly, various experimental and computational efforts have been directed towards identifying the fundamental building blocks that are instrumental for the mechanical response, or constitute the assembly units, of a few specific viral shells. Motivated by these earlier studies we introduce and apply a general and efficient computational scheme for identifying the stable domains of a given viral capsid. The method is based on elastic network models and quasi-rigid domain decomposition. It is first applied to a heterogeneous set of well-characterized viruses (CCMV, MS2, STNV, STMV for which the known mechanical or assembly domains are correctly identified. The validated method is next applied to other viral particles such as L-A, Pariacoto and polyoma viruses, whose fundamental functional domains are still unknown or debated and for which we formulate verifiable predictions. The numerical code implementing the domain decomposition strategy is made freely available.

  3. Shuttle orbiter Ku-band radar/communications system design evaluation. Deliverable test equipment evaluation

    Science.gov (United States)

    Maronde, R. G.

    1980-07-01

    The Ku-band test equipment, known as the Deliverable System Test equipment (DSTE), is reviewed and evaluated. The DSTE is semiautomated and computer programs were generated for 14 communication mode tests and 17 radar mode tests. The 31 test modules provide a good cross section of tests with which to exercise the Ku-band system; however, it is very limited when being used to verify Ku-band system performance. More detailed test descriptions are needed, and a major area of concern is the DSTE sell-off procedure which is inadequate.

  4. Open MR imaging of the unstable shoulder in the apprehension test position: description and evaluation of an alternative MR examination position

    International Nuclear Information System (INIS)

    Wintzell, G.; Larsson, S.; Larsson, H.; Zyto, K.

    1999-01-01

    The aim of this study was to describe and evaluate an alternative MR assessment procedure for analysis of unstable shoulders. Twelve patients with unilateral recurrent anterior shoulder dislocation had both shoulders examined. Magnetic resonance imaging was performed with an open-MR system in the apprehension position with the shoulder in 90 of abduction and maximum tolerable external rotation. Contrast enhancement was achieved with intravenous gadolinium. Correlations were made to the findings at operation. In 10 of 12 unstable shoulders the inferior glenohumeral ligament labral complex (IGHLLC) was detached from the glenoid as seen on MR and later verified during surgery. In one shoulder MR was unable to show a capsulolabral detachment that was verified at surgery, whereas in one shoulder both MR and surgical assessment revealed no soft tissue detachment (accuracy 92 %). A Hill-Sachs lesion was visualized and verified in all unstable shoulders, whereas the stable controls revealed normal IGHLLC and no Hill-Sachs lesion. Open-MRI evaluation of the shoulder in the apprehension test position may become a useful tool for the evaluation of anterior shoulder instability. (orig.)

  5. The International Criticality Safety Benchmark Evaluation Project

    International Nuclear Information System (INIS)

    Briggs, B. J.; Dean, V. F.; Pesic, M. P.

    2001-01-01

    In order to properly manage the risk of a nuclear criticality accident, it is important to establish the conditions for which such an accident becomes possible for any activity involving fissile material. Only when this information is known is it possible to establish the likelihood of actually achieving such conditions. It is therefore important that criticality safety analysts have confidence in the accuracy of their calculations. Confidence in analytical results can only be gained through comparison of those results with experimental data. The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the US Department of Energy. The project was managed through the Idaho National Engineering and Environmental Laboratory (INEEL), but involved nationally known criticality safety experts from Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Savannah River Technology Center, Oak Ridge National Laboratory and the Y-12 Plant, Hanford, Argonne National Laboratory, and the Rocky Flats Plant. An International Criticality Safety Data Exchange component was added to the project during 1994 and the project became what is currently known as the International Criticality Safety Benchmark Evaluation Project (ICSBEP). Representatives from the United Kingdom, France, Japan, the Russian Federation, Hungary, Kazakhstan, Korea, Slovenia, Yugoslavia, Spain, and Israel are now participating on the project In December of 1994, the ICSBEP became an official activity of the Organization for Economic Cooperation and Development - Nuclear Energy Agency's (OECD-NEA) Nuclear Science Committee. The United States currently remains the lead country, providing most of the administrative support. The purpose of the ICSBEP is to: (1) identify and evaluate a comprehensive set of critical benchmark data; (2) verify the data, to the extent possible, by reviewing original and subsequently revised documentation, and by talking with the

  6. Learning to identify Protected Health Information by integrating knowledge- and data-driven algorithms: A case study on psychiatric evaluation notes.

    Science.gov (United States)

    Dehghan, Azad; Kovacevic, Aleksandar; Karystianis, George; Keane, John A; Nenadic, Goran

    2017-11-01

    De-identification of clinical narratives is one of the main obstacles to making healthcare free text available for research. In this paper we describe our experience in expanding and tailoring two existing tools as part of the 2016 CEGS N-GRID Shared Tasks Track 1, which evaluated de-identification methods on a set of psychiatric evaluation notes for up to 25 different types of Protected Health Information (PHI). The methods we used rely on machine learning on either a large or small feature space, with additional strategies, including two-pass tagging and multi-class models, which both proved to be beneficial. The results show that the integration of the proposed methods can identify Health Information Portability and Accountability Act (HIPAA) defined PHIs with overall F 1 -scores of ∼90% and above. Yet, some classes (Profession, Organization) proved again to be challenging given the variability of expressions used to reference given information. Copyright © 2017. Published by Elsevier Inc.

  7. Evaluation of bentonite alteration due to interactions with iron. Sensitivity analyses to identify the important factors for the bentonite alteration

    International Nuclear Information System (INIS)

    Sasamoto, Hiroshi; Wilson, James; Sato, Tsutomu

    2013-01-01

    Performance assessment of geological disposal systems for high-level radioactive waste requires a consideration of long-term systems behaviour. It is possible that the alteration of swelling clay present in bentonite buffers might have an impact on buffer functions. In the present study, iron (as a candidate overpack material)-bentonite (I-B) interactions were evaluated as the main buffer alteration scenario. Existing knowledge on alteration of bentonite during I-B interactions was first reviewed, then the evaluation methodology was developed considering modeling techniques previously used overseas. A conceptual model for smectite alteration during I-B interactions was produced. The following reactions and processes were selected: 1) release of Fe 2+ due to overpack corrosion; 2) diffusion of Fe 2+ in compacted bentonite; 3) sorption of Fe 2+ on smectite edge and ion exchange in interlayers; 4) dissolution of primary phases and formation of alteration products. Sensitivity analyses were performed to identify the most important factors for the alteration of bentonite by I-B interactions. (author)

  8. Non-destructive technique to verify clearance of pipes

    Directory of Open Access Journals (Sweden)

    Savidou Anastasia

    2010-01-01

    Full Text Available A semi-empirical, non-destructive technique to evaluate the activity of gamma ray emitters in contaminated pipes is discussed. The technique is based on in-situ measurements by a portable NaI gamma ray spectrometer. The efficiency of the detector for the pipe and detector configuration was evaluated by Monte Carlo calculations performed using the MCNP code. Gamma ray detector full-energy peak efficiency was predicted assuming a homogeneous activity distribution over the internal surface of the pipe for 344 keV, 614 keV, 662 keV, and 1332 keV photons, representing Eu-152, Ag-118m, Cs-137, and Co-60 contamination, respectively. The effect of inhomogeneity on the accuracy of the technique was also examined. The model was validated against experimental measurements performed using a Cs-137 volume calibration source representing a contaminated pipe and good agreement was found between the calculated and experimental results. The technique represents a sensitive and cost-effective technology for calibrating portable gamma ray spectrometry systems and can be applied in a range of radiation protection and waste management applications.

  9. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    Science.gov (United States)

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely

  10. Non-Destructive Evaluation Method Based On Dynamic Invariant Stress Resultants

    Directory of Open Access Journals (Sweden)

    Zhang Junchi

    2015-01-01

    Full Text Available Most of the vibration based damage detection methods are based on changes in frequencies, mode shapes, mode shape curvature, and flexibilities. These methods are limited and typically can only detect the presence and location of damage. Current methods seldom can identify the exact severity of damage to structures. This paper will present research in the development of a new non-destructive evaluation method to identify the existence, location, and severity of damage for structural systems. The method utilizes the concept of invariant stress resultants (ISR. The basic concept of ISR is that at any given cross section the resultant internal force distribution in a structural member is not affected by the inflicted damage. The method utilizes dynamic analysis of the structure to simulate direct measurements of acceleration, velocity and displacement simultaneously. The proposed dynamic ISR method is developed and utilized to detect the damage of corresponding changes in mass, damping and stiffness. The objectives of this research are to develop the basic theory of the dynamic ISR method, apply it to the specific types of structures, and verify the accuracy of the developed theory. Numerical results that demonstrate the application of the method will reflect the advanced sensitivity and accuracy in characterizing multiple damage locations.

  11. Identifying a breeding habitat of a critically endangered fish, Acheilognathus typus, in a natural river in Japan

    Science.gov (United States)

    Sakata, Masayuki K.; Maki, Nobutaka; Sugiyama, Hideki; Minamoto, Toshifumi

    2017-12-01

    Freshwater biodiversity has been severely threatened in recent years, and to conserve endangered species, their distribution and breeding habitats need to be clarified. However, identifying breeding sites in a large area is generally difficult. Here, by combining the emerging environmental DNA (eDNA) analysis with subsequent traditional collection surveys, we successfully identified a breeding habitat for the critically endangered freshwater fish Acheilognathus typus in the mainstream of Omono River in Akita Prefecture, Japan, which is one of the original habitats of this species. Based on DNA cytochrome B sequences of A. typus and closely related species, we developed species-specific primers and a probe that were used in real-time PCR for detecting A. typus eDNA. After verifying the specificity and applicability of the primers and probe on water samples from known artificial habitats, eDNA analysis was applied to water samples collected at 99 sites along Omono River. Two of the samples were positive for A. typus eDNA, and thus, small fixed nets and bottle traps were set out to capture adult fish and verify egg deposition in bivalves (the preferred breeding substrate for A. typus) in the corresponding regions. Mature female and male individuals and bivalves containing laid eggs were collected at one of the eDNA-positive sites. This was the first record of adult A. typus in Omono River in 11 years. This study highlights the value of eDNA analysis to guide conventional monitoring surveys and shows that combining both methods can provide important information on breeding sites that is essential for species' conservation.

  12. Identifying a breeding habitat of a critically endangered fish, Acheilognathus typus, in a natural river in Japan.

    Science.gov (United States)

    Sakata, Masayuki K; Maki, Nobutaka; Sugiyama, Hideki; Minamoto, Toshifumi

    2017-11-14

    Freshwater biodiversity has been severely threatened in recent years, and to conserve endangered species, their distribution and breeding habitats need to be clarified. However, identifying breeding sites in a large area is generally difficult. Here, by combining the emerging environmental DNA (eDNA) analysis with subsequent traditional collection surveys, we successfully identified a breeding habitat for the critically endangered freshwater fish Acheilognathus typus in the mainstream of Omono River in Akita Prefecture, Japan, which is one of the original habitats of this species. Based on DNA cytochrome B sequences of A. typus and closely related species, we developed species-specific primers and a probe that were used in real-time PCR for detecting A. typus eDNA. After verifying the specificity and applicability of the primers and probe on water samples from known artificial habitats, eDNA analysis was applied to water samples collected at 99 sites along Omono River. Two of the samples were positive for A. typus eDNA, and thus, small fixed nets and bottle traps were set out to capture adult fish and verify egg deposition in bivalves (the preferred breeding substrate for A. typus) in the corresponding regions. Mature female and male individuals and bivalves containing laid eggs were collected at one of the eDNA-positive sites. This was the first record of adult A. typus in Omono River in 11 years. This study highlights the value of eDNA analysis to guide conventional monitoring surveys and shows that combining both methods can provide important information on breeding sites that is essential for species' conservation.

  13. The Patent Literature As A Shortcut To Identify Knowledge Suppliers

    DEFF Research Database (Denmark)

    Søberg, Peder Veng

    patents which decreases the time span between a patent is filed and its value can be evaluated when searching the patent literature. A potential benefit thereof could be that the patent literature could become relevant in order to identify potential knowledge suppliers.......The present paper explores characteristics of valuable patents that have been subject to litigation which resulted in some of the largest fines to patent infringers reported in history. The valuable patents are compared with less valuable patents in order to identify new methods of evaluating...

  14. A novel method of evaluating the lift force on the bluff body based on Noca’s flux equation

    International Nuclear Information System (INIS)

    Sui Xiang-Kun; Jiang Nan

    2015-01-01

    The influence of experimental error on lift force evaluated by Noca’s flux equation is studied based on adding errors into the direct numerical simulation data for flow past cylinder at Re = 100. As Noca suggested using the low-pass filter to get rid of the high-frequency noise in the evaluated lift force, we verify that his method is inapplicable for dealing with the dataset of 1% experimental error, although the precision is acceptable in practice. To overcome this defect, a novel method is proposed in this paper. The average of the lift forces calculated by using multiple control volume is taken as the evaluation before applying the low-pass filter. The method is applied to an experimental data for flow past a cylinder at approximately Re = 900 to verify its validation. The results show that it improves much better on evaluating the lift forces. (paper)

  15. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Science.gov (United States)

    Kung, Woon-Man; Chen, Shuo-Tsung; Lin, Chung-Hsiang; Lu, Yu-Mei; Chen, Tzu-Hsuan; Lin, Muh-Shi

    2013-01-01

    Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM) implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D) CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS). From January 2011 to June 2012, decompressive craniectomy (DC) was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84). CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (ppairs signed rank test). These data evidenced the highly accurate symmetry of these CAD models with regular contours. CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  16. CommWalker: correctly evaluating modules in molecular networks in light of annotation bias.

    Science.gov (United States)

    Luecken, M D; Page, M J T; Crosby, A J; Mason, S; Reinert, G; Deane, C M

    2018-03-15

    Detecting novel functional modules in molecular networks is an important step in biological research. In the absence of gold standard functional modules, functional annotations are often used to verify whether detected modules/communities have biological meaning. However, as we show, the uneven distribution of functional annotations means that such evaluation methods favor communities of well-studied proteins. We propose a novel framework for the evaluation of communities as functional modules. Our proposed framework, CommWalker, takes communities as inputs and evaluates them in their local network environment by performing short random walks. We test CommWalker's ability to overcome annotation bias using input communities from four community detection methods on two protein interaction networks. We find that modules accepted by CommWalker are similarly co-expressed as those accepted by current methods. Crucially, CommWalker performs well not only in well-annotated regions, but also in regions otherwise obscured by poor annotation. CommWalker community prioritization both faithfully captures well-validated communities and identifies functional modules that may correspond to more novel biology. The CommWalker algorithm is freely available at opig.stats.ox.ac.uk/resources or as a docker image on the Docker Hub at hub.docker.com/r/lueckenmd/commwalker/. deane@stats.ox.ac.uk. Supplementary data are available at Bioinformatics online.

  17. The validation of synthetic spectra used in the performance evaluation of radionuclide identifiers

    International Nuclear Information System (INIS)

    Flynn, A.; Boardman, D.; Reinhard, M.I.

    2013-01-01

    This work has evaluated synthetic gamma-ray spectra created by the RASE sampler using experimental data. The RASE sampler resamples experimental data to create large data libraries which are subsequently available for use in evaluation of radionuclide identification algorithms. A statistical evaluation of the synthetic energy bins has shown the variation to follow a Poisson distribution identical to experimental data. The minimum amount of statistics required in each base spectrum to ensure the subsequent use of the base spectrum in the generation of statistically robust synthetic data was determined. A requirement that the simulated acquisition time of the synthetic spectra was not more than 4% of the acquisition time of the base spectrum was also determined. Further validation of RASE was undertaken using two different radionuclide identification algorithms. - Highlights: • A validation of synthetic data created in order to evaluate radionuclide identification systems has been carried out. • Statistical analysis has shown that the data accurately represents experimental data. • A limit to the amount of data which could be created using this method was evaluated. • Analysis of the synthetic gamma spectra show identical results to analysis carried out with experimental data

  18. PGSFR Core Thermal Design Procedure to Evaluate the Safety Margin

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sun Rock; Kim, Sang-Ji [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    The Korea Atomic Energy Research Institute (KAERI) has performed a SFR design with the final goal of constructing a prototype plant by 2028. The main objective of the SFR prototype plant is to verify the TRU metal fuel performance, reactor operation, and transmutation ability of high-level wastes. The core thermal design is to ensure the safe fuel performance during the whole plant operation. Compared to the critical heat flux in typical light water reactors, nuclear fuel damage in SFR subassemblies arises from a creep induced failure. The creep limit is evaluated based on the maximum cladding temperature, power, neutron flux, and uncertainties in the design parameters, as shown in Fig. 1. In this work, the core thermal design procedures are compared to verify the present PGSFR methodology based on the nuclear plant design criteria/guidelines and previous SFR thermal design methods. The PGSFR core thermal design procedure is verified based on the nuclear plant design criteria/guidelines and previous methods in LWRs and SFRs. The present method aims to directly evaluate the fuel cladding failure and to assure more safety margin. The 2 uncertainty is similar to 95% one-side tolerance limit of 1.96 in LWRs. The HCFs, ITDP, and MCM reveal similar uncertainty propagation for cladding midwall temperature for typical SFR conditions. The present HCFs are mainly employed from the CRBR except the fuel-related uncertainty such as an incorrect fuel distribution. Preliminary PGSFR specific HCFs will be developed by the end of 2015.

  19. Clinical implications of nonspecific pulmonary nodules identified during the initial evaluation of patients with head and neck squamous cell carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Minsu [Eulji University School of Medicine, Department of Otorhinolaryngology, Eulji Medical Center, Seoul (Korea, Republic of); Lee, Sang Hoon; Lee, Yoon Se; Roh, Jong-Lyel; Choi, Seung-Ho; Nam, Soon Yuhl; Kim, Sang Yoon [Asan Medical Center, University of Ulsan College of Medicine, Department of Otolaryngology, Songpa-gu, Seoul (Korea, Republic of); Lee, Choong Wook [Asan Medical Center, University of Ulsan College of Medicine, Department of Radiology, Seoul (Korea, Republic of)

    2017-09-15

    We aimed to identify the clinical implications of nonspecific pulmonary nodules (NPNs) detected in the initial staging workup for patients with head and neck squamous cell carcinoma (HNSCC). Medical records of patients who had been diagnosed and treated in our hospital were retrospectively analysed. After definite treatment, changes of NPNs detected on initial evaluation were monitored via serial chest computed tomography. The associations between NPNs and the clinicopathological characteristics of primary HNSCC were evaluated. Survival analyses were performed according to the presence of NPNs. The study consisted of 158 (49.4%) patients without NPNs and 162 (50.6%) patients with NPNs. The cumulative incidence of probabilities of pulmonary malignancy (PM) development at 2 years after treatment were 9.0% and 6.2% in NPN-negative and NPN-positive patients, respectively. Overall and PM-free survival rates were not significantly different according to NPN status. Cervical lymph node (LN) involvement and a platelet-lymphocyte ratio (PLR) ≥126 increased the risk of PMs (both P <0.05). NPNs detected in the initial evaluation of patients with HNSCC did not predict the risk of pulmonary malignancies. Cervical LN involvement and PLR ≥126 may be independent prognostic factors affecting PM-free survival regardless of NPN status. (orig.)

  20. Evaluation of Haddam Neck (Connecticut Yankee) Nuclear Power Plant, environmental impact prediction, based on monitoring programs

    International Nuclear Information System (INIS)

    Gore, K.L.; Thomas, J.M.; Kannberg, L.D.; Mahaffey, J.A.; Waton, D.G.

    1976-12-01

    A study was undertaken by the U.S. Nuclear Regulatory Commission (NRC) to evaluate the nonradiological environmental data obtained from three nuclear power plants operating for a period of one year or longer. The document presented reports the second of three nuclear power plants to be evaluated in detail by Battelle, Pacific Northwest Laboratories. Haddam Neck (Connecticut Yankee) Nuclear Power Plant nonradiological monitoring data were assessed to determine their effectiveness in the measurement of environmental impacts. Efforts were made to determine if: (1) monitoring programs, as designed, can detect environmental impacts, (2) appropriate statistical analyses were performed and if they were sensitive enough to detect impacts, (3) predicted impacts could be verified by monitoring programs, and (4) monitoring programs satisfied the requirements of the Environmental Technical Specifications. Both preoperational and operational monitoring data were examined to test the usefulness of baseline information in evaluating impacts. This included an examination of the methods used to measure ecological, chemical, and physical parameters, and an assessment of sampling periodicity and sensitivity where appropriate data sets were available. From this type of analysis, deficiencies in both preoperational and operational monitoring programs may be identified and provide a basis for suggested improvement

  1. Mechanisms of change in psychotherapy for depression: An empirical update and evaluation of research aimed at identifying psychological mediators.

    Science.gov (United States)

    Lemmens, Lotte H J M; Müller, Viola N L S; Arntz, Arnoud; Huibers, Marcus J H

    2016-12-01

    We present a systematic empirical update and critical evaluation of the current status of research aimed at identifying a variety of psychological mediators in various forms of psychotherapy for depression. We summarize study characteristics and results of 35 relevant studies, and discuss the extent to which these studies meet several important requirements for mechanism research. Our review indicates that in spite of increased attention for the topic, advances in theoretical consensus about necessities for mechanism research, and sophistication of study designs, research in this field is still heterogeneous and unsatisfactory in methodological respect. Probably the biggest challenge in the field is demonstrating the causal relation between change in the mediator and change in depressive symptoms. The field would benefit from a further refinement of research methods to identify processes of therapeutic change. Recommendations for future research are discussed. However, even in the most optimal research designs, explaining psychotherapeutic change remains a challenge. Psychotherapy is a multi-dimensional phenomenon that might work through interplay of multiple mechanisms at several levels. As a result, it might be too complex to be explained in relatively simple causal models of psychological change. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Quality of life evaluation of workers for diagnostic radiology services

    International Nuclear Information System (INIS)

    Fernandes, Ivani Martins

    2011-01-01

    The main objective of this study was to evaluate the quality of life (QOL) of diagnostic radiology services workers at a hospital of Sao Paulo city. It aimed also to draw the profile of these workers identifying the variables, as its influence on their quality of life. A descriptive exploratory study with qualitative and quantitative approaches was carried out. The data were collected using the questionnaires: the abbreviated instrument for the assessment of the QOL, World Health Organization Quality of Life Instrument bref (WHOQOL-bref) and a questionnaire including the social demographic variables, work conditions and the variables that express the lifestyle of individuals, both questionnaires self-applied. The sample was formed by 118 workers, among them: physicians, technologists/technicians in radiology, nurses, technicians and assistants in nursing, and others health professionals. The data analysis included descriptive statistics, nonparametric tests and the use of a linear regression model. The reliability of the instrument for the studied sample was verified by Cronbach's Alpha Coefficient (α). The WHOQOL-bref proved to be an adequate instrument, with a good level of internal consistency (α=0.884), being easily and quickly administrated for the evaluation of the QOL. The study provided an overview of the perception of quality of life of the studied group. (author)

  3. Flaw evaluation of pressure vessel in pressurized water reactor

    International Nuclear Information System (INIS)

    Park, Ki Sung; Kim, Min Geol; Jeon, Chae Hong; Rhim, Soon Hyung; Kim, Seung Tae

    1999-01-01

    Flaw evaluation should be performed to determine the acceptance of a surface or a subsurface flaw detected during the in-service inspection without any repair or replacement. In this paper, the evaluation methodology and procedure were established according to ASME code Sec. XI and the evaluation program was coded. Using this program, a field engineer who doesn't have enough knowledge on fracture mechanics may be able to perform prompt and accurate flaw evaluation on site and decide whether a detected flaw be allowable or not. Analysis results were compared with those obtained from Westinghouse program called KCAL and FCG. Both results made good agreement and accuracy of the program developed in this paper was verified.=20

  4. A 6-gene signature identifies four molecular subgroups of neuroblastoma

    Science.gov (United States)

    2011-01-01

    Background There are currently three postulated genomic subtypes of the childhood tumour neuroblastoma (NB); Type 1, Type 2A, and Type 2B. The most aggressive forms of NB are characterized by amplification of the oncogene MYCN (MNA) and low expression of the favourable marker NTRK1. Recently, mutations or high expression of the familial predisposition gene Anaplastic Lymphoma Kinase (ALK) was associated to unfavourable biology of sporadic NB. Also, various other genes have been linked to NB pathogenesis. Results The present study explores subgroup discrimination by gene expression profiling using three published microarray studies on NB (47 samples). Four distinct clusters were identified by Principal Components Analysis (PCA) in two separate data sets, which could be verified by an unsupervised hierarchical clustering in a third independent data set (101 NB samples) using a set of 74 discriminative genes. The expression signature of six NB-associated genes ALK, BIRC5, CCND1, MYCN, NTRK1, and PHOX2B, significantly discriminated the four clusters (p INSS stage 4 and/or dead of disease, p < 0.05, Fisher's exact test). Conclusions Based on expression profiling we have identified four molecular subgroups of neuroblastoma, which can be distinguished by a 6-gene signature. The fourth subgroup has not been described elsewhere, and efforts are currently made to further investigate this group's specific characteristics. PMID:21492432

  5. Clinical evaluation of a mobile digital specimen radiography system for intraoperative specimen verification.

    Science.gov (United States)

    Wang, Yingbing; Ebuoma, Lilian; Saksena, Mansi; Liu, Bob; Specht, Michelle; Rafferty, Elizabeth

    2014-08-01

    Use of mobile digital specimen radiography systems expedites intraoperative verification of excised breast specimens. The purpose of this study was to evaluate the performance of a such a system for verifying targets. A retrospective review included 100 consecutive pairs of breast specimen radiographs. Specimens were imaged in the operating room with a mobile digital specimen radiography system and then with a conventional digital mammography system in the radiology department. Two expert reviewers independently scored each image for image quality on a 3-point scale and confidence in target visualization on a 5-point scale. A target was considered confidently verified only if both reviewers declared the target to be confidently detected. The 100 specimens contained a total of 174 targets, including 85 clips (49%), 53 calcifications (30%), 35 masses (20%), and one architectural distortion (1%). Although a significantly higher percentage of mobile digital specimen radiographs were considered poor quality by at least one reviewer (25%) compared with conventional digital mammograms (1%), 169 targets (97%), were confidently verified with mobile specimen radiography; 172 targets (98%) were verified with conventional digital mammography. Three faint masses were not confidently verified with mobile specimen radiography, and conventional digital mammography was needed for confirmation. One faint mass and one architectural distortion were not confidently verified with either method. Mobile digital specimen radiography allows high diagnostic confidence for verification of target excision in breast specimens across target types, despite lower image quality. Substituting this modality for conventional digital mammography can eliminate delays associated with specimen transport, potentially decreasing surgical duration and increasing operating room throughput.

  6. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    Science.gov (United States)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  7. Wastewater screening method for evaluating applicability of zero-valent iron to industrial wastewater

    International Nuclear Information System (INIS)

    Lee, J.W.; Cha, D.K.; Oh, Y.K.; Ko, K.B.; Jin, S.H.

    2010-01-01

    This study presents a screening protocol to evaluate the applicability of the ZVI pretreatment to various industrial wastewaters of which major constituents are not identified. The screening protocol consisted of a sequential analysis of UV-vis spectrophotometry, high-performance liquid chromatograph (HPLC), and bioassay. The UV-vis and HPLC analyses represented the potential reductive transformation of unknown constituents in wastewater by the ZVI. The UV-vis and HPLC results were quantified using principal component analysis (PCA) and Euclidian distance (ED). The short-term bioassay was used to assess the increased biodegradability of wastewater constituents after ZVI treatment. The screening protocol was applied to seven different types of real industrial wastewaters. After identifying one wastewater as the best candidate for the ZVI treatment, the benefit of ZVI pretreatment was verified through continuous operation of an integrated iron-sequencing batch reactor (SBR) resulting in the increased organic removal efficiency compared to the control. The iron pretreatment was suggested as an economical option to modify some costly physico-chemical processes in the existing wastewater treatment facility. The screening protocol could be used as a robust strategy to estimate the applicability of ZVI pretreatment to a certain wastewater with unknown composition.

  8. Evaluating financial education initiatives in South Africa: The importance of multiple evaluation approaches

    Directory of Open Access Journals (Sweden)

    Emily Massey

    2016-06-01

    Objectives: This study aims to show that, particularly in a South African context, where investment in financial education interventions is mandated by the Financial Sector Codes, impact should not be the only criterion assessed when evaluating financial education projects. Research method and design: This study was informed by a literature review, a synthesis of team experience on a range of financial education projects in South Africa and the development of case studies. Results: Describing the success or failure of a project needs to go beyond impact and explore factors such as project relevance, design and quality. In order to verify these other factors, different types of evaluations are necessary at the various stages of the project’s life-cycle. Conclusion: Expanding the learning objective beyond the exclusive identification of whether financial behaviour was achieved is particularly important where financial education projects, and the monitoring and evaluation thereof, is mandated. In the African context, where resources are scarce, money for monitoring and evaluation should be selectively channelled into determining project relevance, effectiveness, efficiency and then only impact.

  9. Preliminary results of an attempt to provide soil moisture datasets in order to verify numerical weather prediction models

    International Nuclear Information System (INIS)

    Cassardo, C.; Loglisci, N.

    2005-01-01

    In the recent years, there has been a significant growth in the recognition of the soil moisture importance in large-scale hydrology and climate modelling. Soil moisture is a lower boundary condition, which rules the partitioning of energy in terms of sensible and latent heat flux. Wrong estimations of soil moisture lead to wrong simulation of the surface layer evolution and hence precipitations and cloud cover forecasts could be consequently affected. This is true for large scale medium-range weather forecasts as well as for local-scale short range weather forecasts, particularly in those situations in which local convection is well developed. Unfortunately; despite the importance of this physical parameter there are only few soil moisture data sets sparse in time and in space around in the world. Due to this scarcity of soil moisture observations, we developed an alternative method to provide soil moisture datasets in order to verify numerical weather prediction models. In this paper are presented the preliminary results of an attempt to verify soil moisture fields predicted by a mesoscale model. The data for the comparison were provided by the simulations of the diagnostic land surface scheme LSPM (Land Surface Process Model), widely used at the Piedmont Regional Weather Service for agro-meteorological purposes. To this end, LSPM was initialized and driven by Synop observations, while the surface (vegetation and soil) parameter values were initialized by ECOCLIMAP global dataset at 1km 2 resolution

  10. Standardized methods to verify absorbed dose in irradiated food for insect control. Proceedings of a final research co-ordination meeting

    International Nuclear Information System (INIS)

    2001-03-01

    Irradiation to control insect infestation of food is increasingly accepted and applied, especially as a phytosanitary treatment of food as an alternative to fumigation. However, unlike other processes for insect control, irradiation does not always result in immediate insect death. Thus, it is conceivable that fresh and dried fruits and tree nuts, which have been correctly irradiated to meet insect disinfestation/quarantine requirements, may still contain live insects at the time of importation. There is, however, a movement by plant quarantine authorities away from inspecting to ensure the absence of live insects in imported consignments towards examining through administrative procedures that a treatment required by law has been given. Nevertheless, there is a need to provide plant quarantine inspectors with a reliable objective method to verify that a minimum absorbed dose of radiation was given to supplement administrative procedures. Such an objective method is expected to bolster the confidence of the inspectors in clearing the consignment without delay and to facilitate trade in irradiated commodities. The Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture initiated a co-ordinated research project (CRP) in 1994 to generate data on the verification of absorbed dose of irradiation in fresh, dried fruits and tree nuts for insect disinfestation/quarantine purposes. A standardized label dose indicator available commercially was used to verify the minimum/maximum absorbed dose of the irradiated commodities for these purposes as required by regulations in certain countries. It appears that such a label dose indicator with certain modifications could be made available to assist national authorities and the food industry to verify the absorbed dose of irradiation to facilitate trade in such irradiated commodities. This TECDOC reports on the accomplishments of this co-ordinated research project and includes the papers presented by the participants

  11. EMI Evaluation on Wireless Computer Devices in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Jae Ki; JI Yeong Hwa; Sung, Chan Ho

    2011-01-01

    Wireless computer devices, for example, mice and keyboards are widely used in various industries. However, I and C (instrumentation and control) equipment in nuclear power plants are very susceptible to the EMI (Electro-magnetic interference) and there are concerns regarding EMI induced transient caused by wireless computer devices which emit electromagnetic waves for communication. In this paper, industrial practices and nuclear related international standards are investigated to verify requirements of wireless devices. In addition, actual measurement and evaluation for the intensity of EMI of some commercially available wireless devices is performed to verify their compatibility in terms of EMI. Finally we suggest an appropriate method of using wireless computer devices in nuclear power plant control rooms for better office circumstances of operators

  12. Gene-environment interaction involving recently identified colorectal cancer susceptibility loci

    Science.gov (United States)

    Kantor, Elizabeth D.; Hutter, Carolyn M.; Minnier, Jessica; Berndt, Sonja I.; Brenner, Hermann; Caan, Bette J.; Campbell, Peter T.; Carlson, Christopher S.; Casey, Graham; Chan, Andrew T.; Chang-Claude, Jenny; Chanock, Stephen J.; Cotterchio, Michelle; Du, Mengmeng; Duggan, David; Fuchs, Charles S.; Giovannucci, Edward L.; Gong, Jian; Harrison, Tabitha A.; Hayes, Richard B.; Henderson, Brian E.; Hoffmeister, Michael; Hopper, John L.; Jenkins, Mark A.; Jiao, Shuo; Kolonel, Laurence N.; Le Marchand, Loic; Lemire, Mathieu; Ma, Jing; Newcomb, Polly A.; Ochs-Balcom, Heather M.; Pflugeisen, Bethann M.; Potter, John D.; Rudolph, Anja; Schoen, Robert E.; Seminara, Daniela; Slattery, Martha L.; Stelling, Deanna L.; Thomas, Fridtjof; Thornquist, Mark; Ulrich, Cornelia M.; Warnick, Greg S.; Zanke, Brent W.; Peters, Ulrike; Hsu, Li; White, Emily

    2014-01-01

    BACKGROUND Genome-wide association studies have identified several single nucleotide polymorphisms (SNPs) that are associated with risk of colorectal cancer (CRC). Prior research has evaluated the presence of gene-environment interaction involving the first 10 identified susceptibility loci, but little work has been conducted on interaction involving SNPs at recently identified susceptibility loci, including: rs10911251, rs6691170, rs6687758, rs11903757, rs10936599, rs647161, rs1321311, rs719725, rs1665650, rs3824999, rs7136702, rs11169552, rs59336, rs3217810, rs4925386, and rs2423279. METHODS Data on 9160 cases and 9280 controls from the Genetics and Epidemiology of Colorectal Cancer Consortium (GECCO) and Colon Cancer Family Registry (CCFR) were used to evaluate the presence of interaction involving the above-listed SNPs and sex, body mass index (BMI), alcohol consumption, smoking, aspirin use, post-menopausal hormone (PMH) use, as well as intake of dietary calcium, dietary fiber, dietary folate, red meat, processed meat, fruit, and vegetables. Interaction was evaluated using a fixed-effects meta-analysis of an efficient Empirical Bayes estimator, and permutation was used to account for multiple comparisons. RESULTS None of the permutation-adjusted p-values reached statistical significance. CONCLUSIONS The associations between recently identified genetic susceptibility loci and CRC are not strongly modified by sex, BMI, alcohol, smoking, aspirin, PMH use, and various dietary factors. IMPACT Results suggest no evidence of strong gene-environment interactions involving the recently identified 16 susceptibility loci for CRC taken one at a time. PMID:24994789

  13. Data-Wave-Based Features Extraction and Its Application in Symbol Identifier Recognition and Positioning Suitable for Multi-Robot Systems

    Directory of Open Access Journals (Sweden)

    Xilong Liu

    2012-12-01

    Full Text Available In this paper, feature extraction based on data-wave is proposed. The concept of data-wave is introduced to describe the rising and falling trends of the data over the long-term which are detected based on ripple and wave filters. Supported by data-wave, a novel symbol identifier with significant structure features is designed and these features are extracted by constructing pixel chains. On this basis, the corresponding recognition and positioning approach is presented. The effectiveness of the proposed approach is verified by experiments.

  14. Evaluation of neural networks to identify types of activity using accelerometers

    NARCIS (Netherlands)

    Vries, S.I. de; Garre, F.G.; Engbers, L.H.; Hildebrandt, V.H.; Buuren, S. van

    2011-01-01

    Purpose: To develop and evaluate two artificial neural network (ANN) models based on single-sensor accelerometer data and an ANN model based on the data of two accelerometers for the identification of types of physical activity in adults. Methods: Forty-nine subjects (21 men and 28 women; age range

  15. Identifying Armed Respondents to Domestic Violence Restraining Orders and Recovering Their Firearms: Process Evaluation of an Initiative in California

    Science.gov (United States)

    Frattaroli, Shannon; Claire, Barbara E.; Vittes, Katherine A.; Webster, Daniel W.

    2014-01-01

    Objectives. We evaluated a law enforcement initiative to screen respondents to domestic violence restraining orders for firearm ownership or possession and recover their firearms. Methods. The initiative was implemented in San Mateo and Butte counties in California from 2007 through 2010. We used descriptive methods to evaluate the screening process and recovery effort in each county, relying on records for individual cases. Results. Screening relied on an archive of firearm transactions, court records, and petitioner interviews; no single source was adequate. Screening linked 525 respondents (17.7%) in San Mateo County to firearms; 405 firearms were recovered from 119 (22.7%) of them. In Butte County, 88 (31.1%) respondents were linked to firearms; 260 firearms were recovered from 45 (51.1%) of them. Nonrecovery occurred most often when orders were never served or respondents denied having firearms. There were no reports of serious violence or injury. Conclusions. Recovering firearms from persons subject to domestic violence restraining orders is possible. We have identified design and implementation changes that may improve the screening process and the yield from recovery efforts. Larger implementation trials are needed. PMID:24328660

  16. Identifying Key Attributes for Protein Beverages.

    Science.gov (United States)

    Oltman, A E; Lopetcharat, K; Bastian, E; Drake, M A

    2015-06-01

    This study identified key attributes of protein beverages and evaluated effects of priming on liking of protein beverages. An adaptive choice-based conjoint study was conducted along with Kano analysis to gain insight on protein beverage consumers (n = 432). Attributes evaluated included label claim, protein type, amount of protein, carbohydrates, sweeteners, and metabolic benefits. Utility scores for levels and importance scores for attributes were determined. Subsequently, two pairs of clear acidic whey protein beverages were manufactured that differed by age of protein source or the amount of whey protein per serving. Beverages were evaluated by 151 consumers on two occasions with or without priming statements. One priming statement declared "great flavor," the other priming statement declared 20 g protein per serving. A two way analysis of variance was applied to discern the role of each priming statement. The most important attribute for protein beverages was sweetener type, followed by amount of protein, followed by type of protein followed by label claim. Beverages with whey protein, naturally sweetened, reduced sugar and ≥15 g protein per serving were most desired. Three consumer clusters were identified, differentiated by their preferences for protein type, sweetener and amount of protein. Priming statements positively impacted concept liking (P 0.05). Consistent with trained panel profiles of increased cardboard flavor with higher protein content, consumers liked beverages with 10 g protein more than beverages with 20 g protein (6.8 compared with 5.7, P appeal. © 2015 Institute of Food Technologists®

  17. The anterior choroidal artery syndrome. Pt. 2. CT and/or MR in angiographically verified cases

    International Nuclear Information System (INIS)

    Takahashi, S.; Ishii, K.; Matsumoto, K.; Higano, S.; Ishibashi, T.; Suzuki, M.; Sakamoto, K.

    1994-01-01

    We reviewed 12 cases of infarcts in the territory of the anterior choroidal artery (AChA) on CT and/or MRI. In each case vascular occlusion in the region was verified angiographically. Although the extent of the lesion on CT/MR images was variable, all were located on the axial images within an arcuate zone between the striatium anterolaterally and the thalamus posteromedially. The distribution of the lesions on mutiplanar MRI conformed well to the territory of the AChA demonstrated microangiographically. The variability of the extent of the infarcts may be explained by variations in the degree of occlusive changes in the AChA or the development of collateral circulation through anastomoses between the AChA and the posterior communicating and posterior cerebral arteries. The extent of the lesion appeared to be closely related to the degree of neurological deficit. (orig.)

  18. Reference Material Properties and Standard Problems to Verify the Fuel Performance Models Ver 1.0

    International Nuclear Information System (INIS)

    Yang, Yong Sik; Kim, Jae Yong; Koo, Yang Hyun

    2010-12-01

    All fuel performance models must be validated by in-pile and out-pile tests. However, the model validation requires much efforts and times to confirm its exactness. In many fields, new performance models and codes are confirmed by code-to-code benchmarking process under simplified standard problem analysis. At present, the DUOS, which is the steady state fuel performance analysis code for dual cooled annular fuel, development project is progressing and new FEM module is developed to analyze the fuel performance during transient period. In addition, the verification process is planning to examine the new models and module's rightness by comparing with commercial finite element analysis such as a ADINA, ABAQUS and ANSYS. This reports contains the result of unification of material properties and establishment of standard problem to verify the newly developed models with commercial FEM code

  19. FEMO, A FLOW AND ENRICHMENT MONITOR FOR VERIFYING COMPLIANCE WITH INTERNATIONAL SAFEGUARDS REQUIREMENTS AT A GAS CENTRIFUGE ENRICHMENT FACILITY

    International Nuclear Information System (INIS)

    Gunning, John E.; Laughter, Mark D.; March-Leuba, Jose A.

    2008-01-01

    A number of countries have received construction licenses or are contemplating the construction of large-capacity gas centrifuge enrichment plants (GCEPs). The capability to independently verify nuclear material flows is a key component of international safeguards approaches, and the IAEA does not currently have an approved method to continuously monitor the mass flow of 235U in uranium hexafluoride (UF6) gas streams. Oak Ridge National Laboratory is investigating the development of a flow and enrichment monitor, or FEMO, based on an existing blend-down monitoring system (BDMS). The BDMS was designed to continuously monitor both 235U mass flow and enrichment of UF6 streams at the low pressures similar to those which exists at GCEPs. BDMSs have been installed at three sites-the first unit has operated successfully in an unattended environment for approximately 10 years. To be acceptable to GCEP operators, it is essential that the instrument be installed and maintained without interrupting operations. A means to continuously verify flow as is proposed by FEMO will likely be needed to monitor safeguards at large-capacity plants. This will enable the safeguards effectiveness that currently exists at smaller plants to be maintained at the larger facilities and also has the potential to reduce labor costs associated with inspections at current and future plants. This paper describes the FEMO design requirements, operating capabilities, and development work required before field demonstration.

  20. THE EVALUATION AND IMPROVEMENT OF IT GOVERNANCE

    Directory of Open Access Journals (Sweden)

    Patricia Pérez Lorences

    2013-08-01

    Full Text Available The present article aims to propose a general procedure to evaluate and improve the Information Technology (IT Governance in an organization, considering the Business–IT alignment and risk management. The procedure integrates management tools such as business processes management, risk management, strategic alignment and the balanced scorecard. Additionally, to assess the IT Governance level we proposed an indicator based on the process maturity. The concepts and ideas presented here had been applied in four case studies, verifying their implementation feasibility. The results indicate a low level of IT governance and the existence of several problems primarily in the Plan and Organize and Monitor and Evaluate domains.

  1. Reaching young women who sell sex: Methods and results of social mapping to describe and identify young women for DREAMS impact evaluation in Zimbabwe.

    Science.gov (United States)

    Chiyaka, Tarisai; Mushati, Phillis; Hensen, Bernadette; Chabata, Sungai; Hargreaves, James R; Floyd, Sian; Birdthistle, Isolde J; Cowan, Frances M; Busza, Joanna R

    2018-01-01

    Young women (aged 15-24) who exchange sex for money or other support are among the highest risk groups for HIV acquisition, particularly in high prevalence settings. To prepare for introduction and evaluation of the DREAMS programme in Zimbabwe, which provides biomedical and social interventions to reduce adolescent girls' and young women's HIV vulnerability, we conducted a rapid needs assessment in 6 towns using a "social mapping" approach. In each site, we talked to adult sex workers and other key informants to identify locations where young women sell sex, followed by direct observation, group discussions and interviews. We collected data on socio-demographic characteristics of young women who sell sex, the structure and organisation of their sexual exchanges, interactions with each other and adult sex workers, and engagement with health services. Over a two-week period, we developed a "social map" for each study site, identifying similarities and differences across contexts and their implications for programming and research. Similarities include the concentration of younger women in street-based venues in town centres, their conflict with older sex workers due to competition for clients and acceptance of lower payments, and reluctance to attend existing services. Key differences were found in the 4 university towns included in our sample, where female students participate in diverse forms of sexual exchange but do not identify themselves as selling sex. In smaller towns where illegal gold panning or trucking routes were found, young women migrated in from surrounding rural areas specifically to sell sex. Young women who sell sex are different from each other, and do not work with or attend the same services as adult sex workers. Our findings are being used to inform appropriate intervention activities targeting these vulnerable young women, and to identify effective strategies for recruiting them into the DREAMS process and impact evaluations.

  2. Reaching young women who sell sex: Methods and results of social mapping to describe and identify young women for DREAMS impact evaluation in Zimbabwe.

    Directory of Open Access Journals (Sweden)

    Tarisai Chiyaka

    Full Text Available Young women (aged 15-24 who exchange sex for money or other support are among the highest risk groups for HIV acquisition, particularly in high prevalence settings. To prepare for introduction and evaluation of the DREAMS programme in Zimbabwe, which provides biomedical and social interventions to reduce adolescent girls' and young women's HIV vulnerability, we conducted a rapid needs assessment in 6 towns using a "social mapping" approach. In each site, we talked to adult sex workers and other key informants to identify locations where young women sell sex, followed by direct observation, group discussions and interviews. We collected data on socio-demographic characteristics of young women who sell sex, the structure and organisation of their sexual exchanges, interactions with each other and adult sex workers, and engagement with health services. Over a two-week period, we developed a "social map" for each study site, identifying similarities and differences across contexts and their implications for programming and research. Similarities include the concentration of younger women in street-based venues in town centres, their conflict with older sex workers due to competition for clients and acceptance of lower payments, and reluctance to attend existing services. Key differences were found in the 4 university towns included in our sample, where female students participate in diverse forms of sexual exchange but do not identify themselves as selling sex. In smaller towns where illegal gold panning or trucking routes were found, young women migrated in from surrounding rural areas specifically to sell sex. Young women who sell sex are different from each other, and do not work with or attend the same services as adult sex workers. Our findings are being used to inform appropriate intervention activities targeting these vulnerable young women, and to identify effective strategies for recruiting them into the DREAMS process and impact

  3. Reaching young women who sell sex: Methods and results of social mapping to describe and identify young women for DREAMS impact evaluation in Zimbabwe

    Science.gov (United States)

    Chiyaka, Tarisai; Mushati, Phillis; Hensen, Bernadette; Chabata, Sungai; Hargreaves, James R.; Floyd, Sian; Birdthistle, Isolde J.; Cowan, Frances M.; Busza, Joanna R.

    2018-01-01

    Young women (aged 15–24) who exchange sex for money or other support are among the highest risk groups for HIV acquisition, particularly in high prevalence settings. To prepare for introduction and evaluation of the DREAMS programme in Zimbabwe, which provides biomedical and social interventions to reduce adolescent girls’ and young women’s HIV vulnerability, we conducted a rapid needs assessment in 6 towns using a “social mapping” approach. In each site, we talked to adult sex workers and other key informants to identify locations where young women sell sex, followed by direct observation, group discussions and interviews. We collected data on socio-demographic characteristics of young women who sell sex, the structure and organisation of their sexual exchanges, interactions with each other and adult sex workers, and engagement with health services. Over a two-week period, we developed a “social map” for each study site, identifying similarities and differences across contexts and their implications for programming and research. Similarities include the concentration of younger women in street-based venues in town centres, their conflict with older sex workers due to competition for clients and acceptance of lower payments, and reluctance to attend existing services. Key differences were found in the 4 university towns included in our sample, where female students participate in diverse forms of sexual exchange but do not identify themselves as selling sex. In smaller towns where illegal gold panning or trucking routes were found, young women migrated in from surrounding rural areas specifically to sell sex. Young women who sell sex are different from each other, and do not work with or attend the same services as adult sex workers. Our findings are being used to inform appropriate intervention activities targeting these vulnerable young women, and to identify effective strategies for recruiting them into the DREAMS process and impact evaluations

  4. [Key effect genes responding to nerve injury identified by gene ontology and computer pattern recognition].

    Science.gov (United States)

    Pan, Qian; Peng, Jin; Zhou, Xue; Yang, Hao; Zhang, Wei

    2012-07-01

    In order to screen out important genes from large gene data of gene microarray after nerve injury, we combine gene ontology (GO) method and computer pattern recognition technology to find key genes responding to nerve injury, and then verify one of these screened-out genes. Data mining and gene ontology analysis of gene chip data GSE26350 was carried out through MATLAB software. Cd44 was selected from screened-out key gene molecular spectrum by comparing genes' different GO terms and positions on score map of principal component. Function interferences were employed to influence the normal binding of Cd44 and one of its ligands, chondroitin sulfate C (CSC), to observe neurite extension. Gene ontology analysis showed that the first genes on score map (marked by red *) mainly distributed in molecular transducer activity, receptor activity, protein binding et al molecular function GO terms. Cd44 is one of six effector protein genes, and attracted us with its function diversity. After adding different reagents into the medium to interfere the normal binding of CSC and Cd44, varying-degree remissions of CSC's inhibition on neurite extension were observed. CSC can inhibit neurite extension through binding Cd44 on the neuron membrane. This verifies that important genes in given physiological processes can be identified by gene ontology analysis of gene chip data.

  5. Evaluation of the effectiveness of the sealed double-ringed infiltrometers and the effects of changes in atmospheric pressure on hydraulic conductivity

    International Nuclear Information System (INIS)

    McMullin, S.R.

    1994-01-01

    The Savannah River Site (SRS) is currently evaluating some 40 hazardous and radioactive-waste sites for remediation. A remedial alternative under consideration is the closing of a waste site with a RCRA-style closure cap. The closure cap is a moisture barrier designed to inhibit the free flow of water downward into the buried wastes. When a remedial design is prepared, it is often necessary to test the cap materials to verify compliance with this recommended limit. Among the EPA-recommended test instruments is the sealed double-ring infiltrometer (SDRI). During recent testing at the Savannah River Site (SRS), six SDRI were installed and tested on a single kaolin clay cap. The purpose of this testing was to obtain a measure of the distribution of hydraulic conductivity across a model kaolin clay cap. The test results provide an evaluation of instrument performance and a measure of the repeatability of results. In addition, the testing identified variations in the unsaturated hydraulic conductivity. This paper presents an overview of the SDRI, the testing program at SRS, and an evaluation of the observations and test results

  6. Green Degree Comprehensive Evaluation of Elevator Based on Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Lizhen

    2015-01-01

    Full Text Available The green design of the elevator has many characteristics which contains many factors and the combination of qualitative and quantitative. In view of the fuzzy problem of evaluation index information, fuzzy analytic hierarchy process and fuzzy comprehensive evaluation model are combined to evaluate the green degree of elevator. In this method, the weights of the indexes are calculated by using the fuzzy analytic hierarchy process and the fuzzy analytic hierarchy process is used to calculate the weights of each level. The feasibility will be defined of using green degree evaluation of elevator system as an example to verify the method.

  7. Multiple parameters anomalies for verifying the geosystem spheres coupling effect: a case study of the 2010 Ms7.1 Yushu earthquake in China

    Directory of Open Access Journals (Sweden)

    Shuo Zheng

    2014-08-01

    Full Text Available In the research of earthquake anomaly recognition, the coupling effect of multiple geosystem spheres can be expected to reasonably interpretating the correlation between various anomalous signals before strong earthquake. Specially, the development of the Lithosphere–Atmosphere–Ionosphere (LAI coupling model has been accepted as verified by some experimental, thermal and electromagnetic data. However, quasi-synchronous anomalies of the multiple parameters, including thermal, radon and electromagnetic data, have not been reported in a single event case for verifying the geosystem spheres coupling effect. In this paper, we firstly summarized the reported studies on the power spectrum density (PSD in the ELF/VLF band and radon data recorded from Guza seismic station. Then, historical surface latent heat flux (SLHF data from the NCEP/NCAR Reanalysis Project was employed for investigating anomalous change in a month before the April 14, 2010, Ms7.1 Yushu earthquake which is one of the typical intra-continental earthquakes in Tibet Plateau. The results from spatial and temporal analysis revealed that anomalous fields of PSD and SLHF data were located close to the epicenter and the ends of some active faults at Bayan Har Block and all anomalous dates converged between April 8 and 11 (6 to 3 days before the Yushu earthquake. Therefore, we suggest that the anomalies of multiple parameters before the main shock are related with the Yushu earthquake. This paper could give an ideal case study to verify the geosystem spheres coupling effect happened in a single event.

  8. Actuation and system design and evaluation OMS engine shutoff valve, Volume 1. [space shuttles

    Science.gov (United States)

    Dunn, V. B.

    1975-01-01

    A technology program was conducted to identify and verify the optimum valve and actuation system concept for the Space Shuttle Orbit Maneuvering System engine. Of major importance to the valve and actuation system selection was the ten-year, 100-mission, 10,000-cycle life requirement, while maintaining high reliability, low leakage, and low weight. Valve and actuation system concepts were comparatively evaluated against past valve failure reports and potential failure modes due to the shuttle mission profile to aid in the selection of the most optimum concept for design, manufacture and verification testing. Two valve concepts were considered during the preliminary design stage; i.e., the moving seat and lifting ball. Two actuation systems were manufactured and tested. Test results demonstrate the viability of a lifting ball concept as well as the applicability of an ac motor actuation system to best meet the requirements of the shuttle mission.

  9. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions

    Directory of Open Access Journals (Sweden)

    Suo-Wei Wu

    2018-01-01

    Conclusions: As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the study established a quantitative medical technology evaluation system model and assessment indicators within medical institutions based on the Delphi method and analytical hierarchy process. Moreover, further verifications, adjustments, and optimizations of the system and indicators will be performed in follow-up studies.

  10. Evaluation of the DNA barcodes in Dendrobium (Orchidaceae) from mainland Asia.

    Science.gov (United States)

    Xu, Songzhi; Li, Dezhu; Li, Jianwu; Xiang, Xiaoguo; Jin, Weitao; Huang, Weichang; Jin, Xiaohua; Huang, Luqi

    2015-01-01

    DNA barcoding has been proposed to be one of the most promising tools for accurate and rapid identification of taxa. However, few publications have evaluated the efficiency of DNA barcoding for the large genera of flowering plants. Dendrobium, one of the largest genera of flowering plants, contains many species that are important in horticulture, medicine and biodiversity conservation. Besides, Dendrobium is a notoriously difficult group to identify. DNA barcoding was expected to be a supplementary means for species identification, conservation and future studies in Dendrobium. We assessed the power of 11 candidate barcodes on the basis of 1,698 accessions of 184 Dendrobium species obtained primarily from mainland Asia. Our results indicated that five single barcodes, i.e., ITS, ITS2, matK, rbcL and trnH-psbA, can be easily amplified and sequenced with the currently established primers. Four barcodes, ITS, ITS2, ITS+matK, and ITS2+matK, have distinct barcoding gaps. ITS+matK was the optimal barcode based on all evaluation methods. Furthermore, the efficiency of ITS+matK was verified in four other large genera including Ficus, Lysimachia, Paphiopedilum, and Pedicularis in this study. Therefore, we tentatively recommend the combination of ITS+matK as a core DNA barcode for large flowering plant genera.

  11. Identifying Adverse Drug Events by Relational Learning.

    Science.gov (United States)

    Page, David; Costa, Vítor Santos; Natarajan, Sriraam; Barnard, Aubrey; Peissig, Peggy; Caldwell, Michael

    2012-07-01

    The pharmaceutical industry, consumer protection groups, users of medications and government oversight agencies are all strongly interested in identifying adverse reactions to drugs. While a clinical trial of a drug may use only a thousand patients, once a drug is released on the market it may be taken by millions of patients. As a result, in many cases adverse drug events (ADEs) are observed in the broader population that were not identified during clinical trials. Therefore, there is a need for continued, post-marketing surveillance of drugs to identify previously-unanticipated ADEs. This paper casts this problem as a reverse machine learning task , related to relational subgroup discovery and provides an initial evaluation of this approach based on experiments with an actual EMR/EHR and known adverse drug events.

  12. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  13. Comparison of two heuristic evaluation methods for evaluating the usability of health information systems.

    Science.gov (United States)

    Khajouei, Reza; Hajesmaeel Gohari, Sadrieh; Mirzaee, Moghaddameh

    2018-04-01

    In addition to following the usual Heuristic Evaluation (HE) method, the usability of health information systems can also be evaluated using a checklist. The objective of this study is to compare the performance of these two methods in identifying usability problems of health information systems. Eight evaluators independently evaluated different parts of a Medical Records Information System using two methods of HE (usual and with a checklist). The two methods were compared in terms of the number of problems identified, problem type, and the severity of identified problems. In all, 192 usability problems were identified by two methods in the Medical Records Information System. This was significantly higher than the number of usability problems identified by the checklist and usual method (148 and 92, respectively) (p information systems. The results demonstrated that the checklist method had significantly better performance in terms of the number of identified usability problems; however, the performance of the usual method for identifying problems of higher severity was significantly better. Although the checklist method can be more efficient for less experienced evaluators, wherever usability is critical, the checklist should be used with caution in usability evaluations. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Identifying candidate agents for lung adenocarcinoma by walking the human interactome

    Directory of Open Access Journals (Sweden)

    Sun Y

    2016-09-01

    Full Text Available Yajiao Sun,1 Ranran Zhang,2 Zhe Jiang,1 Rongyao Xia,1 Jingwen Zhang,1 Jing Liu,1 Fuhui Chen1 1Department of Respiratory, The Second Affiliated Hospital of Harbin Medical University, 2Department of Respiratory, Harbin First Hospital, Harbin, People’s Republic of China Abstract: Despite recent advances in therapeutic strategies for lung cancer, mortality is still increasing. Therefore, there is an urgent need to identify effective novel drugs. In the present study, we implement drug repositioning for lung adenocarcinoma (LUAD by a bioinformatics method followed by experimental validation. We first identified differentially expressed genes between LUAD tissues and nontumor tissues from RNA sequencing data obtained from The Cancer Genome Atlas database. Then, candidate small molecular drugs were ranked according to the effect of their targets on differentially expressed genes of LUAD by a random walk with restart algorithm in protein–protein interaction networks. Our method identified some potentially novel agents for LUAD besides those that had been previously reported (eg, hesperidin. Finally, we experimentally verified that atracurium, one of the potential agents, could induce A549 cells death in non-small-cell lung cancer-derived A549 cells by an MTT assay, acridine orange and ethidium bromide staining, and electron microscopy. Furthermore, Western blot assays demonstrated that atracurium upregulated the proapoptotic Bad and Bax proteins, downregulated the antiapoptotic p-Bad and Bcl-2 proteins, and enhanced caspase-3 activity. It could also reduce the expression of p53 and p21Cip1/Waf1 in A549 cells. In brief, the candidate agents identified by our approach may provide greater insights into improving the therapeutic status of LUAD. Keywords: lung adenocarcinoma, drug repositioning, bioinformatics, protein–protein interaction network, atracurium

  15. Protein functional links in Trypanosoma brucei, identified by gene fusion analysis

    Directory of Open Access Journals (Sweden)

    Trimpalis Philip

    2011-07-01

    Full Text Available Abstract Background Domain or gene fusion analysis is a bioinformatics method for detecting gene fusions in one organism by comparing its genome to that of other organisms. The occurrence of gene fusions suggests that the two original genes that participated in the fusion are functionally linked, i.e. their gene products interact either as part of a multi-subunit protein complex, or in a metabolic pathway. Gene fusion analysis has been used to identify protein functional links in prokaryotes as well as in eukaryotic model organisms, such as yeast and Drosophila. Results In this study we have extended this approach to include a number of recently sequenced protists, four of which are pathogenic, to identify fusion linked proteins in Trypanosoma brucei, the causative agent of African sleeping sickness. We have also examined the evolution of the gene fusion events identified, to determine whether they can be attributed to fusion or fission, by looking at the conservation of the fused genes and of the individual component genes across the major eukaryotic and prokaryotic lineages. We find relatively limited occurrence of gene fusions/fissions within the protist lineages examined. Our results point to two trypanosome-specific gene fissions, which have recently been experimentally confirmed, one fusion involving proteins involved in the same metabolic pathway, as well as two novel putative functional links between fusion-linked protein pairs. Conclusions This is the first study of protein functional links in T. brucei identified by gene fusion analysis. We have used strict thresholds and only discuss results which are highly likely to be genuine and which either have already been or can be experimentally verified. We discuss the possible impact of the identification of these novel putative protein-protein interactions, to the development of new trypanosome therapeutic drugs.

  16. Declining trend in the incidence of biopsy-verified coeliac disease in the adult population of Finland, 2005-2014.

    Science.gov (United States)

    Virta, L J; Saarinen, M M; Kolho, K-L

    2017-12-01

    The frequency of coeliac disease (CD) has been on the rise over the past decades, especially in Western Europe, but current trends are unclear. To research the recent temporal changes in the incidence of adult, biopsy-verified coeliac disease and dermatitis herpetiformis (DH) in Finland, a country with a high frequency of coeliac disease. All coeliac disease and DH cases diagnosed at age 20-79 years during 2005-2014 were retrieved from a nationwide database documenting all applicants for monthly compensation to cover the extra cost of maintaining a gluten-free diet. This benefit is granted on the basis of histology, not socioeconomic status. Temporal trends in the annual incidences were estimated using Poisson regression analyses. The total incidence of coeliac disease decreased from 33/100 000 during the years 2005-2006 to 29/100 000 during 2013-2014. The mean annual incidence of coeliac disease was nearly twice as high among women as among men, 42 vs 22 per 100 000, respectively. For middle- and old-aged women, the average rate of decrease in incidence was 4.8% (95% CI 3.9-5.7) per year and for men 3.0% (1.8-4.1) (P for linear trend adults, the rate of change remained low and nonsignificant throughout the period 2005-2014. Although the awareness of coeliac disease has increased during the past decades, the incidence of biopsy-verified diagnoses is not increasing, which suggests that exposure to yet unidentified triggering factors for coeliac disease has plateaued among the Finnish adult population. © 2017 John Wiley & Sons Ltd.

  17. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  18. The chemokine receptor CCR1 is identified in mast cell-derived exosomes.

    Science.gov (United States)

    Liang, Yuting; Qiao, Longwei; Peng, Xia; Cui, Zelin; Yin, Yue; Liao, Huanjin; Jiang, Min; Li, Li

    2018-01-01

    Mast cells are important effector cells of the immune system, and mast cell-derived exosomes carrying RNAs play a role in immune regulation. However, the molecular function of mast cell-derived exosomes is currently unknown, and here, we identify differentially expressed genes (DEGs) in mast cells and exosomes. We isolated mast cells derived exosomes through differential centrifugation and screened the DEGs from mast cell-derived exosomes, using the GSE25330 array dataset downloaded from the Gene Expression Omnibus database. Biochemical pathways were analyzed by Gene ontology (GO) annotation and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway on the online tool DAVID. DEGs-associated protein-protein interaction networks (PPIs) were constructed using the STRING database and Cytoscape software. The genes identified from these bioinformatics analyses were verified by qRT-PCR and Western blot in mast cells and exosomes. We identified 2121 DEGs (843 up and 1278 down-regulated genes) in HMC-1 cell-derived exosomes and HMC-1 cells. The up-regulated DEGs were classified into two significant modules. The chemokine receptor CCR1 was screened as a hub gene and enriched in cytokine-mediated signaling pathway in module one. Seven genes, including CCR1, CD9, KIT, TGFBR1, TLR9, TPSAB1 and TPSB2 were screened and validated through qRT-PCR analysis. We have achieved a comprehensive view of the pivotal genes and pathways in mast cells and exosomes and identified CCR1 as a hub gene in mast cell-derived exosomes. Our results provide novel clues with respect to the biological processes through which mast cell-derived exosomes modulate immune responses.

  19. Verifying 4D gated radiotherapy using time-integrated electronic portal imaging: a phantom and clinical study

    Directory of Open Access Journals (Sweden)

    Slotman Ben J

    2007-08-01

    Full Text Available Abstract Background Respiration-gated radiotherapy (RGRT can decrease treatment toxicity by allowing for smaller treatment volumes for mobile tumors. RGRT is commonly performed using external surrogates of tumor motion. We describe the use of time-integrated electronic portal imaging (TI-EPI to verify the position of internal structures during RGRT delivery Methods TI-EPI portals were generated by continuously collecting exit dose data (aSi500 EPID, Portal vision, Varian Medical Systems when a respiratory motion phantom was irradiated during expiration, inspiration and free breathing phases. RGRT was delivered using the Varian RPM system, and grey value profile plots over a fixed trajectory were used to study object positions. Time-related positional information was derived by subtracting grey values from TI-EPI portals sharing the pixel matrix. TI-EPI portals were also collected in 2 patients undergoing RPM-triggered RGRT for a lung and hepatic tumor (with fiducial markers, and corresponding planning 4-dimensional CT (4DCT scans were analyzed for motion amplitude. Results Integral grey values of phantom TI-EPI portals correlated well with mean object position in all respiratory phases. Cranio-caudal motion of internal structures ranged from 17.5–20.0 mm on planning 4DCT scans. TI-EPI of bronchial images reproduced with a mean value of 5.3 mm (1 SD 3.0 mm located cranial to planned position. Mean hepatic fiducial markers reproduced with 3.2 mm (SD 2.2 mm caudal to planned position. After bony alignment to exclude set-up errors, mean displacement in the two structures was 2.8 mm and 1.4 mm, respectively, and corresponding reproducibility in anatomy improved to 1.6 mm (1 SD. Conclusion TI-EPI appears to be a promising method for verifying delivery of RGRT. The RPM system was a good indirect surrogate of internal anatomy, but use of TI-EPI allowed for a direct link between anatomy and breathing patterns.

  20. Personality and Student Performance on Evaluation Methods Used in Business Administration Courses

    Science.gov (United States)

    Lakhal, Sawsen; Sévigny, Serge; Frenette, Éric

    2015-01-01

    The objective of this study was to verify whether personality (Big Five model) influences performance on the evaluation methods used in business administration courses. A sample of 169 students enrolled in two compulsory undergraduate business courses responded to an online questionnaire. As it is difficult within the same course to assess…

  1. Identifying genes that mediate anthracyline toxicity in immune cells

    Directory of Open Access Journals (Sweden)

    Amber eFrick

    2015-04-01

    Full Text Available The role of the immune system in response to chemotherapeutic agents remains elusive. The interpatient variability observed in immune and chemotherapeutic cytotoxic responses is likely, at least in part, due to complex genetic differences. Through the use of a panel of genetically diverse mouse inbred strains, we developed a drug screening platform aimed at identifying genes underlying these chemotherapeutic cytotoxic effects on immune cells. Using genome-wide association studies (GWAS, we identified four genome-wide significant quantitative trait loci (QTL that contributed to the sensitivity of doxorubicin and idarubicin in immune cells. Of particular interest, a locus on chromosome 16 was significantly associated with cell viability following idarubicin administration (p = 5.01x10-8. Within this QTL lies App, which encodes amyloid beta precursor protein. Comparison of dose-response curves verified that T-cells in App knockout mice were more sensitive to idarubicin than those of C57BL/6J control mice (p < 0.05.In conclusion, the cellular screening approach coupled with GWAS led to the identification and subsequent validation of a gene involved in T-cell viability after idarubicin treatment. Previous studies have suggested a role for App in in vitro and in vivo cytotoxicity to anticancer agents; the overexpression of App enhances resistance, while the knockdown of this gene is deleterious to cell viability. Thus, further investigations should include performing mechanistic studies, validating additional genes from the GWAS, including Ppfia1 and Ppfibp1, and ultimately translating the findings to in vivo and human studies.

  2. Validation of a noninvasive diagnostic tool to verify neuter status in dogs: The urinary FSH to creatinine ratio.

    Science.gov (United States)

    Albers-Wolthers, C H J; de Gier, J; Oei, C H Y; Schaefers-Okkens, A C; Kooistra, H S

    2016-09-15

    Determining the presence of functional gonadal tissue in dogs can be challenging, especially in bitches during anestrus or not known to have been ovariectomized, or in male dogs with nonscrotal testes. Furthermore, in male dogs treated with deslorelin, a slow-release GnRH agonist implant for reversible chemical castration, the verification of complete downregulation of the hypothalamic-pituitary-gonadal (HPG) axis can be difficult, especially if pretreatment parameters such as the size of the testes or prostate gland are not available. The aims of this study were to validate an immunoradiometric assay for measurement of FSH in canine urine, to determine if the urinary FSH to creatinine ratio can be used to verify the neuter status in bitches and male dogs, as an alternative to the plasma FSH concentration, and to determine if downregulation of the HPG axis is achieved in male dogs during deslorelin treatment. Recovery of added canine FSH and serial dilutions of urine reported that the immunoradiometric assay measures urinary FSH concentration accurately and with high precision. Plasma FSH concentrations (the mean of two samples, taken 40 minutes apart) and the urinary FSH to creatinine ratio were determined before gonadectomy and 140 days (median, range 121-225 days) and 206 days (median, range 158-294 days) after gonadectomy of 13 bitches and five male dogs, respectively, and in 13 male dogs before and 132 days (median, range 117-174 days) after administration of a deslorelin implant. In both bitches and male dogs, the plasma FSH concentration and the urinary FSH to creatinine ratio were significantly higher after gonadectomy, with no overlapping of their ranges. Receiver operating characteristic analysis of the urinary FSH to creatinine ratio revealed a cut-off value of 2.9 in bitches and 6.5 in males to verify the presence or absence of functional gonadal tissue. In male dogs treated with deslorelin, the plasma FSH concentrations and urinary FSH to

  3. The Researching on Evaluation of Automatic Voltage Control Based on Improved Zoning Methodology

    Science.gov (United States)

    Xiao-jun, ZHU; Ang, FU; Guang-de, DONG; Rui-miao, WANG; De-fen, ZHU

    2018-03-01

    According to the present serious phenomenon of increasing size and structure of power system, hierarchically structured automatic voltage control(AVC) has been the researching spot. In the paper, the reduced control model is built and the adaptive reduced control model is researched to improve the voltage control effect. The theories of HCSD, HCVS, SKC and FCM are introduced and the effect on coordinated voltage regulation caused by different zoning methodologies is also researched. The generic framework for evaluating performance of coordinated voltage regulation is built. Finally, the IEEE-96 stsyem is used to divide the network. The 2383-bus Polish system is built to verify that the selection of a zoning methodology affects not only the coordinated voltage regulation operation, but also its robustness to erroneous data and proposes a comprehensive generic framework for evaluating its performance. The New England 39-bus network is used to verify the adaptive reduced control models’ performance.

  4. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Directory of Open Access Journals (Sweden)

    Woon-Man Kung

    Full Text Available BACKGROUND: Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS. MATERIALS AND METHODS: From January 2011 to June 2012, decompressive craniectomy (DC was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. RESULTS: CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84. CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (p<0.001, p = 0.064, p = 0.021 respectively, Wilcoxon matched pairs signed rank test. These data evidenced the highly accurate symmetry of these CAD models with regular contours. CONCLUSIONS: CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  5. Project W-314 specific test and evaluation plan for 241-AN-A valve pit

    International Nuclear Information System (INIS)

    Hays, W.H.

    1997-01-01

    The purpose of this Specific Test and Evaluation Plan (STEP) is to provide a detailed written plan for the systematic testing of modifications made to the 241-AN-A Valve Pit by the W-314 Project. The STEP develops the outline for test procedures that verify the system's performance to the established Project design criteria. The STEP is a ''lower tier'' document based on the W-314 Test and Evaluation Plan (TEP) This STEP encompasses all testing activities required to demonstrate compliance to the project design criteria as it relates to the modifications of the AN-A valve pit. The Project Design Specifications (PDS) identify the specific testing activities required for the Project. Testing includes Validations and Verifications (e.g., Commercial Grade Item Dedication activities), Factory Acceptance Tests (FATs), installation tests and inspections, Construction Acceptance Tests (CATs), Acceptance Test Procedures (ATPs), Pre-Operational Test Procedures (POTPs), and Operational Test Procedures (OTPs). It should be noted that POTPs are not required for testing of the modifications to the 241-AN-A Valve Pit. The STEP will be utilized in conjunction with the TEP for verification and validation

  6. Verifying cell loss requirements in high-speed communication networks

    Directory of Open Access Journals (Sweden)

    Kerry W. Fendick

    1998-01-01

    Full Text Available In high-speed communication networks it is common to have requirements of very small cell loss probabilities due to buffer overflow. Losses are measured to verify that the cell loss requirements are being met, but it is not clear how to interpret such measurements. We propose methods for determining whether or not cell loss requirements are being met. A key idea is to look at the stream of losses as successive clusters of losses. Often clusters of losses, rather than individual losses, should be regarded as the important “loss events”. Thus we propose modeling the cell loss process by a batch Poisson stochastic process. Successive clusters of losses are assumed to arrive according to a Poisson process. Within each cluster, cell losses do not occur at a single time, but the distance between losses within a cluster should be negligible compared to the distance between clusters. Thus, for the purpose of estimating the cell loss probability, we ignore the spaces between successive cell losses in a cluster of losses. Asymptotic theory suggests that the counting process of losses initiating clusters often should be approximately a Poisson process even though the cell arrival process is not nearly Poisson. The batch Poisson model is relatively easy to test statistically and fit; e.g., the batch-size distribution and the batch arrival rate can readily be estimated from cell loss data. Since batch (cluster sizes may be highly variable, it may be useful to focus on the number of batches instead of the number of cells in a measurement interval. We also propose a method for approximately determining the parameters of a special batch Poisson cell loss with geometric batch-size distribution from a queueing model of the buffer content. For this step, we use a reflected Brownian motion (RBM approximation of a G/D/1/C queueing model. We also use the RBM model to estimate the input burstiness given the cell loss rate. In addition, we use the RBM model to

  7. Selective use of the biphasic-contrast barium enema study for evaluation of colonic lesions: Results of a prospective study

    International Nuclear Information System (INIS)

    De Lange, E.E.; Shaffer, H.A. Jr.; Riddervold, H.O.

    1987-01-01

    The authors performed a prospective study to determine the value of selective use of the biphasic contrast technique for a variety of indications. In a series of 571 double-contrast barium enema examinations, the examination was immediately followed by a single-contrast study in 85 cases. The biphasic procedure was performed to reexamine a colonic segment that was poorly evaluated initially because of diverticulosis (eta = 35), incomplete filling (eta = 28), or poor mucosal coating (eta = 26); or to verify or exclude a possible lesion identified during the double-contrast examination (eta = 22). The single-contrast study confirmed five polyps and excluded lesions in 17 cases with suspected polyps (eta = 5), strictures (eta = 4), and spasm (eta = 8). Six polyps not visualized on the double-contrast examination were detected with the single-contrast procedure

  8. Project W-314 specific test and evaluation plan 241-AN-B valve pit

    International Nuclear Information System (INIS)

    Hays, W.H.

    1998-01-01

    The purpose of this Specific Test and Evaluation Plan (STEP) is to provide a detailed written plan for the systematic testing of modifications made to the 241-AN-B Valve Pit by the W-314 Project. The STEP develops the outline for test procedures that verify the system's performance to the established Project design criteria. The STEP is a lower tier document based on the W-314 Test and Evaluation Plan (TEP)

  9. Identifying hotspots of coastal risk and evaluating DRR measures: results from the RISC-KIT project.

    Science.gov (United States)

    Van Dongeren, A.; Ciavola, P.; Viavattene, C.; Dekleermaeker, S.; Martinez, G.; Ferreira, O.; Costa, C.

    2016-02-01

    High-impact storm events have demonstrated the vulnerability of coastal zones in Europe and beyond. These impacts are likely to increase due to predicted climate change and ongoing coastal development. In order to reduce impacts, disaster risk reduction (DRR) measures need to be taken, which prevent or mitigate the effects of storm events. To drive the DRR agenda, the UNISDR formulated the Sendai Framework for Action, and the EU has issued the Floods Directive. However, neither is specific about the methods to be used to develop actionable DRR measures in the coastal zone. Therefore, there is a need to develop methods, tools and approaches which make it possible to: identify and prioritize the coastal zones which are most at risk through a Coastal Risk Assessment Framework, evaluate the effectiveness of DRR options for these coastal areas, using an Early Warning/Decision Support System, which can be used both in the planning and event-phase. This paper gives an overview of the products and results obtained in the FP7-funded project RISC-KIT, which aims to develop and apply a set of tools with which highly-vulnerable coastal areas (so-called "hotspots") can be identified. The identification is done using the Coastal Risk Assessment Framework, or CRAF, which computes the intensity from multi-hazards, the exposure and the vulnerability, all components of risk, including network and cascading effects. Based on this analysis hot spots of risk which warrant coastal protection investments are selected. For these hotspot areas, high-resolution Early Warning and Decision Support Tools are developed with which it is possible to compute in detail the effectiveness of Disaster Risk Reduction measures in storm event scenarios, which helps decide which measures to implement in the planning phase. The same systems, but now driven with real time data, can also be used for early warning systems. All tools are tested on eleven case study areas, at least one on each EU Regional Sea

  10. Whole genome DNA copy number changes identified by high density oligonucleotide arrays

    Directory of Open Access Journals (Sweden)

    Huang Jing

    2004-05-01

    Full Text Available Abstract Changes in DNA copy number are one of the hallmarks of the genetic instability common to most human cancers. Previous micro-array-based methods have been used to identify chromosomal gains and losses; however, they are unable to genotype alleles at the level of single nucleotide polymorphisms (SNPs. Here we describe a novel algorithm that uses a recently developed high-density oligonucleotide array-based SNP genotyping method, whole genome sampling analysis (WGSA, to identify genome-wide chromosomal gains and losses at high resolution. WGSA simultaneously genotypes over 10,000 SNPs by allele-specific hybridisation to perfect match (PM and mismatch (MM probes synthesised on a single array. The copy number algorithm jointly uses PM intensity and discrimination ratios between paired PM and MM intensity values to identify and estimate genetic copy number changes. Values from an experimental sample are compared with SNP-specific distributions derived from a reference set containing over 100 normal individuals to gain statistical power. Genomic regions with statistically significant copy number changes can be identified using both single point analysis and contiguous point analysis of SNP intensities. We identified multiple regions of amplification and deletion using a panel of human breast cancer cell lines. We verified these results using an independent method based on quantitative polymerase chain reaction and found that our approach is both sensitive and specific and can tolerate samples which contain a mixture of both tumour and normal DNA. In addition, by using known allele frequencies from the reference set, statistically significant genomic intervals can be identified containing contiguous stretches of homozygous markers, potentially allowing the detection of regions undergoing loss of heterozygosity (LOH without the need for a matched normal control sample. The coupling of LOH analysis, via SNP genotyping, with copy number

  11. The design and performance evaluation of the ultrasonic random coil identity-integrity element for underwater safeguards seals

    International Nuclear Information System (INIS)

    Allen, V.H.; Backer, S.; Smith, M.T.

    1983-06-01

    Irradiated fuel discharged from CANDU power reactors is stored underwater and, in order to comply with the requirements of International Safe-guards, the fuel is stacked in sealed containers which are examined at intervals by IAEA inspectors. The seals are verified for identity and integrity and this report describes the design of an identity/integrity element for the seals. The element is in the form of a random coil of wire which is interrogated by ultrasonic methods. An evaluation of thirty-six seals is reported. The application of seals to stacks of fuel was simulated in a water-filled bay at CRNL and repetitive verification measurements were made which simulated inspection procedures. The seal identity signatures were compared using cross-correlation methods and the results show that a broken or tampered seal can be identified with a high level of confidence

  12. Verifying influenza and pneumococcal immunization status of children in 2009-2010 from primary care practice records and from the North Carolina Immunization Registry.

    Science.gov (United States)

    Poehling, Katherine A; Vannoy, Lauren; Peters, Timothy R

    2013-01-01

    The North Carolina Immunization Registry (NCIR) has been available since 2004. We sought to measure its utilization among practices that provide primary care for children who are enrolled in a prospective influenza surveillance study. This study included children aged 0.5-17 years who presented with fever or acute respiratory symptoms to an emergency department or inpatient setting in Winston-Salem, North Carolina, from September 1, 2009, through May 19, 2010. Study team members verified influenza and pneumococcal immunization status by requesting records from each child's primary care practice and by independently reviewing the NCIR. We assessed agreement of nonregistry immunization medical records with NCIR data using the kappa statistic. Fifty-six practices confirmed the immunization status of 292 study-enrolled children. For most children (238/292, 82%), practices verified the child's immunizations by providing a copy of the NCIR record. For 54 children whose practices verified their immunizations by providing practice records alone, agreement with the NCIR by the kappa statistic was 0.6-0.7 for seasonal and monovalent H1N1 influenza vaccines and 0.8-0.9 for pneumococcal conjugate and polysaccharide vaccines. A total of 221 (98%) of 226 enrolled children younger than 6 years of age had 2 or more immunizations documented in the NCIR. NCIR usage may vary in other regions of North Carolina. More than 95% of children younger than 6 years of age had 2 or more immunizations documented in the NCIR; thus, the Centers for Disease Control and Prevention 2010 goal for immunization information systems was met in this population. We found substantial agreement between practice records and the NCIR for influenza and pneumococcal immunizations in children.

  13. Development of various NDA approach for verifying the hold inventory

    International Nuclear Information System (INIS)

    Nakashima, Shinichi; Yamada, Shigeki; Takahashi, Syunya

    1999-01-01

    This report describes the phase 1 activity to investigate the various nondestructive assay methods which proved to be useful to evaluate the amount of uranium holdup inventory. The study has been carried out in the Ningyo-Toge Demonstration uranium enrichment facility. This feasibility study is the part of JNC/DOE cooperative safeguards agreement. We expect that a combination of neutron counting and gamma-ray measurement method is required to obtain a quantitative measure of process holdup. As the results of the investigation, the basic measurement approaches were discussed and the applicable potential to gaseous centrifuges cascades were evaluated. (author)

  14. Hazard evaluation of The International Fusion Materials Irradiation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Burgazzi, Luciano [ENEA-Centro Ricerche ' Ezio Clementel' , Advanced Physics Technology Division, Via Martiri di Monte Sole, 4, 40129 Bologna (Italy)]. E-mail: burgazzi@bologna.enea.it

    2005-01-15

    The International Fusion Materials Irradiation Facility (IFMIF) is aimed to provide an intense neutron source by a high current deuteron linear accelerator and a high-speed lithium flow target, for testing candidate materials for fusion. Liquid lithium is being circulated through a loop and is kept at a temperature above its freezing point. In the frame of the design phase called Key Element technology Phase (KEP), jointly performed by an international team to verify the most important risk factors, safety assessment of the whole plant has been required in order to identify the hazards associated with the plant operation. This paper discusses the safety assessments that were performed and their outcome: Failure Mode and Effect Analysis (FMEA) approach has been adopted in order to accomplish the task. Main conclusions of the study is that, on account of the safety and preventive measures adopted, potential plant related hazards are confined within the IFMIF security boundaries and great care must be exercised to protect workers and site personnel from operating the plant. The analysis has provided as a result a set of Postulated Initiating Events (PIEs), that is off-normal events, that could result in hazardous consequences for the plant, together with the total frequency and the list of component failures which could induce the PIE: this assures the exhaustive list of major initiating events of accident sequences, helpful to the further accident sequence analysis phase. Finally, for each one of the individuated PIEs, the evaluation of the accident evolution, in terms of effects on the plant and relative countermeasures, has allowed to verify that adequate measures are being taken both to prevent the accident occurrence and to cope with the accident consequences, thus assuring the fulfilment of the safety requirements.

  15. Identifying microRNA/mRNA dysregulations in ovarian cancer.

    Science.gov (United States)

    Miles, Gregory D; Seiler, Michael; Rodriguez, Lorna; Rajagopal, Gunaretnam; Bhanot, Gyan

    2012-03-27

    novel microRNA/mRNA relationships that can be verified experimentally. We identify both generic microRNA/mRNA regulation mechanisms in the ovary as well as specific microRNA/mRNA controls which are turned on or off in ovarian tumours. Our results suggest that the disease process uses specific mechanisms which may be significant for their utility as early detection biomarkers or in the development of microRNA therapies in treating ovarian cancers. The positively correlated microRNA/mRNA pairs suggest the existence of novel regulatory mechanisms that proceed via intermediate states (indirect regulation) in ovarian tumorigenesis.

  16. In-111 platelets used in evaluation of emboli and thrombi in patients with cerebrovascular accident

    International Nuclear Information System (INIS)

    Kuecuek, N.O.; Aras, G.; Ibis, E.; Soylu, A.; Tascilar, N.; Yuecemen, N.; Mutluer, N.

    2000-01-01

    Studies with In-111 platelets were conducted to evaluate pulmonary embolus, deep vein thrombus and cardiac thrombus. This study aimed to evaluate active thrombi and possible new emboli in patients with cerebrovascular accident (CVA) in the first 24 hours by using autologous In-111 platelets. Twenty-five patients were included in the study. Carotid artery thrombi observed in 10 patients with this technique were confirmed by Doppler ultrasonography. Intracranial thrombi appearing in 3 cases were verified by X-ray computed tomography (CT). Scintigraphy of 8 patients who showed findings suggesting CVA in CT revealed no abnormal accumulation. This was attributed to the possibility that they were small in size, deep in location and/or were also quite aged. Abnormal accumulations observed in the lungs of 3 patients and in the mediastinum and pelvis in one patient were verified by other radiological methods. In-111 platelet study was found to be useful in patients with CVA to evaluate the active thrombi and possible emboli in the early period before clinical symptoms appeared. (author)

  17. Standard evaluation techniques for containment and surveillance radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1982-01-01

    Evaluation techniques used at Los Alamos for personnel and vehicle radiation monitors that safeguard nuclear material determine the worst-case sensitivity. An evaluation tests a monitor's lowest sensitivity regions with sources that have minimum emission rates. The result of our performance tests are analyzed as a binomial experiment. The number of trials that are required to verify the monitor's probability of detection is determined by a graph derived from the confidence limits for a binomial distribution. Our testing results are reported in a way that characterizes the monitor yet does not compromise security by revealing its routine performance for detecting process materials

  18. Validation of search filters for identifying pediatric studies in PubMed

    NARCIS (Netherlands)

    Leclercq, Edith; Leeflang, Mariska M. G.; van Dalen, Elvira C.; Kremer, Leontien C. M.

    2013-01-01

    To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity,

  19. Radioresponse of thymomas verified with histologic reponse

    Energy Technology Data Exchange (ETDEWEB)

    Ohara, Kiyoshi; Tatsuzaki, Hideo; Okumura, Toshiyuki; Itai, Yuji [Dept. of Radiology, Tsukuba Univ., Tsukuba City (Japan)]|[Inst. of Clinical Medicine, Tsukuba Univ., Tsukuba City (Japan); Fuji, Hiroshi [Dept. of Radiology, Tsukuba Univ., Tsukuba City (Japan); Sugahara, Shinji [Dept. of Radiology, Hitachi General Hospital, Hitachi City (Japan); Akaogi, Eiichi; Onizuka, Masataka; Ishikawa, Shigemi; Mitsui, Kiyofumi [Dept. of Surgery, Tsukuba Univ., Tsukuba City (Japan)]|[Inst. of Clinical Medicine, Tsukuba Univ., Tsukuba City (Japan)

    1998-12-31

    Patterns of radiologic response of 10 thymomas treated by preoperative radiotherapy (RT) (18-20 Gy/2 weeks) were determined in conjunction with histologic response. Changes in tumor volume were evaluated with CT scans obtained 5 to 36 days before and 14 to 24 days after the initiation of RT and before surgery. The extent of tumor volume reduction (TR) varied widely (40-78%), while the mean daily volume decrement expressed as a percentage of the pre-RT tumor volume correlated significantly with the pre-RT tumor volume. Histologically, the tumors, all of which were resected 17 to 33 days after RT initiation, generally consisted of predominant fibrous tissues, rare necrotic foci, and few epithelial cells. The TR did not correlate with pre-RT tumor volume, observation period, histologic subtype, or quantity of remaining epithelial cells. The TR of thymomas does not predict RT impact on tumor cells but does reflect the quantity of inherent tumor stroma. (orig.)

  20. Radioresponse of thymomas verified with histologic reponse

    International Nuclear Information System (INIS)

    Ohara, Kiyoshi; Tatsuzaki, Hideo; Okumura, Toshiyuki; Itai, Yuji; Fuji, Hiroshi; Sugahara, Shinji; Akaogi, Eiichi; Onizuka, Masataka; Ishikawa, Shigemi; Mitsui, Kiyofumi

    1998-01-01

    Patterns of radiologic response of 10 thymomas treated by preoperative radiotherapy (RT) (18-20 Gy/2 weeks) were determined in conjunction with histologic response. Changes in tumor volume were evaluated with CT scans obtained 5 to 36 days before and 14 to 24 days after the initiation of RT and before surgery. The extent of tumor volume reduction (TR) varied widely (40-78%), while the mean daily volume decrement expressed as a percentage of the pre-RT tumor volume correlated significantly with the pre-RT tumor volume. Histologically, the tumors, all of which were resected 17 to 33 days after RT initiation, generally consisted of predominant fibrous tissues, rare necrotic foci, and few epithelial cells. The TR did not correlate with pre-RT tumor volume, observation period, histologic subtype, or quantity of remaining epithelial cells. The TR of thymomas does not predict RT impact on tumor cells but does reflect the quantity of inherent tumor stroma. (orig.)