WorldWideScience

Sample records for identify evaluate verify

  1. A performance evaluation of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.; Wright, L.J.

    1987-01-01

    Personnel identity verification devices, which are based on the examination and assessment of a body feature or a unique repeatable personal action, are steadily improving. These biometric devices are becoming more practical with respect to accuracy, speed, user compatibility, reliability and cost, but more development is necessary to satisfy the varied and sometimes ill-defined future requirements of the security industry. In an attempt to maintain an awareness of the availability and the capabilities of identity verifiers for the DOE security community, Sandia Laboratories continues to comparatively evaluate the capabilities and improvements of developing devices. An evaluation of several recently available verifiers is discussed in this paper. Operating environments and procedures more typical of physical access control use can reveal performance substantially different from the basic laboratory tests

  2. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    Science.gov (United States)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  3. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  4. [The development and evaluation of software to verify diagnostic accuracy].

    Science.gov (United States)

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  5. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    Science.gov (United States)

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Identifying the 'right patient': nurse and consumer perspectives on verifying patient identity during medication administration.

    Science.gov (United States)

    Kelly, Teresa; Roper, Cath; Elsom, Stephen; Gaskin, Cadeyrn

    2011-10-01

    Accurate verification of patient identity during medication administration is an important component of medication administration practice. In medical and surgical inpatient settings, the use of identification aids, such as wristbands, is common. In many psychiatric inpatient units in Victoria, Australia, however, standardized identification aids are not used. The present paper outlines the findings of a qualitative research project that employed focus groups to examine mental health nurse and mental health consumer perspectives on the identification of patients during routine medication administration in psychiatric inpatient units. The study identified a range of different methods currently employed to verify patient identity, including technical methods, such as wristband and photographs, and interpersonal methods, such as patient recognition. There were marked similarities in the perspectives of mental health nurses and mental health consumers regarding their opinions and preferences. Technical aids were seen as important, but not as a replacement for the therapeutic nurse-patient encounter. © 2011 The Authors. International Journal of Mental Health Nursing © 2011 Australian College of Mental Health Nurses Inc.

  7. Methods to verify absorbed dose of irradiated containers and evaluation of dosimeters

    International Nuclear Information System (INIS)

    Gao Meixu; Wang Chuanyao; Tang Zhangxong; Li Shurong

    2001-01-01

    The research on dose distribution in irradiated food containers and evaluation of several methods to verify absorbed dose were carried out. The minimum absorbed dose of treated five orange containers was in the top of the highest or in the bottom of lowest container. D max /D min in this study was 1.45 irradiated in a commercial 60 Co facility. The density of orange containers was about 0.391g/cm 3 . The evaluation of dosimeters showed that the PMMA-YL and clear PMMA dosimeters have linear relationship with dose response, and the word NOT in STERIN-125 and STERIN-300 indicators were covered completely at the dosage of 125 and 300 Gy respectively. (author)

  8. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling

    Science.gov (United States)

    Antweiler, Ronald C.; Writer, Jeffrey H.; Murphy, Sheila F.

    2014-01-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations – such as missing the Lagrangian parcel by less than 1 h – can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed “verified Lagrangian” sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2–4 h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we

  9. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling.

    Science.gov (United States)

    Antweiler, Ronald C; Writer, Jeffrey H; Murphy, Sheila F

    2014-02-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations - such as missing the Lagrangian parcel by less than 1h - can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed "verified Lagrangian" sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2-4h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we show how data

  10. POSSIBILITIES TO EVALUATE THE QUALITY OF EDUCATION BY VERIFYING THE DISTRIBUTION OF MARKS

    Directory of Open Access Journals (Sweden)

    Alexandru BOROIU

    2015-05-01

    Full Text Available In the higher education, for the evaluation of education process it is of high interest to use some numeric indicators obtained from the database with the final results realized by the students on exams session. For this purpose could be used the following numeric indicators: proportion of students absent on final evaluation, proportion of non-promoted students, normality degree of passing marks distribution. In order to do this we realized an Excel calculation program that could be applied to each discipline. The inputs are concrete (students total, students present to final evaluation, marks absolute frequency and the outputs for the three indicators are binary (competent or noncompetent, in the last situation the verdict being: “Give explanations. Propose an action plan, with actions, responsible and terms”. To verify the imposed normality degree we elaborate a calculation program based on Kolmogorov-Smirnov concordance test. So, it was realized the increase of analyze objectivity and it was created the opportunity to apply corrective measures in order to improve the education process.

  11. Evaluating MC and A effectiveness to verify the presence of nuclear materials

    International Nuclear Information System (INIS)

    Dawson, P.G.; Morzinski, J.A.; Ostenak, Carl A.; Longmire, V.L.; Jewell, D.; Williams, J.D.

    2001-01-01

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process may move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.

  12. Reliability of stable Pb isotopes to identify Pb sources and verifying biological fractionation of Pb isotopes in goats and chickens

    International Nuclear Information System (INIS)

    Nakata, Hokuto; Nakayama, Shouta M.M.; Yabe, John; Liazambi, Allan; Mizukawa, Hazuki; Darwish, Wageh Sobhy; Ikenaka, Yoshinori; Ishizuka, Mayumi

    2016-01-01

    Stable Pb isotope ratios (Pb-IRs) have been recognized as an efficient tool for identifying sources. This study carried out at Kabwe mining area, Zambia, to elucidate the presence or absence of Pb isotope fractionation in goat and chicken, to evaluate the reliability of identifying Pb pollution sources via analysis of Pb-IRs, and to assess whether a threshold for blood Pb levels (Pb-B) for biological fractionation was present. The variation of Pb-IRs in goat decreased with an increase in Pb-B and were fixed at certain values close to those of the dominant source of Pb exposure at Pb-B > 5 μg/dL. However, chickens did not show a clear relationship for Pb-IRs against Pb-B, or a fractionation threshold. Given these, the biological fractionation of Pb isotopes should not occur in chickens but in goats, and the threshold for triggering biological fractionation is at around 5 μg/dL of Pb-B in goats. - Highlights: • Presence of Pb isotope fractionation in goat and chicken was studied. • The variation of Pb-IRs in goat decreased with an increase in Pb-B. • Chickens did not show a clear relationship for Pb-IRs against Pb-B. • The biological fractionation of Pb isotopes should not occur in chickens but in goats. • Threshold for triggering biological fractionation is at 5 μg/dL of Pb-B in goats. - Biological fractionation and its threshold for stable Pb isotope ratio in goats and chickens were examined.

  13. Reasoning about knowledge: Children’s evaluations of generality and verifiability

    Science.gov (United States)

    Koenig, Melissa A.; Cole, Caitlin A.; Meyer, Meredith; Ridge, Katherine E.; Kushnir, Tamar; Gelman, Susan A.

    2015-01-01

    In a series of experiments, we examined 3- to 8-year-old children’s (N = 223) and adults’ (N = 32) use of two properties of testimony to estimate a speaker’s knowledge: generality and verifiability. Participants were presented with a “Generic speaker” who made a series of 4 general claims about “pangolins” (a novel animal kind), and a “Specific speaker” who made a series of 4 specific claims about “this pangolin” as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., “has a pointy nose”) or a non-evident feature that was not visible (e.g., “sleeps in a hollow tree”). Three main findings emerged: (1) Young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) Children’s attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) Children often generalized speakers’ knowledge outside of the pangolin domain, indicating a belief that a person’s knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  14. Verifying Digital Components of Physical Systems: Experimental Evaluation of Test Quality

    Science.gov (United States)

    Laputenko, A. V.; López, J. E.; Yevtushenko, N. V.

    2018-03-01

    This paper continues the study of high quality test derivation for verifying digital components which are used in various physical systems; those are sensors, data transfer components, etc. We have used logic circuits b01-b010 of the package of ITC'99 benchmarks (Second Release) for experimental evaluation which as stated before, describe digital components of physical systems designed for various applications. Test sequences are derived for detecting the most known faults of the reference logic circuit using three different approaches to test derivation. Three widely used fault types such as stuck-at-faults, bridges, and faults which slightly modify the behavior of one gate are considered as possible faults of the reference behavior. The most interesting test sequences are short test sequences that can provide appropriate guarantees after testing, and thus, we experimentally study various approaches to the derivation of the so-called complete test suites which detect all fault types. In the first series of experiments, we compare two approaches for deriving complete test suites. In the first approach, a shortest test sequence is derived for testing each fault. In the second approach, a test sequence is pseudo-randomly generated by the use of an appropriate software for logic synthesis and verification (ABC system in our study) and thus, can be longer. However, after deleting sequences detecting the same set of faults, a test suite returned by the second approach is shorter. The latter underlines the fact that in many cases it is useless to spend `time and efforts' for deriving a shortest distinguishing sequence; it is better to use the test minimization afterwards. The performed experiments also show that the use of only randomly generated test sequences is not very efficient since such sequences do not detect all the faults of any type. After reaching the fault coverage around 70%, saturation is observed, and the fault coverage cannot be increased anymore. For

  15. Experimental evaluation of the exposure level onboard Czech Airlines aircraft - measurements verified the routine method

    International Nuclear Information System (INIS)

    Ploc, O.; Spurny, F.; Turek, K.; Kovar, I.

    2008-01-01

    Air-crew members are exposed to ionizing radiation due to their work on board of air-crafts. The International Commission on Radiological Protection (ICRP) in 1990 recommends that exposure to cosmic radiation in the operation of jet aircraft should be recognised as occupational exposure. Czech air transport operators are therefore obliged to ensure: - Air-crew members to be well informed about the exposure level and health risks; - An analysis of complete exposure level of aircraft crew and its continuing monitoring in cases of exceeding the informative value 1 mSv; - A compliance of limit 1 mSv during pregnancy Since 1998, after receiving a proper accreditation, the Department of Radiation Dosimetry of Nuclear Physics Institute of Czech Academy of Sciences (DRD) is the competent dosimetric service realized requirements of Notice No.307 of the State Office for Nuclear Safety concerning air-crew exposure (paragraphs 87-90). The DRD has developed routine method of personal dosimetry of aircraft crew in 1998 which has been applied after receiving a proper accreditation in the same year. DRD therefore helps Czech airlines a.s. (CSA) with their legislative obligations mentioned above, and in return, once per four years, in terms of business contract, CSA allows scientific measurements performed by DRD onboard its air-crafts with the aim to verify the method of routine individual monitoring of aircraft crew exposure. (authors)

  16. Verified Gaming

    DEFF Research Database (Denmark)

    Kiniry, Joseph Roland; Zimmerman, Daniel

    2011-01-01

    ---falls every year and any mention of mathematics in the classroom seems to frighten students away. So the question is: How do we attract new students in computing to the area of dependable software systems? Over the past several years at three universities we have experimented with the use of computer games......In recent years, several Grand Challenges (GCs) of computing have been identified and expounded upon by various professional organizations in the U.S. and England. These GCs are typically very difficult problems that will take many hundreds, or perhaps thousands, of man-years to solve. Researchers...

  17. Evaluation of the ability of rod drop tests to verify the stability margins in FTR

    International Nuclear Information System (INIS)

    Harris, R.A.; Sevenich, R.A.

    1976-01-01

    Predictions of the stability characteristics of FTR indicate that the reactor can be easily controlled even under the worst possible conditions. Nevertheless, experimental verification and monitoring of these characteristics will be performed during operation of the reactor. An initial evaluation of rod drop experiments which could possibly provide this verification is presented

  18. Evaluation of puberty by verifying spontaneous and stimulated gonadotropin values in girls.

    Science.gov (United States)

    Chin, Vivian L; Cai, Ziyong; Lam, Leslie; Shah, Bina; Zhou, Ping

    2015-03-01

    Changes in pharmacological agents and advancements in laboratory assays have changed the gonadotropin-releasing hormone analog stimulation test. To determine the best predictive model for detecting puberty in girls. Thirty-five girls, aged 2 years 7 months to 9 years 3 months, with central precocious puberty (CPP) (n=20) or premature thelarche/premature adrenarche (n=15). Diagnoses were based on clinical information, baseline hormones, bone age, and pelvic sonogram. Gonadotropins and E2 were analyzed using immunochemiluminometric assay. Logistic regression for CPP was performed. The best predictor of CPP is the E2-change model based on 3- to 24-h values, providing 80% sensitivity and 87% specificity. Three-hour luteinizing hormone (LH) provided 75% sensitivity and 87% specificity. Basal LH lowered sensitivity to 65% and specificity to 53%. The E2-change model provided the best predictive power; however, 3-h LH was more practical and convenient when evaluating puberty in girls.

  19. Process to identify and evaluate restoration options

    International Nuclear Information System (INIS)

    Strand, J.; Senner, S.; Weiner, A.; Rabinowitch, S.; Brodersen, M.; Rice, K.; Klinge, K.; MacMullin, S.; Yender, R.; Thompson, R.

    1993-01-01

    The restoration planning process has yielded a number of possible alternatives for restoring resources and services injured by the Exxon Valdez oil spill. They were developed by resource managers, scientists, and the public, taking into consideration the results of damage assessment and restoration studies and information from the scientific literature. The alternatives thus far identified include no action natural recovery, management of human uses, manipulation of resources, habitat protection and acquisition, acquisition of equivalent resources, and combinations of the above. Each alternative consists of a different mix of resource- or service-specific restoration options. To decide whether it was appropriate to spend restoration funds on a particular resource or service, first criteria had to be developed that evaluated available evidence for consequential injury and the adequacy and rate of natural recovery. Then, recognizing the range of effective restoration options, a second set of criteria was applied to determine which restoration options were the most beneficial. These criteria included technical feasibility, potential to improve the rate or degree of recovery, the relationship of expected costs to benefits, cost effectiveness, and the potential to restore the ecosystem as a whole. The restoration options considered to be most beneficial will be grouped together in several or more of the above alternatives and presented in a draft restoration plan. They will be further evaluated in a companion draft environmental impact statement

  20. Evaluation of food emergency response laboratories' capability for 210Po analysis using proficiency test material with verifiable traceability

    International Nuclear Information System (INIS)

    Zhongyu Wu; Zhichao Lin; Mackill, P.; Cong Wei; Noonan, J.; Cherniack, J.; Gillis-Landrum, D.

    2009-01-01

    Measurement capability and data comparability are essential for emergency response when analytical data from cooperative laboratories are used for risk assessment and post incident decision making. In this study, the current capability of food emergency response laboratories for the analysis of 210 Po in water was evaluated using a proficiency test scheme in compliance with ISO-43 and ILAC G13 guidelines, which comprises a test sample preparation and verification protocol and an insightful statistical data evaluation. The results of performance evaluations on relative bias, value trueness, precision, false positive detection, minimum detection limit, and limit of quantification, are presented. (author)

  1. Status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  2. Identifying, Preparing and Evaluating Army Instructors

    Science.gov (United States)

    2016-04-01

    is perhaps the most prominent and widely-used framework for evaluating training courses and programs (Hilbert, Preskill & Russ- Eft , 1997; Hoole...Glazerman, S., & Seifullah, A. (2012). An evaluation of the Chicago Teacher Advancement Program (Chicago TAP ) after four years. (Report prepared for...H., & Russ- Eft , D. (1997). Evaluating training. In L. J. Bassi & D. Russ- Eft (Eds.), What works: Assessment, development, and measurement (pp. 109

  3. Identifying Anomalous Citations for Objective Evaluation of Scholarly Article Impact.

    Directory of Open Access Journals (Sweden)

    Xiaomei Bai

    Full Text Available Evaluating the impact of a scholarly article is of great significance and has attracted great attentions. Although citation-based evaluation approaches have been widely used, these approaches face limitations e.g. in identifying anomalous citations patterns. This negligence would inevitably cause unfairness and inaccuracy to the article impact evaluation. In this study, in order to discover the anomalous citations and ensure the fairness and accuracy of research outcome evaluation, we investigate the citation relationships between articles using the following factors: collaboration times, the time span of collaboration, citing times and the time span of citing to weaken the relationship of Conflict of Interest (COI in the citation network. Meanwhile, we study a special kind of COI, namely suspected COI relationship. Based on the COI relationship, we further bring forward the COIRank algorithm, an innovative scheme for accurately assessing the impact of an article. Our method distinguishes the citation strength, and utilizes PageRank and HITS algorithms to rank scholarly articles comprehensively. The experiments are conducted on the American Physical Society (APS dataset. We find that about 80.88% articles contain contributed citations by co-authors in 26,366 articles and 75.55% articles among these articles are cited by the authors belonging to the same affiliation, indicating COI and suspected COI should not be ignored for evaluating impact of scientific papers objectively. Moreover, our experimental results demonstrate COIRank algorithm significantly outperforms the state-of-art solutions. The validity of our approach is verified by using the probability of Recommendation Intensity.

  4. Identifying Anomalous Citations for Objective Evaluation of Scholarly Article Impact.

    Science.gov (United States)

    Bai, Xiaomei; Xia, Feng; Lee, Ivan; Zhang, Jun; Ning, Zhaolong

    2016-01-01

    Evaluating the impact of a scholarly article is of great significance and has attracted great attentions. Although citation-based evaluation approaches have been widely used, these approaches face limitations e.g. in identifying anomalous citations patterns. This negligence would inevitably cause unfairness and inaccuracy to the article impact evaluation. In this study, in order to discover the anomalous citations and ensure the fairness and accuracy of research outcome evaluation, we investigate the citation relationships between articles using the following factors: collaboration times, the time span of collaboration, citing times and the time span of citing to weaken the relationship of Conflict of Interest (COI) in the citation network. Meanwhile, we study a special kind of COI, namely suspected COI relationship. Based on the COI relationship, we further bring forward the COIRank algorithm, an innovative scheme for accurately assessing the impact of an article. Our method distinguishes the citation strength, and utilizes PageRank and HITS algorithms to rank scholarly articles comprehensively. The experiments are conducted on the American Physical Society (APS) dataset. We find that about 80.88% articles contain contributed citations by co-authors in 26,366 articles and 75.55% articles among these articles are cited by the authors belonging to the same affiliation, indicating COI and suspected COI should not be ignored for evaluating impact of scientific papers objectively. Moreover, our experimental results demonstrate COIRank algorithm significantly outperforms the state-of-art solutions. The validity of our approach is verified by using the probability of Recommendation Intensity.

  5. Externally Verifiable Oblivious RAM

    Directory of Open Access Journals (Sweden)

    Gancher Joshua

    2017-04-01

    Full Text Available We present the idea of externally verifiable oblivious RAM (ORAM. Our goal is to allow a client and server carrying out an ORAM protocol to have disputes adjudicated by a third party, allowing for the enforcement of penalties against an unreliable or malicious server. We give a security definition that guarantees protection not only against a malicious server but also against a client making false accusations. We then give modifications of the Path ORAM [15] and Ring ORAM [9] protocols that meet this security definition. These protocols both have the same asymptotic runtimes as the semi-honest original versions and require the external verifier to be involved only when the client or server deviates from the protocol. Finally, we implement externally verified ORAM, along with an automated cryptocurrency contract to use as the external verifier.

  6. Verifier Theory and Unverifiability

    OpenAIRE

    Yampolskiy, Roman V.

    2016-01-01

    Despite significant developments in Proof Theory, surprisingly little attention has been devoted to the concept of proof verifier. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verificati...

  7. USCIS E-Verify Program Reports

    Data.gov (United States)

    Department of Homeland Security — The report builds on the last comprehensive evaluation of the E-Verify Program and demonstrates that E-Verify produces accurate results and that accuracy rates have...

  8. Verifiably Truthful Mechanisms

    DEFF Research Database (Denmark)

    Branzei, Simina; Procaccia, Ariel D.

    2015-01-01

    the computational sense). Our approach involves three steps: (i) specifying the structure of mechanisms, (ii) constructing a verification algorithm, and (iii) measuring the quality of verifiably truthful mechanisms. We demonstrate this approach using a case study: approximate mechanism design without money...

  9. Comparison of VerifyNow-P2Y12 test and Flow Cytometry for monitoring individual platelet response to clopidogrel. What is the cut-off value for identifying patients who are low responders to clopidogrel therapy?

    Directory of Open Access Journals (Sweden)

    Castelli Alfredo

    2009-05-01

    Full Text Available Abstract Background Dual anti-platelet therapy with aspirin and a thienopyridine (DAT is used to prevent stent thrombosis after percutaneous coronary intervention (PCI. Low response to clopidogrel therapy (LR occurs, but laboratory tests have a controversial role in the identification of this condition. Methods We studied LR in patients with stable angina undergoing elective PCI, all on DAT for at least 7 days, by comparing: 1 Flow cytometry (FC to measure platelet membrane expression of P-selectin (CD62P and PAC-1 binding following double stimulation with ADP and collagen type I either in the presence of prostaglandin (PG E1; 2 VerifyNow-P2Y12 test, in which results are reported as absolute P2Y12-Reaction-Units (PRU or % of inhibition (% inhibition. Results Thirty controls and 52 patients were analyzed. The median percentage of platelets exhibiting CD62P expression and PAC-1 binding by FC evaluation after stimulation in the presence of PG E1 was 25.4% (IQR: 21.4–33.1% and 3.5% (1.7–9.4%, respectively. Only 6 patients receiving DAT (11.5% had both values above the 1st quartile of controls, and were defined as LR. Evaluation of the same patients with the VerifyNow-P2Y12 test revealed that the area under the receiver-operating-characteristic (ROC curve was 0.94 (95% CI: 0.84–0.98, p 213 PRU gave the maximum accuracy for the detection of patients defined as having LR by FC. Conclusion In conclusion our findings show that a cut-off value of ≤ 15% inhibition or > 213 PRU in the VerifyNow-P2Y12 test may provide the best accuracy for the identification of patients with LR.

  10. Verifying versus falsifying banknotes

    Science.gov (United States)

    van Renesse, Rudolf L.

    1998-04-01

    A series of counterfeit Dutch, German, English, and U.S. banknotes was examined with respect to the various modi operandi to imitate paper based, printed and post-printed security features. These features provide positive evidence (verifiability) as well as negative evidence (falsifiability). It appears that the positive evidence provided in most cases is insufficiently convincing: banknote inspection mainly rests on negative evidence. The act of falsifying (to prove to be false), however, is an inefficacious procedure. Ergonomic verificatory security features are demanded. This demand is increasingly met by security features based on nano- technology. The potential of nano-security has a twofold base: (1) the unique optical effects displayed allow simple, fast and unambiguous inspection, and (2) the nano-technology they are based on, makes successful counterfeit or simulation extremely improbable.

  11. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  12. Guidance for Identifying, Selecting and Evaluating Open Literature Studies

    Science.gov (United States)

    This guidance for Office of Pesticide Program staff will assist in their evaluation of open literature studies of pesticides. It also describes how we identify, select, and ensure that data we use in risk assessments is of sufficient scientific quality.

  13. Identifying and Evaluating External Validity Evidence for Passing Scores

    Science.gov (United States)

    Davis-Becker, Susan L.; Buckendahl, Chad W.

    2013-01-01

    A critical component of the standard setting process is collecting evidence to evaluate the recommended cut scores and their use for making decisions and classifying students based on test performance. Kane (1994, 2001) proposed a framework by which practitioners can identify and evaluate evidence of the results of the standard setting from (1)…

  14. The status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  15. Personal identifiers in medical research networks: evaluation of the personal identifier generator in the Competence Network Paediatric Oncology and Haematology

    Directory of Open Access Journals (Sweden)

    Pommerening, Klaus

    2006-06-01

    Full Text Available The Society for Paediatric Oncology and Haematology (GPOH and the corresponding Competence Network Paediatric Oncology and Haematology conduct various clinical trials. The comprehensive analysis requires reliable identification of the recruited patients. Therefore, a personal identifier (PID generator is used to assign unambiguous, pseudonymous, non-reversible PIDs to participants in those trials. We tested the matching algorithm of the PID generator using a configuration specific to the GPOH. False data was used to verify the correct processing of PID requests (functionality tests, while test data was used to evaluate the matching outcome. We also assigned PIDs to more than 44,000 data records from the German Childhood Cancer Registry (GCCR and assessed the status of the associated patient list which contains the PIDs, partly encrypted data items and information on the PID generation process for each data record. All the functionality tests showed the expected results. Neither 14,915 test data records nor the GCCR data records yielded any homonyms. Six synonyms were found in the test data, due to erroneous birth dates, and 22 synonyms were found when the GCCR data was run against the actual patient list of 2579 records. In the resulting patient list of 45,693 entries, duplicate record submissions were found for about 7% of all listed patients, while more frequent submissions occurred in less than 1% of cases. The synonym error rate depends mainly on the quality of the input data and on the frequency of multiple submissions. Depending on the requirements on maximally tolerable synonym and homonym error rates, additional measures for securing input data quality might be necessary. The results demonstrate that the PID generator is an appropriate tool for reliably identifying trial participants in medical research networks.

  16. Verified OS Interface Code Synthesis

    Science.gov (United States)

    2016-12-01

    results into the larger proof framework of the seL4 microkernel to be directly usable in practice. Beyond the stated project goals, the solution...CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This combination increases proof productivity...were used for the verified ML compiler CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This

  17. Identifying and Evaluating Chaotic Behavior in Hydro-Meteorological Processes

    Directory of Open Access Journals (Sweden)

    Soojun Kim

    2015-01-01

    Full Text Available The aim of this study is to identify and evaluate chaotic behavior in hydro-meteorological processes. This study poses the two hypotheses to identify chaotic behavior of the processes. First, assume that the input data is the significant factor to provide chaotic characteristics to output data. Second, assume that the system itself is the significant factor to provide chaotic characteristics to output data. For solving this issue, hydro-meteorological time series such as precipitation, air temperature, discharge, and storage volume were collected in the Great Salt Lake and Bear River Basin, USA. The time series in the period of approximately one year were extracted from the original series using the wavelet transform. The generated time series from summation of sine functions were fitted to each series and used for investigating the hypotheses. Then artificial neural networks had been built for modeling the reservoir system and the correlation dimension was analyzed for the evaluation of chaotic behavior between inputs and outputs. From the results, we found that the chaotic characteristic of the storage volume which is output is likely a byproduct of the chaotic behavior of the reservoir system itself rather than that of the input data.

  18. Methodology to identify, review, and evaluate components for license renewal

    International Nuclear Information System (INIS)

    Carlson, D.D.; Gregor, F.E.; Walker, R.S.

    1988-01-01

    A methodology has been developed to systematically identify, review, and evaluate plant equipment for license renewal. The method builds upon the existing licensing basis, operating history, and accepted deterministic and probabilistic techniques. Use of these approaches provides a focus for license renewal upon those safety-significant systems and components that are not routinely replaced, refurbished, or subject to detailed inspection as part of the plant's existing test, maintenance, and surveillance programs. Application of the method identified the PWR and BWR systems that should be subjected to detailed license renewal review. Detailed examination of two example systems demonstrates the approach. The review and evaluation of plant equipment for license renewal differ from the initial licensing of the plant. A substantial operating history has been established, the licensing basis has evolved from the original one, and plant equipment has been subject to periodic maintenance and surveillance throughout its life. In consideration of these differences, a basis for license renewal is needed. License renewal should be based upon continuation of the existing licensing basis and recognition of existing programs and operating history

  19. Preclinical Evaluations To Identify Optimal Linezolid Regimens for Tuberculosis Therapy

    Science.gov (United States)

    Drusano, George L.; Adams, Jonathan R.; Rodriquez, Jaime L.; Jambunathan, Kalyani; Baluya, Dodge L.; Brown, David L.; Kwara, Awewura; Mirsalis, Jon C.; Hafner, Richard; Louie, Arnold

    2015-01-01

    ABSTRACT Linezolid is an oxazolidinone with potent activity against Mycobacterium tuberculosis. Linezolid toxicity in patients correlates with the dose and duration of therapy. These toxicities are attributable to the inhibition of mitochondrial protein synthesis. Clinically relevant linezolid regimens were simulated in the in vitro hollow-fiber infection model (HFIM) system to identify the linezolid therapies that minimize toxicity, maximize antibacterial activity, and prevent drug resistance. Linezolid inhibited mitochondrial proteins in an exposure-dependent manner, with toxicity being driven by trough concentrations. Once-daily linezolid killed M. tuberculosis in an exposure-dependent manner. Further, 300 mg linezolid given every 12 hours generated more bacterial kill but more toxicity than 600 mg linezolid given once daily. None of the regimens prevented linezolid resistance. These findings show that with linezolid monotherapy, a clear tradeoff exists between antibacterial activity and toxicity. By identifying the pharmacokinetic parameters linked with toxicity and antibacterial activity, these data can provide guidance for clinical trials evaluating linezolid in multidrug antituberculosis regimens. PMID:26530386

  20. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  1. Two statistics for evaluating parameter identifiability and error reduction

    Science.gov (United States)

    Doherty, John; Hunt, Randall J.

    2009-01-01

    Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.

  2. Verifying Architectural Design Rules of the Flight Software Product Line

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  3. Instructor and course evaluation based on student-identified criteria.

    Science.gov (United States)

    Jackson, M O

    1977-02-01

    Students have come to school for an education and it is their right to evaluate the quality of the education they are receiving. They should not have to demand or even ask for the privilege of saying what they think. Instructors should be providing the opportunity for evaluation by requesting that information from the students. No value judgment can be totally objective, but an instrument composed of mutually agreed upon statements should encourage the greatest possible degree of objectivity. Using one accepted form throughout the school, all students would be considering the same characteristics and traits for every instructor and course evaluated. Each instructor would receive similar information about personal performance and about the course presented. Students would be free to talk to the faculty or to add comments if they so desired; but, a questionnaire used in every course would allow and even encourage responses from every student enrolled. Faculty responsibility would not end with the preparation and implementation of an evaluation instrument. Instructors would have to let the students know their opinions are important and will be considered in curricular and instructional decisions. Faculty and students would be communicating and hopefully fulfilling the needs of and responsibilities to each other.

  4. Further Evaluation of Methods to Identify Matched Stimulation

    OpenAIRE

    Rapp, John T

    2007-01-01

    The effects of preferred stimulation on the vocal stereotypy of 2 individuals were evaluated in two experiments. The results of Experiment 1 showed that (a) the vocal stereotypy of both participants persisted in the absence of social consequences, (b) 1 participant manipulated toys that did and did not produce auditory stimulation, but only sound-producing toys decreased his vocal stereotypy, and (c) only noncontingent music decreased vocal stereotypy for the other participant, but sterotypy ...

  5. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  6. Automated measurement and control of concrete properties in a ready mix truck with VERIFI.

    Science.gov (United States)

    2014-02-01

    In this research, twenty batches of concrete with six different mixture proportions were tested with VERIFI to evaluate 1) accuracy : and repeatability of VERIFI measurements, 2) ability of VERIFI to adjust slump automatically with water and admixtur...

  7. Verifying design patterns in Hoare Type Theory

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Buisse, Alexandre; Birkedal, Lars

    In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory.......In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory....

  8. Verifying pronunciation dictionaries using conflict analysis

    CSIR Research Space (South Africa)

    Davel, MH

    2010-09-01

    Full Text Available The authors describe a new language-independent technique for automatically identifying errors in an electronic pronunciation dictionary by analyzing the source of conflicting patterns directly.They evaluate the effectiveness of the technique in two...

  9. Appraising the value of independent EIA follow-up verifiers

    Energy Technology Data Exchange (ETDEWEB)

    Wessels, Jan-Albert, E-mail: janalbert.wessels@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Environmental Assessment, School of Environmental Science, Murdoch University, Australia. (Australia)

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  10. Appraising the value of independent EIA follow-up verifiers

    International Nuclear Information System (INIS)

    Wessels, Jan-Albert; Retief, Francois; Morrison-Saunders, Angus

    2015-01-01

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  11. Verifying FreeRTOS; a feasibility study

    NARCIS (Netherlands)

    Pronk, C.

    2010-01-01

    This paper presents a study on modeling and verifying the kernel of Real-Time Operating Systems (RTOS). The study will show advances in formally verifying such an RTOS both by refinement and by model checking approaches. This work fits in the context of Hoare’s verification challenge. Several

  12. USCIS E-Verify Self-Check

    Data.gov (United States)

    Department of Homeland Security — E-Verify is an internet based system that contains datasets to compare information from an employee's Form I-9, Employment Eligibility Verification, to data from the...

  13. Auto-identification fiberoptical seal verifier

    International Nuclear Information System (INIS)

    Yamamoto, Yoichi; Mukaiyama, Takehiro

    1998-08-01

    An auto COBRA seal verifier was developed by Japan Atomic Energy Research Institute (JAERI) to provide more efficient and simpler inspection measures for IAEA safeguards. The verifier is designed to provide means of a simple, quantitative and objective judgment on in-situ verification for the COBRA seal. The equipment is a portable unit with hand-held weight and size. It can be operated by battery or AC power. The verifier reads a COBRA seal signature by using a built-in CCD camera and carries out the signature comparison procedure automatically on digital basis. The result of signature comparison is given as a YES/NO answer. The production model of the verifier was completed in July 1996. The development was carried out in collaboration with Mitsubishi Heavy Industries, Ltd. This report describes the design and functions of the COBRA seal verifier and the results of environmental and functional tests. The development of the COBRA seal verifier was carried out in the framework of Japan Support Programme for Agency Safeguards (JASPAS) as a project, JD-4 since 1981. (author)

  14. An IBM 370 assembly language program verifier

    Science.gov (United States)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  15. Identifying Knowledge Gaps in Clinicians Who Evaluate and Treat Vocal Performing Artists in College Health Settings.

    Science.gov (United States)

    McKinnon-Howe, Leah; Dowdall, Jayme

    2018-05-01

    The goal of this study was to identify knowledge gaps in clinicians who evaluate and treat performing artists for illnesses and injuries that affect vocal function in college health settings. This pilot study utilized a web-based cross-sectional survey design incorporating common clinical scenarios to test knowledge of evaluation and management strategies in the vocal performing artist. A web-based survey was administered to a purposive sample of 28 clinicians to identify the approach utilized to evaluate and treat vocal performing artists in college health settings, and factors that might affect knowledge gaps and influence referral patterns to voice specialists. Twenty-eight clinicians were surveyed, with 36% of respondents incorrectly identifying appropriate vocal hygiene measures, 56% of respondents failing to identify symptoms of vocal fold hemorrhage, 84% failing to identify other indications for referral to a voice specialist, 96% of respondents acknowledging unfamiliarity with the Voice Handicap Index and the Singers Voice Handicap Index, and 68% acknowledging unfamiliarity with the Reflux Symptom Index. The data elucidated specific knowledge gaps in college health providers who are responsible for evaluating and treating common illnesses that affect vocal function, and triaging and referring students experiencing symptoms of potential vocal emergencies. Future work is needed to improve the standard of care for this population. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  16. Classroom Experiment to Verify the Lorentz Force

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 3. Classroom Experiment to Verify the Lorentz Force. Somnath Basu Anindita Bose Sumit Kumar Sinha Pankaj Vishe S Chatterjee. Classroom Volume 8 Issue 3 March 2003 pp 81-86 ...

  17. On alternative approach for verifiable secret sharing

    OpenAIRE

    Kulesza, Kamil; Kotulski, Zbigniew; Pieprzyk, Joseph

    2002-01-01

    Secret sharing allows split/distributed control over the secret (e.g. master key). Verifiable secret sharing (VSS) is the secret sharing extended by verification capacity. Usually verification comes at the price. We propose "free lunch", the approach that allows to overcome this inconvenience.

  18. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  19. A Verifiable Secret Shuffle of Homomorphic Encryptions

    DEFF Research Database (Denmark)

    Groth, Jens

    2003-01-01

    We show how to prove in honest verifier zero-knowledge the correctness of a shuffle of homomorphic encryptions (or homomorphic commitments.) A shuffle consists in a rearrangement of the input ciphertexts and a reencryption of them so that the permutation is not revealed....

  20. Evaluation of potential regulatory elements identified as DNase I hypersensitive sites in the CFTR gene

    DEFF Research Database (Denmark)

    Phylactides, M.; Rowntree, R.; Nuthall, H.

    2002-01-01

    hypersensitive sites (DHS) within the locus. We previously identified at least 12 clusters of DHS across the CFTR gene and here further evaluate DHS in introns 2,3,10,16,17a, 18, 20 and 21 to assess their functional importance in regulation of CFTR gene expression. Transient transfections of enhancer/reporter...

  1. Evaluating the Atrial Myopathy Underlying Atrial Fibrillation: Identifying the Arrhythmogenic and Thrombogenic Substrate

    Science.gov (United States)

    Goldberger, Jeffrey J.; Arora, Rishi; Green, David; Greenland, Philip; Lee, Daniel C.; Lloyd-Jones, Donald M.; Markl, Michael; Ng, Jason; Shah, Sanjiv J.

    2015-01-01

    Atrial disease or myopathy forms the substrate for atrial fibrillation (AF) and underlies the potential for atrial thrombus formation and subsequent stroke. Current diagnostic approaches in patients with AF focus on identifying clinical predictors with evaluation of left atrial size by echocardiography serving as the sole measure specifically evaluating the atrium. Although the atrial substrate underlying AF is likely developing for years prior to the onset of AF, there is no current evaluation to identify the pre-clinical atrial myopathy. Atrial fibrosis is one component of the atrial substrate that has garnered recent attention based on newer MRI techniques that have been applied to visualize atrial fibrosis in humans with prognostic implications regarding success of treatment. Advanced ECG signal processing, echocardiographic techniques, and MRI imaging of fibrosis and flow provide up-to-date approaches to evaluate the atrial myopathy underlying AF. While thromboembolic risk is currently defined by clinical scores, their predictive value is mediocre. Evaluation of stasis via imaging and biomarkers associated with thrombogenesis may provide enhanced approaches to assess risk for stroke in patients with AF. Better delineation of the atrial myopathy that serves as the substrate for AF and thromboembolic complications might improve treatment outcomes. Furthermore, better delineation of the pathophysiologic mechanisms underlying the development of the atrial substrate for AF, particularly in its earlier stages, could help identify blood and imaging biomarkers that could be useful to assess risk for developing new onset AF and suggest specific pathways that could be targeted for prevention. PMID:26216085

  2. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  3. A Practical Voter-Verifiable Election Scheme.

    OpenAIRE

    Chaum, D; Ryan, PYA; Schneider, SA

    2005-01-01

    We present an election scheme designed to allow voters to verify that their vote is accurately included in the count. The scheme provides a high degree of transparency whilst ensuring the secrecy of votes. Assurance is derived from close auditing of all the steps of the vote recording and counting process with minimal dependence on the system components. Thus, assurance arises from verification of the election rather than having to place trust in the correct behaviour of components of the vot...

  4. Identifying and assessing strategies for evaluating the impact of mobile eye health units on health outcomes.

    Science.gov (United States)

    Fu, Shiwan; Turner, Angus; Tan, Irene; Muir, Josephine

    2017-12-01

    To identify and assess strategies for evaluating the impact of mobile eye health units on health outcomes. Systematic literature review. Worldwide. Peer-reviewed journal articles that included the use of a mobile eye health unit. Journal articles were included if outcome measures reflected an assessment of the impact of a mobile eye health unit on health outcomes. Six studies were identified with mobile services offering diabetic retinopathy screening (three studies), optometric services (two studies) and orthoptic services (one study). This review identified and assessed strategies in existing literature used to evaluate the impact of mobile eye health units on health outcomes. Studies included in this review used patient outcomes (i.e. disease detection, vision impairment, treatment compliance) and/or service delivery outcomes (i.e. cost per attendance, hospital transport use, inappropriate referrals, time from diabetic retinopathy photography to treatment) to evaluate the impact of mobile eye health units. Limitations include difficulty proving causation of specific outcome measures and the overall shortage of impact evaluation studies. Variation in geographical location, service population and nature of eye care providers limits broad application. © 2017 National Rural Health Alliance Inc.

  5. Identifying the critical financial ratios for stocks evaluation: A fuzzy delphi approach

    Science.gov (United States)

    Mokhtar, Mazura; Shuib, Adibah; Mohamad, Daud

    2014-12-01

    Stocks evaluation has always been an interesting and challenging problem for both researchers and practitioners. Generally, the evaluation can be made based on a set of financial ratios. Nevertheless, there are a variety of financial ratios that can be considered and if all ratios in the set are placed into the evaluation process, data collection would be more difficult and time consuming. Thus, the objective of this paper is to identify the most important financial ratios upon which to focus in order to evaluate the stock's performance. For this purpose, a survey was carried out using an approach which is based on an expert judgement, namely the Fuzzy Delphi Method (FDM). The results of this study indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share and debt to equity are the most important ratios.

  6. Identifying and evaluating E-procurement in supply chain risk by Fuzzy MADM

    Directory of Open Access Journals (Sweden)

    Mostafa Memarzade

    2012-08-01

    Full Text Available E-procurement risks has emerged as an important issue for researchers and practitioners because mitigating supply chain risk helps improve firms’ as well as supply chains’ performance. E-marketplaces have been steadily growing and there have been significant interest in e-business research. There are different risks and uncertainties involved with E-marketplaces, which jeopardizes the sector but we have had a large amount of hype and the business still continue to grow. The primary aim of this study is to identify E-procurement risks and evaluate them using a fuzzy AHP framework. We contribute E-procurement risk by identifying 13 critical criteria and determine four important ones including the extent of acceptable information, interrelationship risk, lack of honesty in relationships and product quality and safety for evaluating suppliers’ risk.

  7. Verified Subtyping with Traits and Mixins

    Directory of Open Access Journals (Sweden)

    Asankhaya Sharma

    2014-07-01

    Full Text Available Traits allow decomposing programs into smaller parts and mixins are a form of composition that resemble multiple inheritance. Unfortunately, in the presence of traits, programming languages like Scala give up on subtyping relation between objects. In this paper, we present a method to check subtyping between objects based on entailment in separation logic. We implement our method as a domain specific language in Scala and apply it on the Scala standard library. We have verified that 67% of mixins used in the Scala standard library do indeed conform to subtyping between the traits that are used to build them.

  8. Unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2016-07-01

    Full Text Available stream_source_info Marais_2016_ABSTRACT.pdf.txt stream_content_type text/plain stream_size 796 Content-Encoding ISO-8859-1 stream_name Marais_2016_ABSTRACT.pdf.txt Content-Type text/plain; charset=ISO-8859-1 18th... International Workshop on Descriptional Complexity of Formal Systems, 5 - 8 July 2016, Bucharest, Romania Unary self-verifying symmetric difference automata Laurette Marais1,2 and Lynette van Zijl1(B) 1 Department of Computer Science, Stellenbosch...

  9. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    OpenAIRE

    Weycker Derek; Sofrygin Oleg; Seefeld Kim; Deeter Robert G; Legg Jason; Edelsberg John

    2013-01-01

    Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classifie...

  10. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  11. Verifying a nuclear weapon`s response to radiation environments

    Energy Technology Data Exchange (ETDEWEB)

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  12. Identifying and evaluating electronic learning resources for use in adult-gerontology nurse practitioner education.

    Science.gov (United States)

    Thompson, Hilaire J; Belza, Basia; Baker, Margaret; Christianson, Phyllis; Doorenbos, Ardith; Nguyen, Huong

    2014-01-01

    Enhancing existing curricula to meet newly published adult-gerontology advanced practice registered nurse (APRN) competencies in an efficient manner presents a challenge to nurse educators. Incorporating shared, published electronic learning resources (ELRs) in existing or new courses may be appropriate in order to assist students in achieving competencies. The purposes of this project were to (a) identify relevant available ELR for use in enhancing geriatric APRN education and (b) to evaluate the educational utility of identified ELRs based on established criteria. A multilevel search strategy was used. Two independent team members reviewed identified ELR against established criteria to ensure utility. Only resources meeting all criteria were retained. Resources were found for each of the competency areas and included formats such as podcasts, Web casts, case studies, and teaching videos. In many cases, resources were identified using supplemental strategies and not through traditional search or search of existing geriatric repositories. Resources identified have been useful to advanced practice educators in improving lecture and seminar content in a particular topic area and providing students and preceptors with additional self-learning resources. Addressing sustainability within geriatric APRN education is critical for sharing of best practices among educators and for sustainability of teaching and related resources. © 2014.

  13. Evaluation of CT in identifying colorectal carcinoma in the frail and disabled patient

    International Nuclear Information System (INIS)

    Ng, C.S.; Dixon, A.K.; Doyle, T.C.; Courtney, H.M.; Bull, R.K.; Freeman, A.H.; Pinto, E.M.; Prevost, A.T.; Campbell, G.A.

    2002-01-01

    Frail and physically or mentally disabled patients frequently have difficulty in tolerating formal colonic investigations. The aims of this study were to evaluate the accuracy of minimal-preparation CT in identifying colorectal carcinoma in this population and to determine the clinical indications and radiological signs with the highest yield for tumour. The CT technique involved helical acquisition (10-mm collimation, 1.5 pitch) following 2 days of preparation with oral contrast medium only. The outcome of 4 years of experience was retrospectively reviewed. The gold standards were pathological and cancer registration records, together with colonoscopy and barium enema when undertaken, with a minimum of 15 months follow-up. One thousand seventy-seven CT studies in 1031 patients (median age 80 years) were evaluated. CT correctly identified 83 of the 98 colorectal carcinomas in this group but missed 15 cases; sensitivity and specificity (with 95% confidence interval) 85% (78-92%) and 91% (90-93%), respectively. Multivariate analysis identified: (a) a palpable abdominal mass and anaemia to be the strongest clinical indications, particularly in combination (p<0.0025); and (b) lesion width and blurring of the serosal margin of lesions to be associated with tumours (p<0.0001). Computed tomography has a valuable role in the investigation of frail and otherwise disabled patients with symptoms suspicious for a colonic neoplasm. Although interpretation can be difficult, the technique is able to exclude malignancy with good accuracy. (orig.)

  14. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  15. [Diagnostic evaluation of the developmental level in children identified at risk of delay through the Child Development Evaluation Test].

    Science.gov (United States)

    Rizzoli-Córdoba, Antonio; Campos-Maldonado, Martha Carmen; Vélez-Andrade, Víctor Hugo; Delgado-Ginebra, Ismael; Baqueiro-Hernández, César Iván; Villasís-Keever, Miguel Ángel; Reyes-Morales, Hortensia; Ojeda-Lara, Lucía; Davis-Martínez, Erika Berenice; O'Shea-Cuevas, Gabriel; Aceves-Villagrán, Daniel; Carrasco-Mendoza, Joaquín; Villagrán-Muñoz, Víctor Manuel; Halley-Castillo, Elizabeth; Sidonio-Aguayo, Beatriz; Palma-Tavera, Josuha Alexander; Muñoz-Hernández, Onofre

    The Child Development Evaluation (or CDE Test) was developed in Mexico as a screening tool for child developmental problems. It yields three possible results: normal, slow development or risk of delay. The modified version was elaborated using the information obtained during the validation study but its properties according to the base population are not known. The objective of this work was to establish diagnostic confirmation of developmental delay in children 16- to 59-months of age previously identified as having risk of delay through the CDE Test in primary care facilities. A population-based cross-sectional study was conducted in one Mexican state. CDE test was administered to 11,455 children 16- to 59-months of age from December/2013 to March/2014. The eligible population represented the 6.2% of the children (n=714) who were identified at risk of delay through the CDE Test. For inclusion in the study, a block randomization stratified by sex and age group was performed. Each participant included in the study had a diagnostic evaluation using the Battelle Development Inventory, 2 nd edition. From the 355 participants included with risk of delay, 65.9% were male and 80.2% were from rural areas; 6.5% were false positives (Total Development Quotient ˃90) and 6.8% did not have any domain with delay (Domain Developmental Quotient <80). The proportion of delay for each domain was as follows: communication 82.5%; cognitive 80.8%; social-personal 33.8%; motor 55.5%; and adaptive 41.7%. There were significant differences in the percentages of delay both by age and by domain/subdomain evaluated. In 93.2% of the participants, developmental delay was corroborated in at least one domain evaluated. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  16. Business rescue decision making through verifier determinants – ask the specialists

    Directory of Open Access Journals (Sweden)

    Marius Pretorius

    2013-11-01

    Full Text Available Orientation: Business rescue has become a critical part of business strategy decision making, especially during economic downturns and recessions. Past legislation has generally supported creditor-friendly regimes, and its mind-set still applies which increases the difficulty of such turnarounds. There are many questions and critical issues faced by those involved in rescue. Despite extensive theory in the literature on failure, there is a void regarding practical verifiers of the signs and causes of venture decline, as specialists are not forthcoming about what they regard as their “competitive advantage”. Research purpose: This article introduces the concept and role of “verifier determinants” of early warning signs, as a tool to confirm the causes of decline in order to direct rescue strategies and, most importantly, reduce time between the first observation and the implementation of the rescue. Motivation for the study: Knowing how specialist practitioners confirm causes of business decline could assist in deciding on strategies for the rescue earlier than can be done using traditional due diligence which is time consuming. Reducing time is a crucial element of a successful rescue. Research design and approach: The researchers interviewed specialist practitioners with extensive experience in rescue and turnaround. An experimental design was used to ensure the specialists evaluated the same real cases to extract their experiences and base their decisions on. Main findings: The specialists confirmed the use of verifier determinants and identified such determinants as they personally used them to confirm causes of decline. These verifier determinants were classified into five categories; namely, management, finance, strategic, banking and operations and marketing of the ventures under investigation. The verifier determinants and their use often depend heavily on subconscious (non-factual information based on previous experiences

  17. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases.

    Science.gov (United States)

    Weycker, Derek; Sofrygin, Oleg; Seefeld, Kim; Deeter, Robert G; Legg, Jason; Edelsberg, John

    2013-02-13

    Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive "gold standard" (ANC based definition (diagnosis codes for neutropenia, fever, and/or infection). Accuracy was evaluated principally based on positive predictive value (PPV) and sensitivity. Among 357 study subjects, 82 (23%) met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28), PPV was 100% and sensitivity was 34% (95% CI: 24-45). For the definition including neutropenia in the primary position (n=54), PPV was 87% (78-95) and sensitivity was 57% (46-68). For the definition including neutropenia in any position (n=71), PPV was 77% (68-87) and sensitivity was 67% (56-77). Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  18. The evaluation of trustworthiness to identify health insurance fraud in dentistry.

    Science.gov (United States)

    Wang, Shu-Li; Pai, Hao-Ting; Wu, Mei-Fang; Wu, Fan; Li, Chen-Lin

    2017-01-01

    According to the investigations of the U.S. Government Accountability Office (GAO), health insurance fraud has caused an enormous pecuniary loss in the U.S. In Taiwan, in dentistry the problem is getting worse if dentists (authorized entities) file fraudulent claims. Several methods have been developed to solve health insurance fraud; however, these methods are like a rule-based mechanism. Without exploring the behavior patterns, these methods are time-consuming and ineffective; in addition, they are inadequate for managing the fraudulent dentists. Based on social network theory, we develop an evaluation approach to solve the problem of cross-dentist fraud. The trustworthiness score of a dentist is calculated based upon the amount and type of dental operations performed on the same patient and the same tooth by that dentist and other dentists. The simulation provides the following evidence. (1) This specific type of fraud can be identified effectively using our evaluation approach. (2) A retrospective study for the claims is also performed. (3) The proposed method is effective in identifying the fraudulent dentists. We provide a new direction for investigating the genuineness of claims data. If the insurer can detect fraudulent dentists using the traditional method and the proposed method simultaneously, the detection will be more transparent and ultimately reduce the losses caused by fraudulent claims. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  20. Towards Verifying National CO2 Emissions

    Science.gov (United States)

    Fung, I. Y.; Wuerth, S. M.; Anderson, J. L.

    2017-12-01

    With the Paris Agreement, nations around the world have pledged their voluntary reductions in future CO2 emissions. Satellite observations of atmospheric CO2 have the potential to verify self-reported emission statistics around the globe. We present a carbon-weather data assimilation system, wherein raw weather observations together with satellite observations of the mixing ratio of column CO2 from the Orbiting Carbon Observatory-2 are assimilated every 6 hours into the NCAR carbon-climate model CAM5 coupled to the Ensemble Kalman Filter of DART. In an OSSE, we reduced the fossil fuel emissions from a country, and estimated the emissions innovations demanded by the atmospheric CO2 observations. The uncertainties in the innovation are analyzed with respect to the uncertainties in the meteorology to determine the significance of the result. The work follows from "On the use of incomplete historical data to infer the present state of the atmosphere" (Charney et al. 1969), which maps the path for continuous data assimilation for weather forecasting and the five decades of progress since.

  1. Evaluating predictive models for solar energy growth in the US states and identifying the key drivers

    Science.gov (United States)

    Chakraborty, Joheen; Banerji, Sugata

    2018-03-01

    Driven by a desire to control climate change and reduce the dependence on fossil fuels, governments around the world are increasing the adoption of renewable energy sources. However, among the US states, we observe a wide disparity in renewable penetration. In this study, we have identified and cleaned over a dozen datasets representing solar energy penetration in each US state, and the potentially relevant socioeconomic and other factors that may be driving the growth in solar. We have applied a number of predictive modeling approaches - including machine learning and regression - on these datasets over a 17-year period and evaluated the relative performance of the models. Our goals were: (1) identify the most important factors that are driving the growth in solar, (2) choose the most effective predictive modeling technique for solar growth, and (3) develop a model for predicting next year’s solar growth using this year’s data. We obtained very promising results with random forests (about 90% efficacy) and varying degrees of success with support vector machines and regression techniques (linear, polynomial, ridge). We also identified states with solar growth slower than expected and representing a potential for stronger growth in future.

  2. Evaluation of Antigen-Conjugated Fluorescent Beads to Identify Antigen-Specific B Cells.

    Science.gov (United States)

    Correa, Isabel; Ilieva, Kristina M; Crescioli, Silvia; Lombardi, Sara; Figini, Mariangela; Cheung, Anthony; Spicer, James F; Tutt, Andrew N J; Nestle, Frank O; Karagiannis, Panagiotis; Lacy, Katie E; Karagiannis, Sophia N

    2018-01-01

    Selection of single antigen-specific B cells to identify their expressed antibodies is of considerable interest for evaluating human immune responses. Here, we present a method to identify single antibody-expressing cells using antigen-conjugated fluorescent beads. To establish this, we selected Folate Receptor alpha (FRα) as a model antigen and a mouse B cell line, expressing both the soluble and the membrane-bound forms of a human/mouse chimeric antibody (MOv18 IgG1) specific for FRα, as test antibody-expressing cells. Beads were conjugated to FRα using streptavidin/avidin-biotin bridges and used to select single cells expressing the membrane-bound form of anti-FRα. Bead-bound cells were single cell-sorted and processed for single cell RNA retrotranscription and PCR to isolate antibody heavy and light chain variable regions. Variable regions were then cloned and expressed as human IgG1/k antibodies. Like the original clone, engineered antibodies from single cells recognized native FRα. To evaluate whether antigen-coated beads could identify specific antibody-expressing cells in mixed immune cell populations, human peripheral blood mononuclear cells (PBMCs) were spiked with test antibody-expressing cells. Antigen-specific cells could comprise up to 75% of cells selected with antigen-conjugated beads when the frequency of the antigen-positive cells was 1:100 or higher. In PBMC pools, beads conjugated to recombinant antigens FRα and HER2 bound antigen-specific anti-FRα MOv18 and anti-HER2 Trastuzumab antibody-expressing cells, respectively. From melanoma patient-derived B cells selected with melanoma cell line-derived protein-coated fluorescent beads, we generated a monoclonal antibody that recognized melanoma antigen-coated beads. This approach may be further developed to facilitate analysis of B cells and their antibody profiles at the single cell level and to help unravel humoral immune repertoires.

  3. Evaluation of Antigen-Conjugated Fluorescent Beads to Identify Antigen-Specific B Cells

    Directory of Open Access Journals (Sweden)

    Isabel Correa

    2018-03-01

    Full Text Available Selection of single antigen-specific B cells to identify their expressed antibodies is of considerable interest for evaluating human immune responses. Here, we present a method to identify single antibody-expressing cells using antigen-conjugated fluorescent beads. To establish this, we selected Folate Receptor alpha (FRα as a model antigen and a mouse B cell line, expressing both the soluble and the membrane-bound forms of a human/mouse chimeric antibody (MOv18 IgG1 specific for FRα, as test antibody-expressing cells. Beads were conjugated to FRα using streptavidin/avidin-biotin bridges and used to select single cells expressing the membrane-bound form of anti-FRα. Bead-bound cells were single cell-sorted and processed for single cell RNA retrotranscription and PCR to isolate antibody heavy and light chain variable regions. Variable regions were then cloned and expressed as human IgG1/k antibodies. Like the original clone, engineered antibodies from single cells recognized native FRα. To evaluate whether antigen-coated beads could identify specific antibody-expressing cells in mixed immune cell populations, human peripheral blood mononuclear cells (PBMCs were spiked with test antibody-expressing cells. Antigen-specific cells could comprise up to 75% of cells selected with antigen-conjugated beads when the frequency of the antigen-positive cells was 1:100 or higher. In PBMC pools, beads conjugated to recombinant antigens FRα and HER2 bound antigen-specific anti-FRα MOv18 and anti-HER2 Trastuzumab antibody-expressing cells, respectively. From melanoma patient-derived B cells selected with melanoma cell line-derived protein-coated fluorescent beads, we generated a monoclonal antibody that recognized melanoma antigen-coated beads. This approach may be further developed to facilitate analysis of B cells and their antibody profiles at the single cell level and to help unravel humoral immune repertoires.

  4. An Evaluation of Algorithms for Identifying Metastatic Breast, Lung, or Colorectal Cancer in Administrative Claims Data.

    Science.gov (United States)

    Whyte, Joanna L; Engel-Nitz, Nicole M; Teitelbaum, April; Gomez Rey, Gabriel; Kallich, Joel D

    2015-07-01

    Administrative health care claims data are used for epidemiologic, health services, and outcomes cancer research and thus play a significant role in policy. Cancer stage, which is often a major driver of cost and clinical outcomes, is not typically included in claims data. Evaluate algorithms used in a dataset of cancer patients to identify patients with metastatic breast (BC), lung (LC), or colorectal (CRC) cancer using claims data. Clinical data on BC, LC, or CRC patients (between January 1, 2007 and March 31, 2010) were linked to a health care claims database. Inclusion required health plan enrollment ≥3 months before initial cancer diagnosis date. Algorithms were used in the claims database to identify patients' disease status, which was compared with physician-reported metastases. Generic and tumor-specific algorithms were evaluated using ICD-9 codes, varying diagnosis time frames, and including/excluding other tumors. Positive and negative predictive values, sensitivity, and specificity were assessed. The linked databases included 14,480 patients; of whom, 32%, 17%, and 14.2% had metastatic BC, LC, and CRC, respectively, at diagnosis and met inclusion criteria. Nontumor-specific algorithms had lower specificity than tumor-specific algorithms. Tumor-specific algorithms' sensitivity and specificity were 53% and 99% for BC, 55% and 85% for LC, and 59% and 98% for CRC, respectively. Algorithms to distinguish metastatic BC, LC, and CRC from locally advanced disease should use tumor-specific primary cancer codes with 2 claims for the specific primary cancer >30-42 days apart to reduce misclassification. These performed best overall in specificity, positive predictive values, and overall accuracy to identify metastatic cancer in a health care claims database.

  5. Evaluation of Antigen-Conjugated Fluorescent Beads to Identify Antigen-Specific B Cells

    Science.gov (United States)

    Correa, Isabel; Ilieva, Kristina M.; Crescioli, Silvia; Lombardi, Sara; Figini, Mariangela; Cheung, Anthony; Spicer, James F.; Tutt, Andrew N. J.; Nestle, Frank O.; Karagiannis, Panagiotis; Lacy, Katie E.; Karagiannis, Sophia N.

    2018-01-01

    Selection of single antigen-specific B cells to identify their expressed antibodies is of considerable interest for evaluating human immune responses. Here, we present a method to identify single antibody-expressing cells using antigen-conjugated fluorescent beads. To establish this, we selected Folate Receptor alpha (FRα) as a model antigen and a mouse B cell line, expressing both the soluble and the membrane-bound forms of a human/mouse chimeric antibody (MOv18 IgG1) specific for FRα, as test antibody-expressing cells. Beads were conjugated to FRα using streptavidin/avidin-biotin bridges and used to select single cells expressing the membrane-bound form of anti-FRα. Bead-bound cells were single cell-sorted and processed for single cell RNA retrotranscription and PCR to isolate antibody heavy and light chain variable regions. Variable regions were then cloned and expressed as human IgG1/k antibodies. Like the original clone, engineered antibodies from single cells recognized native FRα. To evaluate whether antigen-coated beads could identify specific antibody-expressing cells in mixed immune cell populations, human peripheral blood mononuclear cells (PBMCs) were spiked with test antibody-expressing cells. Antigen-specific cells could comprise up to 75% of cells selected with antigen-conjugated beads when the frequency of the antigen-positive cells was 1:100 or higher. In PBMC pools, beads conjugated to recombinant antigens FRα and HER2 bound antigen-specific anti-FRα MOv18 and anti-HER2 Trastuzumab antibody-expressing cells, respectively. From melanoma patient-derived B cells selected with melanoma cell line-derived protein-coated fluorescent beads, we generated a monoclonal antibody that recognized melanoma antigen-coated beads. This approach may be further developed to facilitate analysis of B cells and their antibody profiles at the single cell level and to help unravel humoral immune repertoires. PMID:29628923

  6. Standardized evaluation of lung congestion during COPD exacerbation better identifies patients at risk of dying

    Directory of Open Access Journals (Sweden)

    Høiseth AD

    2013-12-01

    Full Text Available Arne Didrik Høiseth,1 Torbjørn Omland,1 Bo Daniel Karlsson,2 Pål H Brekke,1 Vidar Søyseth11Cardiothoracic Research Group, Division of Medicine, Akershus University Hospital and Institute of Clinical Medicine, University of Oslo, Oslo, Norway; 2Deptartment of Radiology, Akershus University Hospital, Lørenskog, NorwayBackground: Congestive heart failure is underdiagnosed in patients with chronic obstructive pulmonary disease (COPD. Pulmonary congestion on chest radiograph at admission for acute exacerbation of COPD (AECOPD is associated with an increased risk of mortality. A standardized evaluation of chest radiographs may enhance prognostic accuracy.Purpose: We aimed to evaluate whether a standardized, liberal assessment of pulmonary congestion is superior to the routine assessment in identifying patients at increased risk of long-term mortality, and to investigate the association of heart failure with N-terminal prohormone of brain natriuretic peptide (NT-proBNP concentrations.Material and methods: This was a prospective cohort study of 99 patients admitted for AECOPD. Chest radiographs obtained on admission were routinely evaluated and then later evaluated by blinded investigators using a standardized protocol looking for Kerley B lines, enlarged vessels in the lung apex, perihilar cuffing, peribronchial haze, and interstitial or alveolar edema, defining the presence of pulmonary congestion. Adjusted associations with long-term mortality and NT-proBNP concentration were calculated.Results: The standardized assessment was positive for pulmonary congestion in 32 of the 195 radiographs (16% ruled negative in the routine assessment. The standardized assessment was superior in predicting death during a median follow up of 1.9 years (P=0.022, and in multivariable analysis, only the standardized assessment showed a significant association with mortality (hazard ratio 2.4, 95% confidence interval [CI] 1.2–4.7 (P=0.016 and NT-proBNP (relative

  7. Verifying the integrity of hardcopy document using OCR

    CSIR Research Space (South Africa)

    Mthethwa, Sthembile

    2018-03-01

    Full Text Available stream_source_info Mthethwa_20042_2018.pdf.txt stream_content_type text/plain stream_size 7349 Content-Encoding UTF-8 stream_name Mthethwa_20042_2018.pdf.txt Content-Type text/plain; charset=UTF-8 Verifying the Integrity...) of the document to be defined. Each text in the meta-template is labelled with a unique identifier, which makes it easier for the process of validation. The meta-template consist of two types of text; normal text and validation text (important text that must...

  8. Detection of Botnet Command and Control Traffic by the Multistage Trust Evaluation of Destination Identifiers

    Directory of Open Access Journals (Sweden)

    Pieter Burghouwt

    2015-10-01

    Full Text Available Network-based detection of botnet Command and Control communication is a difficult task if the traffic has a relatively low volume and if popular protocols, such as HTTP, are used to resemble normal traffic. We present a new network-based detection approach that is capable of detecting this type of Command and Control traffic in an enterprise network by estimating the trustworthiness of the traffic destinations. If the destination identifier of a traffic flow origins directly from: human input, prior traffic from a trusted destination, or a defined set of legitimate applications, the destination is trusted and its associated traffic is classified as normal. Advantages of this approach are: the ability of zero day malicious traffic detection, low exposure to malware by passive host-external traffic monitoring, and the applicability for real-time filtering. Experimental evaluation demonstrates successful detection of diverse types of Command and Control Traffic.

  9. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    Directory of Open Access Journals (Sweden)

    Weycker Derek

    2013-02-01

    Full Text Available Abstract Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive “gold standard” (ANC 9/L, and body temperature ≥38.3°C or receipt of antibiotics and claims-based definition (diagnosis codes for neutropenia, fever, and/or infection. Accuracy was evaluated principally based on positive predictive value (PPV and sensitivity. Results Among 357 study subjects, 82 (23% met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28, PPV was 100% and sensitivity was 34% (95% CI: 24–45. For the definition including neutropenia in the primary position (n=54, PPV was 87% (78–95 and sensitivity was 57% (46–68. For the definition including neutropenia in any position (n=71, PPV was 77% (68–87 and sensitivity was 67% (56–77. Conclusions Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  10. Evaluation of an online family history tool for identifying hereditary and familial colorectal cancer.

    Science.gov (United States)

    Kallenberg, F G J; Aalfs, C M; The, F O; Wientjes, C A; Depla, A C; Mundt, M W; Bossuyt, P M M; Dekker, E

    2017-09-21

    Identifying a hereditary colorectal cancer (CRC) syndrome or familial CRC (FCC) in a CRC patient may enable the patient and relatives to enroll in surveillance protocols. As these individuals are insufficiently recognized, we evaluated an online family history tool, consisting of a patient-administered family history questionnaire and an automated genetic referral recommendation, to facilitate the identification of patients with hereditary CRC or FCC. Between 2015 and 2016, all newly diagnosed CRC patients in five Dutch outpatient clinics, were included in a trial with a stepped-wedge design, when first visiting the clinic. Each hospital continued standard procedures for identifying patients at risk (control strategy) and then, after a predetermined period, switched to offering the family history tool to included patients (intervention strategy). After considering the tool-based recommendation, the health care provider could decide on and arrange the referral. Primary outcome was the relative number of CRC patients who received screening or surveillance recommendations for themselves or relatives because of hereditary CRC or FCC, provided by genetic counseling. The intervention effect was evaluated using a logit-linear model. With the tool, 46/489 (9.4%) patients received a screening or surveillance recommendation, compared to 35/292 (12.0%) in the control group. In the intention-to-treat-analysis, accounting for time trends and hospital effects, this difference was not statistically significant (p = 0.58). A family history tool does not necessarily assist in increasing the number of CRC patients and relatives enrolled in screening or surveillance recommendations for hereditary CRC or FCC. Other interventions should be considered.

  11. Evaluation of the WinROP system for identifying retinopathy of prematurity in Czech preterm infants.

    Science.gov (United States)

    Timkovic, Juraj; Pokryvkova, Martina; Janurova, Katerina; Barinova, Denisa; Polackova, Renata; Masek, Petr

    2017-03-01

    Retinopathy of Prematurity (ROP) is a potentially serious condition that can afflict preterm infants. Timely and correct identification of individuals at risk of developing a serious form of ROP is therefore of paramount importance. WinROP is an online system for predicting ROP based on birth weight and weight increments. However, the results vary significantly for various populations. It has not been evaluated in the Czech population. This study evaluates the test characteristics (specificity, sensitivity, positive and negative predictive values) of the WinROP system in Czech preterm infants. Data on 445 prematurely born infants included in the ROP screening program at the University Hospital Ostrava, Czech Republic, were retrospectively entered into the WinROP system and the outcomes of the WinROP and regular screening were compared. All 24 infants who developed high-risk (Type 1 or Type 2) ROP were correctly identified by the system. The sensitivity and negative predictive values for this group were 100%. However, the specificity and positive predictive values were substantially lower, resulting in a large number of false positives. Extending the analysis to low risk ROP, the system did not provide such reliable results. The system is a valuable tool for identifying infants who are not likely to develop high-risk ROP and this could help to substantially reduce the number of preterm infants in need of regular ROP screening. It is not suitable for predicting the development of less serious forms of ROP which is however in accordance with the declared aims of the WinROP system.

  12. Can surveillance systems identify and avert adverse drug events? A prospective evaluation of a commercial application.

    Science.gov (United States)

    Jha, Ashish K; Laguette, Julia; Seger, Andrew; Bates, David W

    2008-01-01

    Computerized monitors can effectively detect and potentially prevent adverse drug events (ADEs). Most monitors have been developed in large academic hospitals and are not readily usable in other settings. We assessed the ability of a commercial program to identify and prevent ADEs in a community hospital. and Measurement We prospectively evaluated the commercial application in a community-based hospital. We examined the frequency and types of alerts produced, how often they were associated with ADEs and potential ADEs, and the potential financial impact of monitoring for ADEs. Among 2,407 patients screened, the application generated 516 high priority alerts. We were able to review 266 alerts at the time they were generated and among these, 30 (11.3%) were considered substantially important to warrant contacting the physician caring for the patient. These 30 alerts were associated with 4 ADEs and 11 potential ADEs. In all 15 cases, the responsible physician was unaware of the event, leading to a change in clinical care in 14 cases. Overall, 23% of high priority alerts were associated with an ADE (95% confidence interval [CI] 12% to 34%) and another 15% were associated with a potential ADE (95% CI 6% to 24%). Active surveillance used approximately 1.5 hours of pharmacist time daily. A commercially available, computer-based ADE detection tool was effective at identifying ADEs. When used as part of an active surveillance program, it can have an impact on preventing or ameliorating ADEs.

  13. FACEBOOK for CoP of Researchers: Identifying the Needs and Evaluating the Compatibility

    Directory of Open Access Journals (Sweden)

    Sami Miniaoui

    2011-11-01

    Full Text Available Communities of practice (CoPs are increasingly capturing the interest of many fields such as business companies, education and organizations. Many CoPs were developed for people who have common interest in healthcare, agriculture and environment, and teaching. However, there is lack of COPs dedicated for researchers. This research aims to explore the appropriateness of Facebook (FB as a platform for serving a CoP of researchers. To achieve this goal, first we identify the needs of CoPs for researchers within UAE context. Consequently, we adopted qualitative research approach to elicit the needs. We applied the grounded theory method to analyze the data. The results of the analysis showed seven main needs: collaboration, debating, awareness/ notification, reference management, cross search, customization, tracking, and user orientation. Secondly, we evaluated the compatibility of FB features to the identified needs. Although we found that FB covers most of CoPs needs, there are few needs which are not met successfully so this raised some technical and practical issues, which have been highlighted in the paper.

  14. Verifying competence of operations personnel in nuclear power plants

    International Nuclear Information System (INIS)

    Farber, G.H.

    1986-01-01

    To ensure that only competent people are authorized to fill positions in a nuclear power plant, both the initial competence of personnel and the continuous maintenance of competence have to be verified. Two main methods are normally used for verifying competence, namely evaluation of a person's performance over a period of time, and evaluation of his knowledge and skills at a particular time by means of an examination. Both methods have limitations, and in practice they are often used together to give different and to some extent complementary evaluations of a person's competence. Verification of competence itself is a problem area, because objective judging of human competence is extremely difficult. Formal verification methods, such as tests and examinations, are particularly or exclusively applied for the direct operating personnel in the control room (very rarely for management personnel). Out of the many elements contributing to a person's competence, the knowledge which is needed and the intellectual skills are the main subjects of the formal verification methods. Therefore the presentation will concentrate on the proof of the technical qualification of operators by means of examinations. The examination process in the Federal Republic of Germany for the proof of knowledge and skills will serve as an example to describe and analyze the important aspects. From that recommendations are derived regarding standardization of the procedure as well as validation. (orig./GL)

  15. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhof, Hendrik, E-mail: Hendrik.Kohlhof@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Heidt, Christoph, E-mail: Christoph.heidt@kispi.uzh.ch [Department of Orthopedic Surgery, University Children' s Hospital Zurich, Steinwiesstrasse 74, 8032 Switzerland (Switzerland); Bähler, Alexandrine, E-mail: Alexandrine.baehler@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Kohl, Sandro, E-mail: sandro.kohl@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Gravius, Sascha, E-mail: sascha.gravius@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Friedrich, Max J., E-mail: Max.Friedrich@ukb.uni-bonn.de [Clinic for Orthopedics and Trauma Surgery, University Hospital Bonn, Sigmund-Freud-Str. 25, 53127 Bonn (Germany); Ziebarth, Kai, E-mail: kai.ziebarth@insel.ch [Department of Orthopedic Surgery, University Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland); Stranzinger, Enno, E-mail: Enno.Stranzinger@insel.ch [Department of Pediatric Radiology, University Children' s Hospital Berne, Freiburgstrasse 18, 3010 Berne (Switzerland)

    2015-06-15

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  16. Can 3D ultrasound identify trochlea dysplasia in newborns? Evaluation and applicability of a technique

    International Nuclear Information System (INIS)

    Kohlhof, Hendrik; Heidt, Christoph; Bähler, Alexandrine; Kohl, Sandro; Gravius, Sascha; Friedrich, Max J.; Ziebarth, Kai; Stranzinger, Enno

    2015-01-01

    Highlights: • We evaluated a possible screening method for trochlea dysplasia. • 3D ultrasound was used to perform the measurements on standardized axial planes. • The evaluation of the technique showed comparable results to other studies. • This technique may be used as a screening technique as it is quick and easy to perform. - Abstract: Femoro-patellar dysplasia is considered as a significant risk factor of patellar instability. Different studies suggest that the shape of the trochlea is already developed in early childhood. Therefore early identification of a dysplastic configuration might be relevant information for the treating physician. An easy applicable routine screening of the trochlea is yet not available. The purpose of this study was to establish and evaluate a screening method for femoro-patellar dysplasia using 3D ultrasound. From 2012 to 2013 we prospectively imaged 160 consecutive femoro-patellar joints in 80 newborns from the 36th to 61st gestational week that underwent a routine hip sonography (Graf). All ultrasounds were performed by a pediatric radiologist with only minimal additional time to the routine hip ultrasound. In 30° flexion of the knee, axial, coronal, and sagittal reformats were used to standardize a reconstructed axial plane through the femoral condyle and the mid-patella. The sulcus angle, the lateral-to-medial facet ratio of the trochlea and the shape of the patella (Wiberg Classification) were evaluated. In all examinations reconstruction of the standardized axial plane was achieved, the mean trochlea angle was 149.1° (SD 4.9°), the lateral-to-medial facet ratio of the trochlea ratio was 1.3 (SD 0.22), and a Wiberg type I patella was found in 95% of the newborn. No statistical difference was detected between boys and girls. Using standardized reconstructions of the axial plane allows measurements to be made with lower operator dependency and higher accuracy in a short time. Therefore 3D ultrasound is an easy

  17. Large-scale evaluation of candidate genes identifies associations between VEGF polymorphisms and bladder cancer risk.

    Directory of Open Access Journals (Sweden)

    Montserrat García-Closas

    2007-02-01

    Full Text Available Common genetic variation could alter the risk for developing bladder cancer. We conducted a large-scale evaluation of single nucleotide polymorphisms (SNPs in candidate genes for cancer to identify common variants that influence bladder cancer risk. An Illumina GoldenGate assay was used to genotype 1,433 SNPs within or near 386 genes in 1,086 cases and 1,033 controls in Spain. The most significant finding was in the 5' UTR of VEGF (rs25648, p for likelihood ratio test, 2 degrees of freedom = 1 x 10(-5. To further investigate the region, we analyzed 29 additional SNPs in VEGF, selected to saturate the promoter and 5' UTR and to tag common genetic variation in this gene. Three additional SNPs in the promoter region (rs833052, rs1109324, and rs1547651 were associated with increased risk for bladder cancer: odds ratio (95% confidence interval: 2.52 (1.06-5.97, 2.74 (1.26-5.98, and 3.02 (1.36-6.63, respectively; and a polymorphism in intron 2 (rs3024994 was associated with reduced risk: 0.65 (0.46-0.91. Two of the promoter SNPs and the intron 2 SNP showed linkage disequilibrium with rs25648. Haplotype analyses revealed three blocks of linkage disequilibrium with significant associations for two blocks including the promoter and 5' UTR (global p = 0.02 and 0.009, respectively. These findings are biologically plausible since VEGF is critical in angiogenesis, which is important for tumor growth, its elevated expression in bladder tumors correlates with tumor progression, and specific 5' UTR haplotypes have been shown to influence promoter activity. Associations between bladder cancer risk and other genes in this report were not robust based on false discovery rate calculations. In conclusion, this large-scale evaluation of candidate cancer genes has identified common genetic variants in the regulatory regions of VEGF that could be associated with bladder cancer risk.

  18. External evaluation of the Radiation Therapy Oncology Group brachial plexus contouring protocol: several issues identified

    International Nuclear Information System (INIS)

    Min, Myo; Carruthers, Scott; Zanchetta, Lydia; Roos, Daniel; Keating, Elly; Shakeshaft, John; Baxi, Siddhartha; Penniment, Michael; Wong, Karen

    2014-01-01

    The aims of the study were to evaluate interobserver variability in contouring the brachial plexus (BP) using the Radiation Therapy Oncology Group (RTOG)-approved protocol and to analyse BP dosimetries. Seven outliners independently contoured the BPs of 15 consecutive patients. Interobserver variability was reviewed qualitatively (visually by using planning axial computed-tomography images and anteroposterior digitally reconstructed radiographs) and quantitatively (by volumetric and statistical analyses). Dose–volume histograms of BPs were calculated and compared. We found significant interobserver variability among outliners in both qualitative and quantitative analyses. These were most pronounced for the T1 nerve roots on visual inspection and for the BP volume on statistical analysis. The BP volumes were smaller than those described in the RTOG atlas paper, with a mean volume of 20.8cc (range 11–40.7 cc) compared with 33±4cc (25.1–39.4cc). The average values of mean dose, maximum dose, V60Gy, V66Gy and V70Gy for patients treated with conventional radiotherapy and IMRT were 42.2Gy versus 44.8Gy, 64.5Gy versus 68.5Gy, 6.1% versus 7.6%, 2.9% versus 2.4% and 0.6% versus 0.3%, respectively. This is the first independent external evaluation of the published protocol. We have identified several issues, including significant interobserver variation. Although radiation oncologists should contour BPs to avoid dose dumping, especially when using IMRT, the RTOG atlas should be used with caution. Because BPs are largely radiologically occult on CT, we propose the term brachial-plexus regions (BPRs) to represent regions where BPs are likely to be present. Consequently, BPRs should in principle be contoured generously.

  19. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  20. Baccalaureate Nursing Students' Abilities in Critically Identifying and Evaluating the Quality of Online Health Information.

    Science.gov (United States)

    Theron, Maggie; Redmond, Anne; Borycki, Elizabeth M

    2017-01-01

    Both the Internet and social media have become important tools that patients and health professionals, including health professional students, use to obtain information and support their decision-making surrounding health care. Students in the health sciences require increased competence to select, appraise, and use online sources to adequately educate and support patients and advocate for patient needs and best practices. The purpose of this study was to ascertain if second year nursing students have the ability to critically identify and evaluate the quality of online health information through comparisons between student and expert assessments of selected online health information postings using an adapted Trust in Online Health Information scale. Interviews with experts provided understanding of how experts applied the selected criteria and what experts recommend for implementing nursing informatics literacy in curriculums. The difference between student and expert assessments of the quality of the online information is on average close to 40%. Themes from the interviews highlighted several possible factors that may influence informatics competency levels in students, specifically regarding the critical appraisal of the quality of online health information.

  1. Statistical Evaluation of the Identified Structural Parameters of an idling Offshore Wind Turbine

    International Nuclear Information System (INIS)

    Kramers, Hendrik C.; Van der Valk, Paul L.C.; Van Wingerden, Jan-Willem

    2016-01-01

    With the increased need for renewable energy, new offshore wind farms are being developed at an unprecedented scale. However, as the costs of offshore wind energy are still too high, design optimization and new innovations are required for lowering its cost. The design of modern day offshore wind turbines relies on numerical models for estimating ultimate and fatigue loads of the turbines. The dynamic behavior and the resulting structural loading of the turbines is determined for a large part by its structural properties, such as the natural frequencies and damping ratios. Hence, it is important to obtain accurate estimates of these modal properties. For this purpose stochastic subspace identification (SSI), in combination with clustering and statistical evaluation methods, is used to obtain the variance of the identified modal properties of an installed 3.6MW offshore wind turbine in idling conditions. It is found that one is able to obtain confidence intervals for the means of eigenfrequencies and damping ratios of the fore-aft and side-side modes of the wind turbine. (paper)

  2. Design of a verifiable subset for HAL/S

    Science.gov (United States)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  3. Radioresponse of thymomas verified with histologic reponse

    Energy Technology Data Exchange (ETDEWEB)

    Ohara, Kiyoshi; Tatsuzaki, Hideo; Okumura, Toshiyuki; Itai, Yuji [Dept. of Radiology, Tsukuba Univ., Tsukuba City (Japan)]|[Inst. of Clinical Medicine, Tsukuba Univ., Tsukuba City (Japan); Fuji, Hiroshi [Dept. of Radiology, Tsukuba Univ., Tsukuba City (Japan); Sugahara, Shinji [Dept. of Radiology, Hitachi General Hospital, Hitachi City (Japan); Akaogi, Eiichi; Onizuka, Masataka; Ishikawa, Shigemi; Mitsui, Kiyofumi [Dept. of Surgery, Tsukuba Univ., Tsukuba City (Japan)]|[Inst. of Clinical Medicine, Tsukuba Univ., Tsukuba City (Japan)

    1998-12-31

    Patterns of radiologic response of 10 thymomas treated by preoperative radiotherapy (RT) (18-20 Gy/2 weeks) were determined in conjunction with histologic response. Changes in tumor volume were evaluated with CT scans obtained 5 to 36 days before and 14 to 24 days after the initiation of RT and before surgery. The extent of tumor volume reduction (TR) varied widely (40-78%), while the mean daily volume decrement expressed as a percentage of the pre-RT tumor volume correlated significantly with the pre-RT tumor volume. Histologically, the tumors, all of which were resected 17 to 33 days after RT initiation, generally consisted of predominant fibrous tissues, rare necrotic foci, and few epithelial cells. The TR did not correlate with pre-RT tumor volume, observation period, histologic subtype, or quantity of remaining epithelial cells. The TR of thymomas does not predict RT impact on tumor cells but does reflect the quantity of inherent tumor stroma. (orig.)

  4. Radioresponse of thymomas verified with histologic reponse

    International Nuclear Information System (INIS)

    Ohara, Kiyoshi; Tatsuzaki, Hideo; Okumura, Toshiyuki; Itai, Yuji; Fuji, Hiroshi; Sugahara, Shinji; Akaogi, Eiichi; Onizuka, Masataka; Ishikawa, Shigemi; Mitsui, Kiyofumi

    1998-01-01

    Patterns of radiologic response of 10 thymomas treated by preoperative radiotherapy (RT) (18-20 Gy/2 weeks) were determined in conjunction with histologic response. Changes in tumor volume were evaluated with CT scans obtained 5 to 36 days before and 14 to 24 days after the initiation of RT and before surgery. The extent of tumor volume reduction (TR) varied widely (40-78%), while the mean daily volume decrement expressed as a percentage of the pre-RT tumor volume correlated significantly with the pre-RT tumor volume. Histologically, the tumors, all of which were resected 17 to 33 days after RT initiation, generally consisted of predominant fibrous tissues, rare necrotic foci, and few epithelial cells. The TR did not correlate with pre-RT tumor volume, observation period, histologic subtype, or quantity of remaining epithelial cells. The TR of thymomas does not predict RT impact on tumor cells but does reflect the quantity of inherent tumor stroma. (orig.)

  5. An alternative test for verifying electronic balance linearity

    International Nuclear Information System (INIS)

    Thomas, I.R.

    1998-02-01

    This paper presents an alternative method for verifying electronic balance linearity and accuracy. This method is being developed for safeguards weighings (weighings for the control and accountability of nuclear material) at the Idaho National Engineering and Environmental Laboratory (INEEL). With regard to balance linearity and accuracy, DOE Order 5633.3B, Control and Accountability of Nuclear Materials, Paragraph 2, 4, e, (1), (a) Scales and Balances Program, states: ''All scales and balances used for accountability purposes shall be maintained in good working condition, recalibrated according to an established schedule, and checked for accuracy and linearity on each day that the scale or balance is used for accountability purposes.'' Various tests have been proposed for testing accuracy and linearity. In the 1991 Measurement Science Conference, Dr. Walter E. Kupper presented a paper entitled: ''Validation of High Accuracy Weighing Equipment.'' Dr. Kupper emphasized that tolerance checks for calibrated, state-of-the-art electronic equipment need not be complicated, and he presented four easy steps for verifying that a calibrated balance is operating correctly. These tests evaluate the standard deviation of successive weighings (of the same load), the off-center error, the calibration error, and the error due to nonlinearity. This method of balance validation is undoubtedly an authoritative means of ensuring balance operability, yet it could have two drawbacks: one, the test for linearity is not intuitively obvious, especially from a statistical viewpoint; and two, there is an absence of definitively defined testing limits. Hence, this paper describes an alternative means of verifying electronic balance linearity and accuracy that is being developed for safeguards measurements at the INEEL

  6. Identifying hotspots of coastal risk and evaluating DRR measures: results from the RISC-KIT project.

    Science.gov (United States)

    Van Dongeren, A.; Ciavola, P.; Viavattene, C.; Dekleermaeker, S.; Martinez, G.; Ferreira, O.; Costa, C.

    2016-02-01

    High-impact storm events have demonstrated the vulnerability of coastal zones in Europe and beyond. These impacts are likely to increase due to predicted climate change and ongoing coastal development. In order to reduce impacts, disaster risk reduction (DRR) measures need to be taken, which prevent or mitigate the effects of storm events. To drive the DRR agenda, the UNISDR formulated the Sendai Framework for Action, and the EU has issued the Floods Directive. However, neither is specific about the methods to be used to develop actionable DRR measures in the coastal zone. Therefore, there is a need to develop methods, tools and approaches which make it possible to: identify and prioritize the coastal zones which are most at risk through a Coastal Risk Assessment Framework, evaluate the effectiveness of DRR options for these coastal areas, using an Early Warning/Decision Support System, which can be used both in the planning and event-phase. This paper gives an overview of the products and results obtained in the FP7-funded project RISC-KIT, which aims to develop and apply a set of tools with which highly-vulnerable coastal areas (so-called "hotspots") can be identified. The identification is done using the Coastal Risk Assessment Framework, or CRAF, which computes the intensity from multi-hazards, the exposure and the vulnerability, all components of risk, including network and cascading effects. Based on this analysis hot spots of risk which warrant coastal protection investments are selected. For these hotspot areas, high-resolution Early Warning and Decision Support Tools are developed with which it is possible to compute in detail the effectiveness of Disaster Risk Reduction measures in storm event scenarios, which helps decide which measures to implement in the planning phase. The same systems, but now driven with real time data, can also be used for early warning systems. All tools are tested on eleven case study areas, at least one on each EU Regional Sea

  7. USCIS E-Verify Customer Satisfaction Survey, January 2013

    Data.gov (United States)

    Department of Homeland Security — This report focuses on the customer satisfaction of companies currently enrolled in the E-Verify program. Satisfaction with E-Verify remains high and follows up a...

  8. New concepts in nuclear arms control: verified cutoff and verified disposal

    International Nuclear Information System (INIS)

    Donnelly, W.H.

    1990-01-01

    Limiting the numbers of nuclear warheads by reducing military production and stockpiles of fissionable materials has been a constant item on the nuclear arms control agenda for the last 45 years. It has become more salient recently, however, because of two events: the enforced closure for safety reasons of the current United States military plutonium production facilities; and the possibility that the US and USSR may soon conclude an agreement providing for the verified destruction of significant numbers of nuclear warheads and the recovery of the fissionable material they contain with the option of transferring these materials to peaceful uses. A study has been made of the practical problems of verifying the cut off of fissionable material production for military purposes in the nuclear weapon states, as well as providing assurance that material recovered from warheads is not re-used for proscribed military purposes and facilitating its transfer to civil uses. Implementation of such measures would have important implications for non-proliferation. The resultant paper was presented to a meeting of the PPNN Core Group held in Baden, close to Vienna, over the weekend of 18/19th November 1989 and is reprinted in this booklet. (author)

  9. Verified Interval Orbit Propagation in Satellite Collision Avoidance

    NARCIS (Netherlands)

    Römgens, B.A.; Mooij, E.; Naeije, M.C.

    2011-01-01

    Verified interval integration methods enclose a solution set corresponding to interval initial values and parameters, and bound integration and rounding errors. Verified methods suffer from overestimation of the solution, i.e., non-solutions are also included in the solution enclosure. Two verified

  10. 20 CFR 401.45 - Verifying your identity.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make a...

  11. 28 CFR 802.13 - Verifying your identity.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity. You...

  12. Consequences of discrepancies on verified material balances

    International Nuclear Information System (INIS)

    Jaech, J.L.; Hough, C.G.

    1983-01-01

    There exists a gap between the way item discrepancies that are found in an IAEA inspection are treated in practice and how they are treated in the IAEA Safeguards Technical Manual, Part F, Statistics. In the latter case, the existence of even a single item discrepancy is cause for rejection of the facility data. Probabilities of detection for given inspection plans are calculated based on this premise. In fact, although the existence of discrepancies may be so noted in inspection reports, they in no sense of the word lead to rejection of the facility data, i.e., to ''detection''. Clearly, however, discrepancies have an effect on the integrity of the material balance, and in fact, this effect may well be of dominant importance when compared to that of small measurement biases. This paper provides a quantitative evaluation of the effect of item discrepancies on the facility MUF. The G-circumflex statistic is introduced. It is analogous to the familiar D-circumflex statistic used to quantify the effects of small biases. Thus, just as (MUF-D-circumflex) is the facility MUF adjusted for the inspector's variables measurements, so is (MUF-D-circumflex-G-circumflex) the MUF adjusted for both the variables and attributes measurements, where it is the attributes inspection that detects item discrepancies. The distribution of (MUF-D-circumflex-G-circumflex) is approximated by a Pearson's distribution after finding the first four moments. Both the number of discrepancies and their size and sign distribution are treated as random variables. Assuming, then, that ''detection'' occurs when (MUF-D-circumflex-G-circumflex) differs significantly from zero, procedures for calculating effectiveness are derived. Some generic results on effectiveness are included. These results apply either to the case where (MUF-D-circumflex-G-circumflex) is treated as the single statistic, or to the two-step procedure in which the facility's data are first examined using (D-circumflex+G-circumflex) as

  13. Systematic Evaluation of Pleiotropy Identifies 6 Further Loci Associated With Coronary Artery Disease

    NARCIS (Netherlands)

    Webb, Thomas R.; Erdmann, Jeanette; Stirrups, Kathleen E.; Stitziel, Nathan O.; Masca, Nicholas G. D.; Jansen, Henning; Kanoni, Stavroula; Nelson, Christopher P.; Ferrario, Paola G.; König, Inke R.; Eicher, John D.; Johnson, Andrew D.; Hamby, Stephen E.; Betsholtz, Christer; Ruusalepp, Arno; Franzén, Oscar; Schadt, Eric E.; Björkegren, Johan L. M.; Weeke, Peter E.; Auer, Paul L.; Schick, Ursula M.; Lu, Yingchang; Zhang, He; Dube, Marie-Pierre; Goel, Anuj; Farrall, Martin; Peloso, Gina M.; Won, Hong-Hee; Do, Ron; van Iperen, Erik; Kruppa, Jochen; Mahajan, Anubha; Scott, Robert A.; Willenborg, Christina; Braund, Peter S.; van Capelleveen, Julian C.; Doney, Alex S. F.; Donnelly, Louise A.; Asselta, Rosanna; Merlini, Pier A.; Duga, Stefano; Marziliano, Nicola; Denny, Josh C.; Shaffer, Christian; El-Mokhtari, Nour Eddine; Franke, Andre; Heilmann, Stefanie; Hengstenberg, Christian; Hoffmann, Per; Holmen, Oddgeir L.; Hveem, Kristian; Jansson, Jan-Håkan; Jöckel, Karl-Heinz; Kessler, Thorsten; Kriebel, Jennifer; Laugwitz, Karl L.; Marouli, Eirini; Martinelli, Nicola; McCarthy, Mark I.; van Zuydam, Natalie R.; Meisinger, Christa; Esko, Tõnu; Mihailov, Evelin; Escher, Stefan A.; Alver, Maris; Moebus, Susanne; Morris, Andrew D.; Virtamo, Jarma; Nikpay, Majid; Olivieri, Oliviero; Provost, Sylvie; AlQarawi, Alaa; Robertson, Neil R.; Akinsansya, Karen O.; Reilly, Dermot F.; Vogt, Thomas F.; Yin, Wu; Asselbergs, Folkert W.; Kooperberg, Charles; Jackson, Rebecca D.; Stahl, Eli; Müller-Nurasyid, Martina; Strauch, Konstantin; Varga, Tibor V.; Waldenberger, Melanie; Zeng, Lingyao; Chowdhury, Rajiv; Salomaa, Veikko; Ford, Ian; Jukema, J. Wouter; Amouyel, Philippe; Kontto, Jukka; Nordestgaard, Børge G.; Ferrières, Jean; Saleheen, Danish; Sattar, Naveed; Surendran, Praveen; Wagner, Aline; Young, Robin; Howson, Joanna M. M.; Butterworth, Adam S.; Danesh, John; Ardissino, Diego; Bottinger, Erwin P.; Erbel, Raimund; Franks, Paul W.; Girelli, Domenico; Hall, Alistair S.; Hovingh, G. Kees; Kastrati, Adnan; Lieb, Wolfgang; Meitinger, Thomas; Kraus, William E.; Shah, Svati H.; McPherson, Ruth; Orho-Melander, Marju; Melander, Olle; Metspalu, Andres; Palmer, Colin N. A.; Peters, Annette; Rader, Daniel J.; Reilly, Muredach P.; Loos, Ruth J. F.; Reiner, Alex P.; Roden, Dan M.; Tardif, Jean-Claude; Thompson, John R.; Wareham, Nicholas J.; Watkins, Hugh; Willer, Cristen J.; Samani, Nilesh J.; Schunkert, Heribert; Deloukas, Panos; Kathiresan, Sekar

    2017-01-01

    Genome-wide association studies have so far identified 56 loci associated with risk of coronary artery disease (CAD). Many CAD loci show pleiotropy; that is, they are also associated with other diseases or traits. This study sought to systematically test if genetic variants identified for non-CAD

  14. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  15. A New Tool for Identifying Research Standards and Evaluating Research Performance

    Science.gov (United States)

    Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki

    2012-01-01

    Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…

  16. Identifying the Evaluative Impulse in Local Culture: Insights from West African Proverbs

    Science.gov (United States)

    Easton, Peter B.

    2012-01-01

    Attention to cultural competence has significantly increased in the human services over the last two decades. Evaluators have long had similar concerns and have made a more concentrated effort in recent years to adapt evaluation methodology to varying cultural contexts. Little of this literature, however, has focused on the extent to which local…

  17. Statistical identifiability and convergence evaluation for nonlinear pharmacokinetic models with particle swarm optimization.

    Science.gov (United States)

    Kim, Seongho; Li, Lang

    2014-02-01

    The statistical identifiability of nonlinear pharmacokinetic (PK) models with the Michaelis-Menten (MM) kinetic equation is considered using a global optimization approach, which is particle swarm optimization (PSO). If a model is statistically non-identifiable, the conventional derivative-based estimation approach is often terminated earlier without converging, due to the singularity. To circumvent this difficulty, we develop a derivative-free global optimization algorithm by combining PSO with a derivative-free local optimization algorithm to improve the rate of convergence of PSO. We further propose an efficient approach to not only checking the convergence of estimation but also detecting the identifiability of nonlinear PK models. PK simulation studies demonstrate that the convergence and identifiability of the PK model can be detected efficiently through the proposed approach. The proposed approach is then applied to clinical PK data along with a two-compartmental model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Ecotoxicological evaluation of leachate from the Limeira sanitary landfill with a view to identifying acute toxicity

    OpenAIRE

    José Euclides Stipp Paterniani; Ronaldo Teixeira Pelegrini; Núbia Natália de Brito Pelegrini

    2007-01-01

    Final disposal of solid waste is still a cause for serious impacts on the environment. In sanitary landfills, waste undergoes physical, chemical, and biological decomposition, generating biogas and leachate. Leachate is a highly toxic liquid with a very high pollution potential. The purpose of this work is to evaluate toxicity of in natura leachate samples collected from Limeira Sanitary Landfill, in Limeira, SP. The ecotoxicological evaluation comprised acute toxicity assays using as test or...

  19. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  20. A Finite Equivalence of Verifiable Multi-secret Sharing

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2012-02-01

    Full Text Available We give an abstraction of verifiable multi-secret sharing schemes that is accessible to a fully mechanized analysis. This abstraction is formalized within the applied pi-calculus by using an equational theory which characterizes the cryptographic semantics of secret share. We also present an encoding from the equational theory into a convergent rewriting system, which is suitable for the automated protocol verifier ProVerif. Based on that, we verify the threshold certificate protocol in ProVerif.

  1. The validation of synthetic spectra used in the performance evaluation of radionuclide identifiers

    International Nuclear Information System (INIS)

    Flynn, A.; Boardman, D.; Reinhard, M.I.

    2013-01-01

    This work has evaluated synthetic gamma-ray spectra created by the RASE sampler using experimental data. The RASE sampler resamples experimental data to create large data libraries which are subsequently available for use in evaluation of radionuclide identification algorithms. A statistical evaluation of the synthetic energy bins has shown the variation to follow a Poisson distribution identical to experimental data. The minimum amount of statistics required in each base spectrum to ensure the subsequent use of the base spectrum in the generation of statistically robust synthetic data was determined. A requirement that the simulated acquisition time of the synthetic spectra was not more than 4% of the acquisition time of the base spectrum was also determined. Further validation of RASE was undertaken using two different radionuclide identification algorithms. - Highlights: • A validation of synthetic data created in order to evaluate radionuclide identification systems has been carried out. • Statistical analysis has shown that the data accurately represents experimental data. • A limit to the amount of data which could be created using this method was evaluated. • Analysis of the synthetic gamma spectra show identical results to analysis carried out with experimental data

  2. Identifying usability issues for personalization during formative evaluations: a comparisons of three methods

    NARCIS (Netherlands)

    van Velsen, Lex Stefan; van der Geest, Thea; Klaassen, R.F.

    2011-01-01

    A personalized system is one that generates unique output for each individual. As a result, personalization has transformed the interaction between the user and the system, and specific new usability issues have arisen. Methods used for evaluating personalized systems should be able to reveal the

  3. Evaluation of neural networks to identify types of activity using accelerometers

    NARCIS (Netherlands)

    Vries, S.I. de; Garre, F.G.; Engbers, L.H.; Hildebrandt, V.H.; Buuren, S. van

    2011-01-01

    Purpose: To develop and evaluate two artificial neural network (ANN) models based on single-sensor accelerometer data and an ANN model based on the data of two accelerometers for the identification of types of physical activity in adults. Methods: Forty-nine subjects (21 men and 28 women; age range

  4. Permanent Childhood Hearing Impairment: Aetiological Evaluation of Infants identified through the Irish Newborn Hearing Screening Programme

    LENUS (Irish Health Repository)

    Smith, A

    2017-11-01

    The Newborn Hearing Screening Programme (NHSP) was established in Cork University Maternity Hospital (CUMH) in April 2011. Between April 2011 and July 2014, 42 infants were identified with a Permanent Childhood Hearing Impairment (PCHI). Following this diagnosis, infants underwent a paediatric assessment according to recognised guidelines with the intention of identifying the underlying aetiology of the PCHI. The aim of this study was to assess the findings of this aetiological workup via retrospective chart review. PCHI data was obtained from the eSP database. This is a web based information system (eSP) used to track each baby through the screening and referral process A retrospective chart review of these patients was performed. Sixteen (38%) infants were diagnosed with a bilateral sensorineural hearing loss. Two infants had congenital CMV infection. A Connexin 26 gene mutation was detected in one infant. Two infants were diagnosed with Waardenburg syndrome, One with Pendred syndrome and one with Pfeiffer syndrome. Five babies underwent cochlear implantation. Through adherence to the recommended protocol a possible cause of PCHI may be determined. This study has identified areas of future improvement for this service in Ireland.

  5. Veterans’ Pensions: Verifying Income with Tax Data Can Identify Significant Payment Problems.

    Science.gov (United States)

    1988-03-01

    NOT RETURN THE COMPLETED FORM TO THE VA BY DC 1, -1’on YOUR BENEFITS WILL BE DISCONTINUEr IMPOPTA -Pless. read the inclosO Evp IsuctIos (VA Porn 2 1...physically or mentally helpless before age 18. If you have unmarried children in any of these categories. show the number of such children. If a child is...away at school but still a member of your household, consider that child to be IN YOUR CUSTODY It you have no dependent children show Ŕ". 2. INCOME

  6. Developing and Evaluating the HRM Technique for Identifying Cytochrome P450 2D6 Polymorphisms.

    Science.gov (United States)

    Lu, Hsiu-Chin; Chang, Ya-Sian; Chang, Chun-Chi; Lin, Ching-Hsiung; Chang, Jan-Gowth

    2015-05-01

    Cytochrome P450 2D6 is one of the important enzymes involved in the metabolism of many widely used drugs. Genetic polymorphisms of CYP2D6 can affect its activity. Therefore, an efficient method for identifying CYP2D6 polymorphisms is clinically important. We developed a high-resolution melting (HRM) analysis to investigate CYP2D6 polymorphisms. Genomic DNA was extracted from peripheral blood samples from 71 healthy individuals. All nine exons of the CYP2D6 gene were sequenced before screening by HRM analysis. This method can detect the most genotypes (*1, *2, *4, *10, *14, *21 *39, and *41) of CYP2D6 in Chinese. All samples were successfully genotyped. The four most common mutant CYP2D6 alleles (*1, *2, *10, and *41) can be genotyped. The single nucleotides polymorphism (SNP) frequencies of 100C > T (rs1065852), 1039C > T (rs1081003), 1661G > C (rs1058164), 2663G > A (rs28371722), 2850C > T (rs16947), 2988G > A (rs28371725), 3181A > G, and 4180G > C (rs1135840) were 58%, 61%, 73%, 1%, 13%, 3%, 1%, 73%, respectively. We identified 100% of all heterozygotes without any errors. The two homozygous genotypes (1661G > C and 4180G > C) can be distinguished by mixing with a known genotype sample to generate an artificial heterozygote for HRM analysis. Therefore, all samples could be identified using our HRM method, and the results of HRM analysis are identical to those obtained by sequencing. Our method achieved 100% sensitivity, specificity, positive prediction value and negative prediction value. HRM analysis is a nongel resolution method that is faster and less expensive than direct sequencing. Our study shows that it is an efficient tool for typing CYP2D6 polymorphisms. © 2014 Wiley Periodicals, Inc.

  7. Assessing School Wellness Policies and Identifying Priorities for Action: Results of a Bi-State Evaluation.

    Science.gov (United States)

    Harvey, Susan P; Markenson, Deborah; Gibson, Cheryl A

    2018-05-01

    Obesity is a complex health problem affecting more than one-third of school-aged youth. The increasing obesity rates in Kansas and Missouri has been particularly concerning, with efforts being made to improve student health through the implementation of school wellness policies (SWPs). The primary purpose of this study was to conduct a rigorous assessment of SWPs in the bi-state region. SWPs were collected from 46 school districts. The Wellness School Assessment Tool (WellSAT) was used to assess comprehensiveness and strength. Additionally, focus group discussions and an online survey were conducted with school personnel to identify barriers and supports needed. Assessment of the SWPs indicated that most school districts failed to provide strong and specific language. Due to these deficiencies, districts reported lack of enforcement of policies. Several barriers to implementing the policies were reported by school personnel; supports needed for effective implementation were identified. To promote a healthful school environment, significant improvements are warranted in the strength and comprehensiveness of the SWPs. The focus group discussions provided insight as to where we need to bridge the gap between the current state of policies and the desired beneficial practices to support a healthy school environment. © 2018, American School Health Association.

  8. Experience with in vivo diode dosimetry for verifying radiotherapy dose delivery: Practical implementation of cost-effective approaches

    International Nuclear Information System (INIS)

    Thwaites, D.I.; Blyth, C.; Carruthers, L.; Elliott, P.A.; Kidane, G.; Millwater, C.J.; MacLeod, A.S.; Paolucci, M.; Stacey, C.

    2002-01-01

    A systematic programme of in vivo dosimetry using diodes to verify radiotherapy delivered doses began in Edinburgh in 1992. The aims were to investigate the feasibility of routine systematic use of diodes as part of a comprehensive QA programme, to carry out clinical pilot studies to assess the accuracy of dose delivery on each machine and for each site and technique, to identify and rectify systematic deviations, to assess departmental dosimetric precision and to compare to clinical requirements. A further aim was to carry out a cost-benefit evaluation based on the results from the pilot studies to consider how best to use diodes routinely

  9. Evaluation of Computational Docking to Identify Pregnane X Receptor Agonists in the ToxCast Database

    OpenAIRE

    Kortagere, Sandhya; Krasowski, Matthew D.; Reschly, Erica J.; Venkatesh, Madhukumar; Mani, Sridhar; Ekins, Sean

    2010-01-01

    Background The pregnane X receptor (PXR) is a key transcriptional regulator of many genes [e.g., cytochrome P450s (CYP2C9, CYP3A4, CYP2B6), MDR1] involved in xenobiotic metabolism and excretion. Objectives As part of an evaluation of different approaches to predict compound affinity for nuclear hormone receptors, we used the molecular docking program GOLD and a hybrid scoring scheme based on similarity weighted GoldScores to predict potential PXR agonists in the ToxCast database of pesticides...

  10. Systematic evaluation of drug-disease relationships to identify leads for novel drug uses.

    Science.gov (United States)

    Chiang, A P; Butte, A J

    2009-11-01

    Drug repositioning refers to the discovery of alternative uses for drugs--uses that are different from that for which the drugs were originally intended. One challenge in this effort lies in choosing the indication for which a drug of interest could be prospectively tested. We systematically evaluated a drug treatment-based view of diseases in order to address this challenge. Suggestions for novel drug uses were generated using a "guilt by association" approach. When compared with a control group of drug uses, the suggested novel drug uses generated by this approach were significantly enriched with respect to previous and ongoing clinical trials.

  11. Confronting Oahu's Water Woes: Identifying Scenarios for a Robust Evaluation of Policy Alternatives

    Science.gov (United States)

    van Rees, C. B.; Garcia, M. E.; Alarcon, T.; Sixt, G.

    2013-12-01

    The Pearl Harbor aquifer is the most important freshwater resource on Oahu (Hawaii, U.S.A), providing water to nearly half a million people. Recent studies show that current water use is reaching or exceeding sustainable yield. Climate change and increasing resident and tourist populations are predicted to further stress the aquifer. The island has lost huge tracts of freshwater and estuarine wetlands since human settlement; the dependence of many endemic, endangered species on these wetlands, as well as ecosystem benefits from wetlands, link humans and wildlife through water management. After the collapse of the sugar industry on Oahu (mid-1990s), the Waiahole ditch--a massive stream diversion bringing water from the island's windward to the leeward side--became a hotly disputed resource. Commercial interests and traditional farmers have clashed over the water, which could also serve to support the Pearl Harbor aquifer. Considering competing interests, impending scarcity, and uncertain future conditions, how can groundwater be managed most effectively? Complex water networks like this are characterized by conflicts between stakeholders, coupled human-natural systems, and future uncertainty. The Water Diplomacy Framework offers a model for analyzing such complex issues by integrating multiple disciplinary perspectives, identifying intervention points, and proposing sustainable solutions. The Water Diplomacy Framework is a theory and practice of implementing adaptive water management for complex problems by shifting the discussion from 'allocation of water' to 'benefit from water resources'. This is accomplished through an interactive process that includes stakeholder input, joint fact finding, collaborative scenario development, and a negotiated approach to value creation. Presented here are the results of the initial steps in a long term project to resolve water limitations on Oahu. We developed a conceptual model of the Pearl Harbor Aquifer system and identified

  12. Mechanisms of change in psychotherapy for depression : An empirical update and evaluation of research aimed at identifying psychological mediators

    NARCIS (Netherlands)

    Lemmens, L.H.J.M.; Müller, V.N.L.S.; Arntz, A.; Huibers, M.J.H.

    2016-01-01

    We present a systematic empirical update and critical evaluation of the current status of research aimed at identifying a variety of psychological mediators in various forms of psychotherapy for depression. We summarize study characteristics and results of 35 relevant studies, and discuss the extent

  13. The Single Item Literacy Screener: Evaluation of a brief instrument to identify limited reading ability

    Directory of Open Access Journals (Sweden)

    Chew Lisa D

    2006-03-01

    Full Text Available Abstract Background Reading skills are important for accessing health information, using health care services, managing one's health and achieving desirable health outcomes. Our objective was to assess the diagnostic accuracy of the Single Item Literacy Screener (SILS to identify limited reading ability, one component of health literacy, as measured by the S-TOFHLA. Methods Cross-sectional interview with 999 adults with diabetes residing in Vermont and bordering states. Participants were randomly recruited from Primary Care practices in the Vermont Diabetes Information System June 2003 – December 2004. The main outcome was limited reading ability. The primary predictor was the SILS. Results Of the 999 persons screened, 169 (17% had limited reading ability. The sensitivity of the SILS in detecting limited reading ability was 54% [95% CI: 47%, 61%] and the specificity was 83% [95% CI: 81%, 86%] with an area under the Receiver Operating Characteristics Curve (ROC of 0.73 [95% CI: 0.69, 0.78]. Seven hundred seventy (77% screened negative on the SILS and 692 of these subjects had adequate reading skills (negative predictive value = 0.90 [95% CI: 0.88, 0.92]. Of the 229 who scored positive on the SILS, 92 had limited reading ability (positive predictive value = 0.4 [95% CI: 0.34, 0.47]. Conclusion The SILS is a simple instrument designed to identify patients with limited reading ability who need help reading health-related materials. The SILS performs moderately well at ruling out limited reading ability in adults and allows providers to target additional assessment of health literacy skills to those most in need. Further study of the use of the SILS in clinical settings and with more diverse populations is warranted.

  14. Evaluation of algorithms to identify incident cancer cases by using French health administrative databases.

    Science.gov (United States)

    Ajrouche, Aya; Estellat, Candice; De Rycke, Yann; Tubach, Florence

    2017-08-01

    Administrative databases are increasingly being used in cancer observational studies. Identifying incident cancer in these databases is crucial. This study aimed to develop algorithms to estimate cancer incidence by using health administrative databases and to examine the accuracy of the algorithms in terms of national cancer incidence rates estimated from registries. We identified a cohort of 463 033 participants on 1 January 2012 in the Echantillon Généraliste des Bénéficiaires (EGB; a representative sample of the French healthcare insurance system). The EGB contains data on long-term chronic disease (LTD) status, reimbursed outpatient treatments and procedures, and hospitalizations (including discharge diagnoses, and costly medical procedures and drugs). After excluding cases of prevalent cancer, we applied 15 algorithms to estimate the cancer incidence rates separately for men and women in 2012 and compared them to the national cancer incidence rates estimated from French registries by indirect age and sex standardization. The most accurate algorithm for men combined information from LTD status, outpatient anticancer drugs, radiotherapy sessions and primary or related discharge diagnosis of cancer, although it underestimated the cancer incidence (standardized incidence ratio (SIR) 0.85 [0.80-0.90]). For women, the best algorithm used the same definition of the algorithm for men but restricted hospital discharge to only primary or related diagnosis with an additional inpatient procedure or drug reimbursement related to cancer and gave comparable estimates to those from registries (SIR 1.00 [0.94-1.06]). The algorithms proposed could be used for cancer incidence monitoring and for future etiological cancer studies involving French healthcare databases. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. A credit card verifier structure using diffraction and spectroscopy concepts

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  16. Changing Climate, Challenging Choices: Identifying and Evaluating Climate Change Adaptation Options for Protected Areas Management in Ontario, Canada

    Science.gov (United States)

    Lemieux, Christopher J.; Scott, Daniel J.

    2011-10-01

    Climate change will pose increasingly significant challenges to managers of parks and other forms of protected areas around the world. Over the past two decades, numerous scientific publications have identified potential adaptations, but their suitability from legal, policy, financial, internal capacity, and other management perspectives has not been evaluated for any protected area agency or organization. In this study, a panel of protected area experts applied a Policy Delphi methodology to identify and evaluate climate change adaptation options across the primary management areas of a protected area agency in Canada. The panel identified and evaluated one hundred and sixty five (165) adaptation options for their perceived desirability and feasibility. While the results revealed a high level of agreement with respect to the desirability of adaptation options and a moderate level of capacity pertaining to policy formulation and management direction, a perception of low capacity for implementation in most other program areas was identified. A separate panel of senior park agency decision-makers used a multiple criterion decision-facilitation matrix to further evaluate the institutional feasibility of the 56 most desirable adaptation options identified by the initial expert panel and to prioritize them for consideration in a climate change action plan. Critically, only two of the 56 adaptation options evaluated by senior decision-makers were deemed definitely implementable, due largely to fiscal and internal capacity limitations. These challenges are common to protected area agencies in developed countries and pervade those in developing countries, revealing that limited adaptive capacity represents a substantive barrier to biodiversity conservation and other protected area management objectives in an era of rapid climate change.

  17. Evaluation of ICD-10 algorithms to identify hypopituitary patients in the Danish National Patient Registry

    DEFF Research Database (Denmark)

    Berglund, Agnethe; Olsen, Morten; Andersen, Marianne

    2017-01-01

    : Patients with International Classification of Diseases (10th edition [ICD-10]) diagnoses of hypopituitarism, or other diagnoses of pituitary disorders assumed to be associated with an increased risk of hypopituitarism, recorded in the DNPR during 2000-2012 were identified. Medical records were reviewed...... to confirm or disprove hypopituitarism. RESULTS: Hypopituitarism was confirmed in 911 patients. In a candidate population of 1,661, this yielded an overall positive predictive value (PPV) of 54.8% (95% confidence interval [CI]: 52.4-57.3). Using algorithms searching for patients recorded at least one, three...... or five times with a diagnosis of hypopituitarism (E23.0x) and/or at least once with a diagnosis of postprocedural hypopituitarism (E89.3x), PPVs gradually increased from 73.3% (95% CI: 70.6-75.8) to 83.3% (95% CI: 80.7-85.7). Completeness for the same algorithms, however, decreased from 90.8% (95% CI: 88...

  18. Identifying obstructive sleep apnea after stroke/TIA: evaluating four simple screening tools.

    Science.gov (United States)

    Boulos, Mark I; Wan, Anthony; Im, James; Elias, Sara; Frankul, Fadi; Atalla, Mina; Black, Sandra E; Basile, Vincenzo S; Sundaram, Arun; Hopyan, Julia J; Boyle, Karl; Gladstone, David J; Murray, Brian J; Swartz, Richard H

    2016-05-01

    Despite its high prevalence and unfavorable clinical consequences, obstructive sleep apnea (OSA) often remains underappreciated after cerebrovascular events. The purpose of our study was to evaluate the clinical utility of four simple paper-based screening tools for excluding OSA after stroke or transient ischemic attack (TIA). Sixty-nine inpatients and outpatients with stroke or TIA during the past 180 days completed the 4-Variable screening tool (4V), STOP-BAG questionnaire (ie, STOP-BANG questionnaire without the neck circumference measurement), Berlin questionnaire, and the Sleep Obstructive apnea score optimized for Stroke (SOS). They subsequently underwent objective testing using a portable sleep monitoring device. Cutoffs were selected to maximize sensitivity and exclude OSA (AHI ≥ 10) in ≥10% of the cohort. The mean age was 68.3 ± 14.2 years and 47.8% were male. Thirty-two patients (46.4%) were found to have OSA. Male sex, body mass index (BMI), and atrial fibrillation were independent predictors of OSA. Among the screening tools, the 4V had the greatest area under the curve (AUC) of 0.688 (p = 0.007); the sensitivity was 96.9% for a cutoff of stroke/TIA. Due to the atypical presentation of poststroke/TIA OSA, these tools are only moderately predictive; objective testing should still be used for OSA diagnosis in this population. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Evaluation of Quality and Readability of Health Information Websites Identified through India's Major Search Engines.

    Science.gov (United States)

    Raj, S; Sharma, V L; Singh, A J; Goel, S

    2016-01-01

    Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words "Health" and "Information" were used on search engines "Google" and "Yahoo." Out of 50 websites (25 from each search engines), after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), and SMOG. Results. Forty percent of websites (n = 13) were sponsored by government. Health On the Net Code of Conduct (HONcode) certification was present on 50% (n = 16) of websites. The mean LIDA score (74.31) was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.

  20. Identifying and Evaluating the Relationships that Control a Land Surface Model's Hydrological Behavior

    Science.gov (United States)

    Koster, Randal D.; Mahanama, Sarith P.

    2012-01-01

    The inherent soil moisture-evaporation relationships used in today 's land surface models (LSMs) arguably reflect a lot of guesswork given the lack of contemporaneous evaporation and soil moisture observations at the spatial scales represented by regional and global models. The inherent soil moisture-runoff relationships used in the LSMs are also of uncertain accuracy. Evaluating these relationships is difficult but crucial given that they have a major impact on how the land component contributes to hydrological and meteorological variability within the climate system. The relationships, it turns out, can be examined efficiently and effectively with a simple water balance model framework. The simple water balance model, driven with multi-decadal observations covering the conterminous United States, shows how different prescribed relationships lead to different manifestations of hydrological variability, some of which can be compared directly to observations. Through the testing of a wide suite of relationships, the simple model provides estimates for the underlying relationships that operate in nature and that should be operating in LSMs. We examine the relationships currently used in a number of different LSMs in the context of the simple water balance model results and make recommendations for potential first-order improvements to these LSMs.

  1. Evaluation of PCB sources and releases for identifying priorities to reduce PCBs in Washington State (USA).

    Science.gov (United States)

    Davies, Holly; Delistraty, Damon

    2016-02-01

    Polychlorinated biphenyls (PCBs) are ubiquitously distributed in the environment and produce multiple adverse effects in humans and wildlife. As a result, the purpose of our study was to characterize PCB sources in anthropogenic materials and releases to the environment in Washington State (USA) in order to formulate recommendations to reduce PCB exposures. Methods included review of relevant publications (e.g., open literature, industry studies and reports, federal and state government databases), scaling of PCB sources from national or county estimates to state estimates, and communication with industry associations and private and public utilities. Recognizing high associated uncertainty due to incomplete data, we strived to provide central tendency estimates for PCB sources. In terms of mass (high to low), PCB sources include lamp ballasts, caulk, small capacitors, large capacitors, and transformers. For perspective, these sources (200,000-500,000 kg) overwhelm PCBs estimated to reside in the Puget Sound ecosystem (1500 kg). Annual releases of PCBs to the environment (high to low) are attributed to lamp ballasts (400-1500 kg), inadvertent generation by industrial processes (900 kg), caulk (160 kg), small capacitors (3-150 kg), large capacitors (10-80 kg), pigments and dyes (0.02-31 kg), and transformers (PCB distribution and decrease exposures include assessment of PCBs in buildings (e.g., schools) and replacement of these materials, development of Best Management Practices (BMPs) to contain PCBs, reduction of inadvertent generation of PCBs in consumer products, expansion of environmental monitoring and public education, and research to identify specific PCB congener profiles in human tissues.

  2. Hombres Sanos: evaluation of a social marketing campaign for heterosexually identified Latino men who have sex with men and women.

    Science.gov (United States)

    Martínez-Donate, Ana P; Zellner, Jennifer A; Sañudo, Fernando; Fernandez-Cerdeño, Araceli; Hovell, Melbourne F; Sipan, Carol L; Engelberg, Moshe; Carrillo, Hector

    2010-12-01

    We evaluated the effectiveness of Hombres Sanos [Healthy Men] a social marketing campaign to increase condom use and HIV testing among heterosexually identified Latino men, especially among heterosexually identified Latino men who have sex with men and women (MSMW). Hombres Sanos was implemented in northern San Diego County, California, from June 2006 through December 2006. Every other month we conducted cross-sectional surveys with independent samples of heterosexually identified Latino men before (n = 626), during (n = 752), and after (n = 385) the campaign. Respondents were randomly selected from 12 targeted community venues to complete an anonymous, self-administered survey on sexual practices and testing for HIV and other sexually transmitted infections. About 5.6% of respondents (n = 98) were heterosexually identified Latino MSMW. The intervention was associated with reduced rates of recent unprotected sex with both females and males among heterosexually identified Latino MSMW. The campaign was also associated with increases in perception of HIV risk, knowledge of testing locations, and condom carrying among heterosexual Latinos. Social marketing represents a promising approach for abating HIV transmission among heterosexually identified Latinos, particularly for heterosexually identified Latino MSMW. Given the scarcity of evidence-based HIV prevention interventions for these populations, this prevention strategy warrants further investigation.

  3. Systematic Correlation Matrix Evaluation (SCoMaE) - a bottom-up, science-led approach to identifying indicators

    Science.gov (United States)

    Mengis, Nadine; Keller, David P.; Oschlies, Andreas

    2018-01-01

    This study introduces the Systematic Correlation Matrix Evaluation (SCoMaE) method, a bottom-up approach which combines expert judgment and statistical information to systematically select transparent, nonredundant indicators for a comprehensive assessment of the state of the Earth system. The methods consists of two basic steps: (1) the calculation of a correlation matrix among variables relevant for a given research question and (2) the systematic evaluation of the matrix, to identify clusters of variables with similar behavior and respective mutually independent indicators. Optional further analysis steps include (3) the interpretation of the identified clusters, enabling a learning effect from the selection of indicators, (4) testing the robustness of identified clusters with respect to changes in forcing or boundary conditions, (5) enabling a comparative assessment of varying scenarios by constructing and evaluating a common correlation matrix, and (6) the inclusion of expert judgment, for example, to prescribe indicators, to allow for considerations other than statistical consistency. The example application of the SCoMaE method to Earth system model output forced by different CO2 emission scenarios reveals the necessity of reevaluating indicators identified in a historical scenario simulation for an accurate assessment of an intermediate-high, as well as a business-as-usual, climate change scenario simulation. This necessity arises from changes in prevailing correlations in the Earth system under varying climate forcing. For a comparative assessment of the three climate change scenarios, we construct and evaluate a common correlation matrix, in which we identify robust correlations between variables across the three considered scenarios.

  4. Techniques and methodologies to identify potential generated industries of NORM in Angola Republic and evaluate its impacts

    International Nuclear Information System (INIS)

    Diogo, José Manuel Sucumula

    2017-01-01

    Numerous steps have been taken worldwide to identify and quantify the radiological risks associated with the mining of ores containing Naturally Occurrence Radioactive Material (NORM), often resulting in unnecessary exposures to individuals and high environmental damage, with devastating consequences for the health of workers and damage to the economy of many countries due to a lack of regulations or inadequate regulations. For these and other reasons, the objective of this work was to identify industrial potential generating NORM in the Republic of Angola and to estimate its radiological environmental impacts. To achieve this objective, we studied the theoretical aspects, identified the main internationally recognized industrial companies that as generate by NORM. The Brazilian experience in the regulatory aspect was observed in the evaluation criteria to classify industries that generate NORM, the methods of mining and its radiological environmental impacts, as well as the main techniques applied to evaluate the concentrations of radionuclides in a specific environmental matrix and/or a NORM sample. The study approach allowed the elaboration of a NORM map for the main provinces of Angola, establishing the evaluation criteria for implementing the Radiation Protection Plan in the extractive industry, establishing measures to control ionizing radiation in mining, identifying and quantifying radionuclides present in samples of lees oil. However, in order to assess adequately the radiological environmental impact of the NORM industry, it is not enough to identify them, it is important to know the origin, quantify the radioactive material released as liquid and gaseous effluents, identify the main routes of exposure and examine how this material spreads into the environment until it reaches man. (author)

  5. NOS CO-OPS Water Level Data, Verified, High Low

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), daily, high low water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services...

  6. NOS CO-OPS Water Level Data, Verified, 6-Minute

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), 6-minute, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  7. Verifying Correct Usage of Atomic Blocks and Typestate: Technical Companion

    National Research Council Canada - National Science Library

    Beckman, Nels E; Aldrich, Jonathan

    2008-01-01

    In this technical report, we present a static and dynamic semantics as well as a proof of soundness for a programming language presented in the paper entitled, 'Verifying Correct Usage of Atomic Blocks and Typestate...

  8. NOS CO-OPS Water Level Data, Verified, Hourly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), hourly, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  9. Superfund TIO videos. Set A. Identifying PRPS. Removal process: Removal site evaluation. Part 2. Audio-Visual

    International Nuclear Information System (INIS)

    1990-01-01

    The videotape is divided into three sections. Section 1 details the liability of Potentially Responsible Parties (PRPs) and describes the four classes of PRPs: current owners and operators, former owners and operators, generators, and transporters (if they selected the site). Section 2 lists the goals of the Potentially Responsible Party (PRP) search and explains how to identify key players during the PRP search. How to plan and conduct the PRP search is also outlined. Section 3 outlines the steps involved in conducting a removal site evaluation. A discussion of when to conduct a removal preliminary assessment, a removal site inspection, and an Engineering Evaluation/Cost Analysis (EE/AC) also is covered

  10. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    Science.gov (United States)

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  12. Verifiable Distribution of Material Goods Based on Cryptology

    Directory of Open Access Journals (Sweden)

    Radomír Palovský

    2015-12-01

    Full Text Available Counterfeiting of material goods is a general problem. In this paper an architecture for verifiable distribution of material goods is presented. This distribution is based on printing such a QR code on goods, which would contain digitally signed serial number of the product, and validity of this digital signature could be verifiable by a customer. Extension consisting of adding digital signatures to revenue stamps used for state-controlled goods is also presented. Discussion on possibilities in making copies leads to conclusion that cryptographic security needs to be completed by technical difficulties of copying.

  13. Quality Issues Identified During the Evaluation of Biosimilars by the European Medicines Agency's Committee for Medicinal Products for Human Use.

    Science.gov (United States)

    Cilia, Mark; Ruiz, Sol; Richardson, Peter; Salmonson, Tomas; Serracino-Inglott, Anthony; Wirth, Francesca; Borg, John Joseph

    2018-02-01

    The aim of this study was to identify trends in deficiencies raised during the EU evaluation of the quality part of dossiers for marketing authorisation applications of biosimilar medicinal products. All adopted day 120 list of questions on the quality module of 22 marketing authorisation applications for biosimilars submitted to the European Medicines Agency and concluded by the end of October 2015 was analysed. Frequencies of common deficiencies identified were calculated and summarised descriptions included. Frequencies and trends on quality deficiencies were recorded and presented for 22 biosimilar applications. Thirty-two 'major objections' for 9 products were identified from 14 marketing authorisation applications with 15 raised for drug substance and 17 for drug product. In addition, 547 'other concerns' for drug substance and 495 for drug product were also adopted. The frequencies and trends of the identified deficiencies together with their impact were discussed from a regulatory perspective and how these impact key manufacturing processes and key materials used in the production of biosimilars. This study provides an insight to the regulatory challenges prospective companies need to consider when developing biosimilars; it also helps elucidate common pitfalls in the development and production of biosimilars and in the submission of dossiers for their marketing authorisations. The results are expected to be of interest to pharmaceutical companies but also to regulators to obtain consistent information on medicinal products based on transparent rules safeguarding the necessary pharmaceutical quality of medicinal products.

  14. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  15. Protocol: a systematic review of studies developing and/or evaluating search strategies to identify prognosis studies.

    Science.gov (United States)

    Corp, Nadia; Jordan, Joanne L; Hayden, Jill A; Irvin, Emma; Parker, Robin; Smith, Andrea; van der Windt, Danielle A

    2017-04-20

    Prognosis research is on the rise, its importance recognised because chronic health conditions and diseases are increasingly common and costly. Prognosis systematic reviews are needed to collate and synthesise these research findings, especially to help inform effective clinical decision-making and healthcare policy. A detailed, comprehensive search strategy is central to any systematic review. However, within prognosis research, this is challenging due to poor reporting and inconsistent use of available indexing terms in electronic databases. Whilst many published search filters exist for finding clinical trials, this is not the case for prognosis studies. This systematic review aims to identify and compare existing methodological filters developed and evaluated to identify prognosis studies of any of the three main types: overall prognosis, prognostic factors, and prognostic [risk prediction] models. Primary studies reporting the development and/or evaluation of methodological search filters to retrieve any type of prognosis study will be included in this systematic review. Multiple electronic bibliographic databases will be searched, grey literature will be sought from relevant organisations and websites, experts will be contacted, and citation tracking of key papers and reference list checking of all included papers will be undertaken. Titles will be screened by one person, and abstracts and full articles will be reviewed for inclusion independently by two reviewers. Data extraction and quality assessment will also be undertaken independently by two reviewers with disagreements resolved by discussion or by a third reviewer if necessary. Filters' characteristics and performance metrics reported in the included studies will be extracted and tabulated. To enable comparisons, filters will be grouped according to database, platform, type of prognosis study, and type of filter for which it was intended. This systematic review will identify all existing validated

  16. Testing and evaluation of existing techniques for identifying uptakes and measuring retention of uranium in mill workers

    International Nuclear Information System (INIS)

    1983-03-01

    Preliminary tests and evaluations of existing bio-analytical techniques for identifying uptakes and measuring retention of uranium in mill workers were made at two uranium mills. Urinalysis tests were found to be more reliable indicators of uranium uptakes than personal air sampling. Static air samples were not found to be good indicators of personal uptakes. In vivo measurements of uranium in lung were successfully carried out in the presence of high and fluctuating background radiation. Interference from external contamination was common during end of shift measurements. A full scale study to evaluate model parameters for the uptake, retention and elimination of uranium should include, in addition to the above techniques, particle size determination of airborne uranium, solubility in simulated lung fluid, uranium analysis in faeces and bone and minute volume measurements for each subject

  17. Identifying and Evaluating Options for Improving Sediment Management and Fish Passage at Hydropower Dams in the Lower Mekong River Basin

    Science.gov (United States)

    Wild, T. B.; Reed, P. M.; Loucks, D. P.

    2015-12-01

    The Mekong River basin in Southeast Asia is undergoing intensive and pervasive hydropower development to satisfy demand for increased energy and income to support its growing population of 60 million people. Just 20 years ago this river flowed freely. Today some 30 large dams exist in the basin, and over 100 more are being planned for construction. These dams will alter the river's natural water, sediment and nutrient flows, thereby impacting river morphology and ecosystems, and will fragment fish migration pathways. In doing so, they will degrade one of the world's most valuable and productive freshwater fish habitats. For those dams that have not yet been constructed, there still exist opportunities to modify their siting, design and operation (SDO) to potentially achieve a more balanced set of tradeoffs among hydropower production, sediment/nutrient passage and fish passage. We introduce examples of such alternative SDO opportunities for Sambor Dam in Cambodia, planned to be constructed on the main stem of the Mekong River. To evaluate the performance of such alternatives, we developed a Python-based simulation tool called PySedSim. PySedSim is a daily time step mass balance model that identifies the relative tradeoffs among hydropower production, and flow and sediment regime alteration, associated with reservoir sediment management techniques such as flushing, sluicing, bypassing, density current venting and dredging. To date, there has been a very limited acknowledgement or evaluation of the significant uncertainties that impact the evaluation of SDO alternatives. This research is formalizing a model diagnostic assessment of the key assumptions and parametric uncertainties that strongly influence PySedSim SDO evaluations. Using stochastic hydrology and sediment load data, our diagnostic assessment evaluates and compares several Sambor Dam alternatives using several performance measures related to energy production, sediment trapping and regime alteration, and

  18. Verifying different-modality properties for concepts produces switching costs.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2003-03-01

    According to perceptual symbol systems, sensorimotor simulations underlie the representation of concepts. It follows that sensorimotor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than after verifying a property in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless, switching modalities incurred a cost, analogous to the cost of switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing.

  19. Elements of a system for verifying a Comprehensive Test Ban

    International Nuclear Information System (INIS)

    Hannon, W.J.

    1987-01-01

    The paper discusses the goals of a monitoring system for a CTB, its functions, the challenges to verification, discrimination techniques, and some recent developments. It is concluded technical, military and political efforts are required to establish and verify test ban treaties which will contribute to stability in the long term. It currently appears there will be a significant number of unidentified events

  20. An experiment designed to verify the general theory of relativity

    International Nuclear Information System (INIS)

    Surdin, Maurice

    1960-01-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [fr

  1. Building Program Verifiers from Compilers and Theorem Provers

    Science.gov (United States)

    2015-05-14

    Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015

  2. Verifying a smart design of TCAP : a synergetic experience

    NARCIS (Netherlands)

    T. Arts; I.A. van Langevelde

    1999-01-01

    textabstractAn optimisation of the SS No. 7 Transport Capabilities Procedures is verified by specifying both the original and the optimised {scriptsize sf TCAP in {scriptsize sf $mu$CRL, generating transition systems for both using the {scriptsize sf $mu$CRL tool set, and checking weak bisimulation

  3. A Trustworthy Internet Auction Model with Verifiable Fairness.

    Science.gov (United States)

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  4. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...... tool based on belief logics and explicit attacker knowledge....

  5. Making Digital Artifacts on the Web Verifiable and Reliable

    NARCIS (Netherlands)

    Kuhn, T.; Dumontier, M.

    2015-01-01

    The current Web has no general mechanisms to make digital artifacts - such as datasets, code, texts, and images - verifiable and permanent. For digital artifacts that are supposed to be immutable, there is moreover no commonly accepted method to enforce this immutability. These shortcomings have a

  6. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    Science.gov (United States)

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  7. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    International Nuclear Information System (INIS)

    Jacobson, Jacob J.; Jeffers, Robert F.; Matthern, Gretchen E.; Piet, Steven J.; Baker, Benjamin A.; Grimm, Joseph

    2009-01-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R and D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several Microsoft

  8. Evaluation of multiple approaches to identify genome-wide polymorphisms in closely related genotypes of sweet cherry (Prunus avium L.

    Directory of Open Access Journals (Sweden)

    Seanna Hewitt

    Full Text Available Identification of genetic polymorphisms and subsequent development of molecular markers is important for marker assisted breeding of superior cultivars of economically important species. Sweet cherry (Prunus avium L. is an economically important non-climacteric tree fruit crop in the Rosaceae family and has undergone a genetic bottleneck due to breeding, resulting in limited genetic diversity in the germplasm that is utilized for breeding new cultivars. Therefore, it is critical to recognize the best platforms for identifying genome-wide polymorphisms that can help identify, and consequently preserve, the diversity in a genetically constrained species. For the identification of polymorphisms in five closely related genotypes of sweet cherry, a gel-based approach (TRAP, reduced representation sequencing (TRAPseq, a 6k cherry SNParray, and whole genome sequencing (WGS approaches were evaluated in the identification of genome-wide polymorphisms in sweet cherry cultivars. All platforms facilitated detection of polymorphisms among the genotypes with variable efficiency. In assessing multiple SNP detection platforms, this study has demonstrated that a combination of appropriate approaches is necessary for efficient polymorphism identification, especially between closely related cultivars of a species. The information generated in this study provides a valuable resource for future genetic and genomic studies in sweet cherry, and the insights gained from the evaluation of multiple approaches can be utilized for other closely related species with limited genetic diversity in the breeding germplasm. Keywords: Polymorphisms, Prunus avium, Next-generation sequencing, Target region amplification polymorphism (TRAP, Genetic diversity, SNParray, Reduced representation sequencing, Whole genome sequencing (WGS

  9. Clinical implications of nonspecific pulmonary nodules identified during the initial evaluation of patients with head and neck squamous cell carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Minsu [Eulji University School of Medicine, Department of Otorhinolaryngology, Eulji Medical Center, Seoul (Korea, Republic of); Lee, Sang Hoon; Lee, Yoon Se; Roh, Jong-Lyel; Choi, Seung-Ho; Nam, Soon Yuhl; Kim, Sang Yoon [Asan Medical Center, University of Ulsan College of Medicine, Department of Otolaryngology, Songpa-gu, Seoul (Korea, Republic of); Lee, Choong Wook [Asan Medical Center, University of Ulsan College of Medicine, Department of Radiology, Seoul (Korea, Republic of)

    2017-09-15

    We aimed to identify the clinical implications of nonspecific pulmonary nodules (NPNs) detected in the initial staging workup for patients with head and neck squamous cell carcinoma (HNSCC). Medical records of patients who had been diagnosed and treated in our hospital were retrospectively analysed. After definite treatment, changes of NPNs detected on initial evaluation were monitored via serial chest computed tomography. The associations between NPNs and the clinicopathological characteristics of primary HNSCC were evaluated. Survival analyses were performed according to the presence of NPNs. The study consisted of 158 (49.4%) patients without NPNs and 162 (50.6%) patients with NPNs. The cumulative incidence of probabilities of pulmonary malignancy (PM) development at 2 years after treatment were 9.0% and 6.2% in NPN-negative and NPN-positive patients, respectively. Overall and PM-free survival rates were not significantly different according to NPN status. Cervical lymph node (LN) involvement and a platelet-lymphocyte ratio (PLR) ≥126 increased the risk of PMs (both P <0.05). NPNs detected in the initial evaluation of patients with HNSCC did not predict the risk of pulmonary malignancies. Cervical LN involvement and PLR ≥126 may be independent prognostic factors affecting PM-free survival regardless of NPN status. (orig.)

  10. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    Science.gov (United States)

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely

  11. Evaluation of unique identifiers used for citation linking [version 1; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Heidi Holst Madsen

    2016-06-01

    Full Text Available Unique identifiers (UID are seen as an effective tool to create links between identical publications in databases or identify duplicates in a database. The purpose of the present study is to investigate how well UIDs work for citation linking. We have two objectives: Explore the coverage, precision, and characteristics of publications matched versus not matched with UIDs as the match key.   Illustrate how publication sets formed by using UIDs as the match key may affect the bibliometric indicators: Number of publications, number of citations and the average number of citations per publication.   The objectives are addressed in a literature review and a case study. The literature review shows that only a few studies evaluate how well UIDs work as a match key. From the literature we identify four error types: Duplicate digital object identifiers (DOI, incorrect DOIs in reference lists and databases, DOIs not registered by the database where a bibliometric analysis is performed, and erroneous optical or special character recognition.   The case study explores the use of UIDs in the integration between the databases Pure and SciVal. Specifically journal publications in English are matched between the two databases. We find all error types except erroneous optical or special character recognition in our publication sets. In particular the duplicate DOIs constitute a problem for the calculation of bibliometric indicators as both keeping the duplicates to improve the reliability of citation counts and deleting them to improve the reliability of publication counts will distort the calculation of average number of citations per publication.   The use of UIDs as a match key in citation linking is implemented in many settings, and the availability of UIDs may become critical for the inclusion of a publication or a database in a bibliometric analysis.

  12. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  13. Verifying the gravitational shift due to the earth's rotation

    International Nuclear Information System (INIS)

    Briatore, L.; Leschiutta, S.

    1976-01-01

    Data on various independent time scales kept in different laboratories are elaborated in order to verify the gravitational shift due to the earth's rotation. It is shown that the state of the art in the measurement of time is just now resulting in the possibility to make measurement of Δ t/t approximately 10 -13 . Moreover it is shown an experimental evidence of the earth's rotation relativistic effects

  14. Building and Verifying a Predictive Model of Interruption Resumption

    Science.gov (United States)

    2012-03-01

    the gardener to remember those plants (and whether they need to be removed), and so will not commit resources to remember that information . The overall...camera), the storyteller needed help much less often. This result suggests that when there is no one to help them remember the last thing they said...INV ITED P A P E R Building and Verifying a Predictive Model of Interruption Resumption Help from a robot, to allow a human storyteller to continue

  15. TrustGuard: A Containment Architecture with Verified Output

    Science.gov (United States)

    2017-01-01

    that the TrustGuard system has minimal performance decline, despite restrictions such as high communication latency and limited available bandwidth...design are the availability of high bandwidth and low delays between the host and the monitoring chip. 3-D integration provides an alternate way of...TRUSTGUARD: A CONTAINMENT ARCHITECTURE WITH VERIFIED OUTPUT SOUMYADEEP GHOSH A DISSERTATION PRESENTED TO THE FACULTY OF PRINCETON UNIVERSITY IN

  16. Large test rigs verify Clinch River control rod reliability

    International Nuclear Information System (INIS)

    Michael, H.D.; Smith, G.G.

    1983-01-01

    The purpose of the Clinch River control test programme was to use multiple full-scale prototypic control rod systems for verifying the system's ability to perform reliably during simulated reactor power control and emergency shutdown operations. Two major facilities, the Shutdown Control Rod and Maintenance (Scram) facility and the Dynamic and Seismic Test (Dast) facility, were constructed. The test programme of each facility is described. (UK)

  17. Verifying Temporal Properties of Reactive Systems by Transformation

    OpenAIRE

    Hamilton, Geoff

    2015-01-01

    We show how program transformation techniques can be used for the verification of both safety and liveness properties of reactive systems. In particular, we show how the program transformation technique distillation can be used to transform reactive systems specified in a functional language into a simplified form that can subsequently be analysed to verify temporal properties of the systems. Example systems which are intended to model mutual exclusion are analysed using these techniques with...

  18. Robustness and device independence of verifiable blind quantum computing

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-01-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456–60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant. (paper)

  19. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity

    KAUST Repository

    Pahil, Sapna

    2017-08-02

    Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I), a putative heat shock protein (EL PGI II), Spa32 (EL PGI III), IcsB (EL PGI IV) and a hypothetical protein (EL PGI V). These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP) and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  20. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity

    KAUST Repository

    Pahil, Sapna; Taneja, Neelam; Ansari, Hifzur Rahman; Raghava, G. P. S.

    2017-01-01

    Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I), a putative heat shock protein (EL PGI II), Spa32 (EL PGI III), IcsB (EL PGI IV) and a hypothetical protein (EL PGI V). These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP) and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  1. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity.

    Science.gov (United States)

    Pahil, Sapna; Taneja, Neelam; Ansari, Hifzur Rahman; Raghava, G P S

    2017-01-01

    Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I), a putative heat shock protein (EL PGI II), Spa32 (EL PGI III), IcsB (EL PGI IV) and a hypothetical protein (EL PGI V). These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP) and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  2. In silico analysis to identify vaccine candidates common to multiple serotypes of Shigella and evaluation of their immunogenicity.

    Directory of Open Access Journals (Sweden)

    Sapna Pahil

    Full Text Available Shigellosis or bacillary dysentery is an important cause of diarrhea, with the majority of the cases occurring in developing countries. Considering the high disease burden, increasing antibiotic resistance, serotype-specific immunity and the post-infectious sequelae associated with shigellosis, there is a pressing need of an effective vaccine against multiple serotypes of the pathogen. In the present study, we used bio-informatics approach to identify antigens shared among multiple serotypes of Shigella spp. This approach led to the identification of many immunogenic peptides. The five most promising peptides based on MHC binding efficiency were a putative lipoprotein (EL PGI I, a putative heat shock protein (EL PGI II, Spa32 (EL PGI III, IcsB (EL PGI IV and a hypothetical protein (EL PGI V. These peptides were synthesized and the immunogenicity was evaluated in BALB/c mice by ELISA and cytokine assays. The putative heat shock protein (HSP and the hypothetical protein elicited good humoral response, whereas putative lipoprotein, Spa32 and IcsB elicited good T-cell response as revealed by increased IFN-γ and TNF-α cytokine levels. The patient sera from confirmed cases of shigellosis were also evaluated for the presence of peptide specific antibodies with significant IgG and IgA antibodies against the HSP and the hypothetical protein, bestowing them as potential future vaccine candidates. The antigens reported in this study are novel and have not been tested as vaccine candidates against Shigella. This study offers time and cost-effective way of identifying unprecedented immunogenic antigens to be used as potential vaccine candidates. Moreover, this approach should easily be extendable to find new potential vaccine candidates for other pathogenic bacteria.

  3. Mechanisms of change in psychotherapy for depression: An empirical update and evaluation of research aimed at identifying psychological mediators.

    Science.gov (United States)

    Lemmens, Lotte H J M; Müller, Viola N L S; Arntz, Arnoud; Huibers, Marcus J H

    2016-12-01

    We present a systematic empirical update and critical evaluation of the current status of research aimed at identifying a variety of psychological mediators in various forms of psychotherapy for depression. We summarize study characteristics and results of 35 relevant studies, and discuss the extent to which these studies meet several important requirements for mechanism research. Our review indicates that in spite of increased attention for the topic, advances in theoretical consensus about necessities for mechanism research, and sophistication of study designs, research in this field is still heterogeneous and unsatisfactory in methodological respect. Probably the biggest challenge in the field is demonstrating the causal relation between change in the mediator and change in depressive symptoms. The field would benefit from a further refinement of research methods to identify processes of therapeutic change. Recommendations for future research are discussed. However, even in the most optimal research designs, explaining psychotherapeutic change remains a challenge. Psychotherapy is a multi-dimensional phenomenon that might work through interplay of multiple mechanisms at several levels. As a result, it might be too complex to be explained in relatively simple causal models of psychological change. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Design and Evaluation of Educational Socio-environmental Games to Identify Attitudes, Motivations and Decisions of Smallholder Contemporary Rural Youth

    Directory of Open Access Journals (Sweden)

    Amayrani Meza-Jiménez

    2016-05-01

    Full Text Available The current and potential relationship of contemporary rural youth with the agricultural and natural patrimony (PAN, according to its Spanish initials that they will inherit is little known, but vitally important. In this study, we designed, adapted, and evaluated a variety of socio-environmental learning tools in order to identify and reflect on the opinions, actions, and motivations of 14 to 17 year olds in an area of the Sepultura Biosphere Reserve in Chiapas, Mexico to use their PAN in the future. The methodological approach consisted of exploring discourses using the Q method and three original table games (Mi territorio ideal, El carga palito y Manantiales de la Sierra. 46 teens were shown how to use these four tools, their use was monitored in workshops, and results were recorded and statistically analyzed. These tools allowed a identifying at least four discourses of the teens regarding the use of their PAN, and b reveal to the teens the preferences for land use, levels of diversification and intensification, and their disposition toward behaviors of dominance/subordination, competition, cooperation, coordination, equity, and solidarity that emerge from their decision making regarding PAN. Participants said they understood and enjoyed these tools, and that they learned about their own motivations. Together, these materials conform a dynamic educational approach that allows teachers and students to identify external and internal motivations, conservation behavior, intensification and diversification for managing PAN, attitudes of dominance and equity among teens, and preferences towards individual or collective working. This proposal is innovative, participatory, dynamic, and contextualized, and has great potential to be incorporated in the middle school curriculum in the study area and in similar rural regions of Mexico, as well as in the rest of Latin America and the world.

  5. Identifying Armed Respondents to Domestic Violence Restraining Orders and Recovering Their Firearms: Process Evaluation of an Initiative in California

    Science.gov (United States)

    Frattaroli, Shannon; Claire, Barbara E.; Vittes, Katherine A.; Webster, Daniel W.

    2014-01-01

    Objectives. We evaluated a law enforcement initiative to screen respondents to domestic violence restraining orders for firearm ownership or possession and recover their firearms. Methods. The initiative was implemented in San Mateo and Butte counties in California from 2007 through 2010. We used descriptive methods to evaluate the screening process and recovery effort in each county, relying on records for individual cases. Results. Screening relied on an archive of firearm transactions, court records, and petitioner interviews; no single source was adequate. Screening linked 525 respondents (17.7%) in San Mateo County to firearms; 405 firearms were recovered from 119 (22.7%) of them. In Butte County, 88 (31.1%) respondents were linked to firearms; 260 firearms were recovered from 45 (51.1%) of them. Nonrecovery occurred most often when orders were never served or respondents denied having firearms. There were no reports of serious violence or injury. Conclusions. Recovering firearms from persons subject to domestic violence restraining orders is possible. We have identified design and implementation changes that may improve the screening process and the yield from recovery efforts. Larger implementation trials are needed. PMID:24328660

  6. Evaluation of bentonite alteration due to interactions with iron. Sensitivity analyses to identify the important factors for the bentonite alteration

    International Nuclear Information System (INIS)

    Sasamoto, Hiroshi; Wilson, James; Sato, Tsutomu

    2013-01-01

    Performance assessment of geological disposal systems for high-level radioactive waste requires a consideration of long-term systems behaviour. It is possible that the alteration of swelling clay present in bentonite buffers might have an impact on buffer functions. In the present study, iron (as a candidate overpack material)-bentonite (I-B) interactions were evaluated as the main buffer alteration scenario. Existing knowledge on alteration of bentonite during I-B interactions was first reviewed, then the evaluation methodology was developed considering modeling techniques previously used overseas. A conceptual model for smectite alteration during I-B interactions was produced. The following reactions and processes were selected: 1) release of Fe 2+ due to overpack corrosion; 2) diffusion of Fe 2+ in compacted bentonite; 3) sorption of Fe 2+ on smectite edge and ion exchange in interlayers; 4) dissolution of primary phases and formation of alteration products. Sensitivity analyses were performed to identify the most important factors for the alteration of bentonite by I-B interactions. (author)

  7. Association between cotinine-verified smoking status and hypertension in 167,868 Korean adults.

    Science.gov (United States)

    Kim, Byung Jin; Han, Ji Min; Kang, Jung Gyu; Kim, Bum Soo; Kang, Jin Ho

    2017-10-01

    Previous studies showed inconsistent results concerning the relationship between chronic smoking and blood pressure. Most of the studies involved self-reported smoking status. This study was performed to evaluate the association of urinary cotinine or self-reported smoking status with hypertension and blood pressure in Korean adults. Among individuals enrolled in the Kangbuk Samsung Health Study and Kangbuk Samsung Cohort Study, 167,868 participants (men, 55.7%; age, 37.5 ± 6.9 years) between 2011 and 2013 who had urinary cotinine measurements were included. Individuals with urinary cotinine levels ≥50 ng/mL were defined as cotinine-verified current smokers. The prevalence of hypertension and cotinine-verified current smokers in the overall population was 6.8% and 22.7%, respectively (10.0% in men and 2.8% in women for hypertension: 37.7% in men and 3.9% in women for cotinine-verified current smokers). In a multivariate regression analysis adjusted for age, sex, body mass index, waist circumference, alcohol drinking, vigorous exercise, and diabetes, cotinine-verified current smoking was associated with lower prevalence of hypertension compared with cotinine-verified never smoking (OR[95% CI], 0.79 [0.75, 0.84]). Log-transformed cotinine levels and unobserved smoking were negatively associated with hypertension, respectively (0.96 [0.96, 0.97] and 0.55 [0.39, 0.79]). In a multivariate linear regression analysis, the cotinine-verified current smoking was inversely associated with systolic and diastolic blood pressure (BP) (regression coefficient[95% CI], -1.23[-1.39, -1.07] for systolic BP and -0.71 [-0.84, -0.58] for diastolic BP). In subgroup analyses according to sex, the inverse associations between cotinine-verified current smoking and hypertension were observed only in men. This large observational study showed that cotinine-verified current smoking and unobserved smoking were inversely associated with hypertension in Korean adults, especially only in

  8. From Operating-System Correctness to Pervasively Verified Applications

    Science.gov (United States)

    Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike

    Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.

  9. Psychological and social aspects verified after the Goiania's radioactive accident

    International Nuclear Information System (INIS)

    Helou, Suzana

    1995-01-01

    Psychological and social aspects verified after the radioactive accident occurred in 1987 in Goiania - brazilian city - are discussed. With this goal was going presented a public opinion research in order to retract the Goiania's radioactive accident residual psychological effects. They were going consolidated data obtained in 1.126 interviews. Four involvement different levels groups with the accident are compared with regard to the event. The research allowed to conclude that the accident affected psychologically somehow all Goiania's population. Besides, the research allowed to analyze the professionals performance quality standard in terms of the accident

  10. Flux wire measurements in Cavalier for verifying computer code applications

    International Nuclear Information System (INIS)

    Fehr, M.; Stubbs, J.; Hosticka, B.

    1988-01-01

    The Cavalier and UVAR research reactors are to be converted from high-enrichment uranium (HEU) to low-enrichment uranium (LEU) fuel. As a first step, an extensive set of gold wire activation measurements has been taken on the Cavalier reactor. Axial traverses show internal consistency to the order of ±5%, while horizontal traverses show somewhat larger deviations. The activation measurements will be converted to flux measurements via the Thermos code and will then be used to verify the Leopard-2DB codes. The codes will ultimately be used to design an upgraded LEU core for the UVAR

  11. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    Science.gov (United States)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  12. ASTUS system for verifying the transport seal TITUS 1

    International Nuclear Information System (INIS)

    Barillaux; Monteil, D.; Destain, G.D.

    1991-01-01

    ASTUS, a system for acquisition and processing ultrasonic signatures of TITUS 1 seals has been developed. TITUS seals are used to verify the integrity of the fissile material's container sealing after transport. An autonomous portable reading case permit to take seals signatures at the starting point and to transmit these reference signatures to a central safeguards computer by phonic modem. Then, at the terminal point with a similar reading case, an authority takes again the signature of seals and immediately transmit these signatures to the central safeguards computer. The central computer processes the data in real time by autocorrelation and return its verdict to the terminal point

  13. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    We propose an approach to automatic verification of real-time systems against scenario-based requirements. A real-time system is modeled as a network of Timed Automata (TA), and a scenario-based requirement is specified as a Live Sequence Chart (LSC). We define a trace-based semantics for a kernel...... subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  14. Spin temperature concept verified by optical magnetometry of nuclear spins

    Science.gov (United States)

    Vladimirova, M.; Cronenberger, S.; Scalbert, D.; Ryzhov, I. I.; Zapasskii, V. S.; Kozlov, G. G.; Lemaître, A.; Kavokin, K. V.

    2018-01-01

    We develop a method of nonperturbative optical control over adiabatic remagnetization of the nuclear spin system and apply it to verify the spin temperature concept in GaAs microcavities. The nuclear spin system is shown to exactly follow the predictions of the spin temperature theory, despite the quadrupole interaction that was earlier reported to disrupt nuclear spin thermalization. These findings open a way for the deep cooling of nuclear spins in semiconductor structures, with the prospect of realizing nuclear spin-ordered states for high-fidelity spin-photon interfaces.

  15. Identifying MRI markers to evaluate early treatment-related changes post-laser ablation for cancer pain management

    Science.gov (United States)

    Tiwari, Pallavi; Danish, Shabbar; Madabhushi, Anant

    2014-03-01

    by correcting for intensity drift in order to examine tissue-specific response, and (3) quantification of MRI maps via texture and intensity features to evaluate changes in MR markers pre- and post-LITT. A total of 78 texture features comprising of non-steerable and steerable gradient and second order statistical features were extracted from pre- and post-LITT MP-MRI on a per-voxel basis. Quantitative, voxel-wise comparison of the changes in MRI texture features between pre-, and post-LITT MRI indicate that (a) steerable and non-steerable gradient texture features were highly sensitive as well as specific in predicting subtle micro-architectural changes within and around the ablation zone pre- and post-LITT, (b) FLAIR was identified as the most sensitive MRI protocol in identifying early treatment changes yielding a normalized percentage change of 360% within the ablation zone relative to its pre-LITT value, and (c) GRE was identified as the most sensitive MRI protocol in quantifying changes outside the ablation zone post-LITT. Our preliminary results thus indicate great potential for non-invasive computerized MRI features in determining localized micro-architectural focal treatment related changes post-LITT.

  16. Evaluating genome-wide association study-identified breast cancer risk variants in African-American women.

    Directory of Open Access Journals (Sweden)

    Jirong Long

    Full Text Available Genome-wide association studies (GWAS, conducted mostly in European or Asian descendants, have identified approximately 67 genetic susceptibility loci for breast cancer. Given the large differences in genetic architecture between the African-ancestry genome and genomes of Asians and Europeans, it is important to investigate these loci in African-ancestry populations. We evaluated index SNPs in all 67 breast cancer susceptibility loci identified to date in our study including up to 3,300 African-American women (1,231 cases and 2,069 controls, recruited in the Southern Community Cohort Study (SCCS and the Nashville Breast Health Study (NBHS. Seven SNPs were statistically significant (P ≤ 0.05 with the risk of overall breast cancer in the same direction as previously reported: rs10069690 (5p15/TERT, rs999737 (14q24/RAD51L1, rs13387042 (2q35/TNP1, rs1219648 (10q26/FGFR2, rs8170 (19p13/BABAM1, rs17817449 (16q12/FTO, and rs13329835 (16q23/DYL2. A marginally significant association (P<0.10 was found for three additional SNPs: rs1045485 (2q33/CASP8, rs4849887 (2q14/INHBB, and rs4808801 (19p13/ELL. Three additional SNPs, including rs1011970 (9p21/CDKN2A/2B, rs941764 (14q32/CCDC88C, and rs17529111 (6q14/FAM46A, showed a significant association in analyses conducted by breast cancer subtype. The risk of breast cancer was elevated with an increasing number of risk variants, as measured by quintile of the genetic risk score, from 1.00 (reference, to 1.75 (1.30-2.37, 1.56 (1.15-2.11, 2.02 (1.50-2.74 and 2.63 (1.96-3.52, respectively, (P = 7.8 × 10(-10. Results from this study highlight the need for large genetic studies in AAs to identify risk variants impacting this population.

  17. People consider reliability and cost when verifying their autobiographical memories.

    Science.gov (United States)

    Wade, Kimberley A; Nash, Robert A; Garry, Maryanne

    2014-02-01

    Because memories are not always accurate, people rely on a variety of strategies to verify whether the events that they remember really did occur. Several studies have examined which strategies people tend to use, but none to date has asked why people opt for certain strategies over others. Here we examined the extent to which people's beliefs about the reliability and the cost of different strategies would determine their strategy selection. Subjects described a childhood memory and then suggested strategies they might use to verify the accuracy of that memory. Next, they rated the reliability and cost of each strategy, and the likelihood that they might use it. Reliability and cost each predicted strategy selection, but a combination of the two ratings provided even greater predictive value. Cost was significantly more influential than reliability, which suggests that a tendency to seek and to value "cheap" information more than reliable information could underlie many real-world memory errors. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. A record and verify system for radiotherapy treatment

    International Nuclear Information System (INIS)

    Koens, M.L.; Vroome, H. de

    1984-01-01

    The Record and Verify system developed for the radiotherapy department of the Leiden University Hospital is described. The system has been in use since 1980 and will now be installed in at least four of the Dutch University Hospitals. The system provides the radiographer with a powerful tool for checking the set-up of the linear accelerator preceeding the irradiation of a field. After the irradiation of a field the machine settings are registered in the computer system together with the newly calculated cumulative dose. These registrations are used by the system to produce a daily report which provides the management of the department with insight into the established differences between treatment and treatment planning. Buying a record and verify system from the manufacturer of the linear accelerator is not an optimal solution especially for a department with more than one accelerator from different manufacturers. Integration in a Hospital Information System (HIS) has important advantages over the development of a dedicated departmental system. (author)

  19. Characterizing Verified Head Impacts in High School Girls' Lacrosse.

    Science.gov (United States)

    Caswell, Shane V; Lincoln, Andrew E; Stone, Hannah; Kelshaw, Patricia; Putukian, Margot; Hepburn, Lisa; Higgins, Michael; Cortes, Nelson

    2017-12-01

    Girls' high school lacrosse players have higher rates of head and facial injuries than boys. Research indicates that these injuries are caused by stick, player, and ball contacts. Yet, no studies have characterized head impacts in girls' high school lacrosse. To characterize girls' high school lacrosse game-related impacts by frequency, magnitude, mechanism, player position, and game situation. Descriptive epidemiology study. Thirty-five female participants (mean age, 16.2 ± 1.2 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) volunteered during 28 games in the 2014 and 2015 lacrosse seasons. Participants wore impact sensors affixed to the right mastoid process before each game. All game-related impacts recorded by the sensors were verified using game video. Data were summarized for all verified impacts in terms of frequency, peak linear acceleration (PLA), and peak rotational acceleration (PRA). Descriptive statistics and impact rates were calculated. Fifty-eight verified game-related impacts ≥20 g were recorded (median PLA, 33.8 g; median PRA, 6151.1 rad/s 2 ) during 467 player-games. The impact rate for all game-related verified impacts was 0.12 per athlete-exposure (AE) (95% CI, 0.09-0.16), equivalent to 2.1 impacts per team game, indicating that each athlete suffered fewer than 2 head impacts per season ≥20 g. Of these impacts, 28 (48.3%) were confirmed to directly strike the head, corresponding with an impact rate of 0.05 per AE (95% CI, 0.00-0.10). Overall, midfielders (n = 28, 48.3%) sustained the most impacts, followed by defenders (n = 12, 20.7%), attackers (n = 11, 19.0%), and goalies (n = 7, 12.1%). Goalies demonstrated the highest median PLA and PRA (38.8 g and 8535.0 rad/s 2 , respectively). The most common impact mechanisms were contact with a stick (n = 25, 43.1%) and a player (n = 17, 29.3%), followed by the ball (n = 7, 12.1%) and the ground (n = 7, 12.1%). One hundred percent of ball impacts occurred to goalies. Most impacts

  20. Taking a "Snapshot": Evaluation of a Conversation Aid for Identifying Psychosocial Needs in Young Adults with Cancer.

    Science.gov (United States)

    Poort, Hanneke; Souza, Phoebe M; Malinowski, Paige K; MacDougall, Katelyn M; Barysauskas, Constance M; Lau Greenberg, Teresa; Tulsky, James A; Fasciano, Karen M

    2018-05-21

    Young adults (YAs) aged 18-35 years with cancer often experience unmet psychosocial needs. We aimed to evaluate a conversation aid ("Snapshot") that offered a framework for discussing YA-specific psychosocial concerns between patients and clinicians. We developed and implemented Snapshot between 2014 and 2016 as part of a quality improvement initiative at Dana-Farber Cancer Institute. We extracted pre- and postimplementation data from chart documentation of psychosocial concerns. YAs and social workers provided qualitative feedback on the use of Snapshot in clinical care. Postintervention chart reviews revealed a significant increase in the median number of topics documented in charts after implementation of Snapshot (preintervention median = 9 [range: 1-15] vs. postintervention median = 11 [range 6-15]; p = 0.003). Overall, YAs and social workers reported that using Snapshot improved communication and consistency of psychosocial care, with documented improvement in the following domains: understanding illness (p psychosocial needs assessment among YAs with cancer. Implementation was successful in reducing variability identified in the preintervention cohort and increasing the number of YA-specific psychosocial topics discussed. A standardized conversation aid has the potential to improve quality of care for YAs by enabling early identification and intervention of psychosocial issues for all patients.

  1. An Approach to Evaluate Comprehensive Plan and Identify Priority Lands for Future Land Use Development to Conserve More Ecological Values

    Directory of Open Access Journals (Sweden)

    Long Zhou

    2018-01-01

    Full Text Available Urbanization has significant impacts on the regional environmental quality through altering natural lands, converting them to urban built-up areas. One common strategy applied by urban planners to manage urbanization and preserve natural resources is to make a comprehensive plan and concentrate future land use in certain areas. However, in practice, planners used to make future land use planning mainly based on their subjective interpretations with limited ecological supporting evidence and analysis. Here, we propose a new approach composed of ecological modelling and land use zoning in the spatial matrix to evaluate the comprehensive plan and identify priority lands for sustainable land use planning. We use the city of Corvallis, OR, as the test bed to demonstrate this new approach. The results indicate that the Corvallis Comprehensive Plan 1998–2020 featured with compact development is not performing efficiently in conserving ecological values, and the land use plan featured with mixed-use spreading development generated by the proposed approach meets the city’s land demands for urban growth, and conserves 103% more ecological value of retaining storm water nitrogen, 270% more ecological value of retaining storm water phosphorus and 19% more ecological value in storing carbon in the whole watershed. This study indicates that if planned with scientific analysis and evidence, spreading urban development does not necessarily result in less sustainable urban environment than the compact development recommended in smart growth.

  2. Evaluation of validity of Integrated Management of Childhood Illness guidelines in identifying edema of nutritional causes among Egyptian children.

    Science.gov (United States)

    El Habashy, Safinaz A; Mohamed, Maha H; Amin, Dina A; Marzouk, Diaa; Farid, Mohammed N

    2015-12-01

    The aim of this study was to assess the validity of the Integrated Management of Childhood Illness (IMCI) algorithm to detect edematous type of malnutrition in Egyptian infants and children ranging in age from 2 months to 5 years. This study was carried out by surveying 23 082 children aged between 2 months and 5 years visiting the pediatric outpatient clinic, Ain Shams University Hospital, over a period of 6 months. Thirty-eight patients with edema of both feet on their primary visit were enrolled in the study. Every child was assessed using the IMCI algorithm 'assess and classify' by the same physician, together with a systematic clinical evaluation with all relevant investigations. Twenty-two patients (57.9%) were proven to have nutritional etiology. 'Weight for age' sign had a sensitivity of 95.5%, a specificity of 56%, and a diagnostic accuracy of 78.95% in the identification of nutritional edema among all cases of bipedal edema. Combinations of IMCI symptoms 'pallor, visible severe wasting, fever, diarrhea', and 'weight for age' increased the sensitivity to 100%, but with a low specificity of 38% and a diagnostic accuracy of 73.68%. Bipedal edema and low weight for age as part of the IMCI algorithm can identify edema because of nutritional etiology with 100% sensitivity, but with 37% specificity. Revisions need to be made to the IMCI guidelines published in 2010 by the Egyptian Ministry of Health in the light of the new WHO guidelines of 2014.

  3. The usefulness and feasibility of a screening instrument to identify psychosocial problems in patients receiving curative radiotherapy: a process evaluation

    International Nuclear Information System (INIS)

    Braeken, Anna PBM; Kempen, Gertrudis IJM; Eekers, Daniëlle; Gils, Francis CJM van; Houben, Ruud MA; Lechner, Lilian

    2011-01-01

    Psychosocial problems in cancer patients are often unrecognized and untreated due to the low awareness of the existence of these problems or pressures of time. The awareness of the need to identify psychosocial problems in cancer patients is growing and has affected the development of screening instruments. This study explored the usefulness and feasibility of using a screening instrument (SIPP: Screening Inventory of Psychosocial Problems) to identify psychosocial problems in cancer patients receiving curative radiotherapy treatment (RT). The study was conducted in a radiation oncology department in the Netherlands. Several methods were used to document the usefulness and feasibility of the SIPP. Data were collected using self-report questionnaires completed by seven radiotherapists and 268 cancer patients. Regarding the screening procedure 33 patients were offered to consult a psychosocial care provider (e.g. social worker, psychologist) during the first consultation with their radiotherapist. Of these patients, 31 patients suffered from at least sub-clinical symptoms and two patients hardly suffered from any symptoms. Patients' acceptance rate 63.6% (21/33) was high. Patients were positive about the content of the SIPP (mean scores vary from 8.00 to 8.88, out of a range between 0 and 10) and about the importance of discussing items of the SIPP with their radiotherapist (mean score = 7.42). Radiotherapists' perspectives about the contribution of the SIPP to discuss the different psychosocial problems were mixed (mean scores varied from 3.17 to 4.67). Patients were more positive about discussing items of the SIPP if the radiotherapists had positive attitudes towards screening and discussing psychosocial problems. The screening procedure appeared to be feasible in a radiotherapy department. In general, patients' perspectives were at least moderate. Radiotherapists considered the usefulness and feasibility of the SIPP generally to be lower, but their

  4. Calling Out Cheaters : Covert Security with Public VerifiabilitySecurity

    DEFF Research Database (Denmark)

    Asharov, Gilad; Orlandi, Claudio

    2012-01-01

    We introduce the notion of covert security with public verifiability, building on the covert security model introduced by Aumann and Lindell (TCC 2007). Protocols that satisfy covert security guarantee that the honest parties involved in the protocol will notice any cheating attempt with some...... constant probability ε. The idea behind the model is that the fear of being caught cheating will be enough of a deterrent to prevent any cheating attempt. However, in the basic covert security model, the honest parties are not able to persuade any third party (say, a judge) that a cheating occurred. We...... propose (and formally define) an extension of the model where, when an honest party detects cheating, it also receives a certificate that can be published and used to persuade other parties, without revealing any information about the honest party’s input. In addition, malicious parties cannot create fake...

  5. Developing a flexible and verifiable integrated dose assessment capability

    International Nuclear Information System (INIS)

    Parzyck, D.C.; Rhea, T.A.; Copenhaver, E.D.; Bogard, J.S.

    1987-01-01

    A flexible yet verifiable system of computing and recording personnel doses is needed. Recent directions in statutes establish the trend of combining internal and external doses. We are developing a Health Physics Information Management System (HPIMS) that will centralize dosimetry calculations and data storage; integrate health physics records with other health-related disciplines, such as industrial hygiene, medicine, and safety; provide a more auditable system with published algorithms and clearly defined flowcharts of system operation; readily facilitate future changes dictated by new regulations, new dosimetric models, and new systems of units; and address ad-hoc inquiries regarding worker/workplace interactions, including potential synergisms with non-radiation exposures. The system is modular and provides a high degree of isolation from low-level detail, allowing flexibility for changes without adversely affecting other parts of the system. 10 refs., 3 figs

  6. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  7. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  8. Verifying atom entanglement schemes by testing Bell's inequality

    International Nuclear Information System (INIS)

    Angelakis, D.G.; Knight, P.L.; Tregenna, B.; Munro, W.J.

    2001-01-01

    Recent experiments to test Bell's inequality using entangled photons and ions aimed at tests of basic quantum mechanical principles. Interesting results have been obtained and many loopholes could be closed. In this paper we want to point out that tests of Bell's inequality also play an important role in verifying atom entanglement schemes. We describe as an example a scheme to prepare arbitrary entangled states of N two-level atoms using a leaky optical cavity and a scheme to entangle atoms inside a photonic crystal. During the state preparation no photons are emitted, and observing a violation of Bell's inequality is the only way to test whether a scheme works with a high precision or not. (orig.)

  9. Noninteractive Verifiable Outsourcing Algorithm for Bilinear Pairing with Improved Checkability

    Directory of Open Access Journals (Sweden)

    Yanli Ren

    2017-01-01

    Full Text Available It is well known that the computation of bilinear pairing is the most expensive operation in pairing-based cryptography. In this paper, we propose a noninteractive verifiable outsourcing algorithm of bilinear pairing based on two servers in the one-malicious model. The outsourcer need not execute any expensive operation, such as scalar multiplication and modular exponentiation. Moreover, the outsourcer could detect any failure with a probability close to 1 if one of the servers misbehaves. Therefore, the proposed algorithm improves checkability and decreases communication cost compared with the previous ones. Finally, we utilize the proposed algorithm as a subroutine to achieve an anonymous identity-based encryption (AIBE scheme with outsourced decryption and an identity-based signature (IBS scheme with outsourced verification.

  10. Modelling and Verifying Communication Failure of Hybrid Systems in HCSP

    DEFF Research Database (Denmark)

    Wang, Shuling; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    Hybrid systems are dynamic systems with interacting discrete computation and continuous physical processes. They have become ubiquitous in our daily life, e.g. automotive, aerospace and medical systems, and in particular, many of them are safety-critical. For a safety-critical hybrid system......, in the presence of communication failure, the expected control from the controller will get lost and as a consequence the physical process cannot behave as expected. In this paper, we mainly consider the communication failure caused by the non-engagement of one party in communication action, i.......e. the communication itself fails to occur. To address this issue, this paper proposes a formal framework by extending HCSP, a formal modeling language for hybrid systems, for modeling and verifying hybrid systems in the absence of receiving messages due to communication failure. We present two inference systems...

  11. A detailed and verified wind resource atlas for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Mortensen, N G; Landberg, L; Rathmann, O; Nielsen, M N [Risoe National Lab., Roskilde (Denmark); Nielsen, P [Energy and Environmental Data, Aalberg (Denmark)

    1999-03-01

    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  12. Verifying reciprocal relations for experimental diffusion coefficients in multicomponent mixtures

    DEFF Research Database (Denmark)

    Medvedev, Oleg; Shapiro, Alexander

    2003-01-01

    The goal of the present study is to verify the agreement of the available data on diffusion in ternary mixtures with the theoretical requirement of linear non-equilibrium thermodynamics consisting in symmetry of the matrix of the phenomenological coefficients. A common set of measured diffusion...... coefficients for a three-component mixture consists of four Fickian diffusion coefficients, each being reported separately. However, the Onsager theory predicts the existence of only three independent coefficients, as one of them disappears due to the symmetry requirement. Re-calculation of the Fickian...... extended sets of experimental data and reliable thermodynamic models were available. The sensitivity of the symmetry property to different thermodynamic parameters of the models was also checked. (C) 2003 Elsevier Science B.V. All rights reserved....

  13. How to Verify and Manage the Translational Plagiarism?

    Science.gov (United States)

    Wiwanitkit, Viroj

    2016-01-01

    The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and non-English article that might translate from an original English article, which cannot be detected by simple plagiarism screening tool. It is a hard job for any journal to detect the complex translational plagiarism but the harder job might be how to effectively manage the case. PMID:27703588

  14. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    Science.gov (United States)

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  15. Evaluation of current prediction models for Lynch syndrome: updating the PREMM5 model to identify PMS2 mutation carriers.

    Science.gov (United States)

    Goverde, A; Spaander, M C W; Nieboer, D; van den Ouweland, A M W; Dinjens, W N M; Dubbink, H J; Tops, C J; Ten Broeke, S W; Bruno, M J; Hofstra, R M W; Steyerberg, E W; Wagner, A

    2018-07-01

    Until recently, no prediction models for Lynch syndrome (LS) had been validated for PMS2 mutation carriers. We aimed to evaluate MMRpredict and PREMM5 in a clinical cohort and for PMS2 mutation carriers specifically. In a retrospective, clinic-based cohort we calculated predictions for LS according to MMRpredict and PREMM5. The area under the operator receiving characteristic curve (AUC) was compared between MMRpredict and PREMM5 for LS patients in general and for different LS genes specifically. Of 734 index patients, 83 (11%) were diagnosed with LS; 23 MLH1, 17 MSH2, 31 MSH6 and 12 PMS2 mutation carriers. Both prediction models performed well for MLH1 and MSH2 (AUC 0.80 and 0.83 for PREMM5 and 0.79 for MMRpredict) and fair for MSH6 mutation carriers (0.69 for PREMM5 and 0.66 for MMRpredict). MMRpredict performed fair for PMS2 mutation carriers (AUC 0.72), while PREMM5 failed to discriminate PMS2 mutation carriers from non-mutation carriers (AUC 0.51). The only statistically significant difference between PMS2 mutation carriers and non-mutation carriers was proximal location of colorectal cancer (77 vs. 28%, p PMS2 mutation carriers (AUC 0.77) and overall (AUC 0.81 vs. 0.72). We validated these results in an external cohort of 376 colorectal cancer patients, including 158 LS patients. MMRpredict and PREMM5 cannot adequately identify PMS2 mutation carriers. Adding location of colorectal cancer to PREMM5 may improve the performance of this model, which should be validated in larger cohorts.

  16. A critical evaluation of the use of cluster analysis to identify contaminated sediments in the Ria de Vigo

    Energy Technology Data Exchange (ETDEWEB)

    Rubio, B; Nombela, M. A; Vilas, F [Departamento de Geociencias Marinas y Ordenacion del Territorio, Vigo, Espana (Spain)

    2001-06-01

    The indiscriminate use of cluster analysis to distinguish contaminated and non-contaminated sediments has led us to make a comparative evaluation of different cluster analysis procedures as applied to heavy metal concentrations in subtidal sediments from the Ria de Vigo, NW Spain. The use of different clusters algorithms and other transformations from the same departing set of data lead to the formation of different clusters with a clear inconclusive result about the contamination status of the sediments. The results show that this approach is better suited to identifying groups of samples differing in sedimentological characteristics, such as grain size, rather than in the degree of contamination. Our main aim is to call attention to these aspects in cluster analysis and to suggest that researches should be rigorous with this kind of analysis. Finally, the use of discriminate analysis allows us to find a discriminate function that separates the samples into two clearly differentiated groups, which should not be treated jointly. [Spanish] El uso indiscriminado del analisis cluster para distinguir sedimentos contaminados y no contaminados nos ha llevado a realizar una evaluacion comparativa entre los diferentes procedimientos de estos analisis aplicada a la concentracion de metales pesados en sedimentos submareales de la Ria de Vigo, NW de Espana. La utilizacion de distintos algoritmos de cluster, asi como otras transformaciones de la misma matriz de datos conduce a la formacion de diferentes clusters con un resultado inconcluso sobre el estado de contaminacion de los sedimentos. Los resultados muestran que esta aproximacion se ajusta mejor para identificar grupos de muestras que difieren en caracteristicas sedimentologicas, tal como el tamano de grano, mas que el grado de contaminacion. El principal objetivo es llamar la atencion sobre estos aspectos del analisis cluster y sugerir a los investigadores que sean rigurosos con este tipo de analisis. Finalmente el uso

  17. Parents' evaluation of developmental status: how well do parents' concerns identify children with behavioral and emotional problems?

    Science.gov (United States)

    Glascoe, Frances Page

    2003-03-01

    This study was undertaken to determine which parental concerns are most associated with significant behavioral/emotional problems and the extent to which parents' concerns can be depended on in the detection of mental health problems. An additional goal is to view how well a recently published screening test relying on parents' concerns, Parents' Evaluation of Developmental Status (PEDS), detects behavioral and emotional problems. Subjects were a national sample of 472 parents and their children (21 months to 8 years old) who were participants in 1 of 2 test standardization and validation studies. Sites included various pediatric settings, public schools, and Head Start programs in 5 diverse geographic locations. Subjects were representative of U.S. demographics in terms of ethnicity, parental level of education, gender, and socioeconomic status. At each site, psychological examiners, educational diagnosticians, or school psychologists recruited families, and obtained informed consent. Examiners disseminated a demographics questionnaire (in English or Spanish) and a developmental screening test that relies on parents' concerns (PEDS). Examiners were blinded to PEDS' scoring and interpretation administered either by interview or in writing, the Eyberg Child Behavior Inventory (ECBI) or the Possible Problems Checklist (PPC), a subtest of the Child Development Inventory that includes items measuring emotional well-being and behavioral self-control. PEDS was used to sort children into risk for developmental disabilities according to various types of parental concern. Those identified as having high or moderate risk were nominated for diagnostic testing or screening followed by developmental and mental health services when indicated. Because their emotional and behavioral needs would have been identified and addressed, these groups were removed from the analysis (N = 177). Of the 295 children who would not have been nominated for further scrutiny on PEDS due to their

  18. Identifying motivators and barriers to student completion of instructor evaluations: A multi-faceted, collaborative approach from four colleges of pharmacy.

    Science.gov (United States)

    McAuley, James W; Backo, Jennifer Lynn; Sobota, Kristen Finley; Metzger, Anne H; Ulbrich, Timothy

    To identify motivators and barriers to pharmacy student completion of instructor evaluations, and to develop potential strategies to improve the evaluation process. Completed at four Ohio Colleges of Pharmacy, Phase I consisted of a student/faculty survey and Phase II consisted of joint student/faculty focus groups to discuss Phase I data and to problem solve. In Phase I, the top three student-identified and faculty-perceived motivators to completion of evaluations were to (1) make the course better, (2) earn bonus points, and (3) improve the instructor's teaching. The top three student-identified barriers to completion of evaluations were having to (1) evaluate multiple instructors, (2) complete several evaluations around the same time, and (3) complete lengthy evaluations. Phase II focus groups identified a number of potential ways to enhance the motivators and reduce barriers, including but not limited to making sure faculty convey to students that the feedback they provide is useful and to provide examples of how student feedback has been used to improve their teaching/the course. Students and faculty identified motivators and barriers to completing instructor evaluations and were willing to work together to improve the process. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Biochemically verified smoking cessation and vaping beliefs among vape store customers.

    Science.gov (United States)

    Tackett, Alayna P; Lechner, William V; Meier, Ellen; Grant, DeMond M; Driskill, Leslie M; Tahirkheli, Noor N; Wagener, Theodore L

    2015-05-01

    To evaluate biochemically verified smoking status and electronic nicotine delivery systems (ENDS) use behaviors and beliefs among a sample of customers from vapor stores (stores specializing in ENDS). A cross-sectional survey of 215 adult vapor store customers at four retail locations in the Midwestern United States; a subset of participants (n = 181) also completed exhaled carbon monoxide (CO) testing to verify smoking status. Outcomes evaluated included ENDS preferences, harm beliefs, use behaviors, smoking history and current biochemically verified smoking status. Most customers reported starting ENDS as a means of smoking cessation (86%), using newer-generation devices (89%), vaping non-tobacco/non-menthol flavors (72%) and using e-liquid with nicotine strengths of ≤20 mg/ml (72%). There was a high rate of switching (91.4%) to newer-generation ENDS among those who started with a first-generation product. Exhaled CO readings confirmed that 66% of the tested sample had quit smoking. Among those who continued to smoke, mean cigarettes per day decreased from 22.1 to 7.5 (P customers in the United States who use electronic nicotine delivery devices to stop smoking, vaping longer, using newer-generation devices and using non-tobacco and non-menthol flavored e-liquid appear to be associated with higher rates of smoking cessation. © 2015 Society for the Study of Addiction.

  20. Construct a procedure to verify radiation protection for apparatus of industrial gamma radiography

    International Nuclear Information System (INIS)

    Nghiem Xuan Long; Trinh Dinh Truong; Dinh Chi Hung; Le Ngoc Hieu

    2013-01-01

    Apparatus for industrial gamma radiography include an exposure container, source guide tube, remote control hand crank assembly and other attached equipment. It is used a lot in inspection and evaluation of projects. In Vietnam, there are now more than 50 companies in radiography field and more than 100 apparatus are being used on the site. Therefore, the verification and evaluation is very necessary and important. This project constructs a procedure to verify a radiation protection for apparatus in the industrial gamma radiography for its application in Vietnam. (author)

  1. An evaluation of applicability of seismic refraction method in identifying shallow archaeological features A case study at archaeological site

    Science.gov (United States)

    Jahangardi, Morteza; Hafezi Moghaddas, Naser; Keivan Hosseini, Sayyed; Garazhian, Omran

    2015-04-01

    We applied the seismic refraction method at archaeological site, Tepe Damghani located in Sabzevar, NE of Iran, in order to determine the structures of archaeological interests. This pre-historical site has special conditions with respect to geographical location and geomorphological setting, so it is an urban archaeological site, and in recent years it has been used as an agricultural field. In spring and summer of 2012, the third season of archaeological excavation was carried out. Test trenches of excavations in this site revealed that cultural layers were often disturbed adversely due to human activities such as farming and road construction in recent years. Conditions of archaeological cultural layers in southern and eastern parts of Tepe are slightly better, for instance, in test trench 3×3 m²1S03, third test trench excavated in the southern part of Tepe, an adobe in situ architectural structure was discovered that likely belongs to cultural features of a complex with 5 graves. After conclusion of the third season of archaeological excavation, all of the test trenches were filled with the same soil of excavated test trenches. Seismic refraction method was applied with12 channels of P geophones in three lines with a geophone interval of 0.5 meter and a 1.5 meter distance between profiles on test trench 1S03. The goal of this operation was evaluation of applicability of seismic method in identification of archaeological features, especially adobe wall structures. Processing of seismic data was done with the seismic software, SiesImager. Results were presented in the form of seismic section for every profile, so that identification of adobe wall structures was achieved hardly. This could be due to that adobe wall had been built with the same materials of the natural surrounding earth. Thus, there is a low contrast and it has an inappropriate effect on seismic processing and identifying of archaeological features. Hence the result could be that application of

  2. A Hybrid Verifiable and Delegated Cryptographic Model in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jaber Ibrahim Naser

    2018-02-01

    Full Text Available Access control is very important in cloud data sharing. Especially in the domains like healthcare, it is essential to have access control mechanisms in place for confidentiality and secure data access. Attribute based encryption has been around for many years to secure data and provide controlled access. In this paper, we proposed a framework that supports circuit and attributes based encryption mechanism that involves multiple parties. They are data owner, data user, cloud server and attribute authority. An important feature of the proposed system is the verifiable delegation of the decryption process to cloud server. Data owner encrypts data and delegates decryption process to cloud. Cloud server performs partial decryption and then the final decrypted data are shared for users as per the privileges. Data owner  thus reduces computational complexity by delegating decryption process cloud server. We built a prototype application using the Microsoft.NET platform for proof of the concept. The empirical results revealed that there is controlled access with multiple user roles and access control rights for secure and confidential data access in cloud computing.

  3. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    Science.gov (United States)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  4. Verifying operator fitness - an imperative not an option

    International Nuclear Information System (INIS)

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered

  5. AUTOMATIC ESTIMATION OF SIZE PARAMETERS USING VERIFIED COMPUTERIZED STEREOANALYSIS

    Directory of Open Access Journals (Sweden)

    Peter R Mouton

    2011-05-01

    Full Text Available State-of-the-art computerized stereology systems combine high-resolution video microscopy and hardwaresoftware integration with stereological methods to assist users in quantifying multidimensional parameters of importance to biomedical research, including volume, surface area, length, number, their variation and spatial distribution. The requirement for constant interactions between a trained, non-expert user and the targeted features of interest currently limits the throughput efficiency of these systems. To address this issue we developed a novel approach for automatic stereological analysis of 2-D images, Verified Computerized Stereoanalysis (VCS. The VCS approach minimizes the need for user interactions with high contrast [high signal-to-noise ratio (S:N] biological objects of interest. Performance testing of the VCS approach confirmed dramatic increases in the efficiency of total object volume (size estimation, without a loss of accuracy or precision compared to conventional computerized stereology. The broad application of high efficiency VCS to high-contrast biological objects on tissue sections could reduce labor costs, enhance hypothesis testing, and accelerate the progress of biomedical research focused on improvements in health and the management of disease.

  6. Verifying large modular systems using iterative abstraction refinement

    International Nuclear Information System (INIS)

    Lahtinen, Jussi; Kuismin, Tuomas; Heljanko, Keijo

    2015-01-01

    Digital instrumentation and control (I&C) systems are increasingly used in the nuclear engineering domain. The exhaustive verification of these systems is challenging, and the usual verification methods such as testing and simulation are typically insufficient. Model checking is a formal method that is able to exhaustively analyse the behaviour of a model against a formally written specification. If the model checking tool detects a violation of the specification, it will give out a counter-example that demonstrates how the specification is violated in the system. Unfortunately, sometimes real life system designs are too big to be directly analysed by traditional model checking techniques. We have developed an iterative technique for model checking large modular systems. The technique uses abstraction based over-approximations of the model behaviour, combined with iterative refinement. The main contribution of the work is the concrete abstraction refinement technique based on the modular structure of the model, the dependency graph of the model, and a refinement sampling heuristic similar to delta debugging. The technique is geared towards proving properties, and outperforms BDD-based model checking, the k-induction technique, and the property directed reachability algorithm (PDR) in our experiments. - Highlights: • We have developed an iterative technique for model checking large modular systems. • The technique uses BDD-based model checking, k-induction, and PDR in parallel. • We have tested our algorithm by verifying two models with it. • The technique outperforms classical model checking methods in our experiments

  7. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    Directory of Open Access Journals (Sweden)

    J. H. Lee

    2015-01-01

    Full Text Available A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions.

  8. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  9. Scenarios for exercising technical approaches to verified nuclear reductions

    International Nuclear Information System (INIS)

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  10. Evaluation of 19,460 Wheat Accessions Conserved in the Indian National Genebank to Identify New Sources of Resistance to Rust and Spot Blotch Diseases

    Science.gov (United States)

    Jacob, Sherry R.; Srinivasan, Kalyani; Radhamani, J.; Parimalan, R.; Sivaswamy, M.; Tyagi, Sandhya; Yadav, Mamata; Kumari, Jyotisna; Deepali; Sharma, Sandeep; Bhagat, Indoo; Meeta, Madhu; Bains, N. S.; Chowdhury, A. K.; Saha, B. C.; Bhattacharya, P. M.; Kumari, Jyoti; Singh, M. C.; Gangwar, O. P.; Prasad, P.; Bharadwaj, S. C.; Gogoi, Robin; Sharma, J. B.; GM, Sandeep Kumar; Saharan, M. S.; Bag, Manas; Roy, Anirban; Prasad, T. V.; Sharma, R. K.; Dutta, M.; Sharma, Indu; Bansal, K. C.

    2016-01-01

    A comprehensive germplasm evaluation study of wheat accessions conserved in the Indian National Genebank was conducted to identify sources of rust and spot blotch resistance. Genebank accessions comprising three species of wheat–Triticum aestivum, T. durum and T. dicoccum were screened sequentially at multiple disease hotspots, during the 2011–14 crop seasons, carrying only resistant accessions to the next step of evaluation. Wheat accessions which were found to be resistant in the field were then assayed for seedling resistance and profiled using molecular markers. In the primary evaluation, 19,460 accessions were screened at Wellington (Tamil Nadu), a hotspot for wheat rusts. We identified 4925 accessions to be resistant and these were further evaluated at Gurdaspur (Punjab), a hotspot for stripe rust and at Cooch Behar (West Bengal), a hotspot for spot blotch. The second round evaluation identified 498 accessions potentially resistant to multiple rusts and 868 accessions potentially resistant to spot blotch. Evaluation of rust resistant accessions for seedling resistance against seven virulent pathotypes of three rusts under artificial epiphytotic conditions identified 137 accessions potentially resistant to multiple rusts. Molecular analysis to identify different combinations of genetic loci imparting resistance to leaf rust, stem rust, stripe rust and spot blotch using linked molecular markers, identified 45 wheat accessions containing known resistance genes against all three rusts as well as a QTL for spot blotch resistance. The resistant germplasm accessions, particularly against stripe rust, identified in this study can be excellent potential candidates to be employed for breeding resistance into the background of high yielding wheat cultivars through conventional or molecular breeding approaches, and are expected to contribute toward food security at national and global levels. PMID:27942031

  11. Evaluation of 19,460 Wheat Accessions Conserved in the Indian National Genebank to Identify New Sources of Resistance to Rust and Spot Blotch Diseases.

    Science.gov (United States)

    Kumar, Sundeep; Archak, Sunil; Tyagi, R K; Kumar, Jagdish; Vk, Vikas; Jacob, Sherry R; Srinivasan, Kalyani; Radhamani, J; Parimalan, R; Sivaswamy, M; Tyagi, Sandhya; Yadav, Mamata; Kumari, Jyotisna; Deepali; Sharma, Sandeep; Bhagat, Indoo; Meeta, Madhu; Bains, N S; Chowdhury, A K; Saha, B C; Bhattacharya, P M; Kumari, Jyoti; Singh, M C; Gangwar, O P; Prasad, P; Bharadwaj, S C; Gogoi, Robin; Sharma, J B; Gm, Sandeep Kumar; Saharan, M S; Bag, Manas; Roy, Anirban; Prasad, T V; Sharma, R K; Dutta, M; Sharma, Indu; Bansal, K C

    2016-01-01

    A comprehensive germplasm evaluation study of wheat accessions conserved in the Indian National Genebank was conducted to identify sources of rust and spot blotch resistance. Genebank accessions comprising three species of wheat-Triticum aestivum, T. durum and T. dicoccum were screened sequentially at multiple disease hotspots, during the 2011-14 crop seasons, carrying only resistant accessions to the next step of evaluation. Wheat accessions which were found to be resistant in the field were then assayed for seedling resistance and profiled using molecular markers. In the primary evaluation, 19,460 accessions were screened at Wellington (Tamil Nadu), a hotspot for wheat rusts. We identified 4925 accessions to be resistant and these were further evaluated at Gurdaspur (Punjab), a hotspot for stripe rust and at Cooch Behar (West Bengal), a hotspot for spot blotch. The second round evaluation identified 498 accessions potentially resistant to multiple rusts and 868 accessions potentially resistant to spot blotch. Evaluation of rust resistant accessions for seedling resistance against seven virulent pathotypes of three rusts under artificial epiphytotic conditions identified 137 accessions potentially resistant to multiple rusts. Molecular analysis to identify different combinations of genetic loci imparting resistance to leaf rust, stem rust, stripe rust and spot blotch using linked molecular markers, identified 45 wheat accessions containing known resistance genes against all three rusts as well as a QTL for spot blotch resistance. The resistant germplasm accessions, particularly against stripe rust, identified in this study can be excellent potential candidates to be employed for breeding resistance into the background of high yielding wheat cultivars through conventional or molecular breeding approaches, and are expected to contribute toward food security at national and global levels.

  12. Evaluation of 19,460 Wheat Accessions Conserved in the Indian National Genebank to Identify New Sources of Resistance to Rust and Spot Blotch Diseases.

    Directory of Open Access Journals (Sweden)

    Sundeep Kumar

    Full Text Available A comprehensive germplasm evaluation study of wheat accessions conserved in the Indian National Genebank was conducted to identify sources of rust and spot blotch resistance. Genebank accessions comprising three species of wheat-Triticum aestivum, T. durum and T. dicoccum were screened sequentially at multiple disease hotspots, during the 2011-14 crop seasons, carrying only resistant accessions to the next step of evaluation. Wheat accessions which were found to be resistant in the field were then assayed for seedling resistance and profiled using molecular markers. In the primary evaluation, 19,460 accessions were screened at Wellington (Tamil Nadu, a hotspot for wheat rusts. We identified 4925 accessions to be resistant and these were further evaluated at Gurdaspur (Punjab, a hotspot for stripe rust and at Cooch Behar (West Bengal, a hotspot for spot blotch. The second round evaluation identified 498 accessions potentially resistant to multiple rusts and 868 accessions potentially resistant to spot blotch. Evaluation of rust resistant accessions for seedling resistance against seven virulent pathotypes of three rusts under artificial epiphytotic conditions identified 137 accessions potentially resistant to multiple rusts. Molecular analysis to identify different combinations of genetic loci imparting resistance to leaf rust, stem rust, stripe rust and spot blotch using linked molecular markers, identified 45 wheat accessions containing known resistance genes against all three rusts as well as a QTL for spot blotch resistance. The resistant germplasm accessions, particularly against stripe rust, identified in this study can be excellent potential candidates to be employed for breeding resistance into the background of high yielding wheat cultivars through conventional or molecular breeding approaches, and are expected to contribute toward food security at national and global levels.

  13. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  14. Evaluation of common genetic variants identified by GWAS for early onset and morbid obesity in population-based samples

    DEFF Research Database (Denmark)

    den Hoed, M; Luan, J; Langenberg, C

    2013-01-01

    BACKGROUND: Meta-analysis of case-control genome-wide association studies (GWAS) for early onset and morbid obesity identified four variants in/near the PRL, PTER, MAF and NPC1 genes. OBJECTIVE: We aimed to validate association of these variants with obesity-related traits in population-based sam......BACKGROUND: Meta-analysis of case-control genome-wide association studies (GWAS) for early onset and morbid obesity identified four variants in/near the PRL, PTER, MAF and NPC1 genes. OBJECTIVE: We aimed to validate association of these variants with obesity-related traits in population......, these variants, which were identified in a GWAS for early onset and morbid obesity, do not seem to influence obesity-related traits in the general population....

  15. Verifying cell loss requirements in high-speed communication networks

    Directory of Open Access Journals (Sweden)

    Kerry W. Fendick

    1998-01-01

    Full Text Available In high-speed communication networks it is common to have requirements of very small cell loss probabilities due to buffer overflow. Losses are measured to verify that the cell loss requirements are being met, but it is not clear how to interpret such measurements. We propose methods for determining whether or not cell loss requirements are being met. A key idea is to look at the stream of losses as successive clusters of losses. Often clusters of losses, rather than individual losses, should be regarded as the important “loss events”. Thus we propose modeling the cell loss process by a batch Poisson stochastic process. Successive clusters of losses are assumed to arrive according to a Poisson process. Within each cluster, cell losses do not occur at a single time, but the distance between losses within a cluster should be negligible compared to the distance between clusters. Thus, for the purpose of estimating the cell loss probability, we ignore the spaces between successive cell losses in a cluster of losses. Asymptotic theory suggests that the counting process of losses initiating clusters often should be approximately a Poisson process even though the cell arrival process is not nearly Poisson. The batch Poisson model is relatively easy to test statistically and fit; e.g., the batch-size distribution and the batch arrival rate can readily be estimated from cell loss data. Since batch (cluster sizes may be highly variable, it may be useful to focus on the number of batches instead of the number of cells in a measurement interval. We also propose a method for approximately determining the parameters of a special batch Poisson cell loss with geometric batch-size distribution from a queueing model of the buffer content. For this step, we use a reflected Brownian motion (RBM approximation of a G/D/1/C queueing model. We also use the RBM model to estimate the input burstiness given the cell loss rate. In addition, we use the RBM model to

  16. Systematic Evaluation of Pleiotropy Identifies 6 Further Loci Associated With Coronary Artery Disease

    NARCIS (Netherlands)

    Webb, Thomas R; Erdmann, Jeanette; Stirrups, Kathleen E; Stitziel, Nathan O; Masca, Nicholas G D; Jansen, Henning; Kanoni, Stavroula; Nelson, Christopher P; Ferrario, Paola G; König, Inke R; Eicher, John D; Johnson, Andrew D; Hamby, Stephen E; Betsholtz, Christer; Ruusalepp, Arno; Franzén, Oscar; Schadt, Eric E; Björkegren, Johan L M; Weeke, Peter E; Auer, Paul L; Schick, Ursula M; Lu, Yingchang; Zhang, He; Dube, Marie-Pierre; Goel, Anuj; Farrall, Martin; Peloso, Gina M; Won, Hong-Hee; Do, Ron; van Iperen, Erik; Kruppa, Jochen; Mahajan, Anubha; Scott, Robert A; Willenborg, Christina; Braund, Peter S; van Capelleveen, Julian C; Doney, Alex S F; Donnelly, Louise A; Asselta, Rosanna; Merlini, Pier A; Duga, Stefano; Marziliano, Nicola; Denny, Josh C; Shaffer, Christian; El-Mokhtari, Nour Eddine; Franke, Andre; Heilmann, Stefanie; Hengstenberg, Christian; Hoffmann, Per; Holmen, Oddgeir L; Hveem, Kristian; Jansson, Jan-Håkan; Jöckel, Karl-Heinz; Kessler, Thorsten; Kriebel, Jennifer; Laugwitz, Karl L; Marouli, Eirini; Martinelli, Nicola; McCarthy, Mark I; Van Zuydam, Natalie R; Meisinger, Christa; Esko, Tõnu; Mihailov, Evelin; Escher, Stefan A; Alver, Maris; Moebus, Susanne; Morris, Andrew D; Virtamo, Jarma; Nikpay, Majid; Olivieri, Oliviero; Provost, Sylvie; AlQarawi, Alaa; Robertson, Neil R; Akinsansya, Karen O; Reilly, Dermot F; Vogt, Thomas F; Yin, Wu; Asselbergs, Folkert W; Kooperberg, Charles; Jackson, Rebecca D; Stahl, Eli; Müller-Nurasyid, Martina; Strauch, Konstantin; Varga, Tibor V; Waldenberger, Melanie; Zeng, Lingyao; Chowdhury, Rajiv; Salomaa, Veikko; Ford, Ian; Jukema, J Wouter; Amouyel, Philippe; Kontto, Jukka; Nordestgaard, Børge G; Ferrières, Jean; Saleheen, Danish; Sattar, Naveed; Surendran, Praveen; Wagner, Aline; Young, Robin; Howson, Joanna M M; Butterworth, Adam S; Danesh, John; Ardissino, Diego; Bottinger, Erwin P; Erbel, Raimund; Franks, Paul W; Girelli, Domenico; Hall, Alistair S; Hovingh, G Kees; Kastrati, Adnan; Lieb, Wolfgang; Meitinger, Thomas; Kraus, William E; Shah, Svati H; McPherson, Ruth; Orho-Melander, Marju; Melander, Olle; Metspalu, Andres; Palmer, Colin N A; Peters, Annette; Rader, Daniel J; Reilly, Muredach P; Loos, Ruth J F; Reiner, Alex P; Roden, Dan M; Tardif, Jean-Claude; Thompson, John R; Wareham, Nicholas J; Watkins, Hugh; Willer, Cristen J; Samani, Nilesh J; Schunkert, Heribert; Deloukas, Panos; Kathiresan, Sekar

    2017-01-01

    BACKGROUND: Genome-wide association studies have so far identified 56 loci associated with risk of coronary artery disease (CAD). Many CAD loci show pleiotropy; that is, they are also associated with other diseases or traits. OBJECTIVES: This study sought to systematically test if genetic variants

  17. Identifying Effective Education Interventions in Sub-Saharan Africa: A Meta-Analysis of Rigorous Impact Evaluations

    Science.gov (United States)

    Conn, Katharine

    2014-01-01

    The aim of this dissertation is to identify effective educational interventions in Sub-Saharan African with an impact on student learning. This is the first meta-analysis in the field of education conducted for Sub-Saharan Africa. This paper takes an in-depth look at twelve different types of education interventions or programs and attempts to not…

  18. Development of various NDA approach for verifying the hold inventory

    International Nuclear Information System (INIS)

    Nakashima, Shinichi; Yamada, Shigeki; Takahashi, Syunya

    1999-01-01

    This report describes the phase 1 activity to investigate the various nondestructive assay methods which proved to be useful to evaluate the amount of uranium holdup inventory. The study has been carried out in the Ningyo-Toge Demonstration uranium enrichment facility. This feasibility study is the part of JNC/DOE cooperative safeguards agreement. We expect that a combination of neutron counting and gamma-ray measurement method is required to obtain a quantitative measure of process holdup. As the results of the investigation, the basic measurement approaches were discussed and the applicable potential to gaseous centrifuges cascades were evaluated. (author)

  19. Verifying Embedded Systems using Component-based Runtime Observers

    DEFF Research Database (Denmark)

    Guan, Wei; Marian, Nicolae; Angelov, Christo K.

    against formally specified properties. This paper presents a component-based design method for runtime observers, which are configured from instances of prefabricated reusable components---Predicate Evaluator (PE) and Temporal Evaluator (TE). The PE computes atomic propositions for the TE; the latter...... is a reconfigurable component processing a data structure, representing the state transition diagram of a non-deterministic state machine, i.e. a Buchi automaton derived from a system property specified in Linear Temporal Logic (LTL). Observer components have been implemented using design models and design patterns...

  20. Evaluation of current prediction models for Lynch syndrome: updating the PREMM5 model to identify PMS2 mutation carriers

    NARCIS (Netherlands)

    A. Goverde (Anne); M.C.W. Spaander (Manon); D. Nieboer (Daan); A.M.W. van den Ouweland (Ans); W.N.M. Dinjens (Winand); H.J. Dubbink (Erik Jan); C. Tops (Cmj); S.W. Ten Broeke (Sanne W.); M.J. Bruno (Marco); R.M.W. Hofstra (Robert); E.W. Steyerberg (Ewout); A. Wagner (Anja)

    2017-01-01

    textabstractUntil recently, no prediction models for Lynch syndrome (LS) had been validated for PMS2 mutation carriers. We aimed to evaluate MMRpredict and PREMM5 in a clinical cohort and for PMS2 mutation carriers specifically. In a retrospective, clinic-based cohort we calculated predictions for

  1. 77 FR 41406 - Evaluation of In Vitro Tests for Identifying Eye Injury Hazard Potential of Chemicals and...

    Science.gov (United States)

    2012-07-13

    ...://www.epa.gov/oppad001/eye-irritation.pdf ). The IRE test is an organotypic test method that evaluates... rabbit corneal epithelial cells following test substance exposure (Takahashi et al., 2008). NICEATM is..., mailing address, phone, fax, email, and sponsoring organization, as applicable). NICEATM prefers that data...

  2. A preliminary evaluation of the generalized likelihood ratio for detecting and identifying control element failures in a transport aircraft

    Science.gov (United States)

    Bundick, W. T.

    1985-01-01

    The application of the Generalized Likelihood Ratio technique to the detection and identification of aircraft control element failures has been evaluated in a linear digital simulation of the longitudinal dynamics of a B-737 aircraft. Simulation results show that the technique has potential but that the effects of wind turbulence and Kalman filter model errors are problems which must be overcome.

  3. Evaluation of unique identifiers used as keys to match identical publications in Pure and SciVal

    DEFF Research Database (Denmark)

    Madsen, Heidi Holst; Madsen, Dicte; Gauffriau, Marianne

    2016-01-01

    , and erroneous optical or special character recognition. The case study explores the use of UIDs in the integration between the databases Pure and SciVal. Specifically journal publications in English are matched between the two databases. We find all error types except erroneous optical or special character......Unique identifiers (UID) are seen as an effective key to match identical publications across databases or identify duplicates in a database. The objective of the present study is to investigate how well UIDs work as match keys in the integration between Pure and SciVal, based on a case...... also briefly discuss how publication sets formed by using UIDs as the match keys may affect the bibliometric indicators number of publications, number of citations, and the average number of citations per publication. The objective is addressed in a literature review and a case study. The literature...

  4. Comprehensive evaluation of disease- and trait-specific enrichment for eight functional elements among GWAS-identified variants.

    Science.gov (United States)

    Markunas, Christina A; Johnson, Eric O; Hancock, Dana B

    2017-07-01

    Genome-wide association study (GWAS)-identified variants are enriched for functional elements. However, we have limited knowledge of how functional enrichment may differ by disease/trait and tissue type. We tested a broad set of eight functional elements for enrichment among GWAS-identified SNPs (p Enrichment analyses were conducted using logistic regression, with Bonferroni correction. Overall, a significant enrichment was observed for all functional elements, except sequence motifs. Missense SNPs showed the strongest magnitude of enrichment. eQTLs were the only functional element significantly enriched across all diseases/traits. Magnitudes of enrichment were generally similar across diseases/traits, where enrichment was statistically significant. Blood vs. brain tissue effects on enrichment were dependent on disease/trait and functional element (e.g., cardiovascular disease: eQTLs P TissueDifference  = 1.28 × 10 -6 vs. enhancers P TissueDifference  = 0.94). Identifying disease/trait-relevant functional elements and tissue types could provide new insight into the underlying biology, by guiding a priori GWAS analyses (e.g., brain enhancer elements for psychiatric disease) or facilitating post hoc interpretation.

  5. Comment on “Two statistics for evaluating parameter identifiability and error reduction” by John Doherty and Randall J. Hunt

    Science.gov (United States)

    Hill, Mary C.

    2010-01-01

    Doherty and Hunt (2009) present important ideas for first-order-second moment sensitivity analysis, but five issues are discussed in this comment. First, considering the composite-scaled sensitivity (CSS) jointly with parameter correlation coefficients (PCC) in a CSS/PCC analysis addresses the difficulties with CSS mentioned in the introduction. Second, their new parameter identifiability statistic actually is likely to do a poor job of parameter identifiability in common situations. The statistic instead performs the very useful role of showing how model parameters are included in the estimated singular value decomposition (SVD) parameters. Its close relation to CSS is shown. Third, the idea from p. 125 that a suitable truncation point for SVD parameters can be identified using the prediction variance is challenged using results from Moore and Doherty (2005). Fourth, the relative error reduction statistic of Doherty and Hunt is shown to belong to an emerging set of statistics here named perturbed calculated variance statistics. Finally, the perturbed calculated variance statistics OPR and PPR mentioned on p. 121 are shown to explicitly include the parameter null-space component of uncertainty. Indeed, OPR and PPR results that account for null-space uncertainty have appeared in the literature since 2000.

  6. Pilot study to verify the calibration of electrometers

    International Nuclear Information System (INIS)

    Becker, P.; Meghzifene, A.

    2002-01-01

    National Laboratory for Electrical Measurements has not yet developed its capability for the standardization of small electrical charge produced by DC, the IRD is trying to verify its standardization procedures of the electrical charge through a comparison programme. This subject was discussed with a major electrometer manufacturer that has offered to provide free of charge, three of their electrometer calibration standards for a pilot run. The model to be provided consists of four calibrated resistors and two calibrated capacitors, covering the charge/current range of interest. For producing charge or current a standard DC voltage must be applied to these components. Since practically all-modern electrometers measure using virtual ground, this methodology is viable. The IRD, in collaboration with the IAEA, wishes to invite interested laboratories to participate in this pilot comparison programme. This exercise is expected to be useful for all participants and will hopefully open the way for the establishment of routine comparisons in this area. The results will be discussed and published in an appropriate journal. Interested institutions should contact directly Mr. Paulo H. B. Becker through e-mail (pbecker at ird.gov.br) or fax +55 21 24421950 informing him of the model and manufacturer of the electrometer to be used for the pilot study and discuss all practical details. (author)

  7. Using Unmanned Aerial Vehicles to Assess Vegetative Cover and Identify Biotic Resources in Sagebrush Steppe Ecosystems: Preliminary Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Robert P. Breckenridge

    2006-04-01

    The Idaho National Laboratory (INL), in conjunction with the University of Idaho, is evaluating novel approaches for using unmanned aerial vehicles (UAVs) as a quicker and safer method for monitoring biotic resources. Evaluating vegetative cover is an important factor in understanding the sustainability of many ecosystems. In assessing vegetative cover, methods that improve accuracy and cost efficiency could revolutionize how biotic resources are monitored on western federal lands. Sagebrush steppe ecosystems provide important habitat for a variety of species, some of which are important indicator species (e.g., sage grouse). Improved methods are needed to support monitoring these habitats because there are not enough resource specialists or funds available for comprehensive ground evaluation of these ecosystems. In this project, two types of UAV platforms (fixed wing and helicopter) were used to collect still-frame imagery to assess cover in sagebrush steppe ecosystems. This paper discusses the process for collecting and analyzing imagery from the UAVs to (1) estimate total percent cover, (2) estimate percent cover for six different types of vegetation, and (3) locate sage grouse based on representative decoys. The field plots were located on the INL site west of Idaho Falls, Idaho, in areas with varying amounts and types of vegetative cover. A software program called SamplePoint developed by the U.S. Department of Agriculture, Agricultural Research Service was used to evaluate the imagery for percent cover for the six vegetation types (bare ground, litter, shrubs, dead shrubs, grasses, and forbs). Results were compared against standard field measurements to assess accuracy.

  8. EVALUATION OF THE COMPUTED TOMOGRAPHIC "SENTINEL CLOT SIGN" TO IDENTIFY BLEEDING ABDOMINAL ORGANS IN DOGS WITH HEMOABDOMEN.

    Science.gov (United States)

    Specchi, Swan; Auriemma, Edoardo; Morabito, Simona; Ferri, Filippo; Zini, Eric; Piola, Valentina; Pey, Pascaline; Rossi, Federica

    2017-01-01

    The CT "sentinel clot sign" has been defined as the highest attenuation hematoma adjacent to a bleeding organ in humans with hemoabdomen. The aims of this retrospective descriptive multicenter study were to describe CT findings in a sample of dogs with surgically or necropsy confirmed intra-abdominal bleeding and determine prevalence of the "sentinel clot sign" adjacent to the location of bleeding. Medical records between 2012 and 2014 were searched for dogs with hemoabdomen and in which the origin of the bleeding was confirmed either with surgery or necropsy. Retrieved CT images were reviewed for the presence and localization of the "sentinel clot sign," HU measurements of the "sentinel clot sign" and hemoabdomen, and presence of extravasation of contrast media within the abdominal cavity. Nineteen dogs were included. Three dogs were excluded due to the low amount of blood that did not allow the identification of a "sentinel clot sign." A "sentinel clot sign" was detected in the proximity of the confirmed bleeding organ in 14/16 (88%) of the patients. The mean HU of the "sentinel clot sign" was 56 (range: 43-70) while that of the hemoabdomen was 34 (range: 20-45). Active hemorrhage was identified as extravasation of contrast medium within the peritoneal cavity from the bleeding organ in three dogs. In conclusion, the CT "sentinel clot sign" may be helpful for identifying the source of bleeding in dogs with hemoabdomen. © 2016 American College of Veterinary Radiology.

  9. Validating the TeleStroke Mimic Score: A Prediction Rule for Identifying Stroke Mimics Evaluated Over Telestroke Networks.

    Science.gov (United States)

    Ali, Syed F; Hubert, Gordian J; Switzer, Jeffrey A; Majersik, Jennifer J; Backhaus, Roland; Shepard, L Wylie; Vedala, Kishore; Schwamm, Lee H

    2018-03-01

    Up to 30% of acute stroke evaluations are deemed stroke mimics, and these are common in telestroke as well. We recently published a risk prediction score for use during telestroke encounters to differentiate stroke mimics from ischemic cerebrovascular disease derived and validated in the Partners TeleStroke Network. Using data from 3 distinct US and European telestroke networks, we sought to externally validate the TeleStroke Mimic (TM) score in a broader population. We evaluated the TM score in 1930 telestroke consults from the University of Utah, Georgia Regents University, and the German TeleMedical Project for Integrative Stroke Care Network. We report the area under the curve in receiver-operating characteristic curve analysis with 95% confidence interval for our previously derived TM score in which lower TM scores correspond with a higher likelihood of being a stroke mimic. Based on final diagnosis at the end of the telestroke consultation, there were 630 of 1930 (32.6%) stroke mimics in the external validation cohort. All 6 variables included in the score were significantly different between patients with ischemic cerebrovascular disease versus stroke mimics. The TM score performed well (area under curve, 0.72; 95% confidence interval, 0.70-0.73; P mimic during telestroke consultation in these diverse cohorts was similar to its performance in our original cohort. Predictive decision-support tools like the TM score may help highlight key clinical differences between mimics and patients with stroke during complex, time-critical telestroke evaluations. © 2018 American Heart Association, Inc.

  10. Systematic Evaluation of Pleiotropy Identifies 6 Further Loci Associated With Coronary Artery Disease

    DEFF Research Database (Denmark)

    Webb, Thomas R; Erdmann, Jeanette; Stirrups, Kathleen E

    2017-01-01

    %) single nucleotide polymorphisms available on the exome array, which included a substantial proportion of known or suspected single nucleotide polymorphisms associated with common diseases or traits as of 2011. Suggestive association signals were replicated in an additional 30,533 cases and 42,530 control...... subjects. To evaluate pleiotropy, we tested CAD loci for association with cardiovascular risk factors (lipid traits, blood pressure phenotypes, body mass index, diabetes, and smoking behavior), as well as with other diseases/traits through interrogation of currently available genome-wide association study...

  11. Non-destructive technique to verify clearance of pipes

    Directory of Open Access Journals (Sweden)

    Savidou Anastasia

    2010-01-01

    Full Text Available A semi-empirical, non-destructive technique to evaluate the activity of gamma ray emitters in contaminated pipes is discussed. The technique is based on in-situ measurements by a portable NaI gamma ray spectrometer. The efficiency of the detector for the pipe and detector configuration was evaluated by Monte Carlo calculations performed using the MCNP code. Gamma ray detector full-energy peak efficiency was predicted assuming a homogeneous activity distribution over the internal surface of the pipe for 344 keV, 614 keV, 662 keV, and 1332 keV photons, representing Eu-152, Ag-118m, Cs-137, and Co-60 contamination, respectively. The effect of inhomogeneity on the accuracy of the technique was also examined. The model was validated against experimental measurements performed using a Cs-137 volume calibration source representing a contaminated pipe and good agreement was found between the calculated and experimental results. The technique represents a sensitive and cost-effective technology for calibrating portable gamma ray spectrometry systems and can be applied in a range of radiation protection and waste management applications.

  12. Evaluating motives: Two simple tests to identify and avoid entanglement in legally dubious urine drug testing schemes.

    Science.gov (United States)

    Barnes, Michael C; Worthy, Stacey L

    2015-01-01

    This article educates healthcare practitioners on the legal framework prohibiting abusive practices in urine drug testing (UDT) in medical settings, discusses several profit-driven UDT schemes that have resulted in enforcement actions, and provides recommendations for best practices in UDT to comply with state and federal fraud and anti-kickback statutes. The authors carefully reviewed and analyzed statutes, regulations, adivsory opinions, case law, court documents, articles from legal journals, and news articles. Certain facts-driven UDT arrangements tend to violate federal and state healthcare laws and regulations, including Stark law, the anti-kickback statute, the criminal health care fraud statute, and the False Claims Act. Healthcare practitioners who use UDT can help ensure that they are in compliance with applicable federal and state laws by evaluating whether their actions are motivated by providing proper care to their patients rather than by profits. They must avoid schemes that violate the spirit of the law while appearing to comply with the letter of the law. Such a simple self-evaluation of motive can reduce a practitioner's likelihood of civil fines and criminal liability.

  13. Multilingual Validation of the Questionnaire for Verifying Stroke-Free Status in West Africa.

    Science.gov (United States)

    Sarfo, Fred; Gebregziabher, Mulugeta; Ovbiagele, Bruce; Akinyemi, Rufus; Owolabi, Lukman; Obiako, Reginald; Akpa, Onoja; Armstrong, Kevin; Akpalu, Albert; Adamu, Sheila; Obese, Vida; Boa-Antwi, Nana; Appiah, Lambert; Arulogun, Oyedunni; Mensah, Yaw; Adeoye, Abiodun; Tosin, Aridegbe; Adeleye, Osimhiarherhuo; Tabi-Ajayi, Eric; Phillip, Ibinaiye; Sani, Abubakar; Isah, Suleiman; Tabari, Nasir; Mande, Aliyu; Agunloye, Atinuke; Ogbole, Godwin; Akinyemi, Joshua; Laryea, Ruth; Melikam, Sylvia; Uvere, Ezinne; Adekunle, Gregory; Kehinde, Salaam; Azuh, Paschal; Dambatta, Abdul; Ishaq, Naser; Saulson, Raelle; Arnett, Donna; Tiwari, Hemnant; Jenkins, Carolyn; Lackland, Dan; Owolabi, Mayowa

    2016-01-01

    The Questionnaire for Verifying Stroke-Free Status (QVSFS), a method for verifying stroke-free status in participants of clinical, epidemiological, and genetic studies, has not been validated in low-income settings where populations have limited knowledge of stroke symptoms. We aimed to validate QVSFS in 3 languages, Yoruba, Hausa and Akan, for ascertainment of stroke-free status of control subjects enrolled in an on-going stroke epidemiological study in West Africa. Data were collected using a cross-sectional study design where 384 participants were consecutively recruited from neurology and general medicine clinics of 5 tertiary referral hospitals in Nigeria and Ghana. Ascertainment of stroke status was by neurologists using structured neurological examination, review of case records, and neuroimaging (gold standard). Relative performance of QVSFS without and with pictures of stroke symptoms (pictograms) was assessed using sensitivity, specificity, positive predictive value, and negative predictive value. The overall median age of the study participants was 54 years and 48.4% were males. Of 165 stroke cases identified by gold standard, 98% were determined to have had stroke, whereas of 219 without stroke 87% were determined to be stroke-free by QVSFS. Negative predictive value of the QVSFS across the 3 languages was 0.97 (range, 0.93-1.00), sensitivity, specificity, and positive predictive value were 0.98, 0.82, and 0.80, respectively. Agreement between the questionnaire with and without the pictogram was excellent/strong with Cohen k=0.92. QVSFS is a valid tool for verifying stroke-free status across culturally diverse populations in West Africa. © 2015 American Heart Association, Inc.

  14. Trends in the incidence rate, type and treatment of surgically verified endometriosis - a nationwide cohort study.

    Science.gov (United States)

    Saavalainen, Liisu; Tikka, Tuulia; But, Anna; Gissler, Mika; Haukka, Jari; Tiitinen, Aila; Härkki, Päivi; Heikinheimo, Oskari

    2018-01-01

    To study the trends in incidence rate, type and surgical treatment, and patient characteristics of surgically verified endometriosis during 1987-2012. This is a register-based cohort study. We identified women receiving their first diagnosis of endometriosis in surgery from the Finnish Hospital Discharge Register (FHDR). Quality of the FHDR records was assessed bidirectionally. The age-standardized incidence rates of the first surgically verified endometriosis was assessed by calendar year. The cohort comprises 49 956 women. The quality assessment suggested the FHDR data to be of good quality. The most common diagnosis, ovarian endometriosis (46%), was associated with highest median age 38.5 years (interquartile range 31.0-44.8) and the second most common diagnosis, peritoneal endometriosis (40%), with median age 34.9 years (28.6-41.7). Between 1987 and 2012, a decrease was observed in the median age, from 38.8 (32.3-43.6) to 34.0 (28.9-41.0) years, and in the age-standardized incidence rate from 116 [95% confidence interval (CI) 112-121] to 45 (42-48) per 100 000 women. The proportion of hysterectomy as a first surgical treatment decreased from 38 to 19%, whereas that of laparoscopy increased from 42 to 73% when comparing 1987-1995 with 1996-2012. This nationwide cohort of surgically verified endometriosis showed a decrease in the incidence rate and in the patient age at the time of first diagnosis, even though the proportion of laparoscopy has increased. The number of hysterectomies has decreased. These changes are likely to reflect the evolving diagnostics, increasing awareness of endometriosis, and effective use of medical treatment before surgery. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  15. Development of measurement standards for verifying functional performance of surface texture measuring instruments

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, A [Life and Industrial Product Development Department Olympus Corporation, 2951 Ishikawa-machi, Hachiouji-shi, Tokyo (Japan); Suzuki, H [Industrial Marketing and Planning Department Olympus Corporation, Shinjyuku Monolith, 3-1 Nishi-Shinjyuku 2-chome, Tokyo (Japan); Yanagi, K, E-mail: a_fujii@ot.olympus.co.jp [Department of Mechanical Engineering, Nagaoka University of Technology, 1603-1 Kamitomioka-machi, Nagaoka-shi, Niigata (Japan)

    2011-08-19

    A new measurement standard is proposed for verifying overall functional performance of surface texture measuring instruments. Its surface is composed of sinusoidal surface waveforms of chirp signals along horizontal cross sections of the material measure. One of the notable features is that the amplitude of each cycle in the chirp signal form is geometrically modulated so that the maximum slope is kept constant. The maximum slope of the chirp-like signal is gradually decreased according to movement in the lateral direction. We fabricated the measurement standard by FIB processing, and it was calibrated by AFM. We tried to evaluate the functional performance of Laser Scanning Microscope by this standard in terms of amplitude response with varying slope angles. As a result, it was concluded that the proposed standard can easily evaluate the performance of surface texture measuring instruments.

  16. 76 FR 14678 - Communications Unit Leader Prerequisite and Evaluation

    Science.gov (United States)

    2011-03-17

    ... evaluation form. OEC will use the evaluation form to identify course attendees, verify satisfaction of course... and evaluation of OEC events. Evaluation forms will be available in hard copy at each training session... Prerequisite and Evaluation. OMB Number: 1670--NEW. COML Prerequisites Verification Frequency: On occasion...

  17. An Evaluation of Talent 4 . . . : A Programme to Identify Talent and Skills for Prisoners, Disadvantaged, Unemployed, and Vulnerable Groups.

    Science.gov (United States)

    McGuire-Snieckus, Rebecca; Caulfield, Laura

    2017-11-01

    Previous research suggests that the relationship between employment and recidivism is complex, with more support needed to facilitate employability motivation for sustained change. An arts-based programme designed to facilitate vocational self-determinism among prisoners with evidence of impact across three prisons in the United Kingdom was replicated and delivered to 234 prisoners and long-term unemployed participants from six European countries, to explore whether the findings from the previous evaluation would be replicated on a much larger scale. The research presented in this article found that supporting prisoners and the long-term unemployed to articulate employability goals had a positive effect on personal growth as well as understanding of individual strengths and weaknesses with respect to work, employment, problem solving, and thinking styles. Future research might explore the longer term impact on employment and recidivism.

  18. Evaluation of geophysical techniques for identifying fractures in program wells in Deaf Smith County, Texas: Revision 1, Topical report

    International Nuclear Information System (INIS)

    Gillespie, R.P.; Siminitz, P.C.

    1987-08-01

    Quantitative information about the presence and orientation of fractures is essential for the understanding of the geomechanical and geohydrological behavior of rocks. This report evaluates various borehole geophysical techniques for characterizing fractures in three Civilian Radioactive Waste Management (CRWM) Program test wells in the Palo Duro Basin in Deaf Smith County, Texas. Emphasis has been placed on the Schlumberger Fracture Identification Log (FIL) which detects vertical fractures and provides data for calculation of orientation. Depths of FIL anomalies were compared to available core. It was found that the application of FIL results to characterize fracture frequency or orientation is inappropriate at this time. The uncertainties associated with the FIL information render the information unreliable. No geophysical logging tool appears to unequivocally determine the location and orientation of fractures in a borehole. Geologic mapping of the exploratory shafts will ultimately provide the best data on fracture frequency and orientation at the proposed repository site. 22 refs., 6 figs., 3 tabs

  19. Evaluation of SNP Data from the Malus Infinium Array Identifies Challenges for Genetic Analysis of Complex Genomes of Polyploid Origin.

    Directory of Open Access Journals (Sweden)

    Michela Troggio

    Full Text Available High throughput arrays for the simultaneous genotyping of thousands of single-nucleotide polymorphisms (SNPs have made the rapid genetic characterisation of plant genomes and the development of saturated linkage maps a realistic prospect for many plant species of agronomic importance. However, the correct calling of SNP genotypes in divergent polyploid genomes using array technology can be problematic due to paralogy, and to divergence in probe sequences causing changes in probe binding efficiencies. An Illumina Infinium II whole-genome genotyping array was recently developed for the cultivated apple and used to develop a molecular linkage map for an apple rootstock progeny (M432, but a large proportion of segregating SNPs were not mapped in the progeny, due to unexpected genotype clustering patterns. To investigate the causes of this unexpected clustering we performed BLAST analysis of all probe sequences against the 'Golden Delicious' genome sequence and discovered evidence for paralogous annealing sites and probe sequence divergence for a high proportion of probes contained on the array. Following visual re-evaluation of the genotyping data generated for 8,788 SNPs for the M432 progeny using the array, we manually re-scored genotypes at 818 loci and mapped a further 797 markers to the M432 linkage map. The newly mapped markers included the majority of those that could not be mapped previously, as well as loci that were previously scored as monomorphic, but which segregated due to divergence leading to heterozygosity in probe annealing sites. An evaluation of the 8,788 probes in a diverse collection of Malus germplasm showed that more than half the probes returned genotype clustering patterns that were difficult or impossible to interpret reliably, highlighting implications for the use of the array in genome-wide association studies.

  20. A two-dimensional deformable phantom for quantitatively verifying deformation algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, Neil; Chuang, Cynthia; Pouliot, Jean [Department of Radiation Oncology, University of California San Francisco, San Francisco, California 94143-1708 (United States)

    2011-08-15

    Purpose: The incorporation of deformable image registration into the treatment planning process is rapidly advancing. For this reason, the methods used to verify the underlying deformation algorithms must evolve equally fast. This manuscript proposes a two-dimensional deformable phantom, which can objectively verify the accuracy of deformation algorithms, as the next step for improving these techniques. Methods: The phantom represents a single plane of the anatomy for a head and neck patient. Inflation of a balloon catheter inside the phantom simulates tumor growth. CT and camera images of the phantom are acquired before and after its deformation. Nonradiopaque markers reside on the surface of the deformable anatomy and are visible through an acrylic plate, which enables an optical camera to measure their positions; thus, establishing the ground-truth deformation. This measured deformation is directly compared to the predictions of deformation algorithms, using several similarity metrics. The ratio of the number of points with more than a 3 mm deformation error over the number that are deformed by more than 3 mm is used for an error metric to evaluate algorithm accuracy. Results: An optical method of characterizing deformation has been successfully demonstrated. For the tests of this method, the balloon catheter deforms 32 out of the 54 surface markers by more than 3 mm. Different deformation errors result from the different similarity metrics. The most accurate deformation predictions had an error of 75%. Conclusions: The results presented here demonstrate the utility of the phantom for objectively verifying deformation algorithms and determining which is the most accurate. They also indicate that the phantom would benefit from more electron density heterogeneity. The reduction of the deformable anatomy to a two-dimensional system allows for the use of nonradiopaque markers, which do not influence deformation algorithms. This is the fundamental advantage of this

  1. Trust, but verify: social media models for disaster management.

    Science.gov (United States)

    Mehta, Amisha M; Bruns, Axel; Newton, Judith

    2017-07-01

    A lack of trust in the information exchanged via social media may significantly hinder decisionmaking by community members and emergency services during disasters. The need for timely information at such times, though, challenges traditional ways of establishing trust. This paper, building on a multi-year research project that combined social media data analysis and participant observation within an emergency management organisation and in-depth engagement with stakeholders across the sector, pinpoints and examines assumptions governing trust and trusting relationships in social media disaster management. It assesses three models for using social media in disaster management-information gathering, quasi-journalistic verification, and crowdsourcing-in relation to the guardianship of trust to highlight the verification process for content and source and to identify the role of power and responsibilities. The conclusions contain important implications for emergency management organisations seeking to enhance their mechanisms for incorporating user-generated information from social media sources in their disaster response efforts. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  2. The Identifying, Evaluating and Prioritizing the Factors Affecting Customers’ Satisfaction with E-service Centers of Iran's Police

    Directory of Open Access Journals (Sweden)

    Seyed Ali Ziaee Azimi

    2016-11-01

    Full Text Available The present research is classified as an applied one employing a descriptive survey design to describe the status quo of the factors affecting customers’ satisfaction with the E-service centers of Iran’s police, known as 10 + police centers. The research population involves all the costumers of the 10+ police centers, among which 420 individuals were chosen through simple random sampling technique. Furthermore, 45 10 + police service centers were selected with probability proportional to size. After Determining the validity and reliability of the researcher-made questionnaire, it has been used to collect the required data. Then, a conceptual model was developed using the theoretical framework and background literature. After that, SPSS software was used to examine and make an analysis of the research hypothesises. The findings indicate that all the identified indices to the customers’ satisfaction with the 10 + police e- service centers (including trust and confidence, staff performance, system facility, environmental facility, basic amenity, providing sufficient notification, time and cost, easy access to the office have an effect on the customers’ satisfaction. In the end, some practical suggestions were made for an improvement in the satisfaction level of the customers to the 10 + police e- service centers.

  3. Privacy-Preserving Verifiability: A Case for an Electronic Exam Protocol

    DEFF Research Database (Denmark)

    Giustolisi, Rosario; Iovino, Vincenzo; Lenzini, Gabriele

    2017-01-01

    We introduce the notion of privacy-preserving verifiability for security protocols. It holds when a protocol admits a verifiability test that does not reveal, to the verifier that runs it, more pieces of information about the protocol’s execution than those required to run the test. Our definition...... of privacy-preserving verifiability is general and applies to cryptographic protocols as well as to human security protocols. In this paper we exemplify it in the domain of e-exams. We prove that the notion is meaningful by studying an existing exam protocol that is verifiable but whose verifiability tests...... are not privacy-preserving. We prove that the notion is applicable: we review the protocol using functional encryption so that it admits a verifiability test that preserves privacy according to our definition. We analyse, in ProVerif, that the verifiability holds despite malicious parties and that the new...

  4. Evaluation of Quality and Readability of Health Information Websites Identified through India’s Major Search Engines

    Directory of Open Access Journals (Sweden)

    S. Raj

    2016-01-01

    Full Text Available Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words “Health” and “Information” were used on search engines “Google” and “Yahoo.” Out of 50 websites (25 from each search engines, after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES, Flesch-Kincaid Grade Level (FKGL, and SMOG. Results. Forty percent of websites (n=13 were sponsored by government. Health On the Net Code of Conduct (HONcode certification was present on 50% (n=16 of websites. The mean LIDA score (74.31 was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.

  5. Evaluation of storing hepatitis B vaccine outside the cold chain in the Solomon Islands: Identifying opportunities and barriers to implementation.

    Science.gov (United States)

    Breakwell, Lucy; Anga, Jenniffer; Dadari, Ibrahim; Sadr-Azodi, Nahad; Ogaoga, Divinal; Patel, Minal

    2017-05-15

    Monovalent Hepatitis B vaccine (HepB) is heat stable, making it suitable for storage outside cold chain (OCC) at 37°C for 1month. We conducted an OCC project in the Solomon Islands to determine the feasibility of and barriers to national implementation and to evaluate impact on coverage. Healthcare workers at 13 facilities maintained monovalent HepB birth dose (HepB-BD) OCC for up to 28days over 7months. Vaccination data were recorded for children born during the project and those born during 7months before the project. Timely HepB-BD coverage among facility and home births increased from 30% to 68% and from 4% to 24%, respectively. Temperature excursions above 37°C were rare, but vaccine wastage was high and shortages common. Storing HepB OCC can increase HepB-BD coverage in countries with insufficient cold chain capacity or numerous home births. High vaccine wastage and unreliable vaccine supply must be addressed for successful implementation. Published by Elsevier Ltd.

  6. Evaluation of BioFM liquid medium for culture of cerebrospinal fluid in tuberculous meningitis to identify Mycobacterium tuberculosis.

    Science.gov (United States)

    Kashyap, R S; Ramteke, S S; Gaherwar, H M; Deshpande, P S; Purohit, H J; Taori, G M; Daginawala, H

    2010-01-01

    The present study was designed to evaluate the sensitivity and specificity of liquid culture medium (BioFM broth) for the diagnosis of tuberculous meningitis (TBM) in cerebrospinal fluid (CSF). CSF samples from 200 patients (TBM group = 150 and non-TBM group = 50) were tested for culture of Mycobacterium tuberculosis in BioFM liquid culture medium. Out of 150 TBM cases, 120 were found to be culture positive, indicating a sensitivity of 80% in BioFM broth within 2-3 weeks of inoculation. Positive cultures were also observed for CSF from 32 (64%) out of 50 non-TBM patients in BioFM liquid culture medium within 4 days of sample inoculation. Therefore, according to our study, BioFM broth system yielded 80% sensitivity [95% confidence interval (CI): 67-93%] and 36% specificity (95% CI: 57-98%) for TBM diagnosis. Our results indicate that although BioFM broth allows the detection of positive cultures within a shorter time, it has a high potential for contamination or for the coexistence of M. tuberculosis and non-tuberculous meningitis (NTM). This coexistence may go undetected or potentially lead to erroneous reporting of results.

  7. Evaluation of BioFM liquid medium for culture of cerebrospinal fluid in tuberculous meningitis to identify Mycobacterium tuberculosis

    Directory of Open Access Journals (Sweden)

    Kashyap R

    2010-01-01

    Full Text Available The present study was designed to evaluate the sensitivity and specificity of liquid culture medium (BioFM broth for the diagnosis of tuberculous meningitis (TBM in cerebrospinal fluid (CSF. CSF samples from 200 patients (TBM group = 150 and non-TBM group = 50 were tested for culture of Mycobacterium tuberculosis in BioFM liquid culture medium. Out of 150 TBM cases, 120 were found to be culture positive, indicating a sensitivity of 80% in BioFM broth within 2-3 weeks of inoculation. Positive cultures were also observed for CSF from 32 (64% out of 50 non-TBM patients in BioFM liquid culture medium within 4 days of sample inoculation. Therefore, according to our study, BioFM broth system yielded 80% sensitivity [95% confidence interval (CI: 67-93%] and 36% specificity (95% CI: 57-98% for TBM diagnosis. Our results indicate that although BioFM broth allows the detection of positive cultures within a shorter time, it has a high potential for contamination or for the coexistence of M. tuberculosis and non-tuberculous meningitis (NTM. This coexistence may go undetected or potentially lead to erroneous reporting of results.

  8. Identifying barriers and facilitators to participation in pressure ulcer prevention in allied healthcare professionals: a mixed methods evaluation.

    Science.gov (United States)

    Worsley, Peter R; Clarkson, Paul; Bader, Dan L; Schoonhoven, Lisette

    2017-09-01

    To evaluate the barriers and facilitators for allied health professional's participation in pressure ulcer prevention. Mixed method cohort study. Single centre study in an acute university hospital trust. Five physiotherapists and four occupational therapists were recruited from the hospital trust. Therapists had been working in the National Health Service (NHS) for a minimum of one year. Therapist views and experiences were collated using an audio recorded focus group. This recording was analysed using constant comparison analysis. Secondary outcomes included assessment of attitudes and knowledge of pressure ulcer prevention using questionnaires. Key themes surrounding barriers to participation in pressure ulcer prevention included resources (staffing and equipment), education and professional boundaries. Fewer facilitators were described, with new training opportunities and communication being highlighted. Results from the questionnaires showed the therapists had a positive attitude towards pressure ulcer prevention with a median score of 81% (range 50 to 83%). However, there were gaps in knowledge with a median score of 69% (range 50 to 77%). The therapist reported several barriers to pressure ulcer prevention and few facilitators. The primary barriers were resources, equipment and education. Attitudes and knowledge in AHPs were comparable to data previously reported from experienced nursing staff. Copyright © 2016 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  9. Multidimensional single cell based STAT phosphorylation profiling identifies a novel biosignature for evaluation of systemic lupus erythematosus activity.

    Directory of Open Access Journals (Sweden)

    Xinfang Huang

    Full Text Available INTRODUCTION: Dysregulated cytokine action on immune cells plays an important role in the initiation and progress of systemic lupus erythematosus (SLE, a complex autoimmune disease. Comprehensively quantifying basal STATs phosphorylation and their signaling response to cytokines should help us to better understand the etiology of SLE. METHODS: Phospho-specific flow cytometry was used to measure the basal STAT signaling activation in three immune cell types of peripheral-blood mononuclear cells from 20 lupus patients, 9 rheumatoid arthritis (RA patients and 13 healthy donors (HDs. A panel of 27 cytokines, including inflammatory cytokines, was measured with Bio-Plex™ Human Cytokine Assays. Serum Prolactin levels were measured with an immunoradiometric assay. STAT signaling responses to inflammatory cytokines (interferon α [IFNα], IFNγ, interleukin 2 [IL2], IL6, and IL10 were also monitored. RESULTS: We observed the basal activation of STAT3 in SLE T cells and monocytes, and the basal activation of STAT5 in SLE T cells and B cells. The SLE samples clustered into two main groups, which were associated with the SLE Disease Activity Index 2000, their erythrocyte sedimentation rate, and their hydroxychloroquine use. The phosphorylation of STAT5 in B cells was associated with cytokines IL2, granulocyte colony-stimulating factor (G-CSF, and IFNγ, whereas serum prolactin affected STAT5 activation in T cells. The responses of STAT1, STAT3, and STAT5 to IFNα were greatly reduced in SLE T cells, B cells, and monocytes, except for the STAT1 response to IFNα in monocytes. The response of STAT3 to IL6 was reduced in SLE T cells. CONCLUSIONS: The basal activation of STATs signaling and reduced response to cytokines may be helpful us to identify the activity and severity of SLE.

  10. Evaluating a satellite-based seasonal evapotranspiration product and identifying its relationship with other satellite-derived products and crop yield: A case study for Ethiopia

    Science.gov (United States)

    Tadesse, Tsegaye; Senay, Gabriel B.; Berhan, Getachew; Regassa, Teshome; Beyene, Shimelis

    2015-08-01

    Satellite-derived evapotranspiration anomalies and normalized difference vegetation index (NDVI) products from Moderate Resolution Imaging Spectroradiometer (MODIS) data are currently used for African agricultural drought monitoring and food security status assessment. In this study, a process to evaluate satellite-derived evapotranspiration (ETa) products with a geospatial statistical exploratory technique that uses NDVI, satellite-derived rainfall estimate (RFE), and crop yield data has been developed. The main goal of this study was to evaluate the ETa using the NDVI and RFE, and identify a relationship between the ETa and Ethiopia's cereal crop (i.e., teff, sorghum, corn/maize, barley, and wheat) yields during the main rainy season. Since crop production is one of the main factors affecting food security, the evaluation of remote sensing-based seasonal ETa was done to identify the appropriateness of this tool as a proxy for monitoring vegetation condition in drought vulnerable and food insecure areas to support decision makers. The results of this study showed that the comparison between seasonal ETa and RFE produced strong correlation (R2 > 0.99) for all 41 crop growing zones in Ethiopia. The results of the spatial regression analyses of seasonal ETa and NDVI using Ordinary Least Squares and Geographically Weighted Regression showed relatively weak yearly spatial relationships (R2 products have a good predictive potential for these 31 identified zones in Ethiopia. Decision makers may potentially use ETa products for monitoring cereal crop yields and early warning of food insecurity during drought years for these identified zones.

  11. Evaluating bronchodilator response in pediatric patients with post-infectious bronchiolitis obliterans: use of different criteria for identifying airway reversibility.

    Science.gov (United States)

    Mattiello, Rita; Vidal, Paula Cristina; Sarria, Edgar Enrique; Pitrez, Paulo Márcio; Stein, Renato Tetelbom; Mocelin, Helena Teresinha; Fischer, Gilberto Bueno; Jones, Marcus Herbert; Pinto, Leonardo Araújo

    2016-01-01

    Post-infectious bronchiolitis obliterans (PIBO) is a clinical entity that has been classified as constrictive, fixed obstruction of the lumen by fibrotic tissue. However, recent studies using impulse oscillometry have reported bronchodilator responses in PIBO patients. The objective of this study was to evaluate bronchodilator responses in pediatric PIBO patients, comparing different criteria to define the response. We evaluated pediatric patients diagnosed with PIBO and treated at one of two pediatric pulmonology outpatient clinics in the city of Porto Alegre, Brazil. Spirometric parameters were measured in accordance with international recommendations. We included a total of 72 pediatric PIBO patients. The mean pre- and post-bronchodilator values were clearly lower than the reference values for all parameters, especially FEF25-75%. There were post-bronchodilator improvements. When measured as mean percent increases, FEV1 and FEF25-75%, improved by 11% and 20%, respectively. However, when the absolute values were calculated, the mean FEV1 and FEF25-75% both increased by only 0.1 L. We found that age at viral aggression, a family history of asthma, and allergy had no significant effects on bronchodilator responses. Pediatric patients with PIBO have peripheral airway obstruction that is responsive to treatment but is not completely reversible with a bronchodilator. The concept of PIBO as fixed, irreversible obstruction does not seem to apply to this population. Our data suggest that airway obstruction is variable in PIBO patients, a finding that could have major clinical implications. A bronquiolite obliterante pós-infecciosa (BOPI) é uma entidade clínica que tem sido classificada como obstrução fixa e constritiva do lúmen por tecido fibrótico. Entretanto, estudos recentes utilizando oscilometria de impulso relataram resposta ao broncodilatador em pacientes com BOPI. O objetivo deste estudo foi avaliar a resposta broncodilatadora em pacientes pediátricos com

  12. Using information communication technology to identify deficits in rural health care: a mixed-methods evaluation from Guatemala.

    Science.gov (United States)

    Wahedi, Katharina; Flores, Walter; Beiersmann, Claudia; Bozorgmehr, Kayvan; Jahn, Albrecht

    2018-01-01

    In August 2014, the Centre for the Studies of Equity and Governance in Health Systems (CEGSS) in Guatemala launched an online platform, which facilitates complaints about health services via text messages. The aim is to collect, systemise and forward such complaints to relevant institutions, and to create a data pool on perceived deficits of health care in rural Guatemala. To evaluate if the online platform is an accepted, user-friendly and efficient medium to engage citizens in the reporting of health care deficiencies in Guatemala. The general study design of this research was a mixed-method approach including a quantitative analysis of complaints received and a qualitative exploration of the attitude of community leaders towards the platform. User statistics showed that a total of N = 228 messages were sent to the platform in the period August 2014-March 2015. The majority of complaints (n = 162, 71%) fell under the 'lack of drugs, equipment or supplies' category. The community leaders welcomed the platform, describing it as modern and progressive. Despite feedback mechanisms and methods to respond to complaints not yet being fully developed, many users showed a high intrinsic motivation to use the new tool. Others, however, were restrained by fear of personal consequences and distrust of the state's judicial system. Access to mobile phones, reception, and phone credit or battery life did not pose major obstacles, but the producing and sending of correctly formatted messages was observed to be difficult. The online platform paired with SMS technology appears to be a viable approach to collect citizens' complaints in health care and connect citizens with relevant institutions. Further studies should be conducted to quantify follow-up activities and the impact on local health care provision.

  13. 77 FR 70484 - Preoperational Testing of Onsite Electric Power Systems To Verify Proper Load Group Assignments...

    Science.gov (United States)

    2012-11-26

    ...-1294, ``Preoperational Testing of On-Site Electric Power Systems to Verify Proper Load Group... entitled ``Preoperational Testing of On- Site Electric Power Systems to Verify Proper Load Group... Electric Power Systems to Verify Proper Load Group Assignments, Electrical Separation, and Redundancy...

  14. 31 CFR 363.14 - How will you verify my identity?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application. At...

  15. Is Dose Deformation–Invariance Hypothesis Verified in Prostate IGRT?

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Antoine, E-mail: antoine.simon@univ-rennes1.fr [INSERM, U1099, 35000 Rennes (France); Laboratoire Traitement du Signal et de l' Image, Université de Rennes 1, 35000 Rennes (France); Le Maitre, Amandine; Nassef, Mohamed; Rigaud, Bastien [INSERM, U1099, 35000 Rennes (France); Laboratoire Traitement du Signal et de l' Image, Université de Rennes 1, 35000 Rennes (France); Castelli, Joël [INSERM, U1099, 35000 Rennes (France); Laboratoire Traitement du Signal et de l' Image, Université de Rennes 1, 35000 Rennes (France); Department of Radiotherapy, Centre Eugène Marquis, 35000 Rennes (France); Acosta, Oscar; Haigron, Pascal [INSERM, U1099, 35000 Rennes (France); Laboratoire Traitement du Signal et de l' Image, Université de Rennes 1, 35000 Rennes (France); Lafond, Caroline; Crevoisier, Renaud de [INSERM, U1099, 35000 Rennes (France); Laboratoire Traitement du Signal et de l' Image, Université de Rennes 1, 35000 Rennes (France); Department of Radiotherapy, Centre Eugène Marquis, 35000 Rennes (France)

    2017-03-15

    Purpose: To assess dose uncertainties resulting from the dose deformation–invariance hypothesis in prostate cone beam computed tomography (CT)–based image guided radiation therapy (IGRT), namely to evaluate whether rigidly propagated planned dose distribution enables good estimation of fraction dose distributions. Methods and Materials: Twenty patients underwent a CT scan for planning intensity modulated radiation therapy–IGRT delivering 80 Gy to the prostate, followed by weekly CT scans. Two methods were used to obtain the dose distributions on the weekly CT scans: (1) recalculating the dose using the original treatment plan; and (2) rigidly propagating the planned dose distribution. The cumulative doses were then estimated in the organs at risk for each dose distribution by deformable image registration. The differences between recalculated and propagated doses were finally calculated for the fraction and the cumulative dose distributions, by use of per-voxel and dose-volume histogram (DVH) metrics. Results: For the fraction dose, the mean per-voxel absolute dose difference was <1 Gy for 98% and 95% of the fractions for the rectum and bladder, respectively. The maximum dose difference within 1 voxel reached, however, 7.4 Gy in the bladder and 8.0 Gy in the rectum. The mean dose differences were correlated with gas volume for the rectum and patient external contour variations for the bladder. The mean absolute differences for the considered volume receiving greater than or equal to dose x (V{sub x}) of the DVH were between 0.37% and 0.70% for the rectum and between 0.53% and 1.22% for the bladder. For the cumulative dose, the mean differences in the DVH were between 0.23% and 1.11% for the rectum and between 0.55% and 1.66% for the bladder. The largest dose difference was 6.86%, for bladder V{sub 80Gy}. The mean dose differences were <1.1 Gy for the rectum and <1 Gy for the bladder. Conclusions: The deformation–invariance hypothesis was

  16. Evaluation of the use of shock index in identifying acute blood loss in healthy blood donor dogs.

    Science.gov (United States)

    McGowan, Erin E; Marryott, Kimberly; Drobatz, Kenneth J; Reineke, Erica L

    2017-09-01

    To determine if shock index (SI) would increase following blood donation and if it would be a more sensitive assessment of acute blood loss as compared with heart rate (HR), blood pressure, and plasma lactate. Prospective study. University teaching hospital. Twenty client-owned clinically normal dogs. Peripheral venous blood measurements and blood donation. Data were collected at 3 time points: prior to donation (T pre ), immediately after donation (T 0 ), and 10 minutes following completion of donation (T 10 ). HR and systolic blood pressure (SBP) were recorded and used to calculate SI at time points T pre , T 0 , and T 10 . Packed cell volume (PCV), total plasma protein (TPP), and plasma lactate were evaluated from a peripheral venous blood sample at T pre and T 10. The mean SI was significantly increased at both time points following blood donation as compared to baseline (SI pre = 0.88 ± 0.19 vs SI 0 = 1.17 ± 0.21 vs SI 10 = 1.12 ± 0.25 (P = 0.0002 and 0.0003, respectively). Following blood donation, the mean SBP was significantly lower (SBP pre = 149 ± 24 mm Hg, SBP 0 = 118 ± 20 mm Hg; P = 0.0001, SBP 10 = 133 ± 21 mm Hg; P = 0.011). The mean HR was not significantly different at T 0 but was significantly increased at T 10 (HR pre = 128 ± 21/min, HR 0 = 136 ± 25/min, P = 0.193; HR 10 = 146 ± 29/min, P = 0.003). There was no significant difference in mean PCV (PCV pre = 50 ± 4%, PCV 10 = 48 ± 4%, P = 0.08). The mean TPP and plasma lactate were significantly different following donation but still within the reference interval (TPP pre = 6.8 ± 0.4 g/dL, TPP 10 = 6.4 ± 0.4 g/dL, P = 0.0014; Lac pre = 1.7 ± 0.7mmol/L, Lac 10 = 1.9 ± 0.8 mmol/L, P = 0.04). A receiver operating characteristic (ROC) analysis comparing area under the curve (AUC) for SI, HR, and SBP at T 0 and T 10 compared to T pre found that SI (AUC at T 0 : 0.858, CI: 0.730, 0.984 AUC at T 10 : 0.769 CI: 0.617, 0.921) was a better indicator of blood loss

  17. Verifying the agreed framework between the United States and North Korea

    International Nuclear Information System (INIS)

    May, M.M.

    2001-01-01

    Under the 1994 Agreed Framework (AF) between the United States and the Democratic People Republic of Korea (DPRK), the US and its allies will provide two nuclear-power reactors and other benefits to the DPRK in exchange for an agreement by the DPRK to declare how much nuclear-weapon material it has produced; to identify, freeze, and eventually dismantle specified facilities for producing this material; and to remain a party to the nuclear Non- Proliferation Treaty (NPT) and allow the implementation of its safeguards agreement. This study assesses the verifiability of these provisions. The study concludes verification can be accomplished, given cooperation and openness from the DPRK. Special effort will be needed from the IAEA, as well as support from the US and the Republic of Korea. (author)

  18. Getting What We Paid for: a Script to Verify Full Access to E-Resources

    Directory of Open Access Journals (Sweden)

    Kristina M. Spurgin

    2014-07-01

    Full Text Available Libraries regularly pay for packages of e-resources containing hundreds to thousands of individual titles. Ideally, library patrons could access the full content of all titles in such packages. In reality, library staff and patrons inevitably stumble across inaccessible titles, but no library has the resources to manually verify full access to all titles, and basic URL checkers cannot check for access. This article describes the E-Resource Access Checker—a script that automates the verification of full access. With the Access Checker, library staff can identify all inaccessible titles in a package and bring these problems to content providers’ attention to ensure we get what we pay for.

  19. Técnica de oscilações forçadas na análise da resposta broncodilatadora em voluntários sadios e indivíduos portadores de asma brônquica com resposta positiva Using the forced oscillation technique to evaluate bronchodilator response in healthy volunteers and in asthma patients presenting a verified positive response

    Directory of Open Access Journals (Sweden)

    Juliana Veiga Cavalcanti

    2006-04-01

    Full Text Available OBJETIVO: Analisar, através da técnica de oscilações forçadas, pacientes asmáticos com resposta broncodilatadora positiva pelo laudo espirométrico e comparar esses resultados com os obtidos em indivíduos sadios. MÉTODOS: Foram analisados 53 indivíduos não tabagistas, sendo 24 sadios sem história de doença pulmonar e 29 asmáticos com resposta broncodilatadora positiva segundo o laudo espirométrico. Todos foram submetidos à técnica de oscilações forçadas e a espirometria antes e após vinte minutos da administração de salbutamol spray (300 g. Os parâmetros derivados da técnica de oscilações forçadas foram: resistência total, reatância total, resistência extrapolada para o eixo y, coeficiente angular da reta de resistência e complacência dinâmica. Na espirometria, os parâmetros utilizados foram o volume expiratório forçado no primeiro segundo e a capacidade vital forçada. RESULTADOS: No grupo controle, a utilização do broncodilatador produziu alteração significativa na resistência extrapolada para o eixo y (p OBJECTIVE: To use the forced oscillation technique to evaluate asthma patients presenting positive bronchodilator responses (confirmed through spirometry and compare the results with those obtained in healthy individuals. METHODS: The study sample consisted of 53 non-smoking volunteers: 24 healthy subjects with no history of pulmonary disease and 29 asthmatics presenting positive bronchodilator response, as determined through analysis of spirometry findings. All of the subjects were submitted to forced oscillation technique and spirometry immediately before and 20 minutes after the administration of salbutamol spray (300 g. The parameters derived from the forced oscillation technique were total respiratory resistance, total respiratory reactance, resistance extrapolated to the y axis, the slope of resistance, and dynamic compliance. The parameters measured in the spirometry evaluation tests were forced

  20. Sustainability in Health care by Allocating Resources Effectively (SHARE) 6: investigating methods to identify, prioritise, implement and evaluate disinvestment projects in a local healthcare setting.

    Science.gov (United States)

    Harris, Claire; Allen, Kelly; Brooke, Vanessa; Dyer, Tim; Waller, Cara; King, Richard; Ramsey, Wayne; Mortimer, Duncan

    2017-05-25

    This is the sixth in a series of papers reporting Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE program was established to investigate a systematic, integrated, evidence-based approach to disinvestment within a large Australian health service. This paper describes the methods employed in undertaking pilot disinvestment projects. It draws a number of lessons regarding the strengths and weaknesses of these methods; particularly regarding the crucial first step of identifying targets for disinvestment. Literature reviews, survey, interviews, consultation and workshops were used to capture and process the relevant information. A theoretical framework was adapted for evaluation and explication of disinvestment projects, including a taxonomy for the determinants of effectiveness, process of change and outcome measures. Implementation, evaluation and costing plans were developed. Four literature reviews were completed, surveys were received from 15 external experts, 65 interviews were conducted, 18 senior decision-makers attended a data gathering workshop, 22 experts and local informants were consulted, and four decision-making workshops were undertaken. Mechanisms to identify disinvestment targets and criteria for prioritisation and decision-making were investigated. A catalogue containing 184 evidence-based opportunities for disinvestment and an algorithm to identify disinvestment projects were developed. An Expression of Interest process identified two potential disinvestment projects. Seventeen additional projects were proposed through a non-systematic nomination process. Four of the 19 proposals were selected as pilot projects but only one reached the implementation stage. Factors with potential influence on the outcomes of disinvestment projects are discussed and barriers and enablers in the pilot projects are summarised. This study provides an in-depth insight into the experience of disinvestment

  1. Procedures for measuring and verifying gastric tube placement in newborns: an integrative review.

    Science.gov (United States)

    Dias, Flávia de Souza Barbosa; Emidio, Suellen Cristina Dias; Lopes, Maria Helena Baena de Moraes; Shimo, Antonieta Keiko Kakuda; Beck, Ana Raquel Medeiros; Carmona, Elenice Valentim

    2017-07-10

    to investigate evidence in the literature on procedures for measuring gastric tube insertion in newborns and verifying its placement, using alternative procedures to radiological examination. an integrative review of the literature carried out in the Cochrane, LILACS, CINAHL, EMBASE, MEDLINE and Scopus databases using the descriptors "Intubation, gastrointestinal" and "newborns" in original articles. seventeen publications were included and categorized as "measuring method" or "technique for verifying placement". Regarding measuring methods, the measurements of two morphological distances and the application of two formulas, one based on weight and another based on height, were found. Regarding the techniques for assessing placement, the following were found: electromagnetic tracing, diaphragm electrical activity, CO2 detection, indigo carmine solution, epigastrium auscultation, gastric secretion aspiration, color inspection, and evaluation of pH, enzymes and bilirubin. the measuring method using nose to earlobe to a point midway between the xiphoid process and the umbilicus measurement presents the best evidence. Equations based on weight and height need to be experimentally tested. The return of secretion into the tube aspiration, color assessment and secretion pH are reliable indicators to identify gastric tube placement, and are the currently indicated techniques. investigar, na literatura, evidências sobre procedimentos de mensuração da sonda gástrica em recém-nascidos e de verificação do seu posicionamento, procedimentos alternativos ao exame radiológico. revisão integrativa da literatura nas bases Biblioteca Cochrane, LILACS, CINAHL, EMBASE, MEDLINE e Scopus, utilizando os descritores "intubação gastrointestinal" e "recém-nascido" em artigos originais. dezessete publicações foram incluídas e categorizadas em "método de mensuração" ou "técnica de verificação do posicionamento". Como métodos de mensuração, foram encontrados os de tomada

  2. Measuring reporting verifying. A primer on MRV for nationally appropriate mitigation actions

    Energy Technology Data Exchange (ETDEWEB)

    Hinostroza, M. (ed.); Luetken, S.; Holm Olsen, K. (Technical Univ. of Denmark. UNEP Risoe Centre, Roskilde (Denmark)); Aalders, E.; Pretlove, B.; Peters, N. (Det Norske Veritas, Hellerup (Denmark))

    2012-03-15

    The requirements for measurement, reporting and verification (MRV) of nationally appropriate mitigation actions (NAMAs) are one of the crucial topics on the agenda of international negotiations to address climate change mitigation. According to agreements so far, the general guidelines for domestic MRV are to be developed by Subsidiary Body for Scientific and Technological Advice (SBSTA)1. Further, the Subsidiary Body for Implementation (SBI) will be conducting international consultations and analysis (ICA) of biennial update reports (BUR) to improve transparency of mitigation actions, which should be measured, reported and verified. 2. What is clear from undergoing discussions both at SBSTA and at SBI is that MRV for NAMAs should not be a burden for controlling greenhouse gas (GHG) emissions connected to economic activities. Instead, the MRV process should facilitate mitigation actions; encourage the redirection of investments and address concerns regarding carbon content of emission intensive operations of private and public companies and enterprises worldwide. While MRV requirements are being shaped within the Convention, there are a number of initiatives supporting developing countries moving forward with NAMA development and demonstration activities. How these actions shall be measured, reported and verified, however, remain unanswered. MRV is not new. It is present in most existing policies and frameworks related to climate change mitigation. With an aim to contribute to international debate and capacity building on this crucial issue, the UNEP Risoe Centre in cooperation with UNDP, are pleased to present this publication that through the direct collaboration with Det Norske Veritas (DNV) builds on existing MRV practices in current carbon markets; provides insights on how MRV for NAMAs can be performed and identifies elements and drivers to be considered when designing adequate MRV systems for NAMAs in developing countries. This primer is the second

  3. Reliability of coded data to identify earliest indications of cognitive decline, cognitive evaluation and Alzheimer's disease diagnosis: a pilot study in England.

    Science.gov (United States)

    Dell'Agnello, Grazia; Desai, Urvi; Kirson, Noam Y; Wen, Jody; Meiselbach, Mark K; Reed, Catherine C; Belger, Mark; Lenox-Smith, Alan; Martinez, Carlos; Rasmussen, Jill

    2018-03-22

    Evaluate the reliability of using diagnosis codes and prescription data to identify the timing of symptomatic onset, cognitive assessment and diagnosis of Alzheimer's disease (AD) among patients diagnosed with AD. This was a retrospective cohort study using the UK Clinical Practice Research Datalink (CPRD). The study cohort consisted of a random sample of 50 patients with first AD diagnosis in 2010-2013. Additionally, patients were required to have a valid text-field code and a hospital episode or a referral in the 3 years before the first AD diagnosis. The earliest indications of cognitive impairment, cognitive assessment and AD diagnosis were identified using two approaches: (1) using an algorithm based on diagnostic codes and prescription drug information and (2) using information compiled from manual review of both text-based and coded data. The reliability of the code-based algorithm for identifying the earliest dates of the three measures described earlier was evaluated relative to the comprehensive second approach. Additionally, common cognitive assessments (with and without results) were described for both approaches. The two approaches identified the same first dates of cognitive symptoms in 33 (66%) of the 50 patients, first cognitive assessment in 29 (58%) patients and first AD diagnosis in 43 (86%) patients. Allowing for the dates from the two approaches to be within 30 days, the code-based algorithm's success rates increased to 74%, 70% and 94%, respectively. Mini-Mental State Examination was the most commonly observed cognitive assessment in both approaches; however, of the 53 tests performed, only 19 results were observed in the coded data. The code-based algorithm shows promise for identifying the first AD diagnosis. However, the reliability of using coded data to identify earliest indications of cognitive impairment and cognitive assessments is questionable. Additionally, CPRD is not a recommended data source to identify results of cognitive

  4. Reaching young women who sell sex: Methods and results of social mapping to describe and identify young women for DREAMS impact evaluation in Zimbabwe

    Science.gov (United States)

    Chiyaka, Tarisai; Mushati, Phillis; Hensen, Bernadette; Chabata, Sungai; Hargreaves, James R.; Floyd, Sian; Birdthistle, Isolde J.; Cowan, Frances M.; Busza, Joanna R.

    2018-01-01

    Young women (aged 15–24) who exchange sex for money or other support are among the highest risk groups for HIV acquisition, particularly in high prevalence settings. To prepare for introduction and evaluation of the DREAMS programme in Zimbabwe, which provides biomedical and social interventions to reduce adolescent girls’ and young women’s HIV vulnerability, we conducted a rapid needs assessment in 6 towns using a “social mapping” approach. In each site, we talked to adult sex workers and other key informants to identify locations where young women sell sex, followed by direct observation, group discussions and interviews. We collected data on socio-demographic characteristics of young women who sell sex, the structure and organisation of their sexual exchanges, interactions with each other and adult sex workers, and engagement with health services. Over a two-week period, we developed a “social map” for each study site, identifying similarities and differences across contexts and their implications for programming and research. Similarities include the concentration of younger women in street-based venues in town centres, their conflict with older sex workers due to competition for clients and acceptance of lower payments, and reluctance to attend existing services. Key differences were found in the 4 university towns included in our sample, where female students participate in diverse forms of sexual exchange but do not identify themselves as selling sex. In smaller towns where illegal gold panning or trucking routes were found, young women migrated in from surrounding rural areas specifically to sell sex. Young women who sell sex are different from each other, and do not work with or attend the same services as adult sex workers. Our findings are being used to inform appropriate intervention activities targeting these vulnerable young women, and to identify effective strategies for recruiting them into the DREAMS process and impact evaluations

  5. Reaching young women who sell sex: Methods and results of social mapping to describe and identify young women for DREAMS impact evaluation in Zimbabwe.

    Science.gov (United States)

    Chiyaka, Tarisai; Mushati, Phillis; Hensen, Bernadette; Chabata, Sungai; Hargreaves, James R; Floyd, Sian; Birdthistle, Isolde J; Cowan, Frances M; Busza, Joanna R

    2018-01-01

    Young women (aged 15-24) who exchange sex for money or other support are among the highest risk groups for HIV acquisition, particularly in high prevalence settings. To prepare for introduction and evaluation of the DREAMS programme in Zimbabwe, which provides biomedical and social interventions to reduce adolescent girls' and young women's HIV vulnerability, we conducted a rapid needs assessment in 6 towns using a "social mapping" approach. In each site, we talked to adult sex workers and other key informants to identify locations where young women sell sex, followed by direct observation, group discussions and interviews. We collected data on socio-demographic characteristics of young women who sell sex, the structure and organisation of their sexual exchanges, interactions with each other and adult sex workers, and engagement with health services. Over a two-week period, we developed a "social map" for each study site, identifying similarities and differences across contexts and their implications for programming and research. Similarities include the concentration of younger women in street-based venues in town centres, their conflict with older sex workers due to competition for clients and acceptance of lower payments, and reluctance to attend existing services. Key differences were found in the 4 university towns included in our sample, where female students participate in diverse forms of sexual exchange but do not identify themselves as selling sex. In smaller towns where illegal gold panning or trucking routes were found, young women migrated in from surrounding rural areas specifically to sell sex. Young women who sell sex are different from each other, and do not work with or attend the same services as adult sex workers. Our findings are being used to inform appropriate intervention activities targeting these vulnerable young women, and to identify effective strategies for recruiting them into the DREAMS process and impact evaluations.

  6. Reaching young women who sell sex: Methods and results of social mapping to describe and identify young women for DREAMS impact evaluation in Zimbabwe.

    Directory of Open Access Journals (Sweden)

    Tarisai Chiyaka

    Full Text Available Young women (aged 15-24 who exchange sex for money or other support are among the highest risk groups for HIV acquisition, particularly in high prevalence settings. To prepare for introduction and evaluation of the DREAMS programme in Zimbabwe, which provides biomedical and social interventions to reduce adolescent girls' and young women's HIV vulnerability, we conducted a rapid needs assessment in 6 towns using a "social mapping" approach. In each site, we talked to adult sex workers and other key informants to identify locations where young women sell sex, followed by direct observation, group discussions and interviews. We collected data on socio-demographic characteristics of young women who sell sex, the structure and organisation of their sexual exchanges, interactions with each other and adult sex workers, and engagement with health services. Over a two-week period, we developed a "social map" for each study site, identifying similarities and differences across contexts and their implications for programming and research. Similarities include the concentration of younger women in street-based venues in town centres, their conflict with older sex workers due to competition for clients and acceptance of lower payments, and reluctance to attend existing services. Key differences were found in the 4 university towns included in our sample, where female students participate in diverse forms of sexual exchange but do not identify themselves as selling sex. In smaller towns where illegal gold panning or trucking routes were found, young women migrated in from surrounding rural areas specifically to sell sex. Young women who sell sex are different from each other, and do not work with or attend the same services as adult sex workers. Our findings are being used to inform appropriate intervention activities targeting these vulnerable young women, and to identify effective strategies for recruiting them into the DREAMS process and impact

  7. Isotope correlation techniques for verifying input accountability measurements at a reprocessing plant

    International Nuclear Information System (INIS)

    Umezawa, H.; Nakahara, Y.

    1983-01-01

    Isotope correlation techniques were studied to verify input accountability measurements at a reprocessing plant. On the basis of a historical data bank, correlation between plutonium-to-uranium ratio and isotopic variables was derived as a function of burnup. The burnup was determined from the isotopic ratios of uranium and plutonium, too. Data treatment was therefore made in an iterative manner. The isotopic variables were defined to cover a wide spectrum of isotopes of uranium and plutonium. The isotope correlation techniques evaluated important parameters such as the fuel burnup, the most probable ratio of plutonium to uranium, and the amounts of uranium and plutonium in reprocessing batches in connection with fresh fuel fabrication data. In addition, the most probable values of isotope abundance of plutonium and uranium could be estimated from the plutonium-to-uranium ratio determined, being compared with the reported data for verification. A pocket-computer-based system was developed to enable inspectors to collect and evaluate data in a timely fashion at the input accountability measurement point by the isotope correlation techniques. The device is supported by battery power and completely independent of the operator's system. The software of the system was written in BASIC. The data input can be stored in a cassette tape and transferred into a higher level computer. The correlations used for the analysis were given as a form of analytical function. Coefficients for the function were provided relevant to the type of reactor and the initial enrichment of fuel. (author)

  8. Qualification, training, licensing/authorization and retraining of operating personnel in nuclear power plants. Noteworthy topics identified by evaluation of the practices in countries of the European Communities

    International Nuclear Information System (INIS)

    Kraut, A.; Pfeffer, W.

    1987-01-01

    In the report EUR 10118 '' Qualification, training, licensing and retraining of operating shift personnel in nuclear power plants'' the current practice in the countries of the European Communities as well as the procedures and programmes applied in Sweden, Switzerland and the USA are outlined and evaluated. The intent was to derive fundamental and generally valid concepts concerning shift-staff training and other relevant aspects. Those items were identified that seemed to be noteworthy because they give some guidance on how to achieve and maintain the qualification of the shift staff of NPPs or how to improve the staffing of the control room. These noteworthy topics identified by evaluation of the practice in countries of the European Communities and also elsewhere are presented in the publication at hand. The report addresses the following topics: tasks of the shift personnel, nomenclature for different grades of the personnel; shift staffing and staffing of the control room; criteria for personnel selection when recruiting new shift staff; personnel qualification necessary for recruitment; training of shift personnel; retraining and preservation of qualification standards; training facilities, especially simulators; responsibility for training; licensing/authorization; retirement from shift work. Consideration of these more general aspects and concepts may lead to improvement in training. The job descriptions given in the Annex to the document are only intended to give a general understanding of the typical designations, tasks and responsibilities of shift staff

  9. Validating the use of the evaluation tool of children's handwriting-manuscript to identify handwriting difficulties and detect change in school-age children.

    Science.gov (United States)

    Brossard-Racine, Marie; Mazer, Barbara; Julien, Marilyse; Majnemer, Annette

    2012-01-01

    In this study we sought to validate the discriminant ability of the Evaluation Tool of Children's Handwriting-Manuscript in identifying children in Grades 2-3 with handwriting difficulties and to determine the percentage of change in handwriting scores that is consistently detected by occupational therapists. Thirty-four therapists judged and compared 35 pairs of handwriting samples. Receiver operating characteristic (ROC) analyses were performed to determine (1) the optimal cutoff values for word and letter legibility scores that identify children with handwriting difficulties who should be seen in rehabilitation and (2) the minimal clinically important difference (MCID) in handwriting scores. Cutoff scores of 75.0% for total word legibility and 76.0% for total letter legibility were found to provide excellent levels of accuracy. A difference of 10.0%-12.5% for total word legibility and 6.0%-7.0% for total letter legibility were found as the MCID. Study findings enable therapists to quantitatively support clinical judgment when evaluating handwriting. Copyright © 2012 by the American Occupational Therapy Association, Inc.

  10. Evaluating a satellite-based seasonal evapotranspiration product and identifying its relationship with other satellite-derived products and crop yield: A case study for Ethiopia

    Science.gov (United States)

    Tadesse, Tsegaye; Senay, Gabriel B.; Berhan, Getachew; Regassa, Teshome; Beyene, Shimelis

    2015-01-01

    Satellite-derived evapotranspiration anomalies and normalized difference vegetation index (NDVI) products from Moderate Resolution Imaging Spectroradiometer (MODIS) data are currently used for African agricultural drought monitoring and food security status assessment. In this study, a process to evaluate satellite-derived evapotranspiration (ETa) products with a geospatial statistical exploratory technique that uses NDVI, satellite-derived rainfall estimate (RFE), and crop yield data has been developed. The main goal of this study was to evaluate the ETa using the NDVI and RFE, and identify a relationship between the ETa and Ethiopia’s cereal crop (i.e., teff, sorghum, corn/maize, barley, and wheat) yields during the main rainy season. Since crop production is one of the main factors affecting food security, the evaluation of remote sensing-based seasonal ETa was done to identify the appropriateness of this tool as a proxy for monitoring vegetation condition in drought vulnerable and food insecure areas to support decision makers. The results of this study showed that the comparison between seasonal ETa and RFE produced strong correlation (R2 > 0.99) for all 41 crop growing zones in Ethiopia. The results of the spatial regression analyses of seasonal ETa and NDVI using Ordinary Least Squares and Geographically Weighted Regression showed relatively weak yearly spatial relationships (R2 < 0.7) for all cropping zones. However, for each individual crop zones, the correlation between NDVI and ETa ranged between 0.3 and 0.84 for about 44% of the cropping zones. Similarly, for each individual crop zones, the correlation (R2) between the seasonal ETa anomaly and de-trended cereal crop yield was between 0.4 and 0.82 for 76% (31 out of 41) of the crop growing zones. The preliminary results indicated that the ETa products have a good predictive potential for these 31 identified zones in Ethiopia. Decision makers may potentially use ETa products for monitoring cereal

  11. Eysenbach, Tuische and Diepgen’s Evaluation of Web Searching for Identifying Unpublished Studies for Systematic Reviews: An Innovative Study Which is Still Relevant Today.

    Directory of Open Access Journals (Sweden)

    Simon Briscoe

    2016-09-01

    Full Text Available A Review of: Eysenbach, G., Tuische, J. & Diepgen, T.L. (2001. Evaluation of the usefulness of Internet searches to identify unpublished clinical trials for systematic reviews. Medical Informatics and the Internet in Medicine, 26(3, 203-218. http://dx.doi.org/10.1080/14639230110075459 Objective – To consider whether web searching is a useful method for identifying unpublished studies for inclusion in systematic reviews. Design – Retrospective web searches using the AltaVista search engine were conducted to identify unpublished studies – specifically, clinical trials – for systematic reviews which did not use a web search engine. Setting – The Department of Clinical Social Medicine, University of Heidelberg, Germany. Subjects – n/a Methods – Pilot testing of 11 web search engines was carried out to determine which could handle complex search queries. Pre-specified search requirements included the ability to handle Boolean and proximity operators, and truncation searching. A total of seven Cochrane systematic reviews were randomly selected from the Cochrane Library Issue 2, 1998, and their bibliographic database search strategies were adapted for the web search engine, AltaVista. Each adaptation combined search terms for the intervention, problem, and study type in the systematic review. Hints to planned, ongoing, or unpublished studies retrieved by the search engine, which were not cited in the systematic reviews, were followed up by visiting websites and contacting authors for further details when required. The authors of the systematic reviews were then contacted and asked to comment on the potential relevance of the identified studies. Main Results – Hints to 14 unpublished and potentially relevant studies, corresponding to 4 of the 7 randomly selected Cochrane systematic reviews, were identified. Out of the 14 studies, 2 were considered irrelevant to the corresponding systematic review by the systematic review authors. The

  12. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    Science.gov (United States)

    Stephen M Ogle; Kenneth Davis; Thomas Lauvaux; Andrew Schuh; Dan Cooley; Tristram O West; Linda S Heath; Natasha L Miles; Scott Richardson; F Jay Breidt; James E Smith; Jessica L McCarty; Kevin R Gurney; Pieter Tans; A Scott. Denning

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country's contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated...

  13. A new concept and a comprehensive evaluation of SYSMEX UF-1000i  flow cytometer to identify culture-negative urine specimens in patients with UTI.

    Science.gov (United States)

    Monsen, T; Ryden, P

    2017-09-01

    Urinary tract infections (UTIs) are among the most common bacterial infections in men and urine culture is gold standard for diagnosis. Considering the high prevalence of culture-negative specimens, any method that identifies such specimens is of interest. The aim was to evaluate a new screening concept for flow cytometry analysis (FCA). The outcomes were evaluated against urine culture, uropathogen species and three conventional screening methods. A prospective, consecutive study examined 1,312 urine specimens, collected during January and February 2012. The specimens were analyzed using the Sysmex UF1000i FCA. Based on the FCA data culture negative specimens were identified in a new model by use of linear discriminant analysis (FCA-LDA). In total 1,312 patients were included. In- and outpatients represented 19.6% and 79.4%, respectively; 68.3% of the specimens originated from women. Of the 610 culture-positive specimens, Escherichia coli represented 64%, enterococci 8% and Klebsiella spp. 7%. Screening with FCA-LDA at 95% sensitivity identified 42% (552/1312) as culture negative specimens when UTI was defined according to European guidelines. The proposed screening method was either superior or similar in comparison to the three conventional screening methods. In conclusion, the proposed/suggested and new FCA-LDA screening method was superior or similar to three conventional screening methods. We recommend the proposed screening method to be used in clinic to exclude culture negative specimens, to reduce workload, costs and the turnaround time. In addition, the FCA data may add information that enhance handling and support diagnosis of patients with suspected UTI pending urine culture [corrected].

  14. Verifying Safety Messages Using Relative-Time and Zone Priority in Vehicular Ad Hoc Networks

    Science.gov (United States)

    Banani, Sam; Thiemjarus, Surapa; Kittipiyakul, Somsak

    2018-01-01

    In high-density road networks, with each vehicle broadcasting multiple messages per second, the arrival rate of safety messages can easily exceed the rate at which digital signatures can be verified. Since not all messages can be verified, algorithms for selecting which messages to verify are required to ensure that each vehicle receives appropriate awareness about neighbouring vehicles. This paper presents a novel scheme to select important safety messages for verification in vehicular ad hoc networks (VANETs). The proposed scheme uses location and direction of the sender, as well as proximity and relative-time between vehicles, to reduce the number of irrelevant messages verified (i.e., messages from vehicles that are unlikely to cause an accident). Compared with other existing schemes, the analysis results show that the proposed scheme can verify messages from nearby vehicles with lower inter-message delay and reduced packet loss and thus provides high level of awareness of the nearby vehicles. PMID:29652840

  15. Analytic solution to verify code predictions of two-phase flow in a boiling water reactor core channel

    International Nuclear Information System (INIS)

    Chen, K.F.; Olson, C.A.

    1983-01-01

    One reliable method that can be used to verify the solution scheme of a computer code is to compare the code prediction to a simplified problem for which an analytic solution can be derived. An analytic solution for the axial pressure drop as a function of the flow was obtained for the simplified problem of homogeneous equilibrium two-phase flow in a vertical, heated channel with a cosine axial heat flux shape. This analytic solution was then used to verify the predictions of the CONDOR computer code, which is used to evaluate the thermal-hydraulic performance of boiling water reactors. The results show excellent agreement between the analytic solution and CONDOR prediction

  16. Verifying the functional ability of microstructured surfaces by model-based testing

    Science.gov (United States)

    Hartmann, Wito; Weckenmann, Albert

    2014-09-01

    Micro- and nanotechnology enables the use of new product features such as improved light absorption, self-cleaning or protection, which are based, on the one hand, on the size of functional nanostructures and the other hand, on material-specific properties. With the need to reliably measure progressively smaller geometric features, coordinate and surface-measuring instruments have been refined and now allow high-resolution topography and structure measurements down to the sub-nanometre range. Nevertheless, in many cases it is not possible to make a clear statement about the functional ability of the workpiece or its topography because conventional concepts of dimensioning and tolerancing are solely geometry oriented and standardized surface parameters are not sufficient to consider interaction with non-geometric parameters, which are dominant for functions such as sliding, wetting, sealing and optical reflection. To verify the functional ability of microstructured surfaces, a method was developed based on a parameterized mathematical-physical model of the function. From this model, function-related properties can be identified and geometric parameters can be derived, which may be different for the manufacturing and verification processes. With this method it is possible to optimize the definition of the shape of the workpiece regarding the intended function by applying theoretical and experimental knowledge, as well as modelling and simulation. Advantages of this approach will be discussed and demonstrated by the example of a microstructured inking roll.

  17. Evaluation of the feasibility and performance of early warning scores to identify patients at risk of adverse outcomes in a low-middle income country setting

    Science.gov (United States)

    Beane, Abi; De Silva, Ambepitiyawaduge Pubudu; De Silva, Nirodha; Sujeewa, Jayasingha A; Rathnayake, R M Dhanapala; Sigera, P Chathurani; Athapattu, Priyantha Lakmini; Mahipala, Palitha G; Rashan, Aasiyah; Munasinghe, Sithum Bandara; Jayasinghe, Kosala Saroj Amarasiri; Dondorp, Arjen M; Haniffa, Rashan

    2018-01-01

    Objective This study describes the availability of core parameters for Early Warning Scores (EWS), evaluates the ability of selected EWS to identify patients at risk of death or other adverse outcome and describes the burden of triggering that front-line staff would experience if implemented. Design Longitudinal observational cohort study. Setting District General Hospital Monaragala. Participants All adult (age >17 years) admitted patients. Main outcome measures Existing physiological parameters, adverse outcomes and survival status at hospital discharge were extracted daily from existing paper records for all patients over an 8-month period. Statistical analysis Discrimination for selected aggregate weighted track and trigger systems (AWTTS) was assessed by the area under the receiver operating characteristic (AUROC) curve. Performance of EWS are further evaluated at time points during admission and across diagnostic groups. The burden of trigger to correctly identify patients who died was evaluated using positive predictive value (PPV). Results Of the 16 386 patients included, 502 (3.06%) had one or more adverse outcomes (cardiac arrests, unplanned intensive care unit admissions and transfers). Availability of physiological parameters on admission ranged from 90.97% (95% CI 90.52% to 91.40%) for heart rate to 23.94% (95% CI 23.29% to 24.60%) for oxygen saturation. Ability to discriminate death on admission was less than 0.81 (AUROC) for all selected EWS. Performance of the best performing of the EWS varied depending on admission diagnosis, and was diminished at 24 hours prior to event. PPV was low (10.44%). Conclusion There is limited observation reporting in this setting. Indiscriminate application of EWS to all patients admitted to wards in this setting may result in an unnecessary burden of monitoring and may detract from clinician care of sicker patients. Physiological parameters in combination with diagnosis may have a place when applied on admission to

  18. Epidemiology of Injuries Identified at the NFL Scouting Combine and Their Impact on Performance in the National Football League: Evaluation of 2203 Athletes From 2009 to 2015.

    Science.gov (United States)

    Beaulieu-Jones, Brendin R; Rossy, William H; Sanchez, George; Whalen, James M; Lavery, Kyle P; McHale, Kevin J; Vopat, Bryan G; Van Allen, Joseph J; Akamefula, Ramesses A; Provencher, Matthew T

    2017-07-01

    At the annual National Football League (NFL) Scouting Combine, the medical staff of each NFL franchise performs a comprehensive medical evaluation of all athletes potentially entering the NFL. Currently, little is known regarding the overall epidemiology of injuries identified at the combine and their impact on NFL performance. To determine the epidemiology of injuries identified at the combine and their impact on initial NFL performance. Cohort study; Level of evidence, 3. All previous musculoskeletal injuries identified at the NFL Combine from 2009 to 2015 were retrospectively reviewed. Medical records and imaging reports were examined. Game statistics for the first 2 seasons of NFL play were obtained for all players from 2009 to 2013. Analysis of injury prevalence and overall impact on the draft status and position-specific performance metrics of each injury was performed and compared with a position-matched control group with no history of injury or surgery. A total of 2203 athletes over 7 years were evaluated, including 1490 (67.6%) drafted athletes and 1040 (47.2%) who ultimately played at least 2 years in the NFL. The most common sites of injury were the ankle (1160, 52.7%), shoulder (1143, 51.9%), knee (1128, 51.2%), spine (785, 35.6%), and hand (739, 33.5%). Odds ratios (ORs) demonstrated that quarterbacks were most at risk of shoulder injury (OR, 2.78; P = .001), while running backs most commonly sustained ankle (OR, 1.39; P = .040) and shoulder injuries (OR, 1.55; P = .020) when compared with all other players. Ultimately, defensive players demonstrated a greater negative impact due to injury than offensive players, with multiple performance metrics significantly affected for each defensive position analyzed, whereas skilled offensive players (eg, quarterbacks, running backs) demonstrated only 1 metric significantly affected at each position. The most common sites of injury identified at the combine were (1) ankle, (2) shoulder, (3) knee, (4) spine, and

  19. Informing decision makers and identifying niche opportunities for windpower: use of multiattribute trade off analysis to evaluate non-dispatchable resources

    International Nuclear Information System (INIS)

    Connors, S.R.

    1996-01-01

    The operational and flexibility characteristics of renewable energy technologies are often overlooked in traditional head to head technology comparisons. This impedes their adoption since identification of environmental and risk mitigation advantages requires evaluation of such non-dispatchable technologies in a systemwide context. Use of multiattribute resource planning tools in a trade off analysis framework identifies the complementary emissions reduction and fuel diversification characteristics of renewables. Data visualization using trade off analysis communicates electric resource interactions and the risks of following various strategies to diverse stakeholder audiences, promoting acceptance. This paper provides an overview of the multiattribute trade off approach and applies it to resource strategies incorporating windpower in the New England regional power system. Examples focus on the interaction of wind resources with demand-side management and supply-side options under fuel cost uncertainty. (Author)

  20. Evaluation of the angular response of LaBr{sub 3}(Ce) and NaI(Tl) radiological identifiers for emergency situations attendance

    Energy Technology Data Exchange (ETDEWEB)

    Izidório, Ana C.A.C.; Cardoso, Domingos D’O.; Oliveira, Luciano S.R.; Balthar, Mario C.V.; Amorim, Aneuri S. de; Santos, Avelino dos; Guimarães Junior, Walter J.; Arbach, Mayara N., E-mail: carolizidorio@hotmail.com, E-mail: domingos.oliveiralvr71@gmail.com, E-mail: walter_guimaraes@ime.eb.br, E-mail: lucianosantarita@gmail.com, E-mail: mariobalthar@gmail.com, E-mail: aneurideamorim@gmail.com, E-mail: hiperav@gmail.com, E-mail: mayaraarbach@gmail.com [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil); Instituto de Defesa Química, Biológica, Radiológica e Nuclear (IDQBRN/CTEx), Barra de Guaratiba, RJ (Brazil); Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    The Institute of Chemical, Biological, Radiological and Nuclear Defense (IDQBRN) of the Brazilian Army has been developing activities aimed at characterizing radiological detectors for use during emergency situations and radiological incidents and also for research and academic activities. This work describes the experiments performed in order to evaluate the angular response of LaBr{sub 3}(Ce) and NaI(Tl) portable radiological identifiers (PRI) with scintillator crystal detectors measuring 1.5” x 1.5” and 3.0” x 1.5”, respectively. A {sup 137}Cs source with corrected activity of 2.623 GBq (July 29, 2017) supplied the beam for the experiments. It was positioned at a distance of 1.00 m from the PRIs, together with attenuators, in order to yield different ambient dose equivalent rate, H⁎(10), thus allowing the adjustment of the counting statistics and the analysis of the responses obtained. The objective of this work was to evaluate the angular dependence of the response of LaBr{sub 3}(Ce) and NaI(Tl) PRIs exposed to a {sup 137}Cs source by azimuthally varying the angle of incidence of the primary beam within the 0° ± 30° range, measured from the center of the sensitive volume of the scintillators. The PRIs were programmed to reach a maximum count of 10{sup 6} in order to ensure that the relative uncertainty of the measured data would be negligible which led to improved angular response data, in addition to higher correlation factors and greater reliability in the responses obtained with LaBr{sub 3}(Ce) and NaI(T1) portable radiological identifiers. (author)

  1. Learning to identify Protected Health Information by integrating knowledge- and data-driven algorithms: A case study on psychiatric evaluation notes.

    Science.gov (United States)

    Dehghan, Azad; Kovacevic, Aleksandar; Karystianis, George; Keane, John A; Nenadic, Goran

    2017-11-01

    De-identification of clinical narratives is one of the main obstacles to making healthcare free text available for research. In this paper we describe our experience in expanding and tailoring two existing tools as part of the 2016 CEGS N-GRID Shared Tasks Track 1, which evaluated de-identification methods on a set of psychiatric evaluation notes for up to 25 different types of Protected Health Information (PHI). The methods we used rely on machine learning on either a large or small feature space, with additional strategies, including two-pass tagging and multi-class models, which both proved to be beneficial. The results show that the integration of the proposed methods can identify Health Information Portability and Accountability Act (HIPAA) defined PHIs with overall F 1 -scores of ∼90% and above. Yet, some classes (Profession, Organization) proved again to be challenging given the variability of expressions used to reference given information. Copyright © 2017. Published by Elsevier Inc.

  2. Use of models and mockups in verifying man-machine interfaces

    International Nuclear Information System (INIS)

    Seminara, J.L.

    1985-01-01

    The objective of Human Factors Engineering is to tailor the design of facilities and equipment systems to match the capabilities and limitations of the personnel who will operate and maintain the system. This optimization of the man-machine interface is undertaken to enhance the prospects for safe, reliable, timely, and error-free human performance in meeting system objectives. To ensure the eventual success of a complex man-machine system it is important to systematically and progressively test and verify the adequacy of man-machine interfaces from initial design concepts to system operation. Human factors specialists employ a variety of methods to evaluate the quality of the human-system interface. These methods include: (1) Reviews of two-dimensional drawings using appropriately scaled transparent overlays of personnel spanning the anthropometric range, considering clothing and protective gear encumbrances (2) Use of articulated, scaled, plastic templates or manikins that are overlayed on equipment or facility drawings (3) Development of computerized manikins in computer aided design approaches (4) Use of three-dimensional scale models to better conceptualize work stations, control rooms or maintenance facilities (5) Full or half-scale mockups of system components to evaluate operator/maintainer interfaces (6) Part of full-task dynamic simulation of operator or maintainer tasks and interactive system responses (7) Laboratory and field research to establish human performance capabilities with alternative system design concepts or configurations. Of the design verification methods listed above, this paper will only consider the use of models and mockups in the design process

  3. Back to the basics: identifying positive youth development as the theoretical framework for a youth drug prevention program in rural Saskatchewan, Canada amidst a program evaluation.

    Science.gov (United States)

    Dell, Colleen Anne; Duncan, Charles Randy; DesRoches, Andrea; Bendig, Melissa; Steeves, Megan; Turner, Holly; Quaife, Terra; McCann, Chuck; Enns, Brett

    2013-10-22

    Despite endorsement by the Saskatchewan government to apply empirically-based approaches to youth drug prevention services in the province, programs are sometimes delivered prior to the establishment of evidence-informed goals and objectives. This paper shares the 'preptory' outcomes of our team's program evaluation of the Prince Albert Parkland Health Region Mental Health and Addiction Services' Outreach Worker Service (OWS) in eight rural, community schools three years following its implementation. Before our independent evaluation team could assess whether expectations of the OWS were being met, we had to assist with establishing its overarching program goals and objectives and 'at-risk' student population, alongside its alliance with an empirically-informed theoretical framework. A mixed-methods approach was applied, beginning with in-depth focus groups with the OWS staff to identify the program's goals and objectives and targeted student population. These were supplemented with OWS and school administrator interviews and focus groups with school staff. Alignment with a theoretical focus was determined though a review of the OWS's work to date and explored in focus groups between our evaluation team and the OWS staff and validated with the school staff and OWS and school administration. With improved understanding of the OWS's goals and objectives, our evaluation team and the OWS staff aligned the program with the Positive Youth Development theoretical evidence-base, emphasizing the program's universality, systems focus, strength base, and promotion of assets. Together we also gained clarity about the OWS's definition of and engagement with its 'at-risk' student population. It is important to draw on expert knowledge to develop youth drug prevention programming, but attention must also be paid to aligning professional health care services with a theoretically informed evidence-base for evaluation purposes. If time does not permit for the establishment of

  4. An experiment designed to verify the general theory of relativity; Une experience destinee a verifier la theorie de la relativite generalisee

    Energy Technology Data Exchange (ETDEWEB)

    Surdin, Maurice [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1960-07-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [French] Projet d'une experience, utilisant l'effet de gravitation sur des horloges du type Maser placees sur la terre a deux altitudes differentes, et destinee a verifier la theorie de la relativite generalisee. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, seance du 11 janvier 1960.

  5. Can hospital audit teams identify case management problems, analyse their causes, identify and implement improvements? A cross-sectional process evaluation of obstetric near-miss case reviews in Benin

    Directory of Open Access Journals (Sweden)

    Borchert Matthias

    2012-10-01

    Full Text Available Abstract Background Obstetric near-miss case reviews are being promoted as a quality assurance intervention suitable for hospitals in low income countries. We introduced such reviews in five district, regional and national hospitals in Benin, West Africa. In a cross-sectional study we analysed the extent to which the hospital audit teams were able to identify case management problems (CMPs, analyse their causes, agree on solutions and put these solutions into practice. Methods We analysed case summaries, women’s interview transcripts and audit minutes produced by the audit teams for 67 meetings concerning one woman with near-miss complications each. We compared the proportion of CMPs identified by an external assessment team to the number found by the audit teams. For the latter, we described the CMP causes identified, solutions proposed and implemented by the audit teams. Results Audit meetings were conducted regularly and were well attended. Audit teams identified half of the 714 CMPs; they were more likely to find managerial ones (71% than the ones relating to treatment (30%. Most identified CMPs were valid. Almost all causes of CMPs were plausible, but often too superficial to be of great value for directing remedial action. Audit teams suggested solutions, most of them promising ones, for 38% of the CMPs they had identified, but recorded their implementation only for a minority (8.5%. Conclusions The importance of following-up and documenting the implementation of solutions should be stressed in future audit interventions. Tools facilitating the follow-up should be made available. Near-miss case reviews hold promise, but their effectiveness to improve the quality of care sustainably and on a large scale still needs to be established.

  6. Can hospital audit teams identify case management problems, analyse their causes, identify and implement improvements? A cross-sectional process evaluation of obstetric near-miss case reviews in Benin

    Science.gov (United States)

    2012-01-01

    Background Obstetric near-miss case reviews are being promoted as a quality assurance intervention suitable for hospitals in low income countries. We introduced such reviews in five district, regional and national hospitals in Benin, West Africa. In a cross-sectional study we analysed the extent to which the hospital audit teams were able to identify case management problems (CMPs), analyse their causes, agree on solutions and put these solutions into practice. Methods We analysed case summaries, women’s interview transcripts and audit minutes produced by the audit teams for 67 meetings concerning one woman with near-miss complications each. We compared the proportion of CMPs identified by an external assessment team to the number found by the audit teams. For the latter, we described the CMP causes identified, solutions proposed and implemented by the audit teams. Results Audit meetings were conducted regularly and were well attended. Audit teams identified half of the 714 CMPs; they were more likely to find managerial ones (71%) than the ones relating to treatment (30%). Most identified CMPs were valid. Almost all causes of CMPs were plausible, but often too superficial to be of great value for directing remedial action. Audit teams suggested solutions, most of them promising ones, for 38% of the CMPs they had identified, but recorded their implementation only for a minority (8.5%). Conclusions The importance of following-up and documenting the implementation of solutions should be stressed in future audit interventions. Tools facilitating the follow-up should be made available. Near-miss case reviews hold promise, but their effectiveness to improve the quality of care sustainably and on a large scale still needs to be established. PMID:23057707

  7. Constructing the informatics and information technology foundations of a medical device evaluation system: a report from the FDA unique device identifier demonstration.

    Science.gov (United States)

    Drozda, Joseph P; Roach, James; Forsyth, Thomas; Helmering, Paul; Dummitt, Benjamin; Tcheng, James E

    2018-02-01

    The US Food and Drug Administration (FDA) has recognized the need to improve the tracking of medical device safety and performance, with implementation of Unique Device Identifiers (UDIs) in electronic health information as a key strategy. The FDA funded a demonstration by Mercy Health wherein prototype UDIs were incorporated into its electronic information systems. This report describes the demonstration's informatics architecture. Prototype UDIs for coronary stents were created and implemented across a series of information systems, resulting in UDI-associated data flow from manufacture through point of use to long-term follow-up, with barcode scanning linking clinical data with UDI-associated device attributes. A reference database containing device attributes and the UDI Research and Surveillance Database (UDIR) containing the linked clinical and device information were created, enabling longitudinal assessment of device performance. The demonstration included many stakeholders: multiple Mercy departments, manufacturers, health system partners, the FDA, professional societies, the National Cardiovascular Data Registry, and information system vendors. The resulting system of systems is described in detail, including entities, functions, linkage between the UDIR and proprietary systems using UDIs as the index key, data flow, roles and responsibilities of actors, and the UDIR data model. The demonstration provided proof of concept that UDIs can be incorporated into provider and enterprise electronic information systems and used as the index key to combine device and clinical data in a database useful for device evaluation. Keys to success and challenges to achieving this goal were identified. Fundamental informatics principles were central to accomplishing the system of systems model. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Evaluation of a medication intensity screening tool used in malignant hematology and bone marrow transplant services to identify patients at risk for medication-related problems.

    Science.gov (United States)

    Lucena, Mariana; Bondarenka, Carolyn; Luehrs-Hayes, Genevieve; Perez, Andy

    2018-06-01

    Background In 2014, a screening tool was implemented at Medical University of South Carolina (MUSC) Health to identify patients who are at risk for medication-related events. Patients are classified as high-risk if they meet one of the following criteria: receiving anticoagulation therapy, taking more than 10 scheduled medications upon admission, or readmission within the past 30 days. The goal of this study was to determine risk criteria specific to the malignant hematology (MH) and bone marrow transplant (BMT) patients. Methods A retrospective chart review of 114 patients admitted and discharged from the MH/BMT services between 1 September 2015 and 31 October 2015 was performed. A pharmacist-conducted medication history was completed and documented, and all interventions at admission and throughout hospitalization were categorized by severity and by value of service. The primary objective was to evaluate if patients in the MH/BMT services have more medication-related interventions documented upon admission compared with patients who are not screened as high risk. The secondary objectives were to evaluate the different types and severities of interventions made by pharmacists during the entire hospital stay, and to determine if there are certain characteristics that can help identify hematology/oncology high-risk patients. Results More interventions documented upon admission in the high-risk group as a whole when compared with the not high-risk group (73 vs. 31), but when normalized per patients in each group, there was an equal number of interventions (1.0). The most common interventions were to modify regimen (36%) and discontinue therapy (16%). The patient characteristics associated with high-risk included neutropenia, lower average platelet counts on admission, and longer length of stay. Conclusion The screening tool does not further differentiate an already complex MH/BMT patient population. Pharmacists may be more useful at capturing errors or changes during

  9. Evaluation of testing strategies to identify infected animals at a single round of testing within dairy herds known to be infected with Mycobacterium avium ssp. paratuberculosis.

    Science.gov (United States)

    More, S J; Cameron, A R; Strain, S; Cashman, W; Ezanno, P; Kenny, K; Fourichon, C; Graham, D

    2015-08-01

    As part of a broader control strategy within herds known to be infected with Mycobacterium avium ssp. paratuberculosis (MAP), individual animal testing is generally conducted to identify infected animals for action, usually culling. Opportunities are now available to quantitatively compare different testing strategies (combinations of tests) in known infected herds. This study evaluates the effectiveness, cost, and cost-effectiveness of different testing strategies to identify infected animals at a single round of testing within dairy herds known to be MAP infected. A model was developed, taking account of both within-herd infection dynamics and test performance, to simulate the use of different tests at a single round of testing in a known infected herd. Model inputs included the number of animals at different stages of infection, the sensitivity and specificity of each test, and the costs of testing and culling. Testing strategies included either milk or serum ELISA alone or with fecal culture in series. Model outputs included effectiveness (detection fraction, the proportion of truly infected animals in the herd that are successfully detected by the testing strategy), cost, and cost-effectiveness (testing cost per true positive detected, total cost per true positive detected). Several assumptions were made: MAP was introduced with a single animal and no management interventions were implemented to limit within-herd transmission of MAP before this test. In medium herds, between 7 and 26% of infected animals are detected at a single round of testing, the former using the milk ELISA and fecal culture in series 5 yr after MAP introduction and the latter using fecal culture alone 15 yr after MAP introduction. The combined costs of testing and culling at a single round of testing increases with time since introduction of MAP infection, with culling costs being much greater than testing costs. The cost-effectiveness of testing varied by testing strategy. It was also

  10. Using the serious mental illness health improvement profile [HIP] to identify physical problems in a cohort of community patients: a pragmatic case series evaluation.

    Science.gov (United States)

    Shuel, Francis; White, Jacquie; Jones, Martin; Gray, Richard

    2010-02-01

    The physical health of people with serious mental illness is a cause of growing concern to clinicians. Life expectancy in this population may be reduced by up to 25 years and patients often live with considerable physical morbidity that can dramatically reduce quality of life and contribute to social exclusion. This study sought to determine whether the serious mental illness health improvement profile [HIP], facilitated by mental health nurses [MHNs], has the clinical potential to identify physical morbidity and inform future evidence-based care. Retrospective documentation audit and qualitative evaluation of patients' and clinicians' views about the use of the HIP in practice. A nurse-led outpatient medication management clinic, for community adult patients with serious mental illness in Scotland. 31 Community patients with serious mental illness seen in the clinic by 2 MHNs trained to use the HIP. All 31 patients, 9 MHNs, 4 consultant psychiatrists and 12 general practitioners [GPs] (primary care physicians) participated in the qualitative evaluation. A retrospective documentation audit of case notes for all patients where the HIP had been implemented. Semi-structured interviews with patients and their secondary care clinicians. Postal survey of GPs. 189 Physical health issues were identified (mean 6.1 per patient). Items most frequently flagged 'red', suggesting that intervention was required, were body mass index [BMI] (n=24), breast self-examination (n=23), waist circumference (n=21), pulse (n=14) and diet (n=13). Some rates of physical health problems observed were broadly similar to those reported in studies of patients receiving antipsychotics in primary care but much lower than those reported in epidemiological studies. Individualised care was planned and delivered with each patient based on the profile. 28 discreet interventions that included providing advice, promoting health behavioural change, performing an electrocardiogram and making a referral to

  11. Identifying undiagnosed HIV in men who have sex with men (MSM) by offering HIV home sampling via online gay social media: a service evaluation.

    Science.gov (United States)

    Elliot, E; Rossi, M; McCormack, S; McOwan, A

    2016-09-01

    An estimated one in eight men who have sex with men (MSM) in London lives with HIV, of which 16% are undiagnosed. It is a public health priority to minimise time spent undiagnosed and reduce morbidity, mortality and onward HIV transmission. 'Dean Street at Home' provided an online HIV risk self-assessment and postal home HIV sampling service aimed at hard-to-reach, high-risk MSM. This 2-year service evaluation aims to determine the HIV risk behaviour of users, the uptake of offer of home sampling and the acceptability of the service. Users were invited to assess their HIV risk anonymously through messages or promotional banners on several gay social networking websites. Regardless of risk, they were offered a free postal HIV oral fluid or blood self-sampling kit. Reactive results were confirmed in clinic. A user survey was sent to first year respondents. 17 361 respondents completed the risk self-assessment. Of these, half had an 'identifiable risk' for HIV and a third was previously untested. 5696 test kits were returned. 121 individuals had a reactive sample; 82 (1.4% of returned samples) confirmed as new HIV diagnoses linked to care; 14 (0.25%) already knew their diagnosis; and 14 (0.25%) were false reactives. The median age at diagnosis was 38; median CD4 505 cells/µL and 20% were recent infections. 61/82 (78%) were confirmed on treatment at the time of writing. The post-test email survey revealed a high service acceptability rate. The service was the first of its kind in the UK. This evaluation provides evidence to inform the potential roll-out of further online strategies to enhance community HIV testing. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. Tuberculosis Prevention in the Private Sector: Using Claims-Based Methods to Identify and Evaluate Latent Tuberculosis Infection Treatment With Isoniazid Among the Commercially Insured.

    Science.gov (United States)

    Stockbridge, Erica L; Miller, Thaddeus L; Carlson, Erin K; Ho, Christine

    Targeted identification and treatment of people with latent tuberculosis infection (LTBI) are key components of the US tuberculosis elimination strategy. Because of recent policy changes, some LTBI treatment may shift from public health departments to the private sector. To (1) develop methodology to estimate initiation and completion of treatment with isoniazid for LTBI using claims data, and (2) estimate treatment completion rates for isoniazid regimens from commercial insurance claims. Medical and pharmacy claims data representing insurance-paid services rendered and prescriptions filled between January 2011 and March 2015 were analyzed. Four million commercially insured individuals 0 to 64 years of age. Six-month and 9-month treatment completion rates for isoniazid LTBI regimens. There was an annual isoniazid LTBI treatment initiation rate of 12.5/100 000 insured persons. Of 1074 unique courses of treatment with isoniazid for which treatment completion could be assessed, almost half (46.3%; confidence interval, 43.3-49.3) completed 6 or more months of therapy. Of those, approximately half (48.9%; confidence interval, 44.5-53.3) completed 9 months or more. Claims data can be used to identify and evaluate LTBI treatment with isoniazid occurring in the commercial sector. Completion rates were in the range of those found in public health settings. These findings suggest that the commercial sector may be a valuable adjunct to more traditional venues for tuberculosis prevention. In addition, these newly developed claims-based methods offer a means to gain important insights and open new avenues to monitor, evaluate, and coordinate tuberculosis prevention.

  13. Applying the Water Vapor Radiometer to Verify the Precipitable Water Vapor Measured by GPS

    Directory of Open Access Journals (Sweden)

    Ta-Kang Yeh

    2014-01-01

    Full Text Available Taiwan is located at the land-sea interface in a subtropical region. Because the climate is warm and moist year round, there is a large and highly variable amount of water vapor in the atmosphere. In this study, we calculated the Zenith Wet Delay (ZWD of the troposphere using the ground-based Global Positioning System (GPS. The ZWD measured by two Water Vapor Radiometers (WVRs was then used to verify the ZWD that had been calculated using GPS. We also analyzed the correlation between the ZWD and the precipitation data of these two types of station. Moreover, we used the observational data from 14 GPS and rainfall stations to evaluate three cases. The offset between the GPS-ZWD and the WVR-ZWD ranged from 1.31 to 2.57 cm. The correlation coefficient ranged from 0.89 to 0.93. The results calculated from GPS and those measured using the WVR were very similar. Moreover, when there was no rain, light rain, moderate rain, or heavy rain, the flatland station ZWD was 0.31, 0.36, 0.38, or 0.40 m, respectively. The mountain station ZWD exhibited the same trend. Therefore, these results have demonstrated that the potential and strength of precipitation in a region can be estimated according to its ZWD values. Now that the precision of GPS-ZWD has been confirmed, this method can eventually be expanded to the more than 400 GPS stations in Taiwan and its surrounding islands. The near real-time ZWD data with improved spatial and temporal resolution can be provided to the city and countryside weather-forecasting system that is currently under development. Such an exchange would fundamentally improve the resources used to generate weather forecasts.

  14. Thoughts on identifiers

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    As business processes and information transactions have become an inextricably intertwined with the Web, the importance of assignment, registration, discovery, and maintenance of identifiers has increased. In spite of this, integrated frameworks for managing identifiers have been slow to emerge. Instead, identification systems arise (quite naturally) from immediate business needs without consideration for how they fit into larger information architectures. In addition, many legacy identifier systems further complicate the landscape, making it difficult for content managers to select and deploy identifier systems that meet both the business case and long term information management objectives. This presentation will outline a model for evaluating identifier applications and the functional requirements of the systems necessary to support them. The model is based on a layered analysis of the characteristics of identifier systems, including: * Functional characteristics * Technology * Policy * Business * Social T...

  15. Informing Antibiotic Treatment Decisions: Evaluating Rapid Molecular Diagnostics To Identify Susceptibility and Resistance to Carbapenems against Acinetobacter spp. in PRIMERS III.

    Science.gov (United States)

    Evans, Scott R; Hujer, Andrea M; Jiang, Hongyu; Hill, Carol B; Hujer, Kristine M; Mediavilla, Jose R; Manca, Claudia; Tran, Thuy Tien T; Domitrovic, T Nicholas; Higgins, Paul G; Seifert, Harald; Kreiswirth, Barry N; Patel, Robin; Jacobs, Michael R; Chen, Liang; Sampath, Rangarajan; Hall, Thomas; Marzan, Christine; Fowler, Vance G; Chambers, Henry F; Bonomo, Robert A

    2017-01-01

    The widespread dissemination of carbapenem-resistant Acinetobacter spp. has created significant therapeutic challenges. At present, rapid molecular diagnostics (RMDs) that can identify this phenotype are not commercially available. Two RMD platforms, PCR combined with electrospray ionization mass spectrometry (PCR/ESI-MS) and molecular beacons (MB), for detecting genes conferring resistance/susceptibility to carbapenems in Acinetobacter spp. were evaluated. An archived collection of 200 clinical Acinetobacter sp. isolates was tested. Predictive values for susceptibility and resistance were estimated as a function of susceptibility prevalence and were based on the absence or presence of beta-lactamase (bla) NDM, VIM, IMP, KPC, and OXA carbapenemase genes (e.g., bla OXA-23 , bla OXA-24/40 , and bla OXA-58 found in this study) against the reference standard of MIC determinations. According to the interpretation of MICs, 49% (n = 98) of the isolates were carbapenem resistant (as defined by either resistance or intermediate resistance to imipenem). The susceptibility sensitivities (95% confidence interval [CI]) for imipenem were 82% (74%, 89%) and 92% (85%, 97%) for PCR/ESI-MS and MB, respectively. Resistance sensitivities (95% CI) for imipenem were 95% (88%, 98%) and 88% (80%, 94%) for PCR/ESI-MS and MB, respectively. PRIMERS III establishes that RMDs can discriminate between carbapenem resistance and susceptibility in Acinetobacter spp. In the context of a known prevalence of resistance, SPVs and RPVs can inform clinicians regarding the best choice for empiric antimicrobial therapy against this multidrug-resistant pathogen. Copyright © 2016 American Society for Microbiology.

  16. Comparative evaluation of extracellular β-D-fructofuranosidase in submerged and solid-state fermentation produced by newly identified Bacillus subtilis strain.

    Science.gov (United States)

    Lincoln, Lynette; More, Sunil S

    2018-04-17

    To screen and identify a potential extracellular β-D-fructofuranosidase or invertase producing bacterium from soil, and comparatively evaluate the enzyme biosynthesis under submerged and solid-state fermentation. Extracellular invertase producing bacteria were screened from soil. Identification of the potent bacterium was performed based on microscopic examinations and 16S rDNA molecular sequencing. Bacillus subtilis LYN12 invertase secretion was surplus with wheat bran humidified with molasses medium (70%), with elevated activity at 48 h and 37 °C under solid-state fermentation, whereas under submerged conditions increased activity was observed at 24 h and 45 °C in the molasses medium. The study revealed a simple fermentative medium for elevated production of extracellular invertase from a fast growing Bacillus strain. Bacterial invertases are scarce and limited reports are available. By far, this is the first report on the comparative analysis of optimization of extracellular invertase synthesis from Bacillus subtilis strain by submerged and solid-state fermentation. The use of agricultural residues increased yields resulting in development of a cost-effective and stable approach. Bacillus subtilis LYN12 invertase possesses excellent fermenting capability to utilize agro-industrial residues under submerged and solid-state conditions. This could be a beneficial candidate in food and beverage processing industries. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  17. Evaluation of two recombinant Leishmania proteins identified by an immunoproteomic approach as tools for the serodiagnosis of canine visceral and human tegumentary leishmaniasis.

    Science.gov (United States)

    Coelho, Eduardo Antonio Ferraz; Costa, Lourena Emanuele; Lage, Daniela Pagliara; Martins, Vívian Tamietti; Garde, Esther; de Jesus Pereira, Nathália Cristina; Lopes, Eliane Gonçalves Paiva; Borges, Luiz Felipe Nunes Menezes; Duarte, Mariana Costa; Menezes-Souza, Daniel; de Magalhães-Soares, Danielle Ferreira; Chávez-Fumagalli, Miguel Angel; Soto, Manuel; Tavares, Carlos Alberto Pereira

    2016-01-15

    Serological diagnostic tests for canine and human leishmaniasis present problems related with their sensitivity and/or specificity. Recently, an immunoproteomic approach performed with Leishmania infantum proteins identified new parasite antigens. In the present study, the diagnostic properties of two of these proteins, cytochrome c oxidase and IgE-dependent histamine-releasing factor, were evaluated for the serodiagnosis of canine visceral (CVL) and human tegumentary (HTL) leishmaniasis. For the CVL diagnosis, sera samples from non-infected dogs living in an endemic or non-endemic area of leishmaniasis, sera from asymptomatic or symptomatic visceral leishmaniasis (VL) dogs, from Leish-Tec(®)-vaccinated dogs, and sera from animals experimentally infected by Trypanosoma cruzi or Ehrlichia canis were used. For the HTL diagnosis, sera from non-infected subjects living in an endemic area of leishmaniasis, sera from active cutaneous or mucosal leishmaniasis patients, as well as those from T. cruzi-infected patients were employed. ELISA assays using the recombinant proteins showed both sensitivity and specificity values of 100% for the serodiagnosis of both forms of disease, with high positive and negative predictive values, showing better diagnostic properties than the parasite recombinant A2 protein or a soluble Leishmania antigen extract. In this context, the two new recombinant proteins could be considered to be used in the serodiagnosis of CVL and HTL. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Evaluation of 5.8S rRNA to identify Penaeus semisulcatus and its subspecies, Penaeus semisulcatus persicus (Penaeidae and some Decapoda species

    Directory of Open Access Journals (Sweden)

    Zahra Noroozi

    2015-10-01

    Full Text Available The green tiger prawn, Penaeus semisulcatus is one of the most important members of the family Penaeidae in the Persian Gulf. Based on the morphological characteristics, two groups, including P. semisulcatus and its subspecies viz. P. s. persicus are recognized. This study was conducted to investigate the genetic distance between P. semisulcatus and P. s. persicus by analyzing partial sequence of 5.8S rRNA. Another objective of this study is to evaluate the ability of 5.8S rRNA to identify the species of Decapoda. The results indicated that the 5.8S rRNA gene of both P. semisulcatus and P. s. persicus were exactly identical, and sequence variation was not observed. The results also indicated that 5.8S rRNA sequences between species of the same genus of analysed species of Decapoda are conserved, and no genetic distance was observed in species level. The low evolutionary rate and efficient conservation of the 5.8S rRNA can be attributed to its role in the translation process.

  19. The development of an ergonomics training program to identify, evaluate, and control musculoskeletal disorders among nursing assistants at a state-run veterans' home.

    Science.gov (United States)

    Peterson, Erica L; McGlothlin, James D; Blue, Carolyn L

    2004-01-01

    Nursing assistants (NAs) who work in nursing and personal care facilities are twice and five times more likely, respectively, to suffer a musculoskeletal disorder compared to service industries and other health care facilities, respectively. The purpose of this study was to develop an ergonomics training program for selected NAs at a state-run veterans' home to decrease musculoskeletal disorders by 1) developing questionnaires to assess musculoskeletal stress, 2) evaluating the work environment, 3) developing and using a training package, and 4) determining the application of the information from the training package by NAs on the floor. Results show two new risk factors not previously identified for nursing personnel in the peer-reviewed literature. Quizzes given to the nursing personnel before and after training indicated a significant improvement in understanding the principles of ergonomics and patient-handling techniques. Statistical analysis comparing the pre-training and post-training questionnaires indicated no significant decrease in musculoskeletal risk factors and no significant reduction in pain or discomfort or overall mental or physical health.

  20. Establishment of the Pediatric Obesity Weight Evaluation Registry: A National Research Collaborative for Identifying the Optimal Assessment and Treatment of Pediatric Obesity.

    Science.gov (United States)

    Kirk, Shelley; Armstrong, Sarah; King, Eileen; Trapp, Christine; Grow, Mollie; Tucker, Jared; Joseph, Madeline; Liu, Lenna; Weedn, Ashley; Sweeney, Brooke; Fox, Claudia; Fathima, Samreen; Williams, Ronald; Kim, Roy; Stratbucker, William

    2017-02-01

    Prospective patient registries have been successfully utilized in several disease states with a goal of improving treatment approaches through multi-institutional collaboration. The prevalence of youth with severe obesity is at a historic high in the United States, yet evidence to guide effective weight management is limited. The Pediatric Obesity Weight Evaluation Registry (POWER) was established in 2013 to identify and promote effective intervention strategies for pediatric obesity. Sites in POWER provide multicomponent pediatric weight management (PWM) care for youth with obesity and collect a defined set of demographic and clinical parameters, which they regularly submit to the POWER Data Coordinating Center. A program profile survey was completed by sites to describe characteristics of the respective PWM programs. From January 2014 through December 2015, 26 US sites were enrolled in POWER and had submitted data on 3643 youth with obesity. Ninety-five percent were 6-18 years of age, 54% female, 32% nonwhite, 32% Hispanic, and 59% publicly insured. Over two-thirds had severe obesity. All sites included a medical provider and used weight status in their referral criteria. Other program characteristics varied widely between sites. POWER is an established national registry representing a diverse sample of youth with obesity participating in multicomponent PWM programs across the United States. Using high-quality data collection and a collaborative research infrastructure, POWER aims to contribute to the development of evidence-based guidelines for multicomponent PWM programs.

  1. Digging Deeper: Development and evaluation of an untargeted metabolomics approach to identify biogeochemical hotspots with depth and by vegetation type in Arctic tundra soils

    Science.gov (United States)

    Ladd, M.; Wullschleger, S.; Hettich, R.

    2017-12-01

    Elucidating the chemical composition of low molecular weight (LMW) dissolved organic matter (DOM), and monitoring how this bioavailable pool varies over space and time, is critical to understanding the controlling mechanisms that underlie carbon release and storage in Arctic systems. Due to analytical challenges however, relatively little is known about how this complex mixture of small molecules varies with soil depth or how it may be influenced by vegetation. In this study, we evaluated an untargeted metabolomics approach for the characterization of LMW DOM in water extracts, and applied this approach in soil cores (10-cm diam., 30-cm depth), obtained near Barrow, Alaska (71° 16' N) from the organic-rich active layer where the aboveground vegetation was primarily either Carex aquatilis or Eriophorum angustifolium, two species commonly found in tundra systems. We hypothesized that by using a discovery-based approach, spatial patterns of chemical diversity could be identified, enabling the detection of biogeochemical hotspots across scales. LMW DOM profiles from triplicate water extracts were characterized using dual-separation, nano-liquid chromatography (LC) coupled to an electrospray Orbitrap mass spectrometer in positive and negative ion modes. Both LC separations—reversed-phase and hydrophilic interaction chromatography—were achieved with gradient elutions in 15 minutes. Using a precursor and fragment mass measurement accuracy of nutrients) impact carbon fluxes in the Arctic at the landscape-scale.

  2. The Method of a Standalone Functional Verifying Operability of Sonar Control Systems

    Directory of Open Access Journals (Sweden)

    A. A. Sotnikov

    2014-01-01

    Full Text Available This article describes a method of standalone verifying sonar control system, which is based on functional checking of control system operability.The main features of realized method are a development of the valid mathematic model for simulation of sonar signals at the point of hydroacoustic antenna, a valid representation of the sonar control system modes as a discrete Markov model, providing functional object verification in real time mode.Some ways are proposed to control computational complexity in case of insufficient computing resources of the simulation equipment, namely the way of model functionality reduction and the way of adequacy reduction.Experiments were made using testing equipment, which was developed by department of Research Institute of Information Control System at Bauman Moscow State Technical University to verify technical validity of industrial sonar complexes.On-board software was artificially changed to create malfunctions in functionality of sonar control systems during the verifying process in order to estimate verifying system performances.The method efficiency was proved by the theory and experiment results in comparison with the basic methodology of verifying technical systems.This method could be also used in debugging of on-board software of sonar complexes and in development of new promising algorithms of sonar signal processing.

  3. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  4. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  5. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    International Nuclear Information System (INIS)

    Yampolskiy, Roman V

    2017-01-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification. (invited comment)

  6. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    Science.gov (United States)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  7. Scoring of the radiological picture of idiopathic interstitial pneumonia: a study to verify the reliability of the method

    International Nuclear Information System (INIS)

    Kocova, Eva; Vanasek, Jiri; Koblizek, Vladimir; Novosad, Jakub; Elias, Pavel; Bartos, Vladimir; Sterclova, Martina

    2015-01-01

    Idiopathic pulmonary fibrosis (IPF) is a clinical form of usual interstitial pneumonia (UIP). Computed chest tomography (CT) has a fundamental role in the multidisciplinary diagnostics. However, it has not been verified if and how a subjective opinion of a radiologists or pneumologists can influence the assessment and overall diagnostic summary. To verify the reliability of the scoring system. Assessment of conformity of the radiological score of high-resolution CT (HRCT) of lungs in patients with IPF was performed by a group of radiologists and pneumologists. Personal data were blinded and the assessment was performed independently using the Dutka/Vasakova scoring system (modification of the Gay system). The final score of the single assessors was then evaluated by means of the paired Spearman’s correlation and analysis of the principal components. Two principal components explaining cumulatively a 62% or 73% variability of the assessment of the single assessors were extracted during the analysis. The groups did not differ both in terms of specialty and experience with the assessment of the HRCT findings. According to our study, scoring of a radiological image using the Dutka/Vasakova system is a reliable method in the hands of experienced radiologists. Significant differences occur during the assessment performed by pneumologists especially during the evaluation of the alveolar changes

  8. Consumer participation in quality improvements for chronic disease care: development and evaluation of an interactive patient-centered survey to identify preferred service initiatives.

    Science.gov (United States)

    Fradgley, Elizabeth A; Paul, Christine L; Bryant, Jamie; Roos, Ian A; Henskens, Frans A; Paul, David J

    2014-12-19

    With increasing attention given to the quality of chronic disease care, a measurement approach that empowers consumers to participate in improving quality of care and enables health services to systematically introduce patient-centered initiatives is needed. A Web-based survey with complex adaptive questioning and interactive survey items would allow consumers to easily identify and prioritize detailed service initiatives. The aim was to develop and test a Web-based survey capable of identifying and prioritizing patient-centered initiatives in chronic disease outpatient services. Testing included (1) test-retest reliability, (2) patient-perceived acceptability of the survey content and delivery mode, and (3) average completion time, completion rates, and Flesch-Kincaid reading score. In Phase I, the Web-based Consumer Preferences Survey was developed based on a structured literature review and iterative feedback from expert groups of service providers and consumers. The touchscreen survey contained 23 general initiatives, 110 specific initiatives available through adaptive questioning, and a relative prioritization exercise. In Phase II, a pilot study was conducted within 4 outpatient clinics to evaluate the reliability properties, patient-perceived acceptability, and feasibility of the survey. Eligible participants were approached to complete the survey while waiting for an appointment or receiving intravenous therapy. The age and gender of nonconsenters was estimated to ascertain consent bias. Participants with a subsequent appointment within 14 days were asked to complete the survey for a second time. A total of 741 of 1042 individuals consented to participate (71.11% consent), 529 of 741 completed all survey content (78.9% completion), and 39 of 68 completed the test-retest component. Substantial or moderate reliability (Cohen's kappa>0.4) was reported for 16 of 20 general initiatives with observed percentage agreement ranging from 82.1%-100.0%. The majority of

  9. Consumer Participation in Quality Improvements for Chronic Disease Care: Development and Evaluation of an Interactive Patient-Centered Survey to Identify Preferred Service Initiatives

    Science.gov (United States)

    Paul, Christine L; Bryant, Jamie; Roos, Ian A; Henskens, Frans A; Paul, David J

    2014-01-01

    Background With increasing attention given to the quality of chronic disease care, a measurement approach that empowers consumers to participate in improving quality of care and enables health services to systematically introduce patient-centered initiatives is needed. A Web-based survey with complex adaptive questioning and interactive survey items would allow consumers to easily identify and prioritize detailed service initiatives. Objective The aim was to develop and test a Web-based survey capable of identifying and prioritizing patient-centered initiatives in chronic disease outpatient services. Testing included (1) test-retest reliability, (2) patient-perceived acceptability of the survey content and delivery mode, and (3) average completion time, completion rates, and Flesch-Kincaid reading score. Methods In Phase I, the Web-based Consumer Preferences Survey was developed based on a structured literature review and iterative feedback from expert groups of service providers and consumers. The touchscreen survey contained 23 general initiatives, 110 specific initiatives available through adaptive questioning, and a relative prioritization exercise. In Phase II, a pilot study was conducted within 4 outpatient clinics to evaluate the reliability properties, patient-perceived acceptability, and feasibility of the survey. Eligible participants were approached to complete the survey while waiting for an appointment or receiving intravenous therapy. The age and gender of nonconsenters was estimated to ascertain consent bias. Participants with a subsequent appointment within 14 days were asked to complete the survey for a second time. Results A total of 741 of 1042 individuals consented to participate (71.11% consent), 529 of 741 completed all survey content (78.9% completion), and 39 of 68 completed the test-retest component. Substantial or moderate reliability (Cohen’s kappa>0.4) was reported for 16 of 20 general initiatives with observed percentage agreement

  10. Identifying potential areas for biofuel production and evaluating the environmental effects: a case study of the James River Basin in the Midwestern United States

    Science.gov (United States)

    Wu, Yiping; Liu, Shu-Guang; Li, Zhengpeng

    2012-01-01

    Biofuels are now an important resource in the United States because of the Energy Independence and Security Act of 2007. Both increased corn growth for ethanol production and perennial dedicated energy crop growth for cellulosic feedstocks are potential sources to meet the rising demand for biofuels. However, these measures may cause adverse environmental consequences that are not yet fully understood. This study 1) evaluates the long-term impacts of increased frequency of corn in the crop rotation system on water quantity and quality as well as soil fertility in the James River Basin and 2) identifies potential grasslands for cultivating bioenergy crops (e.g. switchgrass), estimating the water quality impacts. We selected the soil and water assessment tool, a physically based multidisciplinary model, as the modeling approach to simulate a series of biofuel production scenarios involving crop rotation and land cover changes. The model simulations with different crop rotation scenarios indicate that decreases in water yield and soil nitrate nitrogen (NO3-N) concentration along with an increase in NO3-N load to stream water could justify serious concerns regarding increased corn rotations in this basin. Simulations with land cover change scenarios helped us spatially classify the grasslands in terms of biomass productivity and nitrogen loads, and we further derived the relationship of biomass production targets and the resulting nitrogen loads against switchgrass planting acreages. The suggested economically efficient (planting acreage) and environmentally friendly (water quality) planting locations and acreages can be a valuable guide for cultivating switchgrass in this basin. This information, along with the projected environmental costs (i.e. reduced water yield and increased nitrogen load), can contribute to decision support tools for land managers to seek the sustainability of biofuel development in this region.

  11. The Systematic Evaluation of Identifying the Infarct Related Artery Utilizing Cardiac Magnetic Resonance in Patients Presenting with ST-Elevation Myocardial Infarction.

    Directory of Open Access Journals (Sweden)

    Carine E Hamo

    Full Text Available Identification of the infarct-related artery (IRA in patients with STEMI using coronary angiography (CA is often based on the ECG and can be challenging in patients with severe multi-vessel disease. The current study aimed to determine how often percutaneous intervention (PCI is performed in a coronary artery different from the artery supplying the territory of acute infarction on cardiac magnetic resonance imaging (CMR.We evaluated 113 patients from the Reduction of infarct Expansion and Ventricular remodeling with Erythropoetin After Large myocardial infarction (REVEAL trial, who underwent CMR within 4±2 days of revascularization. Blinded reviewers interpreted CA to determine the IRA and CMR to determine the location of infarction on a 17-segment model. In patients with multiple infarcts on CMR, acuity was determined with T2-weighted imaging and/or evidence of microvascular obstruction.A total of 5 (4% patients were found to have a mismatch between the IRA identified on CMR and CA. In 4/5 cases, there were multiple infarcts noted on CMR. Thirteen patients (11.5% had multiple infarcts in separate territories on CMR with 4 patients (3.5% having multiple acute infarcts and 9 patients (8% having both acute and chronic infarcts.In this select population of patients, the identification of the IRA by CA was incorrect in 4% of patients presenting with STEMI. Four patients with a mismatch had an acute infarction in more than one coronary artery territory on CMR. The role of CMR in patients presenting with STEMI with multi-vessel disease on CA deserves further investigation.

  12. Development of material measures for performance verifying surface topography measuring instruments

    International Nuclear Information System (INIS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-01-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments

  13. Identifying Method of Drunk Driving Based on Driving Behavior

    Directory of Open Access Journals (Sweden)

    Xiaohua Zhao

    2011-05-01

    Full Text Available Drunk driving is one of the leading causes contributing to traffic crashes. There are numerous issues that need to be resolved with the current method of identifying drunk driving. Driving behavior, with the characteristic of real-time, was extensively researched to identify impaired driving behaviors. In this paper, the drives with BACs above 0.05% were defined as drunk driving state. A detailed comparison was made between normal driving and drunk driving. The experiment in driving simulator was designed to collect the driving performance data of the groups. According to the characteristics analysis for the effect of alcohol on driving performance, seven significant indicators were extracted and the drunk driving was identified by the Fisher Discriminant Method. The discriminant function demonstrated a high accuracy of classification. The optimal critical score to differentiate normal from drinking state was found to be 0. The evaluation result verifies the accuracy of classification method.

  14. Accuracy of self-reported length of coma and posttraumatic amnesia in persons with medically verified traumatic brain injury.

    Science.gov (United States)

    Sherer, Mark; Sander, Angelle M; Maestas, Kacey Little; Pastorek, Nicholas J; Nick, Todd G; Li, Jingyun

    2015-04-01

    To determine the accuracy of self-reported length of coma and posttraumatic amnesia (PTA) in persons with medically verified traumatic brain injury (TBI) and to investigate factors that affect self-report of length of coma and PTA duration. Prospective cohort study. Specialized rehabilitation center with inpatient and outpatient programs. Persons (N=242) with medically verified TBI who were identified from a registry of persons who had previously participated in TBI-related research. Not applicable. Self-reported length of coma and self-reported PTA duration. Review of medical records revealed that the mean medically documented length of coma and PTA duration was 6.9±12 and 19.2±22 days, respectively, and the mean self-reported length of coma and PTA duration was 16.7±22 and 106±194 days, respectively. The average discrepancy between self-report and medical record for length of coma and PTA duration was 8.2±21 and 64±176 days, respectively. Multivariable regression models revealed that time since injury, performance on cognitive tests, and medical record values were associated with self-reported values for both length of coma and PTA duration. In this investigation, persons with medically verified TBI showed poor accuracy in their self-report of length of coma and PTA duration. Discrepancies were large enough to affect injury severity classification. Caution should be exercised when considering self-report of length of coma and PTA duration. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Regressive transgressive cycle of Devonian sea in Uruguay verified by Palynology

    International Nuclear Information System (INIS)

    Da Silva, J.

    1990-01-01

    This work is about the results and conclusions of the populations palinomorphs study, carried out in Devonian formations in the center of Uruguay. The existence of a regressive transgressive cycle is verified by analyzing the vertical distribution of palinomorphs as well as is mentioned the presence of chintziest for the section studied - hoesphaeridium Cyathochitina kinds

  16. Die verifiëring, verfyning en toepassing van leksikografiese liniale ...

    African Journals Online (AJOL)

    Leksikografiese liniale vir Afrikaans en die Afrikatale is 'n dekade oud en word algemeen gebruik in die samestelling van woordeboeke. Die samestellers het dit tot dusver nie nodig geag om hierdie liniale te verifieer of te verfyn nie. Kritiek is egter uitgespreek op die samestelling van die Afrikaanse Liniaal en dit word in ...

  17. Verifiable Outsourced Decryption of Attribute-Based Encryption with Constant Ciphertext Length

    Directory of Open Access Journals (Sweden)

    Jiguo Li

    2017-01-01

    Full Text Available Outsourced decryption ABE system largely reduces the computation cost for users who intend to access the encrypted files stored in cloud. However, the correctness of the transformation ciphertext cannot be guaranteed because the user does not have the original ciphertext. Lai et al. provided an ABE scheme with verifiable outsourced decryption which helps the user to check whether the transformation done by the cloud is correct. In order to improve the computation performance and reduce communication overhead, we propose a new verifiable outsourcing scheme with constant ciphertext length. To be specific, our scheme achieves the following goals. (1 Our scheme is verifiable which ensures that the user efficiently checks whether the transformation is done correctly by the CSP. (2 The size of ciphertext and the number of expensive pairing operations are constant, which do not grow with the complexity of the access structure. (3 The access structure in our scheme is AND gates on multivalued attributes and we prove our scheme is verifiable and it is secure against selectively chosen-plaintext attack in the standard model. (4 We give some performance analysis which indicates that our scheme is adaptable for various limited bandwidth and computation-constrained devices, such as mobile phone.

  18. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility`s WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator`s waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits.

  19. Descriptional complexity of non-unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2017-09-01

    Full Text Available Previously, self-verifying symmetric difference automata were defined and a tight bound of 2^n-1-1 was shown for state complexity in the unary case. We now consider the non-unary case and show that, for every n at least 2, there is a regular...

  20. 13 CFR 127.403 - What happens if SBA verifies the concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA verifies the concern's eligibility? 127.403 Section 127.403 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  1. 13 CFR 127.404 - What happens if SBA is unable to verify a concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA is unable to verify a concern's eligibility? 127.404 Section 127.404 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  2. 40 CFR 8.9 - Measures to assess and verify environmental impacts.

    Science.gov (United States)

    2010-07-01

    ... environmental impacts. 8.9 Section 8.9 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL ENVIRONMENTAL IMPACT ASSESSMENT OF NONGOVERNMENTAL ACTIVITIES IN ANTARCTICA § 8.9 Measures to assess and verify environmental impacts. (a) The operator shall conduct appropriate monitoring of key environmental indicators as...

  3. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    International Nuclear Information System (INIS)

    Tolk, Keith M.; Stoker, Gerald C.

    1999-01-01

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements

  4. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    International Nuclear Information System (INIS)

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility's WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator's waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits

  5. Verifying mapping, monitoring and modeling of fine sediment pollution sources in West Maui, Hawai'i, USA

    Science.gov (United States)

    Cerovski-Darriau, C.; Stock, J. D.

    2017-12-01

    Coral reef ecosystems, and the fishing and tourism industries they support, depend on clean waters. Fine sediment pollution from nearshore watersheds threatens these enterprises in West Maui, Hawai'i. To effectively mitigate sediment pollution, we first have to know where the sediment is coming from, and how fast it erodes. In West Maui, we know that nearshore sediment plumes originate from erosion of fine sand- to silt-sized air fall deposits where they are exposed by grazing, agriculture, or other disturbances. We identified and located these sediment sources by mapping watershed geomorphological processes using field traverses, historic air photos, and modern orthophotos. We estimated bank lowering rates using erosion pins, and other surface erosion rates were extrapolated from data collected elsewhere on the Hawaiian Islands. These measurements and mapping led to a reconnaissance sediment budget which showed that annual loads are dominated by bank erosion of legacy terraces. Field observations during small storms confirm that nearshore sediment plumes are sourced from bank erosion of in-stream, legacy agricultural deposits. To further verify this sediment budget, we used geochemical fingerprinting to uniquely identify each potential source (e.g. stream banks, agricultural fields, roads, other human modified soils, and hillslopes) from the Wahikuli watershed (10 km2) and analyzed the fine fraction using ICP-MS for elemental geochemistry. We propose to apply this the fingerprinting results to nearshore suspended sediment samples taken during storms to identify the proportion of sediment coming from each source. By combining traditional geomorphic mapping, monitoring and geochemistry, we hope to provide a powerful tool to verify the primary source of sediment reaching the nearshore.

  6. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    International Nuclear Information System (INIS)

    Ogle, Stephen M; Davis, Kenneth; Lauvaux, Thomas; Miles, Natasha L; Richardson, Scott; Schuh, Andrew; Cooley, Dan; Breidt, F Jay; West, Tristram O; Heath, Linda S; Smith, James E; McCarty, Jessica L; Gurney, Kevin R; Tans, Pieter; Denning, A Scott

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country’s contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated emissions associated with managing lands for carbon sequestration and other activities, which often have large uncertainties. We report here on the challenges and results associated with a case study using atmospheric measurements of CO 2 concentrations and inverse modeling to verify nationally-reported biogenic CO 2 emissions. The biogenic CO 2 emissions inventory was compiled for the Mid-Continent region of United States based on methods and data used by the US government for reporting to the UNFCCC, along with additional sources and sinks to produce a full carbon balance. The biogenic emissions inventory produced an estimated flux of −408 ± 136 Tg CO 2 for the entire study region, which was not statistically different from the biogenic flux of −478 ± 146 Tg CO 2 that was estimated using the atmospheric CO 2 concentration data. At sub-regional scales, the spatial density of atmospheric observations did not appear sufficient to verify emissions in general. However, a difference between the inventory and inversion results was found in one isolated area of West-central Wisconsin. This part of the region is dominated by forestlands, suggesting that further investigation may be warranted into the forest C stock or harvested wood product data from this portion of the study area. The results suggest that observations of atmospheric CO 2 concentration data and inverse modeling could be used to verify biogenic emissions, and provide more confidence in biogenic GHG emissions reporting to the UNFCCC. (letter)

  7. Evaluation of Changes in Ghanaian Students' Attitudes Towards Science Following Neuroscience Outreach Activities: A Means to Identify Effective Ways to Inspire Interest in Science Careers.

    Science.gov (United States)

    Yawson, Nat Ato; Amankwaa, Aaron Opoku; Tali, Bernice; Shang, Velma Owusua; Batu, Emmanuella Nsenbah; Asiemoah, Kwame; Fuseini, Ahmed Denkeri; Tene, Louis Nana; Angaandi, Leticia; Blewusi, Isaac; Borbi, Makafui; Aduku, Linda Nana Esi; Badu, Pheonah; Abbey, Henrietta; Karikari, Thomas K

    2016-01-01

    The scientific capacity in many African countries is low. Ghana, for example, is estimated to have approximately twenty-three researchers per a million inhabitants. In order to improve interest in science among future professionals, appropriate techniques should be developed and employed to identify barriers and correlates of science education among pre-university students. Young students' attitudes towards science may affect their future career choices. However, these attitudes may change with new experiences. It is, therefore, important to evaluate potential changes in students' attitudes towards science after their exposure to experiences such as science outreach activities. Through this, more effective means of inspiring and mentoring young students to choose science subjects can be developed. This approach would be particularly beneficial in countries such as Ghana, where: (i) documented impacts of outreach activities are lacking; and (ii) effective means to develop scientist-school educational partnerships are needed. We have established an outreach scheme, aimed at helping to improve interaction between scientists and pre-university students (and their teachers). Outreach activities are designed and implemented by undergraduate students and graduate teaching assistants, with support from faculty members and technical staff. Through this, we aim to build a team of trainee scientists and graduates who will become ambassadors of science in their future professional endeavors. Here, we describe an approach for assessing changes in junior high school students' attitudes towards science following classroom neuroscience outreach activities. We show that while students tended to agree more with questions concerning their perceptions about science learning after the delivery of outreach activities, significant improvements were obtained for only two questions, namely "I enjoy science lessons" and "I want to be a scientist in the future." Furthermore, there was a

  8. A framework for verifying the dismantlement and abandonment of nuclear weapons. A policy implication for the denuclearization of Korea Peninsula

    International Nuclear Information System (INIS)

    Ichimasa, Sukeyuki

    2011-01-01

    Denuclearization of Korean Peninsula has been a serious security issue in the North East Asian region. Although the Six-Party Talks has been suspended since North Korea declared a boycott in 2008, aims of denuclearizing North Korea has still been discussed. For instance, the recent Japan and the U.S. '2+2' dialogue affirmed its importance to achieve complete and verifiable denuclearization of North Korea, including scrutinizing its uranium enrichment program, through irreversible steps under the Six Party process. In order to identify effective and efficient framework for denuclearization of North Korea, this paper examines 5 major denuclearization methods including (1) the Nunn-Luger Method, (2) the Iraqi Method, (3) the South African Method, (4) the Libyan Method and (5) the denuclearization method shown in the Nuclear Weapons Convention (NWC), while referring to the recent developments of the verification studies for nuclear disarmament, such as a joint research conducted by the United Kingdom and Norway and any other arguments made by disarmament experts. Moreover, this paper argues what political and security conditions will be required to make North Korea to accept intrusive verification for its denuclearization. Conditions for successful denuclearization talks among the Six-Party member states and a realistic approach of verifiable denuclearization will be also examined. (author)

  9. 城市生态网络功能性连接辨识方法%Identifying and evaluating functional connectivity for building urban ecological networks

    Institute of Scientific and Technical Information of China (English)

    陈春娣; Meurk D. Colin; Ignatieva E. Maria; Stewart H. Glenn; 吴胜军

    2015-01-01

    城市生态网络是景观生态学应用领域研究的热点和重点之一,识别、评估生境之间的连接是构建生态网络的关键环节。在总结已有连接辨识方法的基础上,提出采用最小费用模型和图论分析相结合的方法,探讨功能性连接的辨识和优先恢复途径。以新西兰基督城为案例,分别利用景观发展强度指数建立阻力面,新西兰鸡毛松( Dacrycarpus dacrydioides)种子最大传播距离作为连接阈值来模拟、评价网络连接。结果表明:在1200 m 距离阈值下,共有408条连接,其重要性分为10类。其中Richmond—Petrie公园,Hansons—Auburn保护地,Centaurus公园—King George保护地是整个生态网络的关键连接;若去除,景观整体连接度将下降31.73%。此外,研究发现连接重要值与两端的源面积之和没有显著相关性,即面积大的源斑块之间的连接不一定对网络构建起关键作用,这一结论还有待进一步证明。针对缺少动物迁移资料的城市环境,改进最小费用模型和网络连接分析的部分参数;可操作性与实用性强,对中国城市区域生态恢复建设、栖息地选择具有借鉴意义。%With rapid urbanization and industrialization, habitat fragmentation and loss are inevitable. Under these circumstances, landscape connectivity and ecological networks have become a focus of applied landscape ecology. A well-connected ecological network is believed to facilitate energy and resource fluxes, species dispersal, genetic exchange and multiple other ecological processes, and to contribute to the maintenance of ecosystem stability and integrity. Identifying and evaluating functional connectivity between habitat patches is the key step in designing and building well-connected ecological networks. Based on a review of literature on linkage identification approaches, our study combined least-cost path modeling with graph-theory based network analysis to simulate, identify

  10. A synthesis of evaluation monitoring projects by the forest health monitoring program (1998-2007)

    Science.gov (United States)

    William A. Bechtold; Michael J. Bohne; Barbara L. Conkling; Dana L. Friedman

    2012-01-01

    The national Forest Health Monitoring Program of the Forest Service, U.S. Department of Agriculture, has funded over 200 Evaluation Monitoring projects. Evaluation Monitoring is designed to verify and define the extent of deterioration in forest ecosystems where potential problems have been identified. This report is a synthesis of results from over 150 Evaluation...

  11. Evaluation of the CropSyst Model during Wheat-Maize Rotations on the North China Plain for Identifying Soil Evaporation Losses

    Directory of Open Access Journals (Sweden)

    Muhammad Umair

    2017-09-01

    Full Text Available The North China Plain (NCP is a major grain production zone that plays a critical role in ensuring China's food supply. Irrigation is commonly used during grain production; however, the high annual water deficit [precipitation (P minus evapotranspiration (ET] in typical irrigated cropland does not support double cropping systems (such as maize and wheat and this has resulted in the steep decline in the water table (~0.8 m year−1 at the Luancheng station that has taken place since the 1970s. The current study aimed to adapt and check the ability of the CropSyst model (Suite-4 to simulate actual evapotranspiration (ETa, biomass, and grain yield, and to identify major evaporation (E losses from winter wheat (WW and summer maize (SM rotations. Field experiments were conducted at the Luancheng Agro-ecosystem station, NCP, in 2010–2011 to 2012–2013. The CropSyst model was calibrated on wheat/maize (from weekly leaf area/biomass data available for 2012–2013 and validated onto measured ETa, biomass, and grain yield at the experimental station from 2010–2011 to 2011–2012, by using model calibration parameters. The revalidation was performed with the ETa, biomass, grain yield, and simulated ETa partition for 2008–2009 WW [ETa partition was measured by the Micro-lysimeter (MLM and isotopes approach available for this year]. For the WW crop, E was 30% of total ETa; but from 2010–11 to 2013, the annual average E was ~40% of ETa for the WW and SM rotation. Furthermore, the WW and SM rotation from 2010–2011 to 2012–2013 was divided into three growth periods; (i pre-sowing irrigation (PSI; sowing at field capacity to emergence period (EP, (ii EP to canopy cover period (CC and (iii CC to harvesting period (HP, and E from each growth period was ~10, 60, and 30%, respectively. In general, error statistics such as RMSE, Willmott's d, and NRMSE in the model evaluation for wheat ETa (maize ETa were 38.3 mm, 0.81, and 9.24% (31.74 mm, 0.73, and 11

  12. Efficient Verifiable Range and Closest Point Queries in Zero-Knowledge

    Directory of Open Access Journals (Sweden)

    Ghosh Esha

    2016-10-01

    Full Text Available We present an efficient method for answering one-dimensional range and closest-point queries in a verifiable and privacy-preserving manner. We consider a model where a data owner outsources a dataset of key-value pairs to a server, who answers range and closest-point queries issued by a client and provides proofs of the answers. The client verifies the correctness of the answers while learning nothing about the dataset besides the answers to the current and previous queries. Our work yields for the first time a zero-knowledge privacy assurance to authenticated range and closest-point queries. Previous work leaked the size of the dataset and used an inefficient proof protocol. Our construction is based on hierarchical identity-based encryption. We prove its security and analyze its efficiency both theoretically and with experiments on synthetic and real data (Enron email and Boston taxi datasets.

  13. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  14. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  15. Election Verifiability: Cryptographic Definitions and an Analysis of Helios and JCJ

    Science.gov (United States)

    2015-08-06

    Computer Society, 2014. To appear. [26] David Chaum . Untraceable electronic mail, return addresses, and digital pseudonyms. Communications of the ACM...24(2):84–88, 1981. [27] David Chaum . Secret-ballot receipts: True voter-verifiable elections. IEEE Security and Privacy, 2(1):38–47, 2004. [28... David Chaum , Richard Carback, Jeremy Clark, Aleksander Essex, Stefan Popoveniuc, Ronald L. Rivest, Peter Y. A. Ryan, Emily Shen, and Alan T. Sherman

  16. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    Science.gov (United States)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  17. An method of verify period signal based on data acquisition card

    International Nuclear Information System (INIS)

    Zeng Shaoli

    2005-01-01

    This paper introduces an method to verify index voltage of Period Signal Generator by using data acquisition card. which it's error is less 0.5%. A corresponding Win32's program, which use voluntarily developed VxD to control data acquisition card direct I/O and multi thread technique for gain the best time scale precision, has developed in Windows platform. The program will real time collect inda voltage data and auto measure period. (authors)

  18. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    Science.gov (United States)

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  19. Identifying and overcoming barriers to technology implementation

    International Nuclear Information System (INIS)

    Bailey, M.; Warren, S.; McCune, M.

    1996-01-01

    In a recent General Accounting Office report, the Department of Energy's (DOE) Office of Environmental Management was found to be ineffective in integrating their environmental technology development efforts with the cleanup actions. As a result of these findings, a study of remediation documents was performed by the Technology Applications Team within DOE's Office of Environmental Restoration (EM-40) to validate this finding and to understand why it was occurring. A second initiative built on the foundation of the remediation document study and evaluated solutions to the ineffective implementation of improved technologies. The Technology Applications Team examined over 50 remediation documents (17 projects) which included nearly 600 proposed remediation technologies. It was determined that very few technologies are reaching the Records of Decision documents. In fact, most are eliminated in the early stages of consideration. These observations stem from regulators' and stakeholders' uncertainties in cost and performance of the technology and the inability of the technology to meet site specific conditions. The Technology Applications Team also set out to identify and evaluate solutions to barriers to implementing innovative technology into the DOE's environmental management activities. Through the combined efforts of DOE and the Hazardous Waste Action Coalition (HWAC), a full day workshop was conducted at the annual HWAC meeting in June 1995 to solve barriers to innovative technology implementation. Three barriers were identified as widespread throughout the DOE complex and industry. Identified barriers included a lack of verified or certified cost and performance data for innovative technologies; risk of failure to reach cleanup goals using innovative technologies; and communication barriers that are present at virtually every stage of the characterization/remediation process from development through implementation

  20. ٍEvaluating Advertizing Effectiveness of Parsian Bank and Identifying the Factors with Most Influence on Its Improvement in the City of Tehran

    OpenAIRE

    Ali Rabiee; Mahmood Mohammadian; Bita Baradaran Jamili

    2011-01-01

    Abstract Nowadays, organizations on one hand try to send effective messages through different media to persuade audience for buying their products and services and informing people about their differences and on the other hand, people are surrounded by different advertising everywhere and every time while, the most important thing in advertising subject is evaluating its effectiveness. In this study an attempt has been made to evaluate Parsian Bank's advertising effectiveness in attracting...

  1. Middle-aged patients with an MRI-verified medial meniscal tear report symptoms commonly associated with knee osteoarthritis

    DEFF Research Database (Denmark)

    Hare, Kristoffer B.; Stefan Lohmander, L.; Kise, Nina Jullum

    2017-01-01

    Background and purpose — No consensus exists on when to perform arthroscopic partial meniscectomy in patients with a degenerative meniscal tear. Since MRI and clinical tests are not accurate in detecting a symptomatic meniscal lesion, the patient’s symptoms often play a large role when deciding...... when to perform surgery. We determined the prevalence and severity of self-reported knee symptoms in patients eligible for arthroscopic partial meniscectomy due to a degenerative meniscal tear. We investigated whether symptoms commonly considered to be related to meniscus injury were associated...... with early radiographic signs of knee osteoarthritis. Patients and methods — We included individual baseline items from the Knee injury and Osteoarthritis Outcome Score collected in 2 randomized controlled trials evaluating treatment for an MRI-verified degenerative medial meniscal tears in 199 patients aged...

  2. The McGill Interactive Pediatric OncoGenetic Guidelines: An approach to identifying pediatric oncology patients most likely to benefit from a genetic evaluation.

    Science.gov (United States)

    Goudie, Catherine; Coltin, Hallie; Witkowski, Leora; Mourad, Stephanie; Malkin, David; Foulkes, William D

    2017-08-01

    Identifying cancer predisposition syndromes in children with tumors is crucial, yet few clinical guidelines exist to identify children at high risk of having germline mutations. The McGill Interactive Pediatric OncoGenetic Guidelines project aims to create a validated pediatric guideline in the form of a smartphone/tablet application using algorithms to process clinical data and help determine whether to refer a child for genetic assessment. This paper discusses the initial stages of the project, focusing on its overall structure, the methodology underpinning the algorithms, and the upcoming algorithm validation process. © 2017 Wiley Periodicals, Inc.

  3. A novel two-stage evaluation system based on a Group-G1 approach to identify appropriate emergency treatment technology schemes in sudden water source pollution accidents.

    Science.gov (United States)

    Qu, Jianhua; Meng, Xianlin; Hu, Qi; You, Hong

    2016-02-01

    Sudden water source pollution resulting from hazardous materials has gradually become a major threat to the safety of the urban water supply. Over the past years, various treatment techniques have been proposed for the removal of the pollutants to minimize the threat of such pollutions. Given the diversity of techniques available, the current challenge is how to scientifically select the most desirable alternative for different threat degrees. Therefore, a novel two-stage evaluation system was developed based on a circulation-correction improved Group-G1 method to determine the optimal emergency treatment technology scheme, considering the areas of contaminant elimination in both drinking water sources and water treatment plants. In stage 1, the threat degree caused by the pollution was predicted using a threat evaluation index system and was subdivided into four levels. Then, a technique evaluation index system containing four sets of criteria weights was constructed in stage 2 to obtain the optimum treatment schemes corresponding to the different threat levels. The applicability of the established evaluation system was tested by a practical cadmium-contaminated accident that occurred in 2012. The results show this system capable of facilitating scientific analysis in the evaluation and selection of emergency treatment technologies for drinking water source security.

  4. A network biology approach evaluating the anticancer effects of bortezomib identifies SPARC as a therapeutic target in adult T-cell leukemia cells

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2008-10-01

    Full Text Available Junko H Ohyashiki1, Ryoko Hamamura2, Chiaki Kobayashi2, Yu Zhang2, Kazuma Ohyashiki21Intractable Immune System Disease Research Center, Tokyo Medical University, Tokyo, Japan; 2First Department of Internal Medicine, Tokyo Medical University, Tokyo, JapanAbstract: There is a need to identify the regulatory gene interaction of anticancer drugs on target cancer cells. Whole genome expression profiling offers promise in this regard, but can be complicated by the challenge of identifying the genes affected by hundreds to thousands of genes that induce changes in expression. A proteasome inhibitor, bortezomib, could be a potential therapeutic agent in treating adult T-cell leukemia (ATL patients, however, the underlying mechanism by which bortezomib induces cell death in ATL cells via gene regulatory network has not been fully elucidated. Here we show that a Bayesian statistical framework by VoyaGene® identified a secreted protein acidic and rich in cysteine (SPARC gene, a tumor-invasiveness related gene, as a possible modulator of bortezomib-induced cell death in ATL cells. Functional analysis using RNAi experiments revealed that inhibition of the expression SPARC by siRNA enhanced the apoptotic effect of bortezomib on ATL cells in accordance with an increase of cleaved caspase 3. Targeting SPARC may help to treat ATL patients in combination with bortezomib. This work shows that a network biology approach can be used advantageously to identify the genetic interaction related to anticancer effects.Keywords: network biology, adult T cell leukemia, bortezomib, SPARC

  5. Comparative evaluation of low-molecular-mass proteins from Mycobacterium tuberculosis identifies members of the ESAT-6 family as immunodominant T-cell antigens

    DEFF Research Database (Denmark)

    Skjøt, R L; Oettinger, T; Rosenkrands, I

    2000-01-01

    Culture filtrate from Mycobacterium tuberculosis contains protective antigens of relevance for the generation of a new antituberculosis vaccine. We have identified two previously uncharacterized M. tuberculosis proteins (TB7.3 and TB10.4) from the highly active low-mass fraction of culture filtra...

  6. Science evaluation of the environmental impact statement for the lower Churchill hydroelectric generation project to identify deficiencies with respect to fish and fish habitat

    International Nuclear Information System (INIS)

    Clarke, K.

    2009-01-01

    This report evaluated an environmental impact statement (EIS) submitted by a company proposing to develop a hydroelectric generation project in the lower Churchill River in Labrador. Construction of the facilities will alter the aquatic environment of the river as well as the receiving environment of lakes. The alterations are expected to have an impact on fish and fish habitats. The study evaluated the methods used to describe and predict impacts in the aquatic environment and examined models used for predictions in order to assess uncertainty levels. Results of the evaluation demonstrated that additional efforts are needed to document local knowledge of fish use and fish habitat, and that the magnitude of expected changes to fish habitat must be considered relative to the loss of fish habitat. The study also highlighted areas within the EIS that will require further clarification. A number of the studies used in the EIS had small sample sizes that increased the uncertainty of predictions made using the data. Uncertainties related to potential changes in flushing rates and morphological features was also needed. The impact of direct fish mortality from turbine operations was not addressed in a population context, and further information is needed to evaluate potential project-related effects on a species-by-species basis. 3 refs., 4 tabs.

  7. Evaluation of candidate stromal epithelial cross-talk genes identifies association between risk of serous ovarian cancer and TERT, a cancer susceptibility "hot-spot"

    DEFF Research Database (Denmark)

    Johnatty, Sharon E; Beesley, Jonathan; Chen, Xiaoqing

    2010-01-01

    We hypothesized that variants in genes expressed as a consequence of interactions between ovarian cancer cells and the host micro-environment could contribute to cancer susceptibility. We therefore used a two-stage approach to evaluate common single nucleotide polymorphisms (SNPs) in 173 genes...

  8. Evaluation of candidate stromal epithelial cross-talk genes identifies association between risk of serous ovarian cancer and TERT, a cancer susceptibility "hot-spot"

    DEFF Research Database (Denmark)

    Johnatty, Sharon E; Beesley, Jonathan; Chen, Xiaoqing

    2010-01-01

    We hypothesized that variants in genes expressed as a consequence of interactions between ovarian cancer cells and the host micro-environment could contribute to cancer susceptibility. We therefore used a two-stage approach to evaluate common single nucleotide polymorphisms (SNPs) in 173 genes in...

  9. Comparative evaluation of low-molecular-mass proteins from Mycobacterium tuberculosis identifies members of the ESAT-6 family as immunodominant T-cell antigens

    DEFF Research Database (Denmark)

    Skjøt, R L; Oettinger, T; Rosenkrands, I

    2000-01-01

    Culture filtrate from Mycobacterium tuberculosis contains protective antigens of relevance for the generation of a new antituberculosis vaccine. We have identified two previously uncharacterized M. tuberculosis proteins (TB7.3 and TB10.4) from the highly active low-mass fraction of culture filtrate....... The molecules were characterized, mapped in a two-dimensional electrophoresis reference map of short-term culture filtrate, and compared with another recently identified low-mass protein, CFP10 (F. X. Berthet, P. B. Rasmussen, I. Rosenkrands, P. Andersen, and B. Gicquel. Microbiology 144:3195-3203, 1998......), and the well-described ESAT-6 antigen. Genetic analyses demonstrated that TB10.4 as well as CFP10 belongs to the ESAT-6 family of low-mass proteins, whereas TB7.3 is a low-molecular-mass protein outside this family. The proteins were expressed in Escherichia coli, and their immunogenicity was tested...

  10. Evaluation of the Reliability of Electronic Medical Record Data in Identifying Comorbid Conditions among Patients with Advanced Non-Small Cell Lung Cancer

    International Nuclear Information System (INIS)

    Muehlenbein, C. E.; Lawson, A.; Pohl, G.; Hoverman, R.; Gruschkus, S. K.; Forsyth, M.; Chen, C.; Lopez, W.; Hartnett, H. J.

    2011-01-01

    Traditional methods for identifying co morbidity data in EMRs have relied primarily on costly and time-consuming manual chart review. The purpose of this study was to validate a strategy of electronically searching EMR data to identify co morbidities among cancer patients. Methods. Advanced stage NSCLC patients ( N = 2,513) who received chemotherapy from 7/1/2006 to 6/30/2008 were identified using iKnowMed, US Oncology's proprietary oncology-specific EMR system. EMR data were searched for documentation of co morbidities common to advanced stage cancer patients. The search was conducted by a series of programmatic queries on standardized information including concomitant illnesses, patient history, review of systems, and diagnoses other than cancer. The validity of the co morbidity information that we derived from the EMR search was compared to the chart review gold standard in a random sample of 450 patients for whom the EMR search yielded no indication of co morbidities. Negative predictive values were calculated. Results. The overall prevalence of co morbidities of 22%. Overall negative predictive value was 0.92 in the 450 patients randomly sampled patients (36 of 450 were found to have evidence of co morbidities on chart review). Conclusion. Results of this study suggest that efficient queries/text searches of EMR data may provide reliable data on co morbid conditions among cancer patients.

  11. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    Science.gov (United States)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  12. Verifying Quality of Service of ARCnet Based ATOMOS Communication System for Integrated Ship Control

    DEFF Research Database (Denmark)

    Nielsen, N.N.; Nielsen, Jens Frederik Dalsgaard; Schiøler, Henrik

    point) layer. An important characteristic of the communication system is that the functionality and timing must be verifiable in order to satisfy requirements from classification companies like Lloyds and Norsk Veritas. By including Service Categories, Traffic Descriptors and Quality of Service concepts......As part of the ATOMOS project (Funded by EU, DG VII) a reliable communication system with predictable behaviour has been designed. The selected solution is a network based on redundant ARCnet segments extended with an EN50170 compliant fieldbus based layer on top of an ARCnet SAP (service access...

  13. Verifying Quality of Service of ARCnet Based ATOMOS Communication System for Integrated Ship Control

    DEFF Research Database (Denmark)

    Nielsen, N.N.; Nielsen, Jens Frederik Dalsgaard; Schiøler, Henrik

    1999-01-01

    point) layer. An important characteristic of the communication system is that the functionality and timing must be verifiable in order to satisfy requirements from classification companies like Lloyds and Norsk Veritas. By including Service Categories, Traffic Descriptors and Quality of Service concepts......As part of the ATOMOS project (Funded by EU, DG VII) a reliable communication system with predictable behaviour has been designed. The selected solution is a network based on redundant ARCnet segments extended with an EN50170 compliant fieldbus based layer on top of an ARCnet SAP (service access...

  14. National, Regional and Global Certification Bodies for Polio Eradication: A Framework for Verifying Measles Elimination.

    Science.gov (United States)

    Deblina Datta, S; Tangermann, Rudolf H; Reef, Susan; William Schluter, W; Adams, Anthony

    2017-07-01

    The Global Certification Commission (GCC), Regional Certification Commissions (RCCs), and National Certification Committees (NCCs) provide a framework of independent bodies to assist the Global Polio Eradication Initiative (GPEI) in certifying and maintaining polio eradication in a standardized, ongoing, and credible manner. Their members meet regularly to comprehensively review population immunity, surveillance, laboratory, and other data to assess polio status in the country (NCC), World Health Organization (WHO) region (RCC), or globally (GCC). These highly visible bodies provide a framework to be replicated to independently verify measles and rubella elimination in the regions and globally. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  15. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  16. On the safe use of verify-and-record systems in external beam radiation therapy

    International Nuclear Information System (INIS)

    Seelantag, W.W.; Davis, J.B.

    1997-01-01

    Verify-and-record (V and R) systems are being used increasingly, not only for verification, but also for computer aided setup and chart printing. The close intercorrelation between V and R system and treatment routine requires new ideas for quality assurance (QA): pure ''machine checking'' as with treatment units is not sufficient anymore. The level of QA obviously depends on the tasks of the V and R system: the most advanced case of the system being used for computer aided setup and for chart printing is discussed -both are indispensable for an efficient use of V and R systems. Seven propositions are defined to make this not only efficient but safe. (author)

  17. Experimental observation of G banding verifying X-ray workers' chromosome translocation detected by FISH

    International Nuclear Information System (INIS)

    Sun Yuanming; Li Jin; Wang Qin; Tang Weisheng; Wang Zhiquan

    2002-01-01

    Objective: FISH is the most effective way of detecting chromosome aberration and many factors affect its accuracy. G-banding is used to verify the results of early X-ray workers' chromosome translocation examined by FISH. Methods: The chromosome translocations of early X-ray workers have been analysed by FISH (fluorescence in situ hybridization) and G-banding, yields of translocation treated with statistics. Results: The chromosome aberrations frequencies by tow methods are closely related. Conclusion: FISH is a feasible way to analyse chromosome aberrations of X-ray workers and reconstruct dose

  18. Verifying the competition between haloperidol and biperiden in serum albumin through a model based on spectrofluorimetry

    Science.gov (United States)

    Muniz da Silva Fragoso, Viviane; Patrícia de Morais e Coura, Carla; Paulino, Erica Tex; Valdez, Ethel Celene Narvaez; Silva, Dilson; Cortez, Celia Martins

    2017-11-01

    The aim of this work was to apply mathematical-computational modeling to study the interactions of haloperidol (HLP) and biperiden (BPD) with human (HSA) and bovine (BSA) serum albumin in order to verify the competition of these drugs for binding sites in HSA, using intrinsic tryptophan fluorescence quenching data. The association constants estimated for HPD-HSA was 2.17(±0.05) × 107 M-1, BPD-HSA was 2.01(±0.03) × 108 M-1 at 37 °C. Results have shown that drugs do not compete for the same binding sites in albumin.

  19. A new (k,n verifiable secret image sharing scheme (VSISS

    Directory of Open Access Journals (Sweden)

    Amitava Nag

    2014-11-01

    Full Text Available In this paper, a new (k,n verifiable secret image sharing scheme (VSISS is proposed in which third order LFSR (linear-feedback shift register-based public key cryptosystem is applied for the cheating prevention and preview before decryption. In the proposed scheme the secret image is first partitioned into several non-overlapping blocks of k pixels. Every k pixel is then used to form m=⌈k/4⌉+1 pixels of one encrypted share. The original secret image can be reconstructed by gathering any k or more encrypted shared images. The experimental results show that the proposed VSISS is an efficient and safe method.

  20. Force10 networks performance in world's first transcontinental 10 gigabit ethernet network verified by Ixia

    CERN Multimedia

    2003-01-01

    Force10 Networks, Inc., today announced that the performance of the Force10 E-Series switch/routers deployed in a transcontinental network has been verified as line-rate 10 GE throughput by Ixia, a leading provider of high-speed, network performance and conformance analysis systems. The network, the world's first transcontinental 10 GE wide area network, consists of a SURFnet OC-192 lambda between Geneva and the StarLight facility in Chicago via Amsterdam and another OC-192 lambda between this same facility in Chicago and Carleton University in Ottawa, Canada provided by CANARIE and ORANO (1/2 page).

  1. Verifying Elimination Programs with a Special Emphasis on Cysticercosis Endpoints and Postelimination Surveillance

    Directory of Open Access Journals (Sweden)

    Sukwan Handali

    2012-01-01

    Full Text Available Methods are needed for determining program endpoints or postprogram surveillance for any elimination program. Cysticercosis has the necessary effective strategies and diagnostic tools for establishing an elimination program; however, tools to verify program endpoints have not been determined. Using a statistical approach, the present study proposed that taeniasis and porcine cysticercosis antibody assays could be used to determine with a high statistical confidence whether an area is free of disease. Confidence would be improved by using secondary tests such as the taeniasis coproantigen assay and necropsy of the sentinel pigs.

  2. Comparative evaluation of low-molecular-mass proteins from Mycobacterium tuberculosis identifies members of the ESAT-6 family as immunodominant T-cell antigens

    DEFF Research Database (Denmark)

    Skjøt, Rikke L. V.; Oettinger, Thomas; Rosenkrands, Ida

    2000-01-01

    . The molecules were characterized, mapped in a two-dimensional electrophoresis reference map of short-term culture filtrate, and compared with another recently identified low-mass protein, CFP10 (F. X. Berthet, P, B. Rasmussen, I. Rosenkrands, P. Andersen, and B. Gicquel. Microbiology 144:3195-3203, 1998......), and the well-described ESAT-6 antigen. Genetic analyses demonstrated that TB10.4 as well as CFP10 belongs to the ESAT-6 family of low-mass proteins, whereas TB7.3 is a low-molecular-mass protein outside this family. The proteins were expressed in Escherichia coli, and their immunogenicity was tested...

  3. Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol

    Science.gov (United States)

    Huang, Xiaowan; Singh, Anu; Smolka, Scott A.

    2010-01-01

    We use the UPPAAL model checker for Timed Automata to verify the Timing-Sync time-synchronization protocol for sensor networks (TPSN). The TPSN protocol seeks to provide network-wide synchronization of the distributed clocks in a sensor network. Clock-synchronization algorithms for sensor networks such as TPSN must be able to perform arithmetic on clock values to calculate clock drift and network propagation delays. They must be able to read the value of a local clock and assign it to another local clock. Such operations are not directly supported by the theory of Timed Automata. To overcome this formal-modeling obstacle, we augment the UPPAAL specification language with the integer clock derived type. Integer clocks, which are essentially integer variables that are periodically incremented by a global pulse generator, greatly facilitate the encoding of the operations required to synchronize clocks as in the TPSN protocol. With this integer-clock-based model of TPSN in hand, we use UPPAAL to verify that the protocol achieves network-wide time synchronization and is devoid of deadlock. We also use the UPPAAL Tracer tool to illustrate how integer clocks can be used to capture clock drift and resynchronization during protocol execution

  4. An improved system to verify CANDU spent fuel elements in dry storage silos

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel

    2000-01-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  5. MUSE: An Efficient and Accurate Verifiable Privacy-Preserving Multikeyword Text Search over Encrypted Cloud Data

    Directory of Open Access Journals (Sweden)

    Zhu Xiangyang

    2017-01-01

    Full Text Available With the development of cloud computing, services outsourcing in clouds has become a popular business model. However, due to the fact that data storage and computing are completely outsourced to the cloud service provider, sensitive data of data owners is exposed, which could bring serious privacy disclosure. In addition, some unexpected events, such as software bugs and hardware failure, could cause incomplete or incorrect results returned from clouds. In this paper, we propose an efficient and accurate verifiable privacy-preserving multikeyword text search over encrypted cloud data based on hierarchical agglomerative clustering, which is named MUSE. In order to improve the efficiency of text searching, we proposed a novel index structure, HAC-tree, which is based on a hierarchical agglomerative clustering method and tends to gather the high-relevance documents in clusters. Based on the HAC-tree, a noncandidate pruning depth-first search algorithm is proposed, which can filter the unqualified subtrees and thus accelerate the search process. The secure inner product algorithm is used to encrypted the HAC-tree index and the query vector. Meanwhile, a completeness verification algorithm is given to verify search results. Experiment results demonstrate that the proposed method outperforms the existing works, DMRS and MRSE-HCI, in efficiency and accuracy, respectively.

  6. Study and survey of assembling parameters to a radioactive source production laboratory used to verify equipment

    International Nuclear Information System (INIS)

    Gauglitz, Erica

    2010-01-01

    This paper presents a survey of parameters for the proper and safe flooring, doors, windows, fume hoods and others, in a radiochemical laboratory. The layout of each item follows guidelines and national standards of the National Commission of Nuclear Energy (CNEN) and the International Atomic Energy Agency (IAEA), aiming to ensure the radiological protection of workers and environment. The adequate items arrangement in the radiochemical laboratory ensures quality and safety in the production of 57 Co 137 Cs and 133 Ba radioactive sealed sources, with activities 185, 9.3 and 5.4 MBq, respectively. These sources are used to verify meter activity equipment and should be available throughout the Nuclear Medicine Center, following the recommendations of CNEN-NN-3.05 standard R equirements for Radiation Protection and Safety Services for Nuclear Medicine , to verify the activity of radiopharmaceuticals that are administered in patients, for diagnosis and therapy. Verification of measuring activity equipment will be used to perform accuracy, reproducibility and linearity tests, which should show results within the limits specified in the standard CNEN-NN-3.05. (author)

  7. An improved system to verify CANDU spent fuel elements in dry storage silos

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2000-07-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  8. Independent technique of verifying high-dose rate (HDR) brachytherapy treatment plans

    International Nuclear Information System (INIS)

    Saw, Cheng B.; Korb, Leroy J.; Darnell, Brenda; Krishna, K. V.; Ulewicz, Dennis

    1998-01-01

    Purpose: An independent technique for verifying high-dose rate (HDR) brachytherapy treatment plans has been formulated and validated clinically. Methods and Materials: In HDR brachytherapy, dwell times at respective dwell positions are computed, using an optimization algorithm in a HDR treatment-planning system to deliver a specified dose to many target points simultaneously. Because of the variability of dwell times, concerns have been expressed regarding the ability of the algorithm to compute the correct dose. To address this concern, a commercially available low-dose rate (LDR) algorithm was used to compute the doses at defined distances, based on the dwell times obtained from the HDR treatment plans. The percent deviation between doses computed using the HDR and LDR algorithms were reviewed for HDR procedures performed over the last year. Results: In this retrospective study, the difference between computed doses using the HDR and LDR algorithms was found to be within 5% for about 80% of the HDR procedures. All of the reviewed procedures have dose differences of less than 10%. Conclusion: An independent technique for verifying HDR brachytherapy treatment plans has been validated based on clinical data. Provided both systems are available, this technique is universal in its applications and not limited to either a particular implant applicator, implant site, or implant type

  9. How to Verify Plagiarism of the Paper Written in Macedonian and Translated in Foreign Language?

    Science.gov (United States)

    Spiroski, Mirko

    2016-03-15

    The aim of this study was to show how to verify plagiarism of the paper written in Macedonian and translated in foreign language. Original article "Ethics in Medical Research Involving Human Subjects", written in Macedonian, was submitted as an assay-2 for the subject Ethics and published by Ilina Stefanovska, PhD candidate from the Iustinianus Primus Faculty of Law, Ss Cyril and Methodius University of Skopje (UKIM), Skopje, Republic of Macedonia in Fabruary, 2013. Suspected article for plagiarism was published by Prof. Dr. Gordana Panova from the Faculty of Medical Sciences, University Goce Delchev, Shtip, Republic of Macedonia in English with the identical title and identical content in International scientific on-line journal "SCIENCE & TECHNOLOGIES", Publisher "Union of Scientists - Stara Zagora". Original document (written in Macedonian) was translated with Google Translator; suspected article (published in English pdf file) was converted into Word document, and compared both documents with several programs for plagiarism detection. It was found that both documents are identical in 71%, 78% and 82%, respectively, depending on the computer program used for plagiarism detection. It was obvious that original paper was entirely plagiarised by Prof. Dr. Gordana Panova, including six references from the original paper. Plagiarism of the original papers written in Macedonian and translated in other languages can be verified after computerised translation in other languages. Later on, original and translated documents can be compared with available software for plagiarism detection.

  10. An experimental method to verify soil conservation by check dams on the Loess Plateau, China.

    Science.gov (United States)

    Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q

    2009-12-01

    A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.

  11. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    International Nuclear Information System (INIS)

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing

  12. Evaluation of a Three-Dimensional Stereophotogrammetric Method to Identify and Measure the Palatal Surface Area in Children With Unilateral Cleft Lip and Palate.

    Science.gov (United States)

    De Menezes, Marcio; Cerón-Zapata, Ana Maria; López-Palacio, Ana Maria; Mapelli, Andrea; Pisoni, Luca; Sforza, Chiarella

    2016-01-01

    To assess a three-dimensional (3D) stereophotogrammetric method for area delimitation and evaluation of the dental arches of children with unilateral cleft lip and palate (UCLP). Obtained data were also used to assess the 3D changes occurring in the maxillary arch with the use of orthopedic therapy prior to rhinocheiloplasty and before the first year of life. Within the collaboration between the Università degli Studi di Milano (Italy) and the University CES of Medellin (Colombia), 96 palatal cast models obtained from neonatal patients with UCLP were analyzed using a 3D stereophotogrammetric imaging system. The area of the minor and greater cleft segments on the digital dental cast surface were delineated by the visualization tool of the stereophotogrammetric software and then examined. "Trueness" of the measurements, as well as systematic and random errors between operators' tracings ("precision") were calculated. The method gave area measurements close to true values (errors lower than 2%), without systematic measurement errors for tracings by both interoperators and intraoperators (P > .05). Statistically significant differences (P digital dental casts and area measurements by the 3D stereophotogrammetric system revealed an accurate (true and precise) method for evaluating the stone casts of newborn patients with UCLP.

  13. Sleep Deprivation in Young and Healthy Subjects Is More Sensitively Identified by Higher Frequencies of Electrodermal Activity than by Skin Conductance Level Evaluated in the Time Domain

    Directory of Open Access Journals (Sweden)

    Hugo F. Posada-Quintero

    2017-06-01

    Full Text Available We analyzed multiple measures of the autonomic nervous system (ANS based on electrodermal activity (EDA and heart rate variability (HRV for young healthy subjects undergoing 24-h sleep deprivation. In this study, we have utilized the error awareness test (EAT every 2 h (13 runs total, to evaluate the deterioration of performance. EAT consists of trials where the subject is presented words representing colors. Subjects are instructed to press a button (“Go” trials or withhold the response if the word presented and the color of the word mismatch (“Stroop No-Go” trial, or the screen is repeated (“Repeat No-Go” trials. We measured subjects' (N = 10 reaction time to the “Go” trials, and accuracy to the “Stroop No-Go” and “Repeat No-Go” trials. Simultaneously, changes in EDA and HRV indices were evaluated. Furthermore, the relationship between reactiveness and vigilance measures and indices of sympathetic control based on HRV were analyzed. We found the performance improved to a stable level from 6 through 16 h of deprivation, with a subsequently sustained impairment after 18 h. Indices of higher frequencies of EDA related more to vigilance measures, whereas lower frequencies index (skin conductance leve, SCL measured the reactiveness of the subject. We conclude that indices of EDA, including those of the higher frequencies, termed TVSymp, EDASymp, and NSSCRs, provide information to better understand the effect of sleep deprivation on subjects' autonomic response and performance.

  14. Evaluation of serological and molecular tests used to identify Toxoplasma gondii infection in pregnant women attended in a public health service in São Paulo state, Brazil.

    Science.gov (United States)

    Murata, Fernando Henrique Antunes; Ferreira, Marina Neves; Pereira-Chioccola, Vera Lucia; Spegiorin, Lígia Cosentino Junqueira Franco; Meira-Strejevitch, Cristina da Silva; Gava, Ricardo; Silveira-Carvalho, Aparecida Perpétuo; de Mattos, Luiz Carlos; Brandão de Mattos, Cinara Cássia

    2017-09-01

    Toxoplasmosis during pregnancy can have severe consequences. The use of sensitive and specific serological and molecular methods is extremely important for the correct diagnosis of the disease. We compared the ELISA and ELFA serological methods, conventional PCR (cPCR), Nested PCR and quantitative PCR (qPCR) in the diagnosis of Toxoplasma gondii infection in pregnant women without clinical suspicion of toxoplasmosis (G1=94) and with clinical suspicion of toxoplasmosis (G2=53). The results were compared using the Kappa index, and the sensitivity, specificity, positive predictive value and negative predictive value were calculated. The results of the serological methods showed concordance between the ELISA and ELFA methods even though ELFA identified more positive cases than ELISA. Molecular methods were discrepant with cPCR using B22/23 primers having greater sensitivity and lower specificity compared to the other molecular methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Development and evaluation of a decision-supporting model for identifying the source location of microbial intrusions in real gravity sewer systems.

    Science.gov (United States)

    Kim, Minyoung; Choi, Christopher Y; Gerba, Charles P

    2013-09-01

    Assuming a scenario of a hypothetical pathogenic outbreak, we aimed this study at developing a decision-support model for identifying the location of the pathogenic intrusion as a means of facilitating rapid detection and efficient containment. The developed model was applied to a real sewer system (the Campbell wash basin in Tucson, AZ) in order to validate its feasibility. The basin under investigation was divided into 14 sub-basins. The geometric information associated with the sewer network was digitized using GIS (Geological Information System) and imported into an urban sewer network simulation model to generate microbial breakthrough curves at the outlet. A pre-defined amount of Escherichia coli (E. coli), which is an indicator of fecal coliform bacteria, was hypothetically introduced into 56 manholes (four in each sub-basin, chosen at random), and a total of 56 breakthrough curves of E. coli were generated using the simulation model at the outlet. Transport patterns were classified depending upon the location of the injection site (manhole), various known characteristics (peak concentration and time, pipe length, travel time, etc.) extracted from each E. coli breakthrough curve and the layout of sewer network. Using this information, we back-predicted the injection location once an E. coli intrusion was detected at a monitoring site using Artificial Neural Networks (ANNs). The results showed that ANNs identified the location of the injection sites with 57% accuracy; ANNs correctly recognized eight out of fourteen expressions with relying on data from a single detection sensor. Increasing the available sensors within the basin significantly improved the accuracy of the simulation results (from 57% to 100%). Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Athletic groin pain (part 2): a prospective cohort study on the biomechanical evaluation of change of direction identifies three clusters of movement patterns

    Science.gov (United States)

    Franklyn-Miller, A; Richter, C; King, E; Gore, S; Moran, K; Strike, S; Falvey, E C

    2017-01-01

    Background Athletic groin pain (AGP) is prevalent in sports involving repeated accelerations, decelerations, kicking and change-of-direction movements. Clinical and radiological examinations lack the ability to assess pathomechanics of AGP, but three-dimensional biomechanical movement analysis may be an important innovation. Aim The primary aim was to describe and analyse movements used by patients with AGP during a maximum effort change-of-direction task. The secondary aim was to determine if specific anatomical diagnoses were related to a distinct movement strategy. Methods 322 athletes with a current symptom of chronic AGP participated. Structured and standardised clinical assessments and radiological examinations were performed on all participants. Additionally, each participant performed multiple repetitions of a planned maximum effort change-of-direction task during which whole body kinematics were recorded. Kinematic and kinetic data were examined using continuous waveform analysis techniques in combination with a subgroup design that used gap statistic and hierarchical clustering. Results Three subgroups (clusters) were identified. Kinematic and kinetic measures of the clusters differed strongly in patterns observed in thorax, pelvis, hip, knee and ankle. Cluster 1 (40%) was characterised by increased ankle eversion, external rotation and knee internal rotation and greater knee work. Cluster 2 (15%) was characterised by increased hip flexion, pelvis contralateral drop, thorax tilt and increased hip work. Cluster 3 (45%) was characterised by high ankle dorsiflexion, thorax contralateral drop, ankle work and prolonged ground contact time. No correlation was observed between movement clusters and clinically palpated location of the participant's pain. Conclusions We identified three distinct movement strategies among athletes with long-standing groin pain during a maximum effort change-of-direction task These movement strategies were not related to clinical

  17. Improving the identification of people with dementia in primary care: evaluation of the impact of primary care dementia coding guidance on identified prevalence.

    Science.gov (United States)

    Russell, Paul; Banerjee, Sube; Watt, Jen; Adleman, Rosalyn; Agoe, Belinda; Burnie, Nerida; Carefull, Alex; Chandan, Kiran; Constable, Dominie; Daniels, Mark; Davies, David; Deshmukh, Sid; Huddart, Martin; Jabin, Ashrafi; Jarrett, Penelope; King, Jenifer; Koch, Tamar; Kumar, Sanjoy; Lees, Stavroula; Mir, Sinan; Naidoo, Dominic; Nyame, Sylvia; Sasae, Ryuichiro; Sharma, Tushar; Thormod, Clare; Vedavanam, Krish; Wilton, Anja; Flaherty, Breda

    2013-12-23

    Improving dementia care is a policy priority nationally and internationally; there is a 'diagnosis gap' with less than half of the cases of dementia ever diagnosed. The English Health Department's Quality and Outcomes Framework (QOF) encourages primary care recognition and recording of dementia. The codes for dementia are complex with the possibility of underidentification through miscoding. We developed guidance on coding of dementia; we report the impact of applying this to 'clean up' dementia coding and records at a practice level. The guidance had five elements: (1) identify Read Codes for dementia; (2) access QOF dementia register; (3) generate lists of patients who may have dementia; (4) compare search with QOF data and (5) review cases. In each practice, one general practitioner conducted the exercise. The number of dementia QOF registers before and after the exercise was recorded with the hours taken to complete the exercise. London primary care. 23 (85%) of 27 practices participated, covering 79 312 (19 562 over 65 s) participants. The number on dementia QOF registers; time taken. The number of people with dementia on QOF registers increased from 1007 to 1139 (χ(2)=8.17, p=0.004), raising identification rates by 8.8%. It took 4.7 h per practice, on an average. These data demonstrate the potential of a simple primary care coding exercise, requiring no specific training, to increase the dementia identification rate. An improvement of 8.8% between 2011 and 2012 is equivalent to that of the fourth most improved primary care trust in the UK. In absolute terms, if this effects were mirrored across the UK primary care, the number of cases with dementia identified would rise by over 70 000 from 364 329 to 434 488 raising the recognition rate from 46% to 54.8%. Implementing this exercise appears to be a simple and effective way to improve recognition rates in primary care.

  18. Evaluation of sequence ambiguities of the HIV-1 pol gene as a method to identify recent HIV-1 infection in transmitted drug resistance surveys.

    Science.gov (United States)

    Andersson, Emmi; Shao, Wei; Bontell, Irene; Cham, Fatim; Cuong, Do Duy; Wondwossen, Amogne; Morris, Lynn; Hunt, Gillian; Sönnerborg, Anders; Bertagnolio, Silvia; Maldarelli, Frank; Jordan, Michael R

    2013-08-01

    Identification of recent HIV infection within populations is a public health priority for accurate estimation of HIV incidence rates and transmitted drug resistance at population level. Determining HIV incidence rates by prospective follow-up of HIV-uninfected individuals is challenging and serological assays have important limitations. HIV diversity within an infected host increases with duration of infection. We explore a simple bioinformatics approach to assess viral diversity by determining the percentage of ambiguous base calls in sequences derived from standard genotyping of HIV-1 protease and reverse transcriptase. Sequences from 691 recently infected (≤1 year) and chronically infected (>1 year) individuals from Sweden, Vietnam and Ethiopia were analyzed for ambiguity. A significant difference (p<0.0001) in the proportion of ambiguous bases was observed between sequences from individuals with recent and chronic infection in both HIV-1 subtype B and non-B infection, consistent with previous studies. In our analysis, a cutoff of <0.47% ambiguous base calls identified recent infection with a sensitivity and specificity of 88.8% and 74.6% respectively. 1,728 protease and reverse transcriptase sequences from 36 surveys of transmitted HIV drug resistance performed following World Health Organization guidance were analyzed for ambiguity. The 0.47% ambiguity cutoff was applied and survey sequences were classified as likely derived from recently or chronically infected individuals. 71% of patients were classified as likely to have been infected within one year of genotyping but results varied considerably amongst surveys. This bioinformatics approach may provide supporting population-level information to identify recent infection but its application is limited by infection with more than one viral variant, decreasing viral diversity in advanced disease and technical aspects of population based sequencing. Standardization of sequencing techniques and base calling

  19. Preparation and in vivo evaluation of radioiodinated closo-decaborate(2-) derivatives to identify structural components that provide low retention in tissues

    International Nuclear Information System (INIS)

    Wilbur, D. Scott; Chyan, M.-K.; Hamlin, Donald K.; Perry, Matthew A.

    2010-01-01

    Introduction: In vivo deastatination of 211 At-labeled biomolecules can severely limit their use in endoradiotherapy. Our studies have shown that the use of closo-decaborate(2-) moiety for 211 At-labeling of biomolecules provides high in vivo stability towards deastatination. However, data from those studies have also been suggestive that some astatinated closo-decaborate(2-) catabolites may be retained in tissues. In this study, we investigated the in vivo distributions of several structurally simple closo-decaborate(2-) derivatives to gain information on the effects of functional groups if catabolites are released into the blood system from the carrier biomolecule. Methods: Thirteen closo-decaborate(2-) derivatives were synthesized and radioiodinated for evaluation. Tissue concentrations of the radioiodinated compounds were obtained in groups of five mice at 1 and 4 h postinjection (pi). Dual-label ( 125 I and 131 I) experiments permitted evaluation of two compounds in each set of mice. Results: All of the target compounds were readily synthesized. Radioiodination reactions were conducted with chloramine-T and Na[ 125/131 I]I in water to give high yields (75-96%) of the desired compounds. Biodistribution data at 1 and 4 h pi (representing catabolites released into the blood system) showed small differences in tissue concentrations for some compounds, but large differences for others. The results indicate that formal (overall) charge on the compounds could not be used as a predictor of tissue localization or retention. However, derivatives containing carboxylate groups generally had lower tissue concentrations. Acid cleavable hydrazone functionalities appeared to be the best candidates for further study. Conclusions: Further studies incorporating hydrazone functionalities into pendant groups for biomolecule radiohalogenation are warranted.

  20. AN EVALUATION TO IDENTIFY THE BARRIERS TO THE FEASIBILITY OF URBAN DEVELOPMENT PLANS: FIVE DECADES OF EXPERIENCE IN URBAN PLANNING IN IRAN

    Directory of Open Access Journals (Sweden)

    Mohammad Javad Maghsoodi Tilaki

    2014-01-01

    Full Text Available The rapid urbanization in many developing countries has indicat ed several challenges in different aspects. This is due t o inefficient urban planning ap proaches towards managing the development process. Similar to many other developing count ries, Iran has experienced rapid urbanization in recent decades. Although over the last few decades, urban planning processes have been applied to develop Iranian c ities, urban planning has failed to tackle the challenges facing the cities. This paper s eeks to identify the barriers that have prevented Iranian c ities from achieving the goals of urban planning. The purpose of this paper is to provide a comprehensive review of the curre nt literature on the concept of urban planning and to assess the urban development plan proc ess in Iranian cities. The required data were collected through a review of international theoretical studies, Iranian experimental research and governmental reports. The findings of this study reveal five major barriers to the feasibility of the urban planning process , including the urban plans context, structure of urban pla nning, related law and regulatio ns, public participation, and financial resources.

  1. Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.

    Science.gov (United States)

    He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk

    2014-10-01

    The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.

  2. New verifiable stationarity concepts for a class of mathematical programs with disjunctive constraints.

    Science.gov (United States)

    Benko, Matúš; Gfrerer, Helmut

    2018-01-01

    In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.

  3. The anterior choroidal artery syndrome. Pt. 2. CT and/or MR in angiographically verified cases

    International Nuclear Information System (INIS)

    Takahashi, S.; Ishii, K.; Matsumoto, K.; Higano, S.; Ishibashi, T.; Suzuki, M.; Sakamoto, K.

    1994-01-01

    We reviewed 12 cases of infarcts in the territory of the anterior choroidal artery (AChA) on CT and/or MRI. In each case vascular occlusion in the region was verified angiographically. Although the extent of the lesion on CT/MR images was variable, all were located on the axial images within an arcuate zone between the striatium anterolaterally and the thalamus posteromedially. The distribution of the lesions on mutiplanar MRI conformed well to the territory of the AChA demonstrated microangiographically. The variability of the extent of the infarcts may be explained by variations in the degree of occlusive changes in the AChA or the development of collateral circulation through anastomoses between the AChA and the posterior communicating and posterior cerebral arteries. The extent of the lesion appeared to be closely related to the degree of neurological deficit. (orig.)

  4. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    Science.gov (United States)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  5. Is it possible to verify directly a proton-treatment plan using positron emission tomography?

    International Nuclear Information System (INIS)

    Vynckier, S.; Derreumaux, S.; Richard, F.; Wambersie, A.; Bol, A.; Michel, C.

    1993-01-01

    A PET camera is used to visualize the positron activity induced during protonbeam therapy in order to verify directly the proton-treatment plans. The positron emitters created are predominantly the 15 O and 11 C, whose total activity amounts to 12 MBq after an irradiation with 85 MeV protons, delivering 3 Gy in a volume of approximately 300 cm 3 . Although this method is a useful verification of patient setup, care must be taken when deriving dose distributions from activity distributions. Correlation between both quantities is difficult, moreover at the last millimeters of their range, protons will no longer activate tissue. Due to the short half-lives the PET camera must be located close to the treatment facility. (author) 17 refs

  6. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    Science.gov (United States)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  7. Error prevention in radiotherapy treatments using a record and verify system

    International Nuclear Information System (INIS)

    Navarrete Campos, S.; Hernandez Vitoria, A.; Canellas Anoz, M.; Millan Cebrian, E.; Garcia Romero, A.

    2001-01-01

    Computerized record-and-verify systems (RVS) are being used increasingly to improve the precision of radiotherapy treatments. With the introduction of new treatment devices, such as multileaf or asymmetric collimators and virtual wedges, the responsibility to ensure correct treatment has increased. The purpose of this paper is to present the method that we are following to prevent some potential radiotherapy errors and to point out some errors that can be easily detected using a RVS, through a check of the daily recorded treatment information. We conclude that a RVS prevents the occurrence of many errors, when the settings of the treatment machine do not match the intended parameters within some maximal authorized deviation, and allows to detect easily other potential errors related with a incorrect selection of the treatment patient data. A quality assurance program, including a check of all beam data and a weekly control of the manual and electronic chart, has helped reduce errors. (author)

  8. Reference Material Properties and Standard Problems to Verify the Fuel Performance Models Ver 1.0

    International Nuclear Information System (INIS)

    Yang, Yong Sik; Kim, Jae Yong; Koo, Yang Hyun

    2010-12-01

    All fuel performance models must be validated by in-pile and out-pile tests. However, the model validation requires much efforts and times to confirm its exactness. In many fields, new performance models and codes are confirmed by code-to-code benchmarking process under simplified standard problem analysis. At present, the DUOS, which is the steady state fuel performance analysis code for dual cooled annular fuel, development project is progressing and new FEM module is developed to analyze the fuel performance during transient period. In addition, the verification process is planning to examine the new models and module's rightness by comparing with commercial finite element analysis such as a ADINA, ABAQUS and ANSYS. This reports contains the result of unification of material properties and establishment of standard problem to verify the newly developed models with commercial FEM code

  9. How to verify lightning protection efficiency for electrical systems? Testing procedures and practical applications

    Energy Technology Data Exchange (ETDEWEB)

    Birkl, Josef; Zahlmann, Peter [DEHN and SOEHNE, Neumarkt (Germany)], Emails: Josef.Birkl@technik.dehn.de, Peter.Zahlmann@technik.dehn.de

    2007-07-01

    There are increasing numbers of applications, installing Surge Protective Devices (SPDs), through which partial lightning currents flow, and highly sensitive, electronic devices to be protected closely next to each other due to the design of electric distribution systems and switchgear installations which is getting more and more compact. In these cases, the protective function of the SPDs has to be co-ordinated with the individual immunity of the equipment against energetic, conductive impulse voltages and impulse currents. In order to verify the immunity against partial lightning currents of the complete system laboratory tests on a system level are a suitable approach. The proposed test schemes for complete systems have been successfully performed on various applications. Examples will be presented. (author)

  10. Evaluation of candidate stromal epithelial cross-talk genes identifies association between risk of serous ovarian cancer and TERT, a cancer susceptibility "hot-spot".

    Directory of Open Access Journals (Sweden)

    Sharon E Johnatty

    2010-07-01

    Full Text Available We hypothesized that variants in genes expressed as a consequence of interactions between ovarian cancer cells and the host micro-environment could contribute to cancer susceptibility. We therefore used a two-stage approach to evaluate common single nucleotide polymorphisms (SNPs in 173 genes involved in stromal epithelial interactions in the Ovarian Cancer Association Consortium (OCAC. In the discovery stage, cases with epithelial ovarian cancer (n=675 and controls (n=1,162 were genotyped at 1,536 SNPs using an Illumina GoldenGate assay. Based on Positive Predictive Value estimates, three SNPs-PODXL rs1013368, ITGA6 rs13027811, and MMP3 rs522616-were selected for replication using TaqMan genotyping in up to 3,059 serous invasive cases and 8,905 controls from 16 OCAC case-control studies. An additional 18 SNPs with Pper-alleleor=0.5. However genotypes at TERT rs7726159 were associated with ovarian cancer risk in the smaller, five-study replication study (Pper-allele=0.03. Combined analysis of the discovery and replication sets for this TERT SNP showed an increased risk of serous ovarian cancer among non-Hispanic whites [adj. ORper-allele 1.14 (1.04-1.24 p=0.003]. Our study adds to the growing evidence that, like the 8q24 locus, the telomerase reverse transcriptase locus at 5p15.33, is a general cancer susceptibility locus.

  11. Evaluation of scoring models for identifying the need for therapeutic intervention of upper gastrointestinal bleeding: A new prediction score model for Japanese patients.

    Science.gov (United States)

    Iino, Chikara; Mikami, Tatsuya; Igarashi, Takasato; Aihara, Tomoyuki; Ishii, Kentaro; Sakamoto, Jyuichi; Tono, Hiroshi; Fukuda, Shinsaku

    2016-11-01

    Multiple scoring systems have been developed to predict outcomes in patients with upper gastrointestinal bleeding. We determined how well these and a newly established scoring model predict the need for therapeutic intervention, excluding transfusion, in Japanese patients with upper gastrointestinal bleeding. We reviewed data from 212 consecutive patients with upper gastrointestinal bleeding. Patients requiring endoscopic intervention, operation, or interventional radiology were allocated to the therapeutic intervention group. Firstly, we compared areas under the curve for the Glasgow-Blatchford, Clinical Rockall, and AIMS65 scores. Secondly, the scores and factors likely associated with upper gastrointestinal bleeding were analyzed with a logistic regression analysis to form a new scoring model. Thirdly, the new model and the existing model were investigated to evaluate their usefulness. Therapeutic intervention was required in 109 patients (51.4%). The Glasgow-Blatchford score was superior to both the Clinical Rockall and AIMS65 scores for predicting therapeutic intervention need (area under the curve, 0.75 [95% confidence interval, 0.69-0.81] vs 0.53 [0.46-0.61] and 0.52 [0.44-0.60], respectively). Multivariate logistic regression analysis retained seven significant predictors in the model: systolic blood pressure upper gastrointestinal bleeding. © 2016 Japan Gastroenterological Endoscopy Society.

  12. Quantitative evaluation of subchondral bone injury of the plantaro-lateral condyles of the third metatarsal bone in Thoroughbred horses identified using nuclear scintigraphy: 48 cases.

    Science.gov (United States)

    Parker, R A; Bladon, B M; Parkin, T D H; Fraser, B S L

    2010-09-01

    Increased radio-isotope uptake (IRU) in the subchondral bone of the plantaro-lateral condyle of the third metatarsus (MTIII) is a commonly reported scintigraphic finding and potential cause of lameness in UK Thoroughbred racehorses in training and has not been fully documented. To characterise lameness attributable to IRU of the subchondral bone of MTIII, compare the scintigraphic findings of these horses with a normal population and evaluate the use of scintigraphy as an indicator of prognosis. IRU will be in significantly higher in horses with subchondral bone injury and will be related to prognosis and future racing performance. Data were analysed from 48 horses in which subchondral bone injury of the plantaro-lateral condyle of MTIII had been diagnosed using nuclear scintigraphy and that met the inclusion criteria. Data recorded included age, sex, trainer, racing discipline, lameness assessment, treatment regimes, radiographic and scintigraphic findings, response to diagnostic analgesia where performed and racing performance pre- and post diagnosis. Region of interest (ROI) counts were obtained for the plantar condyle and the mid diaphysis from the latero-medial view, the ratio calculated and then compared with a control group of clinically unaffected horses. The mean condyle mid-diaphysis ROI ratio was significantly (PThoroughbred racehorses. Nuclear scintigraphy is a useful diagnostic imaging modality in the detection of affected horses but is a poor indicator of prognosis for the condition. Better understanding of the clinical manifestations, diagnosis of and prognosis for subchondral bone injury will benefit the Thoroughbred industry in the UK.

  13. Quality Assurance with Plan Veto: reincarnation of a record and verify system and its potential value.

    Science.gov (United States)

    Noel, Camille E; Gutti, Veerarajesh; Bosch, Walter; Mutic, Sasa; Ford, Eric; Terezakis, Stephanie; Santanam, Lakshmi

    2014-04-01

    To quantify the potential impact of the Integrating the Healthcare Enterprise-Radiation Oncology Quality Assurance with Plan Veto (QAPV) on patient safety of external beam radiation therapy (RT) operations. An institutional database of events (errors and near-misses) was used to evaluate the ability of QAPV to prevent clinically observed events. We analyzed reported events that were related to Digital Imaging and Communications in Medicine RT plan parameter inconsistencies between the intended treatment (on the treatment planning system) and the delivered treatment (on the treatment machine). Critical Digital Imaging and Communications in Medicine RT plan parameters were identified. Each event was scored for importance using the Failure Mode and Effects Analysis methodology. Potential error occurrence (frequency) was derived according to the collected event data, along with the potential event severity, and the probability of detection with and without the theoretical implementation of the QAPV plan comparison check. Failure Mode and Effects Analysis Risk Priority Numbers (RPNs) with and without QAPV were compared to quantify the potential benefit of clinical implementation of QAPV. The implementation of QAPV could reduce the RPN values for 15 of 22 (71%) of evaluated parameters, with an overall average reduction in RPN of 68 (range, 0-216). For the 6 high-risk parameters (>200), the average reduction in RPN value was 163 (range, 108-216). The RPN value reduction for the intermediate-risk (200 > RPN > 100) parameters was (0-140). With QAPV, the largest RPN value for "Beam Meterset" was reduced from 324 to 108. The maximum reduction in RPN value was for Beam Meterset (216, 66.7%), whereas the maximum percentage reduction was for Cumulative Meterset Weight (80, 88.9%). This analysis quantifies the value of the Integrating the Healthcare Enterprise-Radiation Oncology QAPV implementation in clinical workflow. We demonstrate that although QAPV does not provide a

  14. Quality Assurance With Plan Veto: Reincarnation of a Record and Verify System and Its Potential Value

    Energy Technology Data Exchange (ETDEWEB)

    Noel, Camille E. [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri (United States); Gutti, VeeraRajesh [Department of Radiation Oncology, Scott and White Healthcare, Temple, Texas (United States); Bosch, Walter; Mutic, Sasa [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri (United States); Ford, Eric [Department of Radiation Oncology, University of Washington Medical Center, Seattle, Washington (United States); Terezakis, Stephanie [Department of Radiation Oncology, Johns Hopkins University, Baltimore, Maryland (United States); Santanam, Lakshmi, E-mail: lsantanam@radonc.wustl.edu [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri (United States)

    2014-04-01

    Purpose: To quantify the potential impact of the Integrating the Healthcare Enterprise–Radiation Oncology Quality Assurance with Plan Veto (QAPV) on patient safety of external beam radiation therapy (RT) operations. Methods and Materials: An institutional database of events (errors and near-misses) was used to evaluate the ability of QAPV to prevent clinically observed events. We analyzed reported events that were related to Digital Imaging and Communications in Medicine RT plan parameter inconsistencies between the intended treatment (on the treatment planning system) and the delivered treatment (on the treatment machine). Critical Digital Imaging and Communications in Medicine RT plan parameters were identified. Each event was scored for importance using the Failure Mode and Effects Analysis methodology. Potential error occurrence (frequency) was derived according to the collected event data, along with the potential event severity, and the probability of detection with and without the theoretical implementation of the QAPV plan comparison check. Failure Mode and Effects Analysis Risk Priority Numbers (RPNs) with and without QAPV were compared to quantify the potential benefit of clinical implementation of QAPV. Results: The implementation of QAPV could reduce the RPN values for 15 of 22 (71%) of evaluated parameters, with an overall average reduction in RPN of 68 (range, 0-216). For the 6 high-risk parameters (>200), the average reduction in RPN value was 163 (range, 108-216). The RPN value reduction for the intermediate-risk (200 > RPN > 100) parameters was (0-140). With QAPV, the largest RPN value for “Beam Meterset” was reduced from 324 to 108. The maximum reduction in RPN value was for Beam Meterset (216, 66.7%), whereas the maximum percentage reduction was for Cumulative Meterset Weight (80, 88.9%). Conclusion: This analysis quantifies the value of the Integrating the Healthcare Enterprise–Radiation Oncology QAPV implementation in clinical workflow

  15. Quality Assurance With Plan Veto: Reincarnation of a Record and Verify System and Its Potential Value

    International Nuclear Information System (INIS)

    Noel, Camille E.; Gutti, VeeraRajesh; Bosch, Walter; Mutic, Sasa; Ford, Eric; Terezakis, Stephanie; Santanam, Lakshmi

    2014-01-01

    Purpose: To quantify the potential impact of the Integrating the Healthcare Enterprise–Radiation Oncology Quality Assurance with Plan Veto (QAPV) on patient safety of external beam radiation therapy (RT) operations. Methods and Materials: An institutional database of events (errors and near-misses) was used to evaluate the ability of QAPV to prevent clinically observed events. We analyzed reported events that were related to Digital Imaging and Communications in Medicine RT plan parameter inconsistencies between the intended treatment (on the treatment planning system) and the delivered treatment (on the treatment machine). Critical Digital Imaging and Communications in Medicine RT plan parameters were identified. Each event was scored for importance using the Failure Mode and Effects Analysis methodology. Potential error occurrence (frequency) was derived according to the collected event data, along with the potential event severity, and the probability of detection with and without the theoretical implementation of the QAPV plan comparison check. Failure Mode and Effects Analysis Risk Priority Numbers (RPNs) with and without QAPV were compared to quantify the potential benefit of clinical implementation of QAPV. Results: The implementation of QAPV could reduce the RPN values for 15 of 22 (71%) of evaluated parameters, with an overall average reduction in RPN of 68 (range, 0-216). For the 6 high-risk parameters (>200), the average reduction in RPN value was 163 (range, 108-216). The RPN value reduction for the intermediate-risk (200 > RPN > 100) parameters was (0-140). With QAPV, the largest RPN value for “Beam Meterset” was reduced from 324 to 108. The maximum reduction in RPN value was for Beam Meterset (216, 66.7%), whereas the maximum percentage reduction was for Cumulative Meterset Weight (80, 88.9%). Conclusion: This analysis quantifies the value of the Integrating the Healthcare Enterprise–Radiation Oncology QAPV implementation in clinical workflow

  16. Land Suitability Evaluation for Blueberry Crop by Determining the Qualitative Properties of the Identified Soil Type Related with the Antioxidant Capacity of Fruits

    Directory of Open Access Journals (Sweden)

    Amalia Ioana BOT

    2017-12-01

    Full Text Available Organic and inorganic forms of nitrogen and carbon were measured in order to determine soil fertility. The amount of total nitrogen ranged between 0.849 g/kg and 1.755 g/kg in the samples gathered from soil in modified state and between 0.961 k/kg and 2.427 g/kg in the samples collected from the soil in natural state. Based on these results it could be concluded that in comparing with the previous year, plants used the soil nutrients for their development. The activities of different enzymes were measured as well. Nitrate reductase activity was also higher in samples collected from soil in modified state (from bilon than in the samples collected near plantations (control samples and the values ranged between 0.055 ± 0.012 μmol⋅h-1⋅g-1 and 1.018 ± 0.117 μmol⋅h-1⋅g-1 in samples from soil in natural state and between 0.013 ± 0.002 μmol⋅h-1⋅g-1 and 0.447 ± 0.083 μmol⋅h-1⋅g-1 in bilons. Using GIS techniques of spatial analysis to determine the exact type of soil from each studied blueberry plantation from the Northwest Region of Development and also based on the soil bio-chemical analyses, it was possible to achieve a qualitative characterization of the territory, taking into account the requirements of blueberries for cultivation and to achieve a land suitability for blueberry crop. Combining laboratory approach, consisting from soil bio-chemical and physico-chemical analyses and chemical analyses of blueberry fruits, with the techniques used in order to determine the soil type and land suitability, the study conducted in the Northwest Region of Development identified the best conditions for blueberry crop, based on the qualitative characterization of land.

  17. Evaluation of yield and identifying potential regions for Saffron (Crocus sativus L. cultivation in Khorasan Razavi province according to temperature parameters

    Directory of Open Access Journals (Sweden)

    Moein Tosan

    2015-04-01

    Full Text Available Saffron is cultivated in most part of Iran, because of low water requirement and well adaptation to diverse environmental condition. In recent years, for many reasons such as low water requirement, saffron cultivation areas has been increased especially in Khorasan Razavi province. Temperature is one of the most important factors in saffron flowering phenomena. The aim of this research was to evaluate the response of saffron to temperature in Khorasan Razavi province counties (Torbat-e-Heydarieh, Gonabad, Nishabour, Sabzevar and Ghoochan. Climatic data (monthly minimum, average, maximum temperatures and diurnal temperature range and saffron yield data were collected for past 20 years period. The stepwise regression methods were used to remove extra parameters and only keep the most important ones. By using these equations and ArcGIS software zoning, Spline method was find the best for saffron crop zoning. The results of linear regression in Gonabad showed that minimum, maximum and average temperature and also diurnal temperature range in March and April months had the greatest impact on saffron yield. For each of the four indices (the minimum, maximum and average temperature and also diurnal temperature range the best area for saffron cultivation was the southern part of the province (particularly Gonabad; so by increasing distance from this area to north areas (such as Kashmar, Torbat-e-Heydarieh, Sabzevar, Nishabour, Mashhad and finally Ghoochan saffron yield reduced by 30 to 50 percent. Therefore, the northern areas of the province had relatively low saffron yield. According to result of this research, saffron yield in Khorasan Razavi province was significantly influenced by temperature parameters. Flowering which basically is the most important stage of plant growth, is directly setting up with temperature.

  18. Retrospective evaluation of focal hypermetabolic thyroid nodules incidentally identified by 18F-FDG PET/CT in a large population

    International Nuclear Information System (INIS)

    Guan Zhiwei; Xu Baixuan; Chen Yingmao; Zhang Jinming; Tian Jiahe

    2012-01-01

    Objective: To investigate the prevalence of focal hypermetabolic thyroid nodules incidentally detected by 18 F-FDG PET/CT in a relatively large population and explore its value in differentiating malignancy from benign thyroid nodules. Methods: During August 2007 to March 2010, 8463 patients with no history of thyroid cancer or thyroidectomy underwent 18 F-FDG PET/CT. Among them, 145 patients were found to have abnormal hypermetabolic thyroid nodules. Sixty-eight patients were conformed with histopathological or clinical follow-up, including 37 with malignancy and 31 with benign nodules (male 21, female 47, average age (53.66 ± 10.85)y). The SUV max , nodule size, single or multiple nodules, with or without calcification and patient's age were chosen as the parameters for predicting malignancy in hypermetabolic thyroid nodules. Univariate analysis was performed using t test, χ 2 test and Fisher exact test. Binary logistic regression was performed for multi-variate analysis. The AUCs of SUV max and logistic regression analysis were compared. Results: The incidence of focal hypermetabolic thyroid nodules was 1.71% (145/8463), with malignancy rate 54.41% (37/68). The SUV max of benign and malignant nodules were 5.13 ±4.02 and 7.61 ± 4.78, respectively (t=2.235, P=0.029). Logistic regression indicated that SUV max , with or without calcification, single or multiple nodules, nodule size and patient's age were all the predictors for malignancy in hypermetabolic thyroid nodules. The AUC of logistic regressive model (AUC L ) and SUV max (AUC S )were 0.878 ±0.043 (95% CI: 0.793-0.962, P<0.05) and 0.694 ±0.067 (95% CI: 0.562-0.825, P<0.05), respectively (P<0.05). Conclusions: Focal hypermetabolic thyroid nodules incidentally identified by 18 F-FDG PET/CT come with high rate of thyroid malignancy. Differential diagnosis could be improved significantly using SUV max and logistic regressive model aided by other parameters from 18 F-FDG PET/CT as well as patient

  19. The impact and applicability of critical experiment evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Brewer, R. [Los Alamos National Lab., NM (United States)

    1997-06-01

    This paper very briefly describes a project to evaluate previously performed critical experiments. The evaluation is intended for use by criticality safety engineers to verify calculations, and may also be used to identify data which need further investigation. The evaluation process is briefly outlined; the accepted benchmark critical experiments will be used as a standard for verification and validation. The end result of the project will be a comprehensive reference document.

  20. Simplified clinical algorithm for identifying patients eligible for immediate initiation of antiretroviral therapy for HIV (SLATE): protocol for a randomised evaluation.

    Science.gov (United States)

    Rosen, Sydney; Fox, Matthew P; Larson, Bruce A; Brennan, Alana T; Maskew, Mhairi; Tsikhutsu, Isaac; Bii, Margaret; Ehrenkranz, Peter D; Venter, Wd Francois

    2017-05-28

    African countries are rapidly adopting guidelines to offer antiretroviral therapy (ART) to all HIV-infected individuals, regardless of CD4 count. For this policy of 'treat all' to succeed, millions of new patients must be initiated on ART as efficiently as possible. Studies have documented high losses of treatment-eligible patients from care before they receive their first dose of antiretrovirals (ARVs), due in part to a cumbersome, resource-intensive process for treatment initiation, requiring multiple clinic visits over a several-week period. The Simplified Algorithm for Treatment Eligibility (SLATE) study is an individually randomised evaluation of a simplified clinical algorithm for clinicians to reliably determine a patient's eligibility for immediate ART initiation without waiting for laboratory results or additional clinic visits. SLATE will enrol and randomise (1:1) 960 adult, HIV-positive patients who present for HIV testing or care and are not yet on ART in South Africa and Kenya. Patients randomised to the standard arm will receive routine, standard of care ART initiation from clinic staff. Patients randomised to the intervention arm will be administered a symptom report, medical history, brief physical exam and readiness assessment. Patients who have positive (satisfactory) results for all four components of SLATE will be dispensed ARVs immediately, at the same clinic visit. Patients who have any negative results will be referred for further clinical investigation, counselling, tests or other services prior to being dispensed ARVs. After the initial visit, follow-up will be by passive medical record review. The primary outcomes will be ART initiation ≤28 days and retention in care 8 months after study enrolment. Ethics approval has been provided by the Boston University Institutional Review Board, the University of the Witwatersrand Human Research Ethics Committee (Medical) and the KEMRI Scientific and Ethics Review Unit. Results will be published in

  1. Quality assurance for high dose rate brachytherapy treatment planning optimization: using a simple optimization to verify a complex optimization

    International Nuclear Information System (INIS)

    Deufel, Christopher L; Furutani, Keith M

    2014-01-01

    As dose optimization for high dose rate brachytherapy becomes more complex, it becomes increasingly important to have a means of verifying that optimization results are reasonable. A method is presented for using a simple optimization as quality assurance for the more complex optimization algorithms typically found in commercial brachytherapy treatment planning systems. Quality assurance tests may be performed during commissioning, at regular intervals, and/or on a patient specific basis. A simple optimization method is provided that optimizes conformal target coverage using an exact, variance-based, algebraic approach. Metrics such as dose volume histogram, conformality index, and total reference air kerma agree closely between simple and complex optimizations for breast, cervix, prostate, and planar applicators. The simple optimization is shown to be a sensitive measure for identifying failures in a commercial treatment planning system that are possibly due to operator error or weaknesses in planning system optimization algorithms. Results from the simple optimization are surprisingly similar to the results from a more complex, commercial optimization for several clinical applications. This suggests that there are only modest gains to be made from making brachytherapy optimization more complex. The improvements expected from sophisticated linear optimizations, such as PARETO methods, will largely be in making systems more user friendly and efficient, rather than in finding dramatically better source strength distributions. (paper)

  2. Evaluation of plasma cytokines in patients with cocaine use disorders in abstinence identifies transforming growth factor alpha (TGFα as a potential biomarker of consumption and dual diagnosis

    Directory of Open Access Journals (Sweden)

    Rosa Maza-Quiroga

    2017-10-01

    Full Text Available Background Cocaine use disorder (CUD is a complex health condition, especially when it is accompanied by comorbid psychiatric disorders (dual diagnosis. Dual diagnosis is associated with difficulties in the stratification and treatment of patients. One of the major challenges in clinical practice of addiction psychiatry is the lack of objective biological markers that indicate the degree of consumption, severity of addiction, level of toxicity and response to treatment in patients with CUD. These potential biomarkers would be fundamental players in the diagnosis, stratification, prognosis and therapeutic orientation in addiction. Due to growing evidence of the involvement of the immune system in addiction and psychiatric disorders, we tested the hypothesis that patients with CUD in abstinence might have altered circulating levels of signaling proteins related to systemic inflammation. Methods The study was designed as a cross-sectional study of CUD treatment-seeking patients. These patients were recruited from outpatient programs in the province of Malaga (Spain. The study was performed with a total of 160 white Caucasian subjects, who were divided into the following groups: patients diagnosed with CUD in abstinence (N = 79, cocaine group and matched control subjects (N = 81, control group. Participants were clinically evaluated with the diagnostic interview PRISM according to the DSM-IV-TR, and blood samples were collected for the determination of chemokine C-C motif ligand 11 (CCL11, eotaxin-1, interferon gamma (IFNγ, interleukin-4 (IL-4, interleukin-8 (IL-8, interleukin-17α (IL-17α, macrophage inflammatory protein 1α (MIP-1α and transforming growth factor α (TGFα levels in the plasma. Clinical and biochemical data were analyzed in order to find relationships between variables. Results While 57% of patients with CUD were diagnosed with dual diagnosis, approximately 73% of patients had other substance use disorders. Cocaine patients

  3. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    Science.gov (United States)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from

  4. Defining and Verifying Research Grade Airborne Laser Swath Mapping (ALSM) Observations

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Slatton, C. C.

    2004-12-01

    The first and primary goal of the National Science Foundation (NSF) supported Center for Airborne Laser Mapping (NCALM), operated jointly by the University of Florida and the University of California, Berkeley, is to make "research grade" ALSM data widely available at affordable cost to the national scientific community. Cost aside, researchers need to know what NCALM considers research grade data and how the quality of the data is verified, to be able to determine the likelihood that the data they receive will meet their project specific requirements. Given the current state of the technology it is reasonable to expect a well planned and executed survey to produce surface elevations with uncertainties less than 10 centimeters and horizontal uncertainties of a few decimeters. Various components of the total error are generally associated with the aircraft trajectory, aircraft orientation, or laser vectors. Aircraft trajectory error is dependent largely on the Global Positioning System (GPS) observations, aircraft orientation on Inertial Measurement Unit (IMU) observations, and laser vectors on the scanning and ranging instrumentation. In addition to the issue of the precision or accuracy of the coordinates of the surface points, consideration must also be given to the point-to-point spacing and voids in the coverage. The major sources of error produce distinct artifacts in the data set. For example, aircraft trajectory errors tend to change slowly as the satellite constellation geometry varies, producing slopes within swaths and offsets between swaths. Roll, pitch and yaw biases in the IMU observations tend to persist through whole flights, and created distinctive artifacts in the swath overlap areas. Errors in the zero-point and scale of the laser scanner cause the edges of swaths to turn up or down. Range walk errors cause offsets between bright and dark surfaces, causing paint stripes to float above the dark surfaces of roads. The three keys to producing

  5. Evaluating the effectiveness of a training program that builds teachers' capability to identify and appropriately refer middle and high school students with mental health problems in Brazil: an exploratory study.

    Science.gov (United States)

    Vieira, Marlene A; Gadelha, Ary A; Moriyama, Taís S; Bressan, Rodrigo A; Bordin, Isabel A

    2014-02-28

    In Brazil, like many countries, there has been a failure to identify mental health problems (MHP) in young people and refer them to appropriate care and support. The school environment provides an ideal setting to do this. Therefore, effective programs need to be developed to train teachers to identify and appropriately refer children with possible MHP. We aimed to evaluate teachers' ability to identify and appropriately refer students with possible MHP, and the effectiveness of a psychoeducational strategy to build teachers' capability in this area. To meet the first objective, we conducted a case-control study using a student sample. To meet the second, we employed longitudinal design with repeated measures before and after introducing the psychoeducational strategy using a teacher sample. In the case control study, the Youth Self-Report was used to investigate internalizing and externalizing problems. Before training, teachers selected 26 students who they thought were likely to have MHP. Twenty-six non-selected students acted as controls and were matched by gender, age and grade. The underlying principle was that if teachers could identify abnormal behaviors among their actual students, those with some MHP would likely be among the case group and those without among the control group. In the longitudinal study, 32 teachers were asked to evaluate six vignettes that highlighted behaviors indicating a high risk for psychosis, depression, conduct disorder, hyperactivity, mania, and normal adolescent behavior. We calculated the rates of correct answers for identifying the existence of some MHP and the need for referral before and after training; teachers were not asked to identify the individual conditions. Teachers were already able to identify the most symptomatic students, who had both internalizing and externalizing problems, as possibly having MHP, but teachers had difficulty in identifying students with internalizing problems alone. At least 50.0% of teachers

  6. Toward verifying fossil fuel CO2 emissions with the CMAQ model: motivation, model description and initial simulation.

    Science.gov (United States)

    Liu, Zhen; Bambha, Ray P; Pinto, Joseph P; Zeng, Tao; Boylan, Jim; Huang, Maoyi; Lei, Huimin; Zhao, Chun; Liu, Shishi; Mao, Jiafu; Schwalm, Christopher R; Shi, Xiaoying; Wei, Yaxing; Michelsen, Hope A

    2014-04-01

    Motivated by the question of whether and how a state-of-the-art regional chemical transport model (CTM) can facilitate characterization of CO2 spatiotemporal variability and verify CO2 fossil-fuel emissions, we for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate CO2. This paper presents methods, input data, and initial results for CO2 simulation using CMAQ over the contiguous United States in October 2007. Modeling experiments have been performed to understand the roles of fossil-fuel emissions, biosphere-atmosphere exchange, and meteorology in regulating the spatial distribution of CO2 near the surface over the contiguous United States. Three sets of net ecosystem exchange (NEE) fluxes were used as input to assess the impact of uncertainty of NEE on CO2 concentrations simulated by CMAQ. Observational data from six tall tower sites across the country were used to evaluate model performance. In particular, at the Boulder Atmospheric Observatory (BAO), a tall tower site that receives urban emissions from Denver CO, the CMAQ model using hourly varying, high-resolution CO2 fossil-fuel emissions from the Vulcan inventory and Carbon Tracker optimized NEE reproduced the observed diurnal profile of CO2 reasonably well but with a low bias in the early morning. The spatial distribution of CO2 was found to correlate with NO(x), SO2, and CO, because of their similar fossil-fuel emission sources and common transport processes. These initial results from CMAQ demonstrate the potential of using a regional CTM to help interpret CO2 observations and understand CO2 variability in space and time. The ability to simulate a full suite of air pollutants in CMAQ will also facilitate investigations of their use as tracers for CO2 source attribution. This work serves as a proof of concept and the foundation for more comprehensive examinations of CO2 spatiotemporal variability and various uncertainties in the future. Atmospheric CO2 has long been modeled

  7. Building and verifying a severity prediction model of acute pancreatitis (AP) based on BISAP, MEWS and routine test indexes.

    Science.gov (United States)

    Ye, Jiang-Feng; Zhao, Yu-Xin; Ju, Jian; Wang, Wei

    2017-10-01

    To discuss the value of the Bedside Index for Severity in Acute Pancreatitis (BISAP), Modified Early Warning Score (MEWS), serum Ca2+, similarly hereinafter, and red cell distribution width (RDW) for predicting the severity grade of acute pancreatitis and to develop and verify a more accurate scoring system to predict the severity of AP. In 302 patients with AP, we calculated BISAP and MEWS scores and conducted regression analyses on the relationships of BISAP scoring, RDW, MEWS, and serum Ca2+ with the severity of AP using single-factor logistics. The variables with statistical significance in the single-factor logistic regression were used in a multi-factor logistic regression model; forward stepwise regression was used to screen variables and build a multi-factor prediction model. A receiver operating characteristic curve (ROC curve) was constructed, and the significance of multi- and single-factor prediction models in predicting the severity of AP using the area under the ROC curve (AUC) was evaluated. The internal validity of the model was verified through bootstrapping. Among 302 patients with AP, 209 had mild acute pancreatitis (MAP) and 93 had severe acute pancreatitis (SAP). According to single-factor logistic regression analysis, we found that BISAP, MEWS and serum Ca2+ are prediction indexes of the severity of AP (P-value0.05). The multi-factor logistic regression analysis showed that BISAP and serum Ca2+ are independent prediction indexes of AP severity (P-value0.05); BISAP is negatively related to serum Ca2+ (r=-0.330, P-valuemodel is as follows: ln()=7.306+1.151*BISAP-4.516*serum Ca2+. The predictive ability of each model for SAP follows the order of the combined BISAP and serum Ca2+ prediction model>Ca2+>BISAP. There is no statistical significance for the predictive ability of BISAP and serum Ca2+ (P-value>0.05); however, there is remarkable statistical significance for the predictive ability using the newly built prediction model as well as BISAP

  8. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Science.gov (United States)

    Kung, Woon-Man; Chen, Shuo-Tsung; Lin, Chung-Hsiang; Lu, Yu-Mei; Chen, Tzu-Hsuan; Lin, Muh-Shi

    2013-01-01

    Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM) implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D) CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS). From January 2011 to June 2012, decompressive craniectomy (DC) was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84). CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (ppairs signed rank test). These data evidenced the highly accurate symmetry of these CAD models with regular contours. CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  9. Method for verifying the pressure in a nuclear reactor fuel rod

    International Nuclear Information System (INIS)

    Jones, W.J.

    1979-01-01

    Disclosed is a method of accurately verifying the pressure contained in a sealed pressurized fuel rod by utilizing a pressure balance measurement technique wherein an end of the fuel rod extends through and is sealed in a wall of a small chamber. The chamber is pressurized to the nominal (desired) fuel rod pressure and the fuel rod is then pierced to interconnect the chamber and fuel rod. The deviation of chamber pressure is noted. The final combined pressure of the fuel rod and drill chamber is substantially equal to the nominal rod pressure; departure of the combined pressure from nominal is in direct proportion to departure of rod pressure from nominal. The maximum error in computing the rod pressure from the deviation of the combined pressure from nominal is estimated at plus or minus 3.0 psig for rod pressures within the specified production limits. If the rod pressure is corrected for rod void volume using a digital printer data record, the accuracy improves to about plus or minus 2.0 psig

  10. Could hypomanic traits explain selective migration? Verifying the hypothesis by the surveys on sardinian migrants.

    Science.gov (United States)

    Giovanni, Carta Mauro; Francesca, Moro Maria; Viviane, Kovess; Brasesco, Maria Veronica; Bhat, Krishna M; Matthias, Angermeyer C; Akiskal, Hagop S

    2012-01-01

    A recent survey put forward the hypothesis that the emigration that occurred from Sardinia from the 1960's to the 1980's, selected people with a hypomanic temperament. The paper aims to verify if the people who migrated from Sardinia in that period have shown a high risk of mood disorders in the surveys carried out in their host countries, and if the results are consistent with this hypothesis. This is systematic review. In the 1970's when examining the attitudes towards migration in Sardinian couples waiting to emigrate, Rudas found that the decision to emigrate was principally taken by males. Female showed lower self-esteem than male emigrants. A study on Sardinian immigrants in Argentina carried out in 2001-02, at the peak of the economic crisis, found a high risk of depressive disorders in women only. These results were opposite to the findings recorded ten years earlier in a survey on Sardinian immigrants in Paris, where the risk of Depressive Episode was higher in young men only. Data point to a bipolar disorder risk for young (probably hypomanic) male migrants in competitive, challenging conditions; and a different kind of depressive episodes for women in trying economic conditions. The results of the survey on Sardinian migrants are partially in agreement with the hypothesis of a selective migration of people with a hypomanic temperament. Early motivations and self-esteem seem related to the ways mood disorders are expressed, and to the vulnerability to specific triggering situations in the host country.

  11. Experimentally verified inductance extraction and parameter study for superconductive integrated circuit wires crossing ground plane holes

    International Nuclear Information System (INIS)

    Fourie, Coenrad J; Wetzstein, Olaf; Kunert, Juergen; Meyer, Hans-Georg; Toepfer, Hannes

    2013-01-01

    As the complexity of rapid single flux quantum (RSFQ) circuits increases, both current and power consumption of the circuits become important design criteria. Various new concepts such as inductive biasing for energy efficient RSFQ circuits and inductively coupled RSFQ cells for current recycling have been proposed to overcome increasingly severe design problems. Both of these techniques use ground plane holes to increase the inductance or coupling factor of superconducting integrated circuit wires. New design tools are consequently required to handle the new topographies. One important issue in such circuit design is the accurate calculation of networks of inductances even in the presence of finite holes in the ground plane. We show how a fast network extraction method using InductEx, which is a pre- and post-processor for the magnetoquasistatic field solver FastHenry, is used to calculate the inductances of a set of SQUIDs (superconducting quantum interference devices) with ground plane holes of different sizes. The results are compared to measurements of physical structures fabricated with the IPHT Jena 1 kA cm −2 RSFQ niobium process to verify accuracy. We then do a parameter study and derive empirical equations for fast and useful estimation of the inductance of wires surrounded by ground plane holes. We also investigate practical circuits and show excellent accuracy. (paper)

  12. K/sub infinity/-meter concept verified via subcritical-critical TRIGA experiments

    International Nuclear Information System (INIS)

    Ocampo Mansilla, H.

    1983-01-01

    This work presents a technique for building a device to measure the k/sub infinity/ of a spent nuclear fuel assembly discharged from the core of a nuclear power plant. The device, called a k/sub infinity/-meter, consists of a cross-shaped subcritical assembly, two artificial neutron sources, and two separate neutron counting systems. The central position of the subcritical assembly is used to measure k/sub infinity/ of the spent fuel assembly. The initial subcritical assembly is calibrated to determine its k/sub eff/ and verify the assigned k/sub infinity/ of a selected fuel assembly placed in the central position. Count rates are taken with the fuel assembly of known k/sub infinity/'s placed in the central position and then repeated with a fuel assembly of unknown k/sub infinity/ placed in the central position. The count rate ratio of the unknown fuel assembly to the known fuel assembly is used to determine the k/sub infinity/ of the unknown fuel assembly. The k/sub infinity/ of the unknown fuel assembly is represented as a polynomial function of the count rate ratios. The coefficients of the polynomial equation are determined using the neutronic codes LEOPARD and EXTERMINATOR-II. The analytical approach has been validated by performing several subcritical/critical experiments, using the Penn State Breazeale TRIGA Reactor (PSBR), and comparing the experimental results with the calculations

  13. Measurement of Deformations by MEMS Arrays, Verified at Sub-millimetre Level Using Robotic Total Stations

    Directory of Open Access Journals (Sweden)

    Tomas Beran

    2014-06-01

    Full Text Available Measurement of sub-millimetre-level deformations of structures in the presence of ambienttemperature changes can be challenging. This paper describes the measurement of astructure moving due to temperature changes, using two ShapeAccelArray (SAAinstruments, and verified by a geodetic monitoring system. SAA is a geotechnicalinstrument often used for monitoring of displacements in soil. SAA uses micro-electro-mechanical system (MEMS sensors to measure tilt in the gravity field. The geodeticmonitoring system, which uses ALERT software, senses the displacements of targetsrelative to control points, using a robotic total station (RTS. The test setup consists of acentral four-metre free-standing steel tube with other steel tubes welded to most of itslength. The central tube is anchored in a concrete foundation. This composite “pole” isequipped with two SAAs as well as three geodetic prisms mounted on the top, in the middle,and in the foundation. The geodetic system uses multiple control targets mounted inconcrete foundations of nearby buildings, and at the base of the pole. Long-termobservations using two SAAs indicate that the pole is subject to deformations due to cyclicalambient temperature variations causing the pole to move by a few millimetres each day. Ina multiple-day experiment, it was possible to track this movement using SAA as well as theRTS system. This paper presents data comparing the measurements of the two instrumentsand provides a good example of the detection of two-dimensional movements of seeminglyrigid objects due to temperature changes.

  14. Risks of Using Bedside Tests to Verify Nasogastric Tube Position in Adult Patients

    Directory of Open Access Journals (Sweden)

    Melody Ni

    2014-12-01

    Full Text Available Nasogastric (NG tubes are commonly used for enteral feeding. Complications of feeding tube misplacement include malnutrition, pulmonary aspiration, and even death. We built a Bayesian network (BN to analyse the risks associated with available bedside tests to verify tube position. Evidence on test validity (sensitivity and specificity was retrieved from a systematic review. Likelihood ratios were used to select the best tests for detecting tubes misplaced in the lung or oesophagus. Five bedside tests were analysed including magnetic guidance, aspirate pH, auscultation, aspirate appearance, and capnography/colourimetry. Among these, auscultation and appearance are non-diagnostic towards lung or oesophagus placements. Capnography/ colourimetry can confirm but cannot rule out lung placement. Magnetic guidance can rule out both lung and oesophageal placement. However, as a relatively new technology, further validation studies are needed. The pH test with a cut-off at 5.5 or lower can rule out lung intubation. Lowering the cut-off to 4 not only minimises oesophageal intubation but also provides extra safety as the sensitivity of pH measurement is reduced by feeding, antacid medication, or the use of less accurate pH paper. BN is an effective tool for representing and analysing multi-layered uncertainties in test validity and reliability for the verification of NG tube position. Aspirate pH with a cut-off of 4 is the safest bedside method to minimise lung and oesophageal misplacement.

  15. The Mitochondrial Protein Atlas: A Database of Experimentally Verified Information on the Human Mitochondrial Proteome.

    Science.gov (United States)

    Godin, Noa; Eichler, Jerry

    2017-09-01

    Given its central role in various biological systems, as well as its involvement in numerous pathologies, the mitochondrion is one of the best-studied organelles. However, although the mitochondrial genome has been extensively investigated, protein-level information remains partial, and in many cases, hypothetical. The Mitochondrial Protein Atlas (MPA; URL: lifeserv.bgu.ac.il/wb/jeichler/MPA ) is a database that provides a complete, manually curated inventory of only experimentally validated human mitochondrial proteins. The MPA presently contains 911 unique protein entries, each of which is associated with at least one experimentally validated and referenced mitochondrial localization. The MPA also contains experimentally validated and referenced information defining function, structure, involvement in pathologies, interactions with other MPA proteins, as well as the method(s) of analysis used in each instance. Connections to relevant external data sources are offered for each entry, including links to NCBI Gene, PubMed, and Protein Data Bank. The MPA offers a prototype for other information sources that allow for a distinction between what has been confirmed and what remains to be verified experimentally.

  16. Dynamic simulation platform to verify the performance of the reactor regulating system for a research reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-07-01

    Digital instrumentation and controls system technique is being introduced in new constructed research reactor or life extension of older research reactor. Digital systems are easy to change and optimize but the validated process for them is required. Also, to reduce project risk or cost, we have to make it sure that configuration and control functions is right before the commissioning phase on research reactor. For this purpose, simulators have been widely used in developing control systems in automotive and aerospace industries. In these literatures, however, very few of these can be found regarding test on the control system of research reactor with simulator. Therefore, this paper proposes a simulation platform to verify the performance of RRS (Reactor Regulating System) for research reactor. This simulation platform consists of the reactor simulation model and the interface module. This simulation platform is applied to I and C upgrade project of TRIGA reactor, and many problems of RRS configuration were found and solved. And it proved that the dynamic performance testing based on simulator enables significant time saving and improves economics and quality for RRS in the system test phase. (authors)

  17. A Pilot Study Verifying How the Curve Information Impacts on the Driver Performance with Cognition Model

    Directory of Open Access Journals (Sweden)

    Xiaohua Zhao

    2013-01-01

    Full Text Available Drivers' misjudgment is a significant issue for the curve safety. It is considered as a more influential factor than other traffic environmental conditions for inducing risk. The research suggested that the cognition theory could explain the process of drivers’ behavior at curves. In this simulator experiment, a principle cognition model was built to examine the rationality of this explanation. The core of this pilot study was using one of the driving decision strategies for braking at curves to verify the accuracy of the cognition model fundamentally. Therefore, the experiment designed three treatments of information providing modes. The result of the experiment presented that the warning information about curves in advance can move the position of first braking away from curves. This phenomenon is consistent with the model’s inference. Thus, the conclusion of this study indicates that the process of the drivers' behavior at curves can be explained by the cognition theory and represented by cognition model. In addition, the model’s characteristics and working parameters can be acquired by doing other research. Then based on the model it can afford the advice for giving the appropriate warning information that may avoid the driver’s mistake.

  18. Verifying Real-Time Systems using Explicit-time Description Methods

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking has been extensively researched in recent years. Many new formalisms with time extensions and tools based on them have been presented. On the other hand, Explicit-Time Description Methods aim to verify real-time systems with general untimed model checkers. Lamport presented an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables for time requirements. This paper proposes a new explicit-time description method with no reliance on global variables. Instead, it uses rendezvous synchronization steps between the Tick process and each system process to simulate time. This new method achieves better modularity and facilitates usage of more complex timing constraints. The two explicit-time description methods are implemented in DIVINE, a well-known distributed-memory model checker. Preliminary experiment results show that our new method, with better modularity, is comparable to Lamport's method with respect to time and memory efficiency.

  19. On verifying magnetic dipole moment of a magnetic torquer by experiments

    Science.gov (United States)

    Kuyyakanont, Aekjira; Kuntanapreeda, Suwat; Fuengwarodsakul, Nisai H.

    2018-01-01

    Magnetic torquers are used for the attitude control of small satellites, such as CubeSats with Low Earth Orbit (LEO). During the design of magnetic torquers, it is necessary to confirm if its magnetic dipole moment is enough to control the satellite attitude. The magnetic dipole moment can affect the detumbling time and the satellite rotation time. In addition, it is also necessary to understand how to design the magnetic torquer for operation in a CubeSat under the space environment at LEO. This paper reports an investigation of the magnetic dipole moment and the magnetic field generated by a circular air-coil magnetic torquer using experimental measurements. The experiment testbed was built on an air-bearing under a magnetic field generated by a Helmholtz coil. This paper also describes the procedure to determine and verify the magnetic dipole moment value of the designed circular air-core magnetic torquer. The experimental results are compared with the design calculations. According to the comparison results, the designed magnetic torquer reaches the required magnetic dipole moment. This designed magnetic torquer will be applied to the attitude control systems of a 1U CubeSat satellite in the project “KNACKSAT.”

  20. Verifying Identities of Plant-Based Multivitamins Using Phytochemical Fingerprinting in Combination with Multiple Bioassays.

    Science.gov (United States)

    Lim, Yeni; Ahn, Yoon Hee; Yoo, Jae Keun; Park, Kyoung Sik; Kwon, Oran

    2017-09-01

    Sales of multivitamins have been growing rapidly and the concept of natural multivitamin, plant-based multivitamin, or both has been introduced in the market, leading consumers to anticipate additional health benefits from phytochemicals that accompany the vitamins. However, the lack of labeling requirements might lead to fraudulent claims. Therefore, the objective of this study was to develop a strategy to verify identity of plant-based multivitamins. Phytochemical fingerprinting was used to discriminate identities. In addition, multiple bioassays were performed to determine total antioxidant capacity. A statistical computation model was then used to measure contributions of phytochemicals and vitamins to antioxidant activities. Fifteen multivitamins were purchased from the local markets in Seoul, Korea and classified into three groups according to the number of plant ingredients. Pearson correlation analysis among antioxidant capacities, amount phenols, and number of plant ingredients revealed that ferric reducing antioxidant power (FRAP) and 2,2-diphenyl-1-picryhydrazyl (DPPH) assay results had the highest correlation with total phenol content. This suggests that FRAP and DPPH assays are useful for characterizing plant-derived multivitamins. Furthermore, net effect linear regression analysis confirmed that the contribution of phytochemicals to total antioxidant capacities was always relatively higher than that of vitamins. Taken together, the results suggest that phytochemical fingerprinting in combination with multiple bioassays could be used as a strategy to determine whether plant-derived multivitamins could provide additional health benefits beyond their nutritional value.

  1. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    Science.gov (United States)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  2. Verifying the model of predicting entrepreneurial intention among students of business and non-business orientation

    Directory of Open Access Journals (Sweden)

    Zoran Sušanj

    2015-01-01

    Full Text Available This study aims to verify whether certain entrepreneurial characteristics, like entrepreneurial potential and entrepreneurial propensity, affect the level of entrepreneurial self-efficacy and desirability of entrepreneurship, and further have direct and indirect effect on entrepreneurial intentions. Furthermore, this study seeks to compare the strength of the relationship between these variables among groups of students who receive some entrepreneurship education and students outside the business sphere. Data was collected from a sample of undergraduate students of business and non-business orientation and analyzed with multi-group analysis within SEM. Results of the multi-group analysis indicate that indeed, the strength of the relationship among tested variables is more pronounced when it comes to business students. That is, mediating effect of perceived entrepreneurial self-efficacy and desirability of entrepreneurship in the relationship between entrepreneurial characteristics and intent, is significantly stronger for the business-oriented groups, in comparison to non-business orientation group. The amount of explained variance of all constructs (except entrepreneurial propensity is also larger in business students in comparison to non-business students. Educational implications of obtained results are discussed.

  3. Verified spider bites in Oregon (USA) with the intent to assess hobo spider venom toxicity.

    Science.gov (United States)

    McKeown, Nathanael; Vetter, Richard S; Hendrickson, Robert G

    2014-06-01

    This study compiled 33 verified spider bites from the state of Oregon (USA). The initial goal was to amass a series of bites by the hobo spider to assess whether it possesses toxic venom, a supposition which is currently in a contested state. None of the 33 bites from several spider species developed significant medical symptoms nor did dermonecrosis occur. The most common biters were the yellow sac spider, Cheiracanthium mildei (N = 10) and orb-weavers of the genus Araneus (N = 6). There were 10 bites from three genera of funnel web spiders of the family Agelenidae including one hobo spider bite and one from the congeneric giant house spider which is readily confused as a hobo spider. The hobo spider bite resulted in pain, redness, twitching in the calf muscle and resolved in 12 h. Also generated from this study were possibly the first records of bites from spiders of the genera Callobius (Amaurobiidae) and Antrodiaetus (Antrodiaetidae), both with minor manifestations. Published by Elsevier Ltd.

  4. 78 FR 69871 - Agency Information Collection Activities: myE-Verify, Revision of a Currently Approved Collection

    Science.gov (United States)

    2013-11-21

    ... Collection (1) Type of Information Collection: Revision of a Currently Approved Collection. (2) Title of the... respond: E-Verify Self Check--Identity Authentication 2,900,000 responses at 0.0833 hours (5 minutes) per...

  5. Proposed procedure and analysis of results to verify the indicator of the product dose-area in radiology equipment

    International Nuclear Information System (INIS)

    Garcia Marcos, R.; Gallego Franco, P.; Sierra Diaz, F.; Gonzalez Ruiz, C.; Rodriguez Checa, M.; Brasa Estevez, M.; Gomez Calvar, R.

    2013-01-01

    The aim of this work is to establish a procedure to verify the value of the product dose-area showing certain teams of Radiology, with an alternative to the use of external transmission cameras. (Author)

  6. Combining of both RPAS and GPR methods for documentation and verifying of archaeological objects

    Science.gov (United States)

    Pavelka, Karel; Šedina, Jaroslav

    2015-04-01

    UAV (unmanned aircraft vehicle) or RPAS (remote piloted aircraft systems) are a modern technology for non - contact mapping and monitoring small areas. Nowadays, for control and piloting, RPAS are equipped with sophisticated micro-instruments such as IMU, gyroscopes, GNSS receivers, wireless image insights, wireless controls, automatic stabilization, flight planners, etc. RPAS can provide not only photographic data, but also other data types like multispectral (with NDVI capability), thermal data too (depending on sensors and type). Bigger RPAS can be equipped with more complex and expensive instruments like laser scanners or hyperspectral scanners. The RPAS method of acquisition combines the benefits of close range and aerial photogrammetry. As a result, a higher resolution and mapping precision can be obtained over compact and possibly less accessible areas (e.g. mountains, moors, swamps, dumps, small natural reserves, archaeological areas and dangerous or restricted areas). In our project, many small archaeological places are monitored. It is low cost, simple, and speedy. From these photos, a DSM (digital surface model) and orthophoto can be derived, which are useful for archaeologists (DSM is often used in shaded relief form). Based on the type of processing software, a textured virtual model can be obtained. Near infrared photos from height 100-200m give a new possibility in archaeology. We used both RPAS and GPR methods in three case projects in the Czech Republic in 2014. 1.Historical field fortification In the neighbourhood of town Litoměřice, there are still visible ramparts from the Prussian - Austrian war in the 19th Century. This was a field forward fortification, but has never been used in battle and later disappeared because of agricultural activities. Some parts are detectable by their terrain signatures, visible on shaded DSMs. By the documentation and research of these relics, we measured profiles with GPR for verifying of parts, which were

  7. Verifying the hypothesis of disconnection syndrome in patients with conduction aphasia using diffusion tensor imaging

    Institute of Scientific and Technical Information of China (English)

    Yanqin Guo; Jing Xu; Yindong Yang

    2007-01-01

    BACKGROUND: It is thought in disconnection theory that connection of anterior and posterior language function areas, i.e. the lesion of arcuate fasciculus causes conduction aphasia.OBJECTIVE: To verify the theory of disconnection elicited by repetition disorder in patients with conduction aphasia by comparing the characteristics of diffusion tensor imaging between healthy persons and patients with conduction aphasia.DESIGN: Case-control observation.SETTING: Department of Neurology, Hongqi Hospital Affiliated to Mudanjiang Medical College.PARTICIPANTS: Five male patients with cerebral infarction-involved arcuate fasciculus conduction aphasia, averaged (43±2) years, who hospitalized in the Department of Neurology, Hongqi Hospital Affiliated to Mudanjiang Medical College from February 2004 to February 2005 were involved in this experiment. The involved patients were all confirmed as cerebral infarction by skull CT and MRI, and met the diagnosis criteria revised in 1995 4th Cerebrovascular Conference. They were examined by the method of Aphasia Battery of Chinese (ABC) edited by Surong Gao. The results were poorer than auditory comprehension disproportionately, and consistented with the mode of conduction aphasia. Another 5 male healthy persons, averaged (43 ± 1 ) years, who were physicians receiving further training in the Department of Neurology, Beijing Tiantan Hospital were also involved in this experiment. Informed consents of detected items were obtained from all the subjects.METHODS: All the subjects were performed handedness assessment with assessment criteria of handedness formulated by Department of Neurology, First Hospital Affiliated to Beijing Medical University. Arcuate fasciculus of involved patients and health controls were analyzed with diffusion tensor imaging (DTI) and divided into 3 parts (anterior, middle and posterior segments) for determining FA value (mean value was obtained after three times of measurements), and a comparison of FA value was

  8. Can EC and UK national methane emission inventories be verified using high precision stable isotope data?

    International Nuclear Information System (INIS)

    Lowry, D.; Holmes, C.W.; Nisbet, E.G.; Rata, N.D.

    2002-01-01

    The main anthropogenic sources of methane in industrialised countries (landfill/waste treatment, gas storage and distribution, coal) are far easier to reduce than CO 2 sources and the implementation of reduction strategies is potentially profitable. Statistical databases of methane emissions need independent external verification and carbon isotope data provide one way of estimating the expected source mix for each country if the main source types have been characterised isotopically. Using this method each country participating in the CORINAIR 94 database has been assigned an expected isotopic value for its emissions. The averaged δ 13 C of methane emitted from the CORINAIR region of Europe, based on total emissions of each country is -55.4 per mille for 1994. This European source mix can be verified using trajectory analysis for air samples collected at background stations. Methane emissions from the UK, and particularly the London region, have undergone more detailed analysis using data collected at the Royal Holloway site on the western fringe of London. If the latest emissions inventory figures are correct then the modelled isotopic change in the UK source mix is from -48.4 per mille in 1990 to -50.7 per mille in 1997. This represents a reduction in emissions of 25% over a 7-year period, important in meeting proposed UK greenhouse gas reduction targets. These changes can be tested by the isotopic analysis of air samples at carefully selected coastal background and interior sites. Regular sampling and isotopic analysis coupled with back trajectory analysis from a range of sites could provide an important tool for monitoring and verification of EC and UK methane emissions in the run-up to 2010. (author)

  9. Unmaking the bomb: Verifying limits on the stockpiles of nuclear weapons

    Science.gov (United States)

    Glaser, Alexander

    2017-11-01

    Verifying limits on the stockpiles of nuclear weapons may require the ability for international in-spectors to account for individual warheads, even when non-deployed, and to confirm the authenticity of nuclear warheads prior to dismantlement. These are fundamentally new challenges for nuclear verification, and they have been known for some time; unfortunately, due to a lack of sense of urgency, research in this area has not made substantial progress over the past 20 years. This chapter explores the central outstanding issues and offers a number of possible paths forward. In the case of confirming numerical limits, these in-clude innovative tagging techniques and approaches solely based on declarations using modern crypto-graphic escrow schemes; with regard to warhead confirmation, there has recently been increasing interest in developing fundamentally new measurement approaches where, in one form or another, sensitive infor-mation is not acquired in the first place. Overall, new international R&D efforts could more usefully focus on non-intrusive technologies and approaches, which may show more promise for early demonstration and adoption. In the meantime, while warhead dismantlements remain unverified, nuclear weapon states ought to begin to document warhead assembly, refurbishment, and dismantlement activities and movements of warheads and warhead components through the weapons complex in ways that international inspectors will find credible at a later time. Again, such a process could be enabled by modern cryptographic techniques such as blockchaining. Finally, and perhaps most importantly, it is important to recognize that the main reason for the complexity of technologies and approaches needed for nuclear disarmament verification is the requirement to protect information that nuclear weapon states consider sensitive. Ultimately, if information security concerns cannot be resolved to the satisfaction of all stakeholders, an alternative would be to "reveal the

  10. Trust, but verify – accuracy of clinical commercial radiation treatment planning systems

    International Nuclear Information System (INIS)

    Lehmann, J; Kenny, J; Lye, J; Dunn, L; Williams, I

    2014-01-01

    Computer based Treatment Planning Systems (TPS) are used worldwide to design and calculate treatment plans for treating radiation therapy patients. TPS are generally well designed and thoroughly tested by their developers and local physicists prior to clinical use. However, the wide-reaching impact of their accuracy warrants ongoing vigilance. This work reviews the findings of the Australian national audit system and provides recommendations for checks of TPS. The Australian Clinical Dosimetry Service (ACDS) has designed and implemented a national system of audits, currently in a three year test phase. The Level III audits verify the accuracy of a beam model of a facility's TPS through a comparison of measurements with calculation at selected points in an anthropomorphic phantom. The plans are prescribed by the ACDS and all measurement equipment is brought in for independent onsite measurements. In this first version of audits, plans are comparatively simple, involving asymmetric fields, wedges and inhomogeneities. The ACDS has performed 14 Level III audits to-date. Six audits returned at least one measurement at Action Level, indicating that the measured dose differed more than 3.3% (but less than 5%) from the planned dose. Two audits failed (difference >5%). One fail was caused by a data transmission error coupled with quality assurance (QA) not being performed. The second fail was investigated and reduced to Action Level with the onsite audit team finding phantom setup at treatment a contributing factor. The Action Level results are attributed to small dose calculation deviations within the TPS, which are investigated and corrected by the facilities. Small deviations exist in clinical TPS which can add up and can combine with output variations to result in unacceptable variations. Ongoing checks and independent audits are recommended.

  11. Rigidity of quantum steering and one-sided device-independent verifiable quantum computation

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Wallden, Petros; Kashefi, Elham

    2017-01-01

    The relationship between correlations and entanglement has played a major role in understanding quantum theory since the work of Einstein et al (1935 Phys. Rev. 47 777–80). Tsirelson proved that Bell states, shared among two parties, when measured suitably, achieve the maximum non-local correlations allowed by quantum mechanics (Cirel’son 1980 Lett. Math. Phys. 4 93–100). Conversely, Reichardt et al showed that observing the maximal correlation value over a sequence of repeated measurements, implies that the underlying quantum state is close to a tensor product of maximally entangled states and, moreover, that it is measured according to an ideal strategy (Reichardt et al 2013 Nature 496 456–60). However, this strong rigidity result comes at a high price, requiring a large number of entangled pairs to be tested. In this paper, we present a significant improvement in terms of the overhead by instead considering quantum steering where the device of the one side is trusted. We first demonstrate a robust one-sided device-independent version of self-testing, which characterises the shared state and measurement operators of two parties up to a certain bound. We show that this bound is optimal up to constant factors and we generalise the results for the most general attacks. This leads us to a rigidity theorem for maximal steering correlations. As a key application we give a one-sided device-independent protocol for verifiable delegated quantum computation, and compare it to other existing protocols, to highlight the cost of trust assumptions. Finally, we show that under reasonable assumptions, the states shared in order to run a certain type of verification protocol must be unitarily equivalent to perfect Bell states. (paper)

  12. Trust, but verify - Accuracy of clinical commercial radiation Treatment Planning Systems

    Science.gov (United States)

    Lehmann, J.; Kenny, J.; Lye, J.; Dunn, L.; Williams, I.

    2014-03-01

    Computer based Treatment Planning Systems (TPS) are used worldwide to design and calculate treatment plans for treating radiation therapy patients. TPS are generally well designed and thoroughly tested by their developers and local physicists prior to clinical use. However, the wide-reaching impact of their accuracy warrants ongoing vigilance. This work reviews the findings of the Australian national audit system and provides recommendations for checks of TPS. The Australian Clinical Dosimetry Service (ACDS) has designed and implemented a national system of audits, currently in a three year test phase. The Level III audits verify the accuracy of a beam model of a facility's TPS through a comparison of measurements with calculation at selected points in an anthropomorphic phantom. The plans are prescribed by the ACDS and all measurement equipment is brought in for independent onsite measurements. In this first version of audits, plans are comparatively simple, involving asymmetric fields, wedges and inhomogeneities. The ACDS has performed 14 Level III audits to-date. Six audits returned at least one measurement at Action Level, indicating that the measured dose differed more than 3.3% (but less than 5%) from the planned dose. Two audits failed (difference >5%). One fail was caused by a data transmission error coupled with quality assurance (QA) not being performed. The second fail was investigated and reduced to Action Level with the onsite audit team finding phantom setup at treatment a contributing factor. The Action Level results are attributed to small dose calculation deviations within the TPS, which are investigated and corrected by the facilities. Small deviations exist in clinical TPS which can add up and can combine with output variations to result in unacceptable variations. Ongoing checks and independent audits are recommended.

  13. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Directory of Open Access Journals (Sweden)

    Woon-Man Kung

    Full Text Available BACKGROUND: Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS. MATERIALS AND METHODS: From January 2011 to June 2012, decompressive craniectomy (DC was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. RESULTS: CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84. CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (p<0.001, p = 0.064, p = 0.021 respectively, Wilcoxon matched pairs signed rank test. These data evidenced the highly accurate symmetry of these CAD models with regular contours. CONCLUSIONS: CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  14. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r)

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Stambaugh, Cassandra [Department of Physics, University of South Florida, Tampa, Florida 33612 (United States); Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2015-08-15

    Purpose: The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. Methods: DICOM RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms—PINNACLE (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)—were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, D{sub max}, D{sub min}, and doses to % volume: D99, D95, D5, D1, D0.03 cm{sup 3}) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. Results: In Test 1, PINNACLE produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, PINNACLE and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding D{sub min} and D{sub max} as least clinically relevant would result in 32 (15%) vs 5 (2

  15. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r).

    Science.gov (United States)

    Nelms, Benjamin; Stambaugh, Cassandra; Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir

    2015-08-01

    The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. dicom RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms-pinnacle (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)-were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, Dmax, Dmin, and doses to % volume: D99, D95, D5, D1, D0.03 cm(3)) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. In Test 1, pinnacle produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, pinnacle and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding Dmin and Dmax as least clinically relevant would result in 32 (15%) vs 5 (2%) scored deviations for pinnacle vs PlanIQ in Test 1, while Test 2

  16. Experimentally verifiable Yang-Mills spin 2 gauge theory of gravity with group U(1) x SU(2)

    International Nuclear Information System (INIS)

    Peng, H.

    1988-01-01

    In this work, a Yang-Mills spin 2 gauge theory of gravity is proposed. Based on both the verification of the helicity 2 property of the SU(2) gauge bosons of the theory and the agreement of the theory with most observational and experimental evidence, the authors argues that the theory is truly a gravitational theory. An internal symmetry group, the eigenvalues of its generators are identical with quantum numbers, characterizes the interactions of a given class. The author demonstrates that the 4-momentum P μ of a fermion field generates the U(1) x SU(2) internal symmetry group for gravity, but not the transformation group T 4 . That particles are classified by mass and spin implies that the U(1) x SU(2), instead of the Poincare group, is a symmetry group of gravity. It is shown that the U(1) x SU(2) group represents the time displacement and rotation in ordinary space. Thereby internal space associated with gravity is identical with Minkowski spacetime, so a gauge potential of gravity carries two space-time indices. Then he verifies that the SU(2) gravitational boson has helicity 2. It is this fact, spin from internal spin, that explains alternatively why the gravitational field is the only field which is characterized by spin 2. The Physical meaning of gauge potentials of gravity is determined by comparing theory with the results of experiments, such as the Collella-Overhauser-Werner (COW) experiment and the Newtonian limit, etc. The gauge potentials this must identify with ordinary gravitational potentials

  17. Governor stability simulations of Svartisen power plant verified by the installed monitoring system on site

    International Nuclear Information System (INIS)

    Nielsen, T K; Kjeldsen, M

    2010-01-01

    plant has performed during start ups, close downs and steady state operation. The data has for instance been used to verify the simulations of governor stability of the excising turbine.This will also secure the simulations done with an additional turbine installed.

  18. Governor stability simulations of Svartisen power plant verified by the installed monitoring system on site

    Science.gov (United States)

    Nielsen, T. K.; Kjeldsen, M.

    2010-08-01

    plant has performed during start ups, close downs and steady state operation. The data has for instance been used to verify the simulations of governor stability of the excising turbine.This will also secure the simulations done with an additional turbine installed.

  19. Verifying Parentage and Confirming Identity in Blackberry with a Fingerprinting Set

    Science.gov (United States)

    Parentage and identity confirmation is an important aspect of clonally propagated crops outcrossing. Potential errors resulting misidentification include off-type pollination events, labeling errors, or sports of clones. DNA fingerprinting sets are an excellent solution to quickly identify off-type ...

  20. 40 CFR 63.2994 - How do I verify the performance of monitoring equipment?

    Science.gov (United States)

    2010-07-01

    ... equipment? (a) Before conducting the performance test, you must take the steps listed in paragraphs (a)(1) and (2) of this section: (1) Install and calibrate all process equipment, control devices, and... evaluation results. (b) If you use a thermal oxidizer, the temperature monitoring device must meet the...

  1. Verifying compliance with nuclear non-proliferation undertakings: IAEA safeguards agreements and additional protocols

    International Nuclear Information System (INIS)

    2008-06-01

    commonly used, for instance, in shielding on radioactive sources used in hospitals. Other radioactive material, such as most radioactive sources and isotopes used in medicine, industry, agriculture, and water resource management, are not the subject of safeguards and need not be reported to the IAEA under safeguards agreements. Reporting depends on the level of nuclear activity in the country. Declarations pursuant to safeguards agreements and additional protocols for States that do not have nuclear facilities are expected to be short and simple. The IAEA has prepared a document, available upon request, which provides guidance on the reporting requirements for such States. More elaborate guidelines have been prepared for States that do have nuclear facilities subject to routine safeguards inspections. Through its activities in the field, the IAEA seeks to verify the correctness and completeness of States' reports and declarations regarding nuclear material. Each State with a comprehensive safeguards agreement is required to establish and maintain a State system of accounting for and control of nuclear material (SSAC), which is the national authority formally designated to keep track of nuclear material and activities in the country. For all States with safeguards agreements in force, the IAEA draws an annual conclusion on the non-diversion of nuclear material and other items placed under safeguard. The IAEA's focal point for the negotiation of safeguards agreements and additional protocols, and the amendment of SQPs, is the Office of External Relations and Policy Coordination. Once a State has decided to conclude such an agreement and/or protocol, or amend its SQP, the IAEA can help the country with the implementation of related legal and technical requirements. The appendix of this publication informs how to conclude a comprehensive Safeguards Agreement and/or an Additional Protocol and provides 3 model notification letters for (a) conclusion of a safeguards agreement, a

  2. Verifying compliance with nuclear non-proliferation undertakings: IAEA safeguards agreements and additional protocols

    International Nuclear Information System (INIS)

    2008-04-01

    commonly used, for instance, in shielding on radioactive sources used in hospitals. Other radioactive material, such as most radioactive sources and isotopes used in medicine, industry, agriculture, and water resource management, are not the subject of safeguards and need not be reported to the IAEA under safeguards agreements. Reporting depends on the level of nuclear activity in the country. Declarations pursuant to safeguards agreements and additional protocols for States that do not have nuclear facilities are expected to be short and simple. The IAEA has prepared a document, available upon request, which provides guidance on the reporting requirements for such States. More elaborate guidelines have been prepared for States that do have nuclear facilities subject to routine safeguards inspections. Through its activities in the field, the IAEA seeks to verify the correctness and completeness of States' reports and declarations regarding nuclear material. Each State with a comprehensive safeguards agreement is required to establish and maintain a State system of accounting for and control of nuclear material (SSAC), which is the national authority formally designated to keep track of nuclear material and activities in the country. For all States with safeguards agreements in force, the IAEA draws an annual conclusion on the non-diversion of nuclear material and other items placed under safeguard. The IAEA's focal point for the negotiation of safeguards agreements and additional protocols, and the amendment of SQPs, is the Office of External Relations and Policy Coordination. Once a State has decided to conclude such an agreement and/or protocol, or amend its SQP, the IAEA can help the country with the implementation of related legal and technical requirements. The appendix of this publication informs how to conclude a comprehensive Safeguards Agreement and/or an Additional Protocol and provides 3 model notification letters for (a) conclusion of a safeguards agreement, a

  3. Model-based strategy for cell culture seed train layout verified at lab scale.

    Science.gov (United States)

    Kern, Simon; Platas-Barradas, Oscar; Pörtner, Ralf; Frahm, Björn

    2016-08-01

    Cell culture seed trains-the generation of a sufficient viable cell number for the inoculation of the production scale bioreactor, starting from incubator scale-are time- and cost-intensive. Accordingly, a seed train offers potential for optimization regarding its layout and the corresponding proceedings. A tool has been developed to determine the optimal points in time for cell passaging from one scale into the next and it has been applied to two different cell lines at lab scale, AGE1.HN AAT and CHO-K1. For evaluation, experimental seed train realization has been evaluated in comparison to its layout. In case of the AGE1.HN AAT cell line, the results have also been compared to the formerly manually designed seed train. The tool provides the same seed train layout based on the data of only two batches.

  4. The use of curium neutrons to verify plutonium in spent fuel and reprocessing wastes

    International Nuclear Information System (INIS)

    Miura, N.

    1994-05-01

    For safeguards verification of spent fuel, leached hulls, and reprocessing wastes, it is necessary to determine the plutonium content in these items. We have evaluated the use of passive neutron multiplicity counting to determine the plutonium content directly and also to measure the 240 Pu/ 244 Cm ratio for the indirect verification of the plutonium. Neutron multiplicity counting of the singles, doubles, and triples neutrons has been evaluated for measuring 240 Pu, 244 Cm, and 252 Cf. We have proposed a method to establish the plutonium to curium ratio using the hybrid k-edge densitometer x-ray fluorescence instrument plus a neutron coincidence counter for the reprocessing dissolver solution. This report presents the concepts, experimental results, and error estimates for typical spent fuel applications

  5. Ecological improvements to hydroelectric power plants under EEG. Guidance to environmental verifiers and water rights authorities

    International Nuclear Information System (INIS)

    Meyr, Christoph; Pfeifer, Hansjoerg; Schnell, Johannes; Hanfland, Sebastian

    2011-11-01

    The use of hydropower as a renewable form of energy is experiencing a renaissance due to the energy transition in Bavaria. The fishery evaluate not uncritically this development, because hydroelectric plants generally normally represent a considerable intervention in water and therefore in the habitat of the fish. In this case it should be noted that just often not even the minimum requirements of ecology are fulfilled at existing plants according to the Federal Water Act. [de

  6. Cardiac Magnetic Resonance-Verified Myocardial Fibrosis in Chagas Disease: Clinical Correlates and Risk Stratification

    Directory of Open Access Journals (Sweden)

    Marly Uellendahl

    Full Text Available Abstract Background: Chagas disease (CD is an important cause of heart failure and mortality, mainly in Latin America. This study evaluated the morphological and functional characteristics of the heart as well the extent of myocardial fibrosis (MF in patients with CD by cardiac magnetic resonance (CMR. The prognostic value of MF evaluated by myocardial-delayed enhancement (MDE was compared with that via Rassi score. Methods: This study assessed 39 patients divided into 2 groups: 28 asymptomatic patients as indeterminate form group (IND; and symptomatic patients as Chagas Heart Disease (CHD group. All patients underwent CMR using the techniques of cine-MRI and MDE, and the amount of MF was compared with the Rassi score. Results: Regarding the morphological and functional analysis, significant differences were observed between both groups (p < 0.001. Furthermore, there was a strong correlation between the extent of MF and the Rassi score (r = 0.76. Conclusions: CMR is an important technique for evaluating patients with CD, stressing morphological and functional differences in all clinical presentations. The strong correlation with the Rassi score and the extent of MF detected by CMR emphasizes its role in the prognostic stratification of patients with CD.

  7. Evaluation des méthodes chimiques, spectroscopiques et chromatographiques utilisables pour l'identification des polluants pétroliers en mer Evaluation of Chemical, Spectroscopic and Chromatographic Methods Used to Identify Offshore Oil Pollutants

    Directory of Open Access Journals (Sweden)

    Albaigés J.

    2006-11-01

    Full Text Available Dans cet article on passe en revue les différentes méthodes utilisables pour l'identification des principaux polluants pétroliers de la mer par l'analyse quantitativé de leurs « marqueurs passifs x (soufre, azote, nickel, vanadium, paraffine et asphaltènes et la détermination d'autres caractéristiques intrinsèques. II s'agit de méthodes chimiques, spectroscop iques (infrarouge, ultraviolette et chromatographiques (chromatographie en phase gazeuse à haute résolution avec détection par ionisation de flamme, photométrie de flamme et capture d'électrons. Les mesures ont concerné une grande variété de produits susceptibles de polluer la côte méditerranéenne espagnole - pétrole brut des gisements offshore d'Amposta et de Castellôn; - pétroles bruts importés traités dans les raffineries côtières (Boscan, Es Sider, Kuwait, Arabian light, etc.; - fractions lourdes provenant de ces raffineries (fuel-cils, asphaltes, lubrifiants; - polluants réels; - échantillons altérés artificiellement en laboratoire afin de mettre en évidence l'action progressive des éléments naturels. On a trouvé que les méthodes les plus intéressantes étaient : - le dosage chimique du soufre, du nickel et du vanadium; - la spectroscopie infrarouge; - la chromatographie en phase gazeuse à haute résolution avec détection par ionisation et photométrie de flamme. This article reviews the different methods that con be used to identify the leading petroleum pollutants of the sea by quantitative analysis of their a passive markers » (sulfur, nitrogen, nickel, vanadium, paraffin, asphaltenes and by determining other intrinsic properties. These methods are chemical, spectroscopic (infrared, ultraviolet and chromatographic (high-resolution gas chromatography with flame ionization detection, flame photometry and electron capture. Measurements were made of a great variety of products capable of polluting the Spanish Mediterranean coast, including

  8. Effect of glucocorticosteroid injections in tennis elbow verified on colour Doppler ultrasonography: evidence of inflammation

    DEFF Research Database (Denmark)

    Torp-Pedersen, T.E.; Torp-Pedersen, S.T.; Qvistgaard, E.

    2008-01-01

    were evaluated at baseline before the injection and at 2 weeks of follow-up. Outcome measures were changes in pain score and US parameters (resistive index (RI) and the amount of colour within the CEO). Prognosticators for outcome were: use of computer mouse, symptom duration, elbow strain, RI, colour...... fraction, Likert pain score, pain at rest, pain during activity, age, height, weight, disease in dominant versus nondominant arm. RESULTS: All but one patient experienced improvement of general elbow pain perception at follow-up at 2 weeks. In parallel, Doppler US showed significant reduction in colour...

  9. DMPL: Programming and Verifying Distributed Mixed Synchrony and Mixed Critical Software

    Science.gov (United States)

    2016-06-16

    Program Instance Semantics 13 4.4 Property Specification 14 4.5 Concrete Syntax 14 5 Code Generation 20 6 Verification via Sequentialization 23 6.1...Implementing Sequentialization 23 6.2 Bug Finding and Full Verification 23 7 Evaluation 24 7.1 Reconnaissance example 24 7.2 Other examples 24 8 Future...Gabriel Moreno for help with zsrm, madara, and self -adaptation, and the rest of the dart team members for many helpful comments and discussions. CMU/SEI

  10. Development of a system to verify the programs used for planning of photon beams teletherapy

    International Nuclear Information System (INIS)

    Ocariz Ayala, Victor Daniel

    2004-12-01

    The main objective of radiotherapy is to deliver to the tumor the radiation dose prescribed by the physician, in the most possible accurate form, to save, as much as possible, the healthy tissues located in the neighborhood of the tumor. In order to reach these objectives, it is necessary to carry out a treatment planning and the more the used technologies and therapeutical procedures are sophisticated, the more the planning will be sophisticated. The most sophisticated planning systems use computer programs and are able to determine dose distributions in three dimensions. However, since they work using mathematical models, they may fail and it is necessary to evaluate their performances in order to be considered reliable. Therefore, the availability of a system capable to evaluate the performance of planning systems employed in oncological teletherapy, using ionizing radiation, becomes important. In this work, a data file to be used in radiotherapy planning system quality control (Algorithm accuracy and dose distribution) was developed and it is able to be sent by mail to the radiotherapy services that work with photon beams. (author)

  11. THE RISKS’ ASSESSMENT IN INNOVATIVE PROJECTS BY THE METHOD OF VERIFIED EQUIVALENTS

    Directory of Open Access Journals (Sweden)

    Анатолій Валентинович ШАХОВ

    2017-03-01

    Full Text Available The article describes the concept of "risk of innovation", identified the causes of the risk and the methods of eliminating of negative manifestations of the risk situations in innovative projects. The advantages and disadvantages of the method of correction of the discount rate and the method of equivalent annuities are considered. The methodical approach in assessing the expected effect of the innovative project based on the concept of probability-interval uncertainty is proposed in the article. It was established that the analyzed approaches can be used for the accounting of the risk of innovative projects. Project manager makes his choice using any method of risk assessment individually, depending on the extent and characteristics of the project, the degree of novelty and scale introduction of innovative products, the number of participants and the level of requirements of the foundation of project efficiency and other factors.

  12. On Verifying Currents and Other Features in the Hawaiian Islands Region Using Fully Coupled Ocean/Atmosphere Mesoscale Prediction System Compared to Global Ocean Model and Ocean Observations

    Science.gov (United States)

    Jessen, P. G.; Chen, S.

    2014-12-01

    This poster introduces and evaluates features concerning the Hawaii, USA region using the U.S. Navy's fully Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS-OS™) coupled to the Navy Coastal Ocean Model (NCOM). It also outlines some challenges in verifying ocean currents in the open ocean. The system is evaluated using in situ ocean data and initial forcing fields from the operational global Hybrid Coordinate Ocean Model (HYCOM). Verification shows difficulties in modelling downstream currents off the Hawaiian islands (Hawaii's wake). Comparing HYCOM to NCOM current fields show some displacement of small features such as eddies. Generally, there is fair agreement from HYCOM to NCOM in salinity and temperature fields. There is good agreement in SSH fields.

  13. 222Radon Concentration Measurements biased to Cerro Prieto Fault for Verify its Continuity to the Northwest of the Mexicali Valley.

    Science.gov (United States)

    Lazaro-Mancilla, O.; Lopez, D. L.; Reyes-Lopez, J. A.; Carreón-Diazconti, C.; Ramirez-Hernandez, J.

    2009-05-01

    The need to know the exact location in the field of the fault traces in Mexicali has been an important affair due that the topography in this valley is almost flat and fault traces are hidden by plow zone, for this reason, the southern and northern ends of the San Jacinto and Cerro Prieto fault zones, respectively, are not well defined beneath the thick sequence of late Holocene Lake Cahuilla deposits. The purpose of this study was to verify if Cerro Prieto fault is the continuation to the southeast of the San Jacinto Fault proposed by Hogan in 2002 who based his analysis on pre-agriculture geomorphy, relocation and analysis of regional microseismicity, and trench exposures from a paleoseismic site in Laguna Xochimilco, Mexicali. In this study, four radon (222Rn) profiles were carried out in the Mexicali Valley, first, to the SW-NE of Cerro Prieto Volcano, second, to the W-E along the highway Libramiento San Luis Río Colorado-Tecate, third, to the W-E of Laguna Xochimilco and fourth, to the W-E of the Colonia Progreso. The Radon results allow us to identify in the Cerro Prieto profile four regions where the values exceed 100 picocuries per liter (pCi/L), these regions can be associated to fault traces, one of them associated to the Cerro Prieto Fault (200 pCi/L) and other related with Michoacán de Ocampo Fault (450 pCi/L). The profile Libramiento San Luis Río Colorado-Tecate, show three regions above 100 pCi/L, two of them related to the same faults. In spite of the results of the Laguna Xochimilco, site used by Hogan (2002), the profile permit us observe three regions above the 100 pCi/L, but we can associate only one of the regions above this level to the Michoacán de Ocampo Fault, but none region to the Cerro Prieto Fault. Finally in spite of the Colonia Progreso is the shortest profile with only five stations, it shows one region with a value of 270 pCi/L that we can correlate with the Cerro Prieto Fault. The results of this study allow us to think in the

  14. An Evaluation of the Technical Adequacy of a Revised Measure of Quality Indicators of Transition

    Science.gov (United States)

    Morningstar, Mary E.; Lee, Hyunjoo; Lattin, Dana L.; Murray, Angela K.

    2016-01-01

    This study confirmed the reliability and validity of the Quality Indicators of Exemplary Transition Programs Needs Assessment-2 (QI-2). Quality transition program indicators were identified through a systematic synthesis of transition research, policies, and program evaluation measures. To verify reliability and validity of the QI-2, we…

  15. Perspectives on understanding and verifying the safety terrain of modular high temperature gas-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Donald E., E-mail: donald@carlsonperin.net [11221 Empire Lane, Rockville, MD 20852 (United States); Ball, Sydney J., E-mail: beckysyd@comcast.net [100 Greywood Place, Oak Ridge, TN 37830 (United States)

    2016-09-15

    The passive safety characteristics of modular high temperature gas-cooled reactors (HTGRs) are conceptually well known and are largely supported by insights from past and ongoing research. This paper offers perspectives on selected issues in areas where further analysis and testing achievable within existing research and demonstration programs could help address residual uncertainties and better support the analysis of safety performance and the regulatory assessment of defense in depth. Areas considered include the evaluation of normal and anomalous core operating conditions and the analysis of accidents involving loss of forced cooling, coolant depressurization, air ingress, moisture ingress, and reactivity events. In addition to discussing associated uncertainties and potential measures to address them, this paper also proposes supplemental “safety terrain” studies that would use realistic assessments of postulated extreme event sequences to establish a more comprehensive understanding of the inherent behaviors and ultimate safety capabilities of modular HTGRs.

  16. Is Mc Leod's Patent Pending Naturoptic Method for Restoring Healthy Vision Easy and Verifiable?

    Science.gov (United States)

    Niemi, Paul; McLeod, David; McLeod, Roger

    2006-10-01

    RDM asserts that he and people he has trained can assign visual tasks from standard vision assessment charts, or better replacements, proceeding through incremental changes and such rapid improvements that healthy vision can be restored. Mc Leod predicts that in visual tasks with pupil diameter changes, wavelengths change proportionally. A longer, quasimonochromatic wavelength interval is coincident with foveal cones, and rods. A shorter, partially overlapping interval separately aligns with extrafoveal cones. Wavelengths follow the Airy disk radius formula. Niemi can evaluate if it is true that visual health merely requires triggering and facilitating the demands of possibly overridden feedback signals. The method and process are designed so that potential Naturopathic and other select graduate students should be able to self-fund their higher- level educations from preferential franchising arrangements of earnings while they are in certain programs.

  17. Can experimental data in humans verify the finite element-based bone remodeling algorithm?

    DEFF Research Database (Denmark)

    Wong, C.; Gehrchen, P.M.; Kiaer, T.

    2008-01-01

    STUDY DESIGN: A finite element analysis-based bone remodeling study in human was conducted in the lumbar spine operated on with pedicle screws. Bone remodeling results were compared to prospective experimental bone mineral content data of patients operated on with pedicle screws. OBJECTIVE......: The validity of 2 bone remodeling algorithms was evaluated by comparing against prospective bone mineral content measurements. Also, the potential stress shielding effect was examined using the 2 bone remodeling algorithms and the experimental bone mineral data. SUMMARY OF BACKGROUND DATA: In previous studies...... operated on with pedicle screws between L4 and L5. The stress shielding effect was also examined. The bone remodeling results were compared with prospective bone mineral content measurements of 4 patients. They were measured after surgery, 3-, 6- and 12-months postoperatively. RESULTS: After 1 year...

  18. European wind turbine testing procedure developments. Task 1: Measurement method to verify wind turbine performance characteristics

    DEFF Research Database (Denmark)

    Hunter, R.; Friis Pedersen, Troels; Dunbabin, P.

    2001-01-01

    There is currently significant standardisation work ongoing in the context of wind farm energy yield warranty assessment and wind turbine power performance testing. A standards maintenance team is revising the current IEC (EN) 61400-12 Ed 1 standard forwind turbine power performance testing....... The standard is being divided into four documents. Two of them are drafted for evaluation and verification of complete wind farms and of individual wind turbines within wind farms. This document, and the project itdescribes, has been designed to help provide a solid technical foundation for this revised...... standard. The work was wide ranging and addressed 'grey' areas of knowledge, regarding existing methodologies or to carry out basic research in support offundamentally new procedures. The work has given rise to recommendations in all areas of the work, including site calibration procedures, nacelle...

  19. Perspectives on Understanding and Verifying the Safety Terrain of Modular High Temperature Gas-Cooled Reactors

    International Nuclear Information System (INIS)

    Carlson, Donald E.

    2014-01-01

    The inherent safety characteristics of modular high temperature gas-cooled reactors (HTGRs) are conceptually well known and are largely supported by insights from past and ongoing research. This paper offers perspectives on selected issues in areas where further analysis and testing achievable within existing research and demonstration programs could help address residual uncertainties and better support the analysis of safety performance and the regulatory assessment of defense in depth. Areas considered include the evaluation of normal and anomalous core operating conditions and the analysis of accidents involving coolant depressurization, air ingress, moisture ingress, and reactivity insertion. In addition to discussing associated uncertainties and potential measures to address them, the paper also proposes supplemental “safety terrain” studies that would use realistic assessments of postulated extreme event sequences to establish a more comprehensive understanding of the inherent behaviors and ultimate safety capabilities of modular HTGRs. (author)

  20. European wind turbine testing procedure developments. Task 1: Measurement method to verify wind turbine performance characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, R.; Friis Pedersen, T.; Dunbabin, P.; Antoniou, I.; Frandsen, S.; Klug, H.; Albers, A.; Lee, W.K.

    2001-01-01

    There is currently significant standardisation work ongoing in the context of wind farm energy yield warranty assessment and wind turbine power performance testing. A standards maintenance team is revising the current IEC (EN) 61400-12 Ed 1 standard for wind turbine power performance testing. The standard is being divided into four documents. Two of them are drafted for evaluation and verification of complete wind farms and of individual wind turbines within wind farms. This document, and the project it describes, has been designed to help provide a solid technical foundation for this revised standard. The work was wide ranging and addressed 'grey' areas of knowledge, regarding existing methodologies or to carry out basic research in support of fundamentally new procedures. The work has given rise to recommendations in all areas of the work, including site calibration procedures, nacelle anemometry, multi-variate regression analysis and density normalisation. (au)

  1. Evaluation of unique identifiers used as keys to match identical publications in Pure and SciVal – a case study from health science [version 2; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Heidi Holst Madsen

    2016-09-01

    Full Text Available Unique identifiers (UID are seen as an effective key to match identical publications across databases or identify duplicates in a database. The objective of the present study is to investigate how well UIDs work as match keys in the integration between Pure and SciVal, based on a case with publications from the health sciences. We evaluate the matching process based on information about coverage, precision, and characteristics of publications matched versus not matched with UIDs as the match keys. We analyze this information to detect errors, if any, in the matching process. As an example we also briefly discuss how publication sets formed by using UIDs as the match keys may affect the bibliometric indicators number of publications, number of citations, and the average number of citations per publication.  The objective is addressed in a literature review and a case study. The literature review shows that only a few studies evaluate how well UIDs work as a match key. From the literature we identify four error types: Duplicate digital object identifiers (DOI, incorrect DOIs in reference lists and databases, DOIs not registered by the database where a bibliometric analysis is performed, and erroneous optical or special character recognition. The case study explores the use of UIDs in the integration between the databases Pure and SciVal. Specifically journal publications in English are matched between the two databases. We find all error types except erroneous optical or special character recognition in our publication sets. In particular the duplicate DOIs constitute a problem for the calculation of bibliometric indicators as both keeping the duplicates to improve the reliability of citation counts and deleting them to improve the reliability of publication counts will distort the calculation of average number of citations per publication. The use of UIDs as a match key in citation linking is implemented in many settings, and the availability of

  2. 30 CFR 253.24 - When I submit audited annual financial statements to verify my net worth, what standards must...

    Science.gov (United States)

    2010-07-01

    ... statements to verify my net worth, what standards must they meet? (a) Your audited annual financial statements must be bound. (b) Your audited annual financial statements must include the unqualified opinion of an independent accountant that states: (1) The financial statements are free from material...

  3. 30 CFR 253.27 - When I submit audited annual financial statements to verify my unencumbered assets, what...

    Science.gov (United States)

    2010-07-01

    ... financial statements to verify my unencumbered assets, what standards must they meet? Any audited annual financial statements that you submit must: (a) Meet the standards in § 253.24; and (b) Include a certification by the independent accountant who audited the financial statements that states: (1) The value of...

  4. 24 CFR 5.218 - Penalties for failing to disclose and verify Social Security and Employer Identification Numbers.

    Science.gov (United States)

    2010-04-01

    ... and verify Social Security and Employer Identification Numbers. 5.218 Section 5.218 Housing and Urban... REQUIREMENTS; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security...

  5. 49 CFR 40.137 - On what basis does the MRO verify test results involving marijuana, cocaine, amphetamines, or PCP?

    Science.gov (United States)

    2010-10-01

    ... involving marijuana, cocaine, amphetamines, or PCP? 40.137 Section 40.137 Transportation Office of the... results involving marijuana, cocaine, amphetamines, or PCP? (a) As the MRO, you must verify a confirmed positive test result for marijuana, cocaine, amphetamines, and/or PCP unless the employee presents a...

  6. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  7. Verifying the geographic origin of mahogany (Swietenia macrophylla King) with DNA-fingerprints.

    Science.gov (United States)

    Degen, B; Ward, S E; Lemes, M R; Navarro, C; Cavers, S; Sebbenn, A M

    2013-01-01

    Illegal logging is one of the main causes of ongoing worldwide deforestation and needs to be eradicated. The trade in illegal timber and wood products creates market disadvantages for products from sustainable forestry. Although various measures have been established to counter illegal logging and the subsequent trade, there is a lack of practical mechanisms for identifying the origin of timber and wood products. In this study, six nuclear microsatellites were used to generate DNA fingerprints for a genetic reference database characterising the populations of origin of a large set of mahogany (Swietenia macrophylla King, Meliaceae) samples. For the database, leaves and/or cambium from 1971 mahogany trees sampled in 31 stands from Mexico to Bolivia were genotyped. A total of 145 different alleles were found, showing strong genetic differentiation (δ(Gregorious)=0.52, F(ST)=0.18, G(ST(Hedrick))=0.65) and clear correlation between genetic and spatial distances among stands (r=0.82, P<0.05). We used the genetic reference database and Bayesian assignment testing to determine the geographic origins of two sets of mahogany wood samples, based on their multilocus genotypes. In both cases the wood samples were assigned to the correct country of origin. We discuss the overall applicability of this methodology to tropical timber trading. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Verifying elementary ITER maintenance actions with the MS2 benchmark product

    International Nuclear Information System (INIS)

    Heemskerk, C.J.M.; Elzendoorn, B.S.Q.; Magielsen, A.J.; Schropp, G.Y.R.

    2011-01-01

    A new facility has been taken in operation to investigate the influence of visual and haptic feedback on the performance of remotely executed ITER RH maintenance tasks. A reference set of representative ITER remote handling maintenance tasks was included the master slave manipulator system (MS2) benchmark product. The benchmark product was used in task performance tests in a representative two-handed dexterous manipulation test bed at NRG. In the setup, the quality of visual feedback was varied by exchanging direct view with indirect view setups in which visual feedback is provided via video cameras. Interaction forces were measured via an integrated force sensor. The impact of feedback quality on the performance of maintenance tasks at the level of handling individual parts was measured and analysed. Remote execution of the maintenance actions took roughly 3-5 times more time than hands-on. Visual feedback was identified as the dominant factor, including aspects like (lack of) operator control over camera placement, pan, tilt and zoom, lack of 3D perception, image quality, and latency. Haptic feedback was found to be important, but only in specific contact transition and constrained motion tasks.

  9. 'Trust but verify'--five approaches to ensure safe medical apps.

    Science.gov (United States)

    Wicks, Paul; Chiauzzi, Emil

    2015-09-25

    Mobile health apps are health and wellness programs available on mobile devices such as smartphones or tablets. In three systematic assessments published in BMC Medicine, Huckvale and colleagues demonstrate that widely available health apps meant to help patients calculate their appropriate insulin dosage, educate themselves about asthma, or perform other important functions are methodologically weak. Insulin dose calculators lacked user input validation and made inappropriate dose recommendations, with a lack of documentation throughout. Since 2011, asthma apps have become more interactive, but have not improved in quality; peak flow calculators have the same issues as the insulin calculators. A review of the accredited National Health Service Health Apps Library found poor and inconsistent implementation of privacy and security, with 28% of apps lacking a privacy policy and one even transmitting personally identifying data the policy claimed would be anonymous. Ensuring patient safety might require a new approach, whether that be a consumer education program at one extreme or government regulation at the other. App store owners could ensure transparency of algorithms (whiteboxing), data sharing, and data quality. While a proper balance must be struck between innovation and caution, patient safety must be paramount.Please see related articles: http://dx.doi.org/10.1186/s12916-015-0444-y , http://www.biomedcentral.com/1741-7015/13/106 and http://www.biomedcentral.com/1741-7015/13/58.

  10. Identification of 3-MCPD esters to verify the adulteration of extra virgin olive oil.

    Science.gov (United States)

    Hung, Wei-Ching; Peng, Guan-Jhih; Tsai, Wen-Ju; Chang, Mei-Hua; Liao, Chia-Ding; Tseng, Su-Hsiang; Kao, Ya-Min; Wang, Der-Yuan; Cheng, Hwei-Fang

    2017-09-01

    The adulteration of olive oil is an important issue around the world. This paper reports an indirect method by which to identify 3-monochloropropane-1,2-diol (3-MCPD) esters in olive oils. Following sample preparation, the samples were spiked with 1,2-bis-palmitoyl-3-chloropropanediol standard for analysis using gas chromatograph-tandem mass spectrometry. The total recovery ranged from 102.8% to 105.5%, the coefficient of variation ranged from 1.1% to 10.1%, and the limit of quantification was 0.125 mg/kg. The content of 3-MCPD esters in samples of refined olive oil (0.97-20.53 mg/kg) exceeded those of extra virgin olive oil (non-detected to 0.24 mg/kg). These results indicate that the oil refining process increased the content of 3-MCPD esters, which means that they could be used as a target compound for the differentiation of extra virgin olive oil from refined olive oil in order to prevent adulteration.

  11. Cardiac Magnetic Resonance-Verified Myocardial Fibrosis in Chagas Disease: Clinical Correlates and Risk Stratification.

    Science.gov (United States)

    Uellendahl, Marly; Siqueira, Maria Eduarda Menezes de; Calado, Eveline Barros; Kalil-Filho, Roberto; Sobral, Dário; Ribeiro, Clébia; Oliveira, Wilson; Martins, Silvia; Narula, Jagat; Rochitte, Carlos Eduardo

    2016-11-01

    Chagas disease (CD) is an important cause of heart failure and mortality, mainly in Latin America. This study evaluated the morphological and functional characteristics of the heart as well the extent of myocardial fibrosis (MF) in patients with CD by cardiac magnetic resonance (CMR). The prognostic value of MF evaluated by myocardial-delayed enhancement (MDE) was compared with that via Rassi score. This study assessed 39 patients divided into 2 groups: 28 asymptomatic patients as indeterminate form group (IND); and symptomatic patients as Chagas Heart Disease (CHD) group. All patients underwent CMR using the techniques of cine-MRI and MDE, and the amount of MF was compared with the Rassi score. Regarding the morphological and functional analysis, significant differences were observed between both groups (p realce tardio miocárdico (RTM) foi comparado àquele do escore de Rassi. Avaliação de 39 pacientes divididos em 2 grupos: grupo 'forma indeterminada' (IND), 28 pacientes assintomáticos; e grupo 'cardiopatia chagásica' (CC), pacientes sintomáticos. Todos os pacientes foram submetidos a RMC com as técnicas de cine-RM e RTM, sendo a quantidade de FM evidenciada ao exame comparada ao escore de Rassi. As análises morfológica e funcional mostraram significativas diferenças entre os 2 grupos (p < 0,001). Houve ainda uma forte correlação entre a extensão da FM e o escore de Rassi (r = 0,76). A RMC é uma importante técnica para avaliar pacientes com DC, ressaltando as diferenças morfológicas e funcionais em todas as apresentações clínicas. A forte correlação entre o escore de Rassi e a extensão da FM detectada por RMC enfatiza seu papel na estratificação prognóstica de pacientes com DC.

  12. Use of tiling array data and RNA secondary structure predictions to identify noncoding RNA genes

    DEFF Research Database (Denmark)

    Weile, Christian; Gardner, Paul P; Hedegaard, Mads M

    2007-01-01

    neuroblastoma cell line SK-N-AS. Using this strategy, we identify thousands of human candidate RNA genes. To further verify the expression of these genes, we focused on candidate genes that had a stable hairpin structures or a high level of covariance. Using northern blotting, we verify the expression of 2 out...

  13. Digital tomosynthesis for verifying spine position during radiotherapy: a phantom study

    International Nuclear Information System (INIS)

    Gurney-Champion, Oliver J; Dahele, Max; Slotman, Ben J; Verbakel, Wilko F A R; Mostafavi, Hassan

    2013-01-01

    Monitoring the stability of patient position is essential during high-precision radiotherapy such as spine stereotactic body radiotherapy (SBRT). We evaluated the combination of digital tomosynthesis (DTS) and triangulation for spine position detection, using non-clinical DTS software and an anthropomorphic pelvic phantom that includes a bone-like spine structure. Kilovoltage cone beam CT projection images over 2–16° gantry rotation were used to generate single slice DTS images. Each DTS slice was registered to a digitally reconstructed DTS derived from the planning CT scan to determine 2D shifts between actual phantom and treatment plan position. Two or more DTS registrations, central axes 4–22° apart, were triangulated to determine the 3D phantom position. Using sequentially generated DTS images, the phantom position can be updated every degree with a small latency of DTS and triangulation angle. The precision of position determination was investigated as function of DTS and triangulation angle. To mimic the scenario of spine SBRT, the effect on the standard deviation of megavoltage radiation delivery during kV image acquisition was tested. In addition, the ability of the system to detect different types of movement was investigated for a variety of small sudden and gradual movements during kV image acquisition. (paper)

  14. A joint FED watermarking system using spatial fusion for verifying the security issues of teleradiology.

    Science.gov (United States)

    Viswanathan, P; Krishna, P Venkata

    2014-05-01

    Teleradiology allows transmission of medical images for clinical data interpretation to provide improved e-health care access, delivery, and standards. The remote transmission raises various ethical and legal issues like image retention, fraud, privacy, malpractice liability, etc. A joint FED watermarking system means a joint fingerprint/encryption/dual watermarking system is proposed for addressing these issues. The system combines a region based substitution dual watermarking algorithm using spatial fusion, stream cipher algorithm using symmetric key, and fingerprint verification algorithm using invariants. This paper aims to give access to the outcomes of medical images with confidentiality, availability, integrity, and its origin. The watermarking, encryption, and fingerprint enrollment are conducted jointly in protection stage such that the extraction, decryption, and verification can be applied independently. The dual watermarking system, introducing two different embedding schemes, one used for patient data and other for fingerprint features, reduces the difficulty in maintenance of multiple documents like authentication data, personnel and diagnosis data, and medical images. The spatial fusion algorithm, which determines the region of embedding using threshold from the image to embed the encrypted patient data, follows the exact rules of fusion resulting in better quality than other fusion techniques. The four step stream cipher algorithm using symmetric key for encrypting the patient data with fingerprint verification system using algebraic invariants improves the robustness of the medical information. The experiment result of proposed scheme is evaluated for security and quality analysis in DICOM medical images resulted well in terms of attacks, quality index, and imperceptibility.

  15. On a Test of Hypothesis to Verify the Operating Risk Due to Accountancy Errors

    Directory of Open Access Journals (Sweden)

    Paola Maddalena Chiodini

    2014-12-01

    Full Text Available According to the Statement on Auditing Standards (SAS No. 39 (AU 350.01, audit sampling is defined as “the application of an audit procedure to less than 100 % of the items within an account balance or class of transactions for the purpose of evaluating some characteristic of the balance or class”. The audit system develops in different steps: some are not susceptible to sampling procedures, while others may be held using sampling techniques. The auditor may also be interested in two types of accounting error: the number of incorrect records in the sample that overcome a given threshold (natural error rate, which may be indicative of possible fraud, and the mean amount of monetary errors found in incorrect records. The aim of this study is to monitor jointly both types of errors through an appropriate system of hypotheses, with particular attention to the second type error that indicates the risk of non-reporting errors overcoming the upper precision limits.

  16. Reducing and verifying haloacetic acids in treated drinking water using a biological filter system.

    Science.gov (United States)

    Lou, Jie C; Chan, Hung Y; Yang, Chih Y; Tseng, Wei B; Han, Jia Y

    2014-01-01

    This study focused on reducing the haloacetic acid (HAA) concentrations in treated drinking water. HAA has been thought to be one possible nutrient supporting heterotrophic bacteria regrowth in drinking water. In this study, experiments were conducted using a pilot-scale system to evaluate the efficiency of biological filters (BF) for reducing excess HAA concentrations in water. The BF system reduced the total HAA concentration and the concentrations of five HAA species in the water. Dichloroacetic acid (DCAA), monobromoacetic acid (MBAA) and dibromoacetic acid (DBAA) were the three main HAA5 species that were present in the treated drinking water in this investigation. Combined, these three species represent approximately 77% of the HAA5 in the finished water after BF. The verification of the empirical HAA equation for the outlet in the BF system indicated linear relationships with high correlation coefficients. The empirical equation for the HAA5 concentrations in the finished water was established by examining other nutrients (e.g., dissolved organic carbon (DOC), ultraviolet absorbance at 254 nm wavelength (UV254), and ammonia nitrogen) that can reduce pathogenic contamination. These findings may be useful for designing advanced processes for conventional water treatment plants or for managing water treatment and distribution systems for providing high-quality drinking water.

  17. Field Observations of Precursors to Large Earthquakes: Interpreting and Verifying Their Causes

    Science.gov (United States)

    Suyehiro, K.; Sacks, S. I.; Rydelek, P. A.; Smith, D. E.; Takanami, T.

    2017-12-01

    Many reports of precursory anomalies before large earthquakes exist. However, it has proven elusive to even identify these signals before their actual occurrences. They often only become evident in retrospect. A probabilistic cellular automaton model (Sacks and Rydelek, 1995) explains many of the statistical and dynamic natures of earthquakes including the observed b-value decrease towards a large earthquake or a small stress perturbation to have effect on earthquake occurrence pattern. It also reproduces dynamic characters of each earthquake rupture. This model is useful in gaining insights on causal relationship behind complexities. For example, some reported cases of background seismicity quiescence before a main shock only seen for events larger than M=3 4 at years time scale can be reproduced by this model, if only a small fraction ( 2%) of the component cells are strengthened by a small amount. Such an enhancement may physically occur if a tiny and scattered portion of the seismogenic crust undergoes dilatancy hardening. Such a process to occur will be dependent on the fluid migration and microcracks developments under tectonic loading. Eventual large earthquake faulting will be promoted by the intrusion of excess water from surrounding rocks into the zone capable of cascading slips to a large area. We propose this process manifests itself on the surface as hydrologic, geochemical, or macroscopic anomalies, for which so many reports exist. We infer from seismicity that the eastern Nankai Trough (Tokai) area of central Japan is already in the stage of M-dependent seismic quiescence. Therefore, we advocate that new observations sensitive to detecting water migration in Tokai should be implemented. In particular, vertical component strain, gravity, and/or electrical conductivity, should be observed for verification.

  18. The ability to identify the intraparotid facial nerve for locating parotid gland lesions in comparison to other indirect landmark methods: evaluation by 3.0 T MR imaging with surface coils

    Energy Technology Data Exchange (ETDEWEB)

    Ishibashi, Mana; Fujii, Shinya; Nishihara, Keisuke; Matsusue, Eiji; Kodani, Kazuhiko; Kaminou, Toshio; Ogawa, Toshihide [Tottori University, Division of Radiology, Department of Pathophysiological and Therapeutic Science, Faculty of Medicine, Tottori (Japan); Kawamoto, Katsuyuki [Tottori University, Division of Otolaryngology, Head and Neck Surgery, Department of Medicine of Sensory and Motor Organs, Faculty of Medicine, Tottori (Japan)

    2010-11-15

    It is important to know whether a parotid gland lesion is in the superficial or deep lobe for preoperative planning. We aimed to investigate the ability of 3.0 T magnetic resonance (MR) imaging with surface coils to identify the intraparotid facial nerve and locate parotid gland lesions, in comparison to other indirect landmark methods. We retrospectively evaluated 50 consecutive patients with primary parotid gland lesions. The position of the facial nerve was determined by tracing the nerve in the stylomastoid foramen and then following it on sequential MR sections through the parotid gland. The retromandibular vein and the facial nerve line (FN line) were also identified. For each radiologist and each method, we determined the diagnostic ability for deep lobe lesions and superficial lobe lesions, as well as accuracy. These abilities were compared among the three methods using the Chi-square test with Yates' correction. Mean diagnostic ability for deep lobe lesions, the diagnostic ability for superficial lobe lesions, and accuracy were 92%, 86%, 87%, respectively, for the direct identification method; 67%, 89%, 86%, respectively, for the retromandibular vein method; and 25%, 99%, 90%, respectively, for the FN line method. The direct identification method had significantly higher diagnostic ability for deep lesions than the FN line method (P < 0.01), but significantly lower diagnostic ability for superficial lobe lesions than the FN line method (P < 0.01). Direct identification of the intraparotid facial nerve enables parotid gland lesions to be correctly located, particularly those in the deep lobes. (orig.)

  19. Potential benefits of dosimetric VMAT tracking verified with 3D film measurements

    Energy Technology Data Exchange (ETDEWEB)

    Crijns, Wouter, E-mail: wouter.crijns@uzleuven.be; Depuydt, Tom; Haustermans, Karin [Laboratory of Experimental Radiotherapy, KU Leuven Department of Oncology, Herestraat 49, 3000 Leuven (Belgium); Radiation Oncology, University Hospitals Leuven, Herestraat 49, 3000 Leuven (Belgium); Defraene, Gilles [Laboratory of Experimental Radiotherapy, KU Leuven Department of Oncology, Herestraat 49, 3000 Leuven, Belgium and KU Leuven Medical Imaging Research Center, Herestraat 49, 3000 Leuven (Belgium); Van Herck, Hans [KU Leuven Medical Imaging Research Center, Herestraat 49, 3000 Leuven, Belgium and KU Leuven Department of Electrical Engineering (ESAT)–PSI, Center for Processing Speech and Images, 3000 Leuven (Belgium); Maes, Frederik [KU Leuven Medical Imaging Research Center, Herestraat 49, 3000 Leuven (Belgium); KU Leuven Department of Electrical Engineering (ESAT)–PSI, Center for Processing Speech and Images, 3000 Leuven (Belgium); Medical IT Department, KU Leuven iMinds, 3000 Leuven (Belgium); Van den Heuvel, Frank [Department of Oncology, MRC-CR-UK Gray Institute of Radiation Oncology and Biology, University of Oxford, Oxford OX1 2JD (United Kingdom)

    2016-05-15

    Purpose: To evaluate three different plan adaptation strategies using 3D film-stack dose measurements of both focal boost and hypofractionated prostate VMAT treatments. The adaptation strategies (a couch shift, geometric tracking, and dosimetric tracking) were applied for three realistic intrafraction prostate motions. Methods: A focal boost (35 × 2.2 and 35 × 2.7 Gy) and a hypofractionated (5 × 7.25 Gy) prostate VMAT plan were created for a heterogeneous phantom that allows for internal prostate motion. For these plans geometric tracking and dosimetric tracking were evaluated by ionization chamber (IC) point dose measurements (zero-D) and measurements using a stack of EBT3 films (3D). The geometric tracking applied translations, rotations, and scaling of the MLC aperture in response to realistic prostate motions. The dosimetric tracking additionally corrected the monitor units to resolve variations due to difference in depth, tissue heterogeneity, and MLC-aperture. The tracking was based on the positions of four fiducial points only. The film measurements were compared to the gold standard (i.e., IC measurements) and the planned dose distribution. Additionally, the 3D measurements were converted to dose volume histograms, tumor control probability, and normal tissue complication probability parameters (DVH/TCP/NTCP) as a direct estimate of clinical relevance of the proposed tracking. Results: Compared to the planned dose distribution, measurements without prostate motion and tracking showed already a reduced homogeneity of the dose distribution. Adding prostate motion further blurs the DVHs for all treatment approaches. The clinical practice (no tracking) delivered the dose distribution inside the PTV but off target (CTV), resulting in boost dose errors up to 10%. The geometric and dosimetric tracking corrected the dose distribution’s position. Moreover, the dosimetric tracking could achieve the planned boost DVH, but not the DVH of the more homogeneously

  20. Novel penalised likelihood reconstruction of PET in the assessment of histologically verified small pulmonary nodules

    International Nuclea