WorldWideScience

Sample records for order test verified

  1. A CAREM fuel assembly prototype construction in order to verify its mechanical design using hydrodynamic testing

    International Nuclear Information System (INIS)

    Aparicio, Gaspar; Di Marco, Agustin; Falcone, Jose M.; Giorgis, Miguel A.; Mathot, Sergio R.; Migliori, Julio; Orlando, Oscar S.; Restelli, Miguel A.; Ruggirello, Gabriel; Sapia, Gustavo C.; Zinzallari, Fausto; Bianchi, Daniel R.; Volpi, Ricardo M.

    2000-01-01

    The scope of this paper is to describe the activities of several Groups from three Atomic Centers (C. A. Bariloche, C. A. Ezeiza and C. A. Constituyentes), involved in the manufacturing of a CAREM fuel assembly prototype. The Design Group (UAIN-CAB) carried out the fuel assembly engineering. Cladding components were constructed by the Special Alloys Pilot Factory (UAMCN-CAE). Engineering Group (UACN-CAC) manufactured the parts to be processed, resorting to qualified suppliers. Elastic spacers were completely designed and constructed by this Group, and fuel rods, control rods, guide tubes and spacers were also welded here. Research Reactors Fuels Group (UACN-CAC) carried out the dimensional control of the elaborated parts, while Postirradiation Testing Group (UACN-CAC) performed the assembling of the fuel element. This paper also refers to the design and development of special equipment and devices, all of them required for the prototype construction. (author)

  2. An alternative test for verifying electronic balance linearity

    International Nuclear Information System (INIS)

    Thomas, I.R.

    1998-02-01

    This paper presents an alternative method for verifying electronic balance linearity and accuracy. This method is being developed for safeguards weighings (weighings for the control and accountability of nuclear material) at the Idaho National Engineering and Environmental Laboratory (INEEL). With regard to balance linearity and accuracy, DOE Order 5633.3B, Control and Accountability of Nuclear Materials, Paragraph 2, 4, e, (1), (a) Scales and Balances Program, states: ''All scales and balances used for accountability purposes shall be maintained in good working condition, recalibrated according to an established schedule, and checked for accuracy and linearity on each day that the scale or balance is used for accountability purposes.'' Various tests have been proposed for testing accuracy and linearity. In the 1991 Measurement Science Conference, Dr. Walter E. Kupper presented a paper entitled: ''Validation of High Accuracy Weighing Equipment.'' Dr. Kupper emphasized that tolerance checks for calibrated, state-of-the-art electronic equipment need not be complicated, and he presented four easy steps for verifying that a calibrated balance is operating correctly. These tests evaluate the standard deviation of successive weighings (of the same load), the off-center error, the calibration error, and the error due to nonlinearity. This method of balance validation is undoubtedly an authoritative means of ensuring balance operability, yet it could have two drawbacks: one, the test for linearity is not intuitively obvious, especially from a statistical viewpoint; and two, there is an absence of definitively defined testing limits. Hence, this paper describes an alternative means of verifying electronic balance linearity and accuracy that is being developed for safeguards measurements at the INEEL

  3. Large test rigs verify Clinch River control rod reliability

    International Nuclear Information System (INIS)

    Michael, H.D.; Smith, G.G.

    1983-01-01

    The purpose of the Clinch River control test programme was to use multiple full-scale prototypic control rod systems for verifying the system's ability to perform reliably during simulated reactor power control and emergency shutdown operations. Two major facilities, the Shutdown Control Rod and Maintenance (Scram) facility and the Dynamic and Seismic Test (Dast) facility, were constructed. The test programme of each facility is described. (UK)

  4. Elements of a system for verifying a Comprehensive Test Ban

    International Nuclear Information System (INIS)

    Hannon, W.J.

    1987-01-01

    The paper discusses the goals of a monitoring system for a CTB, its functions, the challenges to verification, discrimination techniques, and some recent developments. It is concluded technical, military and political efforts are required to establish and verify test ban treaties which will contribute to stability in the long term. It currently appears there will be a significant number of unidentified events

  5. Verifying atom entanglement schemes by testing Bell's inequality

    International Nuclear Information System (INIS)

    Angelakis, D.G.; Knight, P.L.; Tregenna, B.; Munro, W.J.

    2001-01-01

    Recent experiments to test Bell's inequality using entangled photons and ions aimed at tests of basic quantum mechanical principles. Interesting results have been obtained and many loopholes could be closed. In this paper we want to point out that tests of Bell's inequality also play an important role in verifying atom entanglement schemes. We describe as an example a scheme to prepare arbitrary entangled states of N two-level atoms using a leaky optical cavity and a scheme to entangle atoms inside a photonic crystal. During the state preparation no photons are emitted, and observing a violation of Bell's inequality is the only way to test whether a scheme works with a high precision or not. (orig.)

  6. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  7. Various verifying tests using full size partial models of PCCV

    International Nuclear Information System (INIS)

    Nagata, Kaoru; Fukihara, Masaaki; Takemoto, Yasushi.

    1987-01-01

    The prestressed concrete containment vessel (PCCV) for Tsuruga No.2 plant of Japan Atomic Power Co. was adopted for the first time in Japan, and the necessity of experimental verification was pointed out about a number of items in the design and construction. In this report, the various tests carried out with full size models are described. The tendon system adopted for this PCCV is BBRV type, in which PC wires are bundled in parallel to make cables, and involves many matters inexperienced in Japan, such as the stretching capacity is as large as 1000 t class, the longest cable is 160 m, and it is the unbonded system of injecting rust inhibitor. It was demanded to confirm by testing the propriety of the small coefficient of friction at the time of stretching tendons. For the tests, the materials, equipment and their size were prepared all as those for actual works. The test works became the rehearsal of the actual prestressing works. Besides, by utilizing these full size test beds, the workability test on concrete at the time of their construction, the confirmation test on tendon strength and the safety of concrete at fixing part at the time of friction test, thereafter, greasing test, the simulation test of in-service inspection, and the thermal loading test on liners were carried out. The results of these tests are briefly reported. (Kako, I.)

  8. 77 FR 70484 - Preoperational Testing of Onsite Electric Power Systems To Verify Proper Load Group Assignments...

    Science.gov (United States)

    2012-11-26

    ...-1294, ``Preoperational Testing of On-Site Electric Power Systems to Verify Proper Load Group... entitled ``Preoperational Testing of On- Site Electric Power Systems to Verify Proper Load Group... Electric Power Systems to Verify Proper Load Group Assignments, Electrical Separation, and Redundancy...

  9. Testing hypotheses in order

    OpenAIRE

    Paul R. Rosenbaum

    2008-01-01

    In certain circumstances, one wishes to test one hypothesis only if certain other hypotheses have been rejected. This ordering of hypotheses simplifies the task of controlling the probability of rejecting any true hypothesis. In an example from an observational study, a treated group is shown to be further from both of two control groups than the two control groups are from each other. Copyright 2008, Oxford University Press.

  10. How to verify lightning protection efficiency for electrical systems? Testing procedures and practical applications

    Energy Technology Data Exchange (ETDEWEB)

    Birkl, Josef; Zahlmann, Peter [DEHN and SOEHNE, Neumarkt (Germany)], Emails: Josef.Birkl@technik.dehn.de, Peter.Zahlmann@technik.dehn.de

    2007-07-01

    There are increasing numbers of applications, installing Surge Protective Devices (SPDs), through which partial lightning currents flow, and highly sensitive, electronic devices to be protected closely next to each other due to the design of electric distribution systems and switchgear installations which is getting more and more compact. In these cases, the protective function of the SPDs has to be co-ordinated with the individual immunity of the equipment against energetic, conductive impulse voltages and impulse currents. In order to verify the immunity against partial lightning currents of the complete system laboratory tests on a system level are a suitable approach. The proposed test schemes for complete systems have been successfully performed on various applications. Examples will be presented. (author)

  11. Verifying object-oriented programs with higher-order separation logic in Coq

    DEFF Research Database (Denmark)

    Bengtson, Jesper; Jensen, Jonas Braband; Sieczkowski, Filip

    2011-01-01

    We present a shallow Coq embedding of a higher-order separation logic with nested triples for an object-oriented programming language. Moreover, we develop novel specification and proof patterns for reasoning in higher-order separation logic with nested triples about programs that use interfaces...... and interface inheritance. In particular, we show how to use the higher-order features of the Coq formalisation to specify and reason modularly about programs that (1) depend on some unknown code satisfying a specification or that (2) return objects conforming to a certain specification. All of our results have...

  12. Specifying and Verifying Organizational Security Properties in First-Order Logic

    Science.gov (United States)

    Brandt, Christoph; Otten, Jens; Kreitz, Christoph; Bibel, Wolfgang

    In certain critical cases the data flow between business departments in banking organizations has to respect security policies known as Chinese Wall or Bell-La Padula. We show that these policies can be represented by formal requirements and constraints in first-order logic. By additionally providing a formal model for the flow of data between business departments we demonstrate how security policies can be applied to a concrete organizational setting and checked with a first-order theorem prover. Our approach can be applied without requiring a deep formal expertise and it therefore promises a high potential of usability in the business.

  13. Programming and Verifying a Declarative First-Order Prover in Isabelle/HOL

    DEFF Research Database (Denmark)

    Jensen, Alexander Birch; Larsen, John Bruntse; Schlichtkrull, Anders

    2018-01-01

    We certify in the proof assistant Isabelle/HOL the soundness of a declarative first-order prover with equality. The LCF-style prover is a translation we have made, to Standard ML, of a prover in John Harrison’s Handbook of Practical Logic and Automated Reasoning. We certify it by replacing its ke......’s ML environment as an interactive application or can be used standalone in OCaml or Standard ML (or in other functional programming languages like Haskell and Scala with some additional work)....

  14. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    International Nuclear Information System (INIS)

    Tolk, Keith M.; Stoker, Gerald C.

    1999-01-01

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements

  15. Verifying seismic design of nuclear reactors by testing. Volume 1: test plan

    Energy Technology Data Exchange (ETDEWEB)

    1979-07-20

    This document sets forth recommendations for a verification program to test the ability of operational nuclear power plants to achieve safe shutdown immediately following a safe-shutdown earthquake. The purpose of the study is to develop a program plan to provide assurance by physical demonstration that nuclear power plants are earthquake resistant and to allow nuclear power plant operators to (1) decide whether tests should be conducted on their facilities, (2) specify the tests that should be performed, and (3) estimate the cost of the effort to complete the recommended test program.

  16. Verifying seismic design of nuclear reactors by testing. Volume 1: test plan

    International Nuclear Information System (INIS)

    1979-01-01

    This document sets forth recommendations for a verification program to test the ability of operational nuclear power plants to achieve safe shutdown immediately following a safe-shutdown earthquake. The purpose of the study is to develop a program plan to provide assurance by physical demonstration that nuclear power plants are earthquake resistant and to allow nuclear power plant operators to (1) decide whether tests should be conducted on their facilities, (2) specify the tests that should be performed, and (3) estimate the cost of the effort to complete the recommended test program

  17. Preliminary results of an attempt to provide soil moisture datasets in order to verify numerical weather prediction models

    International Nuclear Information System (INIS)

    Cassardo, C.; Loglisci, N.

    2005-01-01

    In the recent years, there has been a significant growth in the recognition of the soil moisture importance in large-scale hydrology and climate modelling. Soil moisture is a lower boundary condition, which rules the partitioning of energy in terms of sensible and latent heat flux. Wrong estimations of soil moisture lead to wrong simulation of the surface layer evolution and hence precipitations and cloud cover forecasts could be consequently affected. This is true for large scale medium-range weather forecasts as well as for local-scale short range weather forecasts, particularly in those situations in which local convection is well developed. Unfortunately; despite the importance of this physical parameter there are only few soil moisture data sets sparse in time and in space around in the world. Due to this scarcity of soil moisture observations, we developed an alternative method to provide soil moisture datasets in order to verify numerical weather prediction models. In this paper are presented the preliminary results of an attempt to verify soil moisture fields predicted by a mesoscale model. The data for the comparison were provided by the simulations of the diagnostic land surface scheme LSPM (Land Surface Process Model), widely used at the Piedmont Regional Weather Service for agro-meteorological purposes. To this end, LSPM was initialized and driven by Synop observations, while the surface (vegetation and soil) parameter values were initialized by ECOCLIMAP global dataset at 1km 2 resolution

  18. Risks of Using Bedside Tests to Verify Nasogastric Tube Position in Adult Patients

    Directory of Open Access Journals (Sweden)

    Melody Ni

    2014-12-01

    Full Text Available Nasogastric (NG tubes are commonly used for enteral feeding. Complications of feeding tube misplacement include malnutrition, pulmonary aspiration, and even death. We built a Bayesian network (BN to analyse the risks associated with available bedside tests to verify tube position. Evidence on test validity (sensitivity and specificity was retrieved from a systematic review. Likelihood ratios were used to select the best tests for detecting tubes misplaced in the lung or oesophagus. Five bedside tests were analysed including magnetic guidance, aspirate pH, auscultation, aspirate appearance, and capnography/colourimetry. Among these, auscultation and appearance are non-diagnostic towards lung or oesophagus placements. Capnography/ colourimetry can confirm but cannot rule out lung placement. Magnetic guidance can rule out both lung and oesophageal placement. However, as a relatively new technology, further validation studies are needed. The pH test with a cut-off at 5.5 or lower can rule out lung intubation. Lowering the cut-off to 4 not only minimises oesophageal intubation but also provides extra safety as the sensitivity of pH measurement is reduced by feeding, antacid medication, or the use of less accurate pH paper. BN is an effective tool for representing and analysing multi-layered uncertainties in test validity and reliability for the verification of NG tube position. Aspirate pH with a cut-off of 4 is the safest bedside method to minimise lung and oesophageal misplacement.

  19. Verifying Digital Components of Physical Systems: Experimental Evaluation of Test Quality

    Science.gov (United States)

    Laputenko, A. V.; López, J. E.; Yevtushenko, N. V.

    2018-03-01

    This paper continues the study of high quality test derivation for verifying digital components which are used in various physical systems; those are sensors, data transfer components, etc. We have used logic circuits b01-b010 of the package of ITC'99 benchmarks (Second Release) for experimental evaluation which as stated before, describe digital components of physical systems designed for various applications. Test sequences are derived for detecting the most known faults of the reference logic circuit using three different approaches to test derivation. Three widely used fault types such as stuck-at-faults, bridges, and faults which slightly modify the behavior of one gate are considered as possible faults of the reference behavior. The most interesting test sequences are short test sequences that can provide appropriate guarantees after testing, and thus, we experimentally study various approaches to the derivation of the so-called complete test suites which detect all fault types. In the first series of experiments, we compare two approaches for deriving complete test suites. In the first approach, a shortest test sequence is derived for testing each fault. In the second approach, a test sequence is pseudo-randomly generated by the use of an appropriate software for logic synthesis and verification (ABC system in our study) and thus, can be longer. However, after deleting sequences detecting the same set of faults, a test suite returned by the second approach is shorter. The latter underlines the fact that in many cases it is useless to spend `time and efforts' for deriving a shortest distinguishing sequence; it is better to use the test minimization afterwards. The performed experiments also show that the use of only randomly generated test sequences is not very efficient since such sequences do not detect all the faults of any type. After reaching the fault coverage around 70%, saturation is observed, and the fault coverage cannot be increased anymore. For

  20. 49 CFR 40.137 - On what basis does the MRO verify test results involving marijuana, cocaine, amphetamines, or PCP?

    Science.gov (United States)

    2010-10-01

    ... involving marijuana, cocaine, amphetamines, or PCP? 40.137 Section 40.137 Transportation Office of the... results involving marijuana, cocaine, amphetamines, or PCP? (a) As the MRO, you must verify a confirmed positive test result for marijuana, cocaine, amphetamines, and/or PCP unless the employee presents a...

  1. Verifying the functional ability of microstructured surfaces by model-based testing

    Science.gov (United States)

    Hartmann, Wito; Weckenmann, Albert

    2014-09-01

    Micro- and nanotechnology enables the use of new product features such as improved light absorption, self-cleaning or protection, which are based, on the one hand, on the size of functional nanostructures and the other hand, on material-specific properties. With the need to reliably measure progressively smaller geometric features, coordinate and surface-measuring instruments have been refined and now allow high-resolution topography and structure measurements down to the sub-nanometre range. Nevertheless, in many cases it is not possible to make a clear statement about the functional ability of the workpiece or its topography because conventional concepts of dimensioning and tolerancing are solely geometry oriented and standardized surface parameters are not sufficient to consider interaction with non-geometric parameters, which are dominant for functions such as sliding, wetting, sealing and optical reflection. To verify the functional ability of microstructured surfaces, a method was developed based on a parameterized mathematical-physical model of the function. From this model, function-related properties can be identified and geometric parameters can be derived, which may be different for the manufacturing and verification processes. With this method it is possible to optimize the definition of the shape of the workpiece regarding the intended function by applying theoretical and experimental knowledge, as well as modelling and simulation. Advantages of this approach will be discussed and demonstrated by the example of a microstructured inking roll.

  2. Verifying seismic design of nuclear reactors by testing. Volume 2: appendix, theoretical discussions

    International Nuclear Information System (INIS)

    1979-01-01

    Theoretical discussions on seismic design testing are presented under the following appendix headings: system functions, pulse optimization program, system identification, and motion response calculations from inertance measurements of a nuclear power plant

  3. Verifying the nuclear-test ban. CTBTO: For a safer and more secure world [videorecording

    International Nuclear Information System (INIS)

    1999-01-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) was opened for signature in September 1996. In March 1997, the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization started work in Vienna, Austria. This film depicts the Commission's activities in establishing the Treaty's verification regime to monitor adherence to the global ban on nuclear explosions. It presents the challenging work at some of the global monitoring facilities, and at the International Data Centre in Vienna, where the data generated by the facilities are received, processed and analysed

  4. European wind turbine testing procedure developments. Task 1: Measurement method to verify wind turbine performance characteristics

    DEFF Research Database (Denmark)

    Hunter, R.; Friis Pedersen, Troels; Dunbabin, P.

    2001-01-01

    There is currently significant standardisation work ongoing in the context of wind farm energy yield warranty assessment and wind turbine power performance testing. A standards maintenance team is revising the current IEC (EN) 61400-12 Ed 1 standard forwind turbine power performance testing....... The standard is being divided into four documents. Two of them are drafted for evaluation and verification of complete wind farms and of individual wind turbines within wind farms. This document, and the project itdescribes, has been designed to help provide a solid technical foundation for this revised...... standard. The work was wide ranging and addressed 'grey' areas of knowledge, regarding existing methodologies or to carry out basic research in support offundamentally new procedures. The work has given rise to recommendations in all areas of the work, including site calibration procedures, nacelle...

  5. European wind turbine testing procedure developments. Task 1: Measurement method to verify wind turbine performance characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, R.; Friis Pedersen, T.; Dunbabin, P.; Antoniou, I.; Frandsen, S.; Klug, H.; Albers, A.; Lee, W.K.

    2001-01-01

    There is currently significant standardisation work ongoing in the context of wind farm energy yield warranty assessment and wind turbine power performance testing. A standards maintenance team is revising the current IEC (EN) 61400-12 Ed 1 standard for wind turbine power performance testing. The standard is being divided into four documents. Two of them are drafted for evaluation and verification of complete wind farms and of individual wind turbines within wind farms. This document, and the project it describes, has been designed to help provide a solid technical foundation for this revised standard. The work was wide ranging and addressed 'grey' areas of knowledge, regarding existing methodologies or to carry out basic research in support of fundamentally new procedures. The work has given rise to recommendations in all areas of the work, including site calibration procedures, nacelle anemometry, multi-variate regression analysis and density normalisation. (au)

  6. Standard test method for verifying the alignment of X-Ray diffraction instrumentation for residual stress measurement

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the preparation and use of a flat stress-free test specimen for the purpose of checking the systematic error caused by instrument misalignment or sample positioning in X-ray diffraction residual stress measurement, or both. 1.2 This test method is applicable to apparatus intended for X-ray diffraction macroscopic residual stress measurement in polycrystalline samples employing measurement of a diffraction peak position in the high-back reflection region, and in which the θ, 2θ, and ψ rotation axes can be made to coincide (see Fig. 1). 1.3 This test method describes the use of iron powder which has been investigated in round-robin studies for the purpose of verifying the alignment of instrumentation intended for stress measurement in ferritic or martensitic steels. To verify instrument alignment prior to stress measurement in other metallic alloys and ceramics, powder having the same or lower diffraction angle as the material to be measured should be prepared in similar fashion...

  7. Testing Consent Order for Sodium Cyanide

    Science.gov (United States)

    This document announces that EPA has signed an enforceable testing Consent Order with E.I. du Pont de Nemours and Company (DuPont), FMC Corporation (FMC), Degussa Corporation (Degussa), ICI Americas Incorporated (ICI), and Cyanco Company (Cyanco).

  8. Testing Consent Order on Refractory Ceramic Fibers

    Science.gov (United States)

    This notice announces that EPA has signed signed an enforceable testing consent order under the Toxic Substances Control Act (TSCA), 15 U.S.C. section 2601 at seq., with three of the primary producers of refractory ceramic fibers (RCF).

  9. Permutation Tests for Stochastic Ordering and ANOVA

    CERN Document Server

    Basso, Dario; Salmaso, Luigi; Solari, Aldo

    2009-01-01

    Permutation testing for multivariate stochastic ordering and ANOVA designs is a fundamental issue in many scientific fields such as medicine, biology, pharmaceutical studies, engineering, economics, psychology, and social sciences. This book presents advanced methods and related R codes to perform complex multivariate analyses

  10. Building and verifying a severity prediction model of acute pancreatitis (AP) based on BISAP, MEWS and routine test indexes.

    Science.gov (United States)

    Ye, Jiang-Feng; Zhao, Yu-Xin; Ju, Jian; Wang, Wei

    2017-10-01

    To discuss the value of the Bedside Index for Severity in Acute Pancreatitis (BISAP), Modified Early Warning Score (MEWS), serum Ca2+, similarly hereinafter, and red cell distribution width (RDW) for predicting the severity grade of acute pancreatitis and to develop and verify a more accurate scoring system to predict the severity of AP. In 302 patients with AP, we calculated BISAP and MEWS scores and conducted regression analyses on the relationships of BISAP scoring, RDW, MEWS, and serum Ca2+ with the severity of AP using single-factor logistics. The variables with statistical significance in the single-factor logistic regression were used in a multi-factor logistic regression model; forward stepwise regression was used to screen variables and build a multi-factor prediction model. A receiver operating characteristic curve (ROC curve) was constructed, and the significance of multi- and single-factor prediction models in predicting the severity of AP using the area under the ROC curve (AUC) was evaluated. The internal validity of the model was verified through bootstrapping. Among 302 patients with AP, 209 had mild acute pancreatitis (MAP) and 93 had severe acute pancreatitis (SAP). According to single-factor logistic regression analysis, we found that BISAP, MEWS and serum Ca2+ are prediction indexes of the severity of AP (P-value0.05). The multi-factor logistic regression analysis showed that BISAP and serum Ca2+ are independent prediction indexes of AP severity (P-value0.05); BISAP is negatively related to serum Ca2+ (r=-0.330, P-valuemodel is as follows: ln()=7.306+1.151*BISAP-4.516*serum Ca2+. The predictive ability of each model for SAP follows the order of the combined BISAP and serum Ca2+ prediction model>Ca2+>BISAP. There is no statistical significance for the predictive ability of BISAP and serum Ca2+ (P-value>0.05); however, there is remarkable statistical significance for the predictive ability using the newly built prediction model as well as BISAP

  11. Standard practice of calibration of force-measuring instruments for verifying the force indication of testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 The purpose of this practice is to specify procedures for the calibration of force-measuring instruments. Procedures are included for the following types of instruments: 1.1.1 Elastic force-measuring instruments, and 1.1.2 Force-multiplying systems, such as balances and small platform scales. Note 1Verification by deadweight loading is also an acceptable method of verifying the force indication of a testing machine. Tolerances for weights for this purpose are given in Practices E 4; methods for calibration of the weights are given in NIST Technical Note 577, Methods of Calibrating Weights for Piston Gages. 1.2 The values stated in SI units are to be regarded as the standard. Other metric and inch-pound values are regarded as equivalent when required. 1.3 This practice is intended for the calibration of static force measuring instruments. It is not applicable for dynamic or high speed force calibrations, nor can the results of calibrations performed in accordance with this practice be assumed valid for...

  12. Finger Length Ratio (2D:4D) in Central India and an Attempt to Verify Fraternal Birth Order Effect: A Population Based Cross-Sectional Study.

    Science.gov (United States)

    Maitra, Arjun; Maitra, Chaitali; Jha, Dilip Kumar; Biswas, Rakesh

    2016-12-01

    A normal physiology of a human being is not mere a series of functions occurring with specific intensities and timing. There are lot of factors that may change the normal physiological activity within normal limits. Finger length ratio is one of the markers of intrauterine androgen exposure and it is debated and contradicted by many authors. Digit ratio varies among the ethnicities. Many Indian studies show that there is considerable difference in finger length ratio in different population. Data regarding Central India was not found on extensive search. To find out the finger length ratio and explore the birth order effect on finger length ratio among the first two successive born in the said population. We conducted a survey on 1500 volunteer persons (800 male and 700 female) over two years of time. We measured the length of the index finger (2D) and ring finger (4D) of both the hands and asked about their birth order history to find out the digit ratio for Central India population and any existing correlation of the same with birth order. T Test and Analysis of Variance (ANOVA) were used for the measure of significance and difference among the groups. The peffect among the eldest, second born with elder brother and second born with elder sister groups, no significant (p>0.05) variation for finger length ratio of right and left hands observed in both male and female population. Our study reports that the finger length ratio (2D:4D) for Central India population did not show significant association between finger length ratio and fraternal birth order among the first two successive born.

  13. Evaluation of food emergency response laboratories' capability for 210Po analysis using proficiency test material with verifiable traceability

    International Nuclear Information System (INIS)

    Zhongyu Wu; Zhichao Lin; Mackill, P.; Cong Wei; Noonan, J.; Cherniack, J.; Gillis-Landrum, D.

    2009-01-01

    Measurement capability and data comparability are essential for emergency response when analytical data from cooperative laboratories are used for risk assessment and post incident decision making. In this study, the current capability of food emergency response laboratories for the analysis of 210 Po in water was evaluated using a proficiency test scheme in compliance with ISO-43 and ILAC G13 guidelines, which comprises a test sample preparation and verification protocol and an insightful statistical data evaluation. The results of performance evaluations on relative bias, value trueness, precision, false positive detection, minimum detection limit, and limit of quantification, are presented. (author)

  14. A Test of High-Order Thinking

    Science.gov (United States)

    Libresco, Andrea S.

    2007-01-01

    In this article, the author presents a case study of three fourth grade teachers in New York, a state that administers an elementary social studies test, constructed by teachers, that relies on the use of documents in the majority of its questions. Throughout the 2002-2003 school year, the three teachers were observed during every unit of social…

  15. 49 CFR 40.139 - On what basis does the MRO verify test results for codeine and morphine?

    Science.gov (United States)

    2010-10-01

    ... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the... unauthorized use of any opium, opiate, or opium derivative (i.e., morphine, heroin, or codeine). (1) As an MRO... needle tracks; (ii) Behavioral and psychological signs of acute opiate intoxication or withdrawal; (iii...

  16. Improving follow-up of abnormal cancer screens using electronic health records: trust but verify test result communication

    Directory of Open Access Journals (Sweden)

    Reis Brian

    2009-12-01

    Full Text Available Abstract Background Early detection of colorectal cancer through timely follow-up of positive Fecal Occult Blood Tests (FOBTs remains a challenge. In our previous work, we found 40% of positive FOBT results eligible for colonoscopy had no documented response by a treating clinician at two weeks despite procedures for electronic result notification. We determined if technical and/or workflow-related aspects of automated communication in the electronic health record could lead to the lack of response. Methods Using both qualitative and quantitative methods, we evaluated positive FOBT communication in the electronic health record of a large, urban facility between May 2008 and March 2009. We identified the source of test result communication breakdown, and developed an intervention to fix the problem. Explicit medical record reviews measured timely follow-up (defined as response within 30 days of positive FOBT pre- and post-intervention. Results Data from 11 interviews and tracking information from 490 FOBT alerts revealed that the software intended to alert primary care practitioners (PCPs of positive FOBT results was not configured correctly and over a third of positive FOBTs were not transmitted to PCPs. Upon correction of the technical problem, lack of timely follow-up decreased immediately from 29.9% to 5.4% (p Conclusion Electronic communication of positive FOBT results should be monitored to avoid limiting colorectal cancer screening benefits. Robust quality assurance and oversight systems are needed to achieve this. Our methods may be useful for others seeking to improve follow-up of FOBTs in their systems.

  17. Status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  18. Post-upgrade testing on a radiotherapy oncology information system with an embedded record and verify system following the IAEA Human Health Report No. 7 recommendations.

    Science.gov (United States)

    Nyathi, Thulani; Colyer, Christopher; Bhardwaj, Anup Kumar; Rijken, James; Morton, Jason

    2016-06-01

    Record and verify (R&V) systems have proven that their application in radiotherapy clinics leads to a significant reduction in mis-treatments of patients. The purpose of this technical note is to share our experience of acceptance testing, commissioning and setting up a quality assurance programme for the MOSAIQ® oncology information system and R&V system after upgrading from software version 2.41 to 2.6 in a multi-vendor, multi-site environment. Testing was guided primarily by the IAEA Human Report No. 7 recommendations, but complemented by other departmental workflow specific tests. To the best of our knowledge, this is the first time successful implementation of the IAEA Human Health Report Series No. 7 recommendations have been reported in the literature. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. HHF test with 80x80x1 Be/Cu/SS Mock-ups for verifying the joining technology of the ITER blanket First Wall

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won; Bae, Young Dug; Kim, Suk Kwon; Hong, Bong Guen; Jeong, Yong Hwan; Park, Jeong Yong; Choi, Byung Kwon; Jung, Hyun Kyu

    2008-11-15

    Through the fabrication of the Cu/SS and Be/Cu joint specimens, fabrication procedure such as material preparation, canning, degassing, HIP (Hot Isostatic Pressing), PHHT (Post HIP heat treatment) was established. The HIP conditions (1050 .deg. C, 100 MPa 2 hr for Cu/SS, 580 .deg. C 100 MPa 2 hr for Be/Cu) were developed through the investigation on joint specimen fabricated with the various HIP conditions; the destructive tests of joint include the microstructure observation of the interface with the examination of the elemental distribution, tension test, bend test, Charpy impact test and fracture toughness test. However, since the joint should be tested under the High Heat Flux (HHF) conditions like the ITER operation for verifying its joint integrity, several HHF tests were performed like the previous HHF test with the Cu/SS, Be/Cu, Be/Cu/SS Mock-ups. In the present study, the HHF test with Be/Cu/SS Mock-ups, which have 80 mm x 80 mm single Be tile and each material depths were kept to be the same as the ITER blanket FW. The Mock-ups fabricated with three kinds of interlayers such as Cr/Ti/Cu, Ti/Cr/Cu, Ti/Cu, which were different from the developed interlayer (Cr/Cu), total 6 Mock-ups were fabricated. Preliminary analysis were performed to decide the test conditions; they were tested with up to 2.5 MW/m2 of heat fluxes and 20 cycles for each Mock-up in a given heat flux. They were tested with JUDITH-1 at FZJ in Germany. During tests, all Mock-ups showed delamination or full detachment of Be tile and it can be concluded that the joints with these interlayers have a bad joining but it can be used as a good data for developing the Be/Cu joint with HIP.

  20. 49 CFR 655.61 - Action when an employee has a verified positive drug test result or has a confirmed alcohol test...

    Science.gov (United States)

    2010-10-01

    ... drug test result or has a confirmed alcohol test result of 0.04 or greater, or refuses to submit to a... drug test result or has a confirmed alcohol test result of 0.04 or greater, or refuses to submit to a... performing a safety-sensitive function. (3) If an employee refuses to submit to a drug or alcohol test...

  1. Externally Verifiable Oblivious RAM

    Directory of Open Access Journals (Sweden)

    Gancher Joshua

    2017-04-01

    Full Text Available We present the idea of externally verifiable oblivious RAM (ORAM. Our goal is to allow a client and server carrying out an ORAM protocol to have disputes adjudicated by a third party, allowing for the enforcement of penalties against an unreliable or malicious server. We give a security definition that guarantees protection not only against a malicious server but also against a client making false accusations. We then give modifications of the Path ORAM [15] and Ring ORAM [9] protocols that meet this security definition. These protocols both have the same asymptotic runtimes as the semi-honest original versions and require the external verifier to be involved only when the client or server deviates from the protocol. Finally, we implement externally verified ORAM, along with an automated cryptocurrency contract to use as the external verifier.

  2. Verifier Theory and Unverifiability

    OpenAIRE

    Yampolskiy, Roman V.

    2016-01-01

    Despite significant developments in Proof Theory, surprisingly little attention has been devoted to the concept of proof verifier. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verificati...

  3. Construction of testing facilities and verifying tests of a 22.9 kV/630 A class superconducting fault current limiter

    Science.gov (United States)

    Yim, S.-W.; Yu, S.-D.; Kim, H.-R.; Kim, M.-J.; Park, C.-R.; Yang, S.-E.; Kim, W.-S.; Hyun, O.-B.; Sim, J.; Park, K.-B.; Oh, I.-S.

    2010-11-01

    We have constructed and completed the preparation for a long-term operation test of a superconducting fault current limiter (SFCL) in a Korea Electric Power Corporation (KEPCO) test grid. The SFCL with rating of 22.9 kV/630 A, 3-phases, has been connected to the 22.9 kV test grid equipped with reclosers and other protection devices in Gochang Power Testing Center of KEPCO. The main goals of the test are the verification of SFCL performance and protection coordination studies. A line-commutation type SFCL was fabricated and installed for this project, and the superconducting components were cooled by a cryo-cooler to 77 K in the sub-cooled liquid nitrogen pressurized by 3 bar of helium gas. The verification test includes un-manned - long-term operation with and without loads and fault tests. Since the test site is 170 km away from the laboratory, we will adopt the un-manned operation with real-time remote monitoring and controlling using high speed internet. For the fault tests, we will apply fault currents up to around 8 kArms to the SFCL using an artificial fault generator. The fault tests may allow us not only to confirm the current limiting capability of the SFCL, but also to adjust the SFCL - recloser coordination such as resetting over-current relay parameters. This paper describes the construction of the testing facilities and discusses the plans for the verification tests.

  4. Construction of testing facilities and verifying tests of a 22.9 kV/630 A class superconducting fault current limiter

    International Nuclear Information System (INIS)

    Yim, S.-W.; Yu, S.-D.; Kim, H.-R.; Kim, M.-J.; Park, C.-R.; Yang, S.-E.; Kim, W.-S.; Hyun, O.-B.; Sim, J.; Park, K.-B.; Oh, I.-S.

    2010-01-01

    We have constructed and completed the preparation for a long-term operation test of a superconducting fault current limiter (SFCL) in a Korea Electric Power Corporation (KEPCO) test grid. The SFCL with rating of 22.9 kV/630 A, 3-phases, has been connected to the 22.9 kV test grid equipped with reclosers and other protection devices in Gochang Power Testing Center of KEPCO. The main goals of the test are the verification of SFCL performance and protection coordination studies. A line-commutation type SFCL was fabricated and installed for this project, and the superconducting components were cooled by a cryo-cooler to 77 K in the sub-cooled liquid nitrogen pressurized by 3 bar of helium gas. The verification test includes un-manned - long-term operation with and without loads and fault tests. Since the test site is 170 km away from the laboratory, we will adopt the un-manned operation with real-time remote monitoring and controlling using high speed internet. For the fault tests, we will apply fault currents up to around 8 kA rms to the SFCL using an artificial fault generator. The fault tests may allow us not only to confirm the current limiting capability of the SFCL, but also to adjust the SFCL - recloser coordination such as resetting over-current relay parameters. This paper describes the construction of the testing facilities and discusses the plans for the verification tests.

  5. A controllability test for general first-order representations

    NARCIS (Netherlands)

    U. Helmke; J. Rosenthal; J.M. Schumacher (Hans)

    1995-01-01

    textabstractIn this paper we derive a new controllability rank test for general first-order representations. The criterion generalizes the well-known controllability rank test for linear input-state systems as well as a controllability rank test by Mertzios et al. for descriptor systems.

  6. Neuropsychological Testing in Pathologically Verified Alzheimer Disease and Frontotemporal Dementia: How Well Do the Uniform Data Set Measures Differentiate Between Diseases?

    Science.gov (United States)

    Ritter, Aaron R; Leger, Gabriel C; Miller, Justin B; Banks, Sarah J

    2017-01-01

    Differences in cognition between frontotemporal dementia (FTD) and Alzheimer disease (AD) are well described in clinical cohorts, but have rarely been confirmed in studies with pathologic verification. For emerging therapeutics to succeed, determining underlying pathology early in the disease course is increasingly important. Neuropsychological evaluation is an important component of the diagnostic workup for AD and FTD. Patients with FTD are thought to have greater deficits in language and executive function while patients with AD are more likely to have deficits in memory. To determine if performance on initial cognitive testing can reliably distinguish between patients with frontotemporal lobar degeneration (FTLD) and AD neuropathology. In addition, are there other factors of the neuropsychological assessment that can be used to enhance the accuracy of underlying pathology? Using a logistic regression we retrospectively compared neurocognitive performance on initial evaluation of 106 patients with pathologically verified FTLD (pvFTLD), with 558 pathologically verified AD (pvAD) patients from the National Alzheimer's Coordinating Center using data from the Uniform Data Set (UDS) and the neuropathology data set. As expected, pvFTLD patients were younger, demonstrated better memory performance, and had more neuropsychiatric symptoms than pvAD patients. Other results were less predictable: pvFTLD patients performed better on one test of executive function (trail making test part B) but worse on another (digit span backward). Performance on language testing did not strongly distinguish the 2 groups. To determine what factors led to a misdiagnosis of AD in patients with FTLD, we further analyzed a small group of pvFTLD patients. These patients demonstrated older age and lower Neuropsychiatric Inventory Questionnaire counts compared with accurately diagnosed cases. Other than memory, numerical scores of neurocognitive performance on the UDS are of limited value in

  7. Testing First-Order Logic Axioms in AutoCert

    Science.gov (United States)

    Ahn, Ki Yung; Denney, Ewen

    2009-01-01

    AutoCert [2] is a formal verification tool for machine generated code in safety critical domains, such as aerospace control code generated from MathWorks Real-Time Workshop. AutoCert uses Automated Theorem Provers (ATPs) [5] based on First-Order Logic (FOL) to formally verify safety and functional correctness properties of the code. These ATPs try to build proofs based on user provided domain-specific axioms, which can be arbitrary First-Order Formulas (FOFs). These axioms are the most crucial part of the trusted base, since proofs can be submitted to a proof checker removing the need to trust the prover and AutoCert itself plays the part of checking the code generator. However, formulating axioms correctly (i.e. precisely as the user had really intended) is non-trivial in practice. The challenge of axiomatization arise from several dimensions. First, the domain knowledge has its own complexity. AutoCert has been used to verify mathematical requirements on navigation software that carries out various geometric coordinate transformations involving matrices and quaternions. Axiomatic theories for such constructs are complex enough that mistakes are not uncommon. Second, adjusting axioms for ATPs can add even more complexity. The axioms frequently need to be modified in order to have them in a form suitable for use with ATPs. Such modifications tend to obscure the axioms further. Thirdly, speculating validity of the axioms from the output of existing ATPs is very hard since theorem provers typically do not give any examples or counterexamples.

  8. On the Estimation of Disease Prevalence by Latent Class Models for Screening Studies Using Two Screening Tests with Categorical Disease Status Verified in Test Positives Only

    Science.gov (United States)

    Chu, Haitao; Zhou, Yijie; Cole, Stephen R.; Ibrahim, Joseph G.

    2010-01-01

    Summary To evaluate the probabilities of a disease state, ideally all subjects in a study should be diagnosed by a definitive diagnostic or gold standard test. However, since definitive diagnostic tests are often invasive and expensive, it is generally unethical to apply them to subjects whose screening tests are negative. In this article, we consider latent class models for screening studies with two imperfect binary diagnostic tests and a definitive categorical disease status measured only for those with at least one positive screening test. Specifically, we discuss a conditional independent and three homogeneous conditional dependent latent class models and assess the impact of misspecification of the dependence structure on the estimation of disease category probabilities using frequentist and Bayesian approaches. Interestingly, the three homogeneous dependent models can provide identical goodness-of-fit but substantively different estimates for a given study. However, the parametric form of the assumed dependence structure itself is not “testable” from the data, and thus the dependence structure modeling considered here can only be viewed as a sensitivity analysis concerning a more complicated non-identifiable model potentially involving heterogeneous dependence structure. Furthermore, we discuss Bayesian model averaging together with its limitations as an alternative way to partially address this particularly challenging problem. The methods are applied to two cancer screening studies, and simulations are conducted to evaluate the performance of these methods. In summary, further research is needed to reduce the impact of model misspecification on the estimation of disease prevalence in such settings. PMID:20191614

  9. Verifiably Truthful Mechanisms

    DEFF Research Database (Denmark)

    Branzei, Simina; Procaccia, Ariel D.

    2015-01-01

    the computational sense). Our approach involves three steps: (i) specifying the structure of mechanisms, (ii) constructing a verification algorithm, and (iii) measuring the quality of verifiably truthful mechanisms. We demonstrate this approach using a case study: approximate mechanism design without money...

  10. Testing static tradeoff theiry against pecking order models of capital ...

    African Journals Online (AJOL)

    We test two models with the purpose of finding the best empirical explanation for corporate financing choice of a cross section of 27 Nigerian quoted companies. The models were developed to represent the Static tradeoff Theory and the Pecking order Theory of capital structure with a view to make comparison between ...

  11. Testing Library Specifications by Verifying Conformance Tests

    DEFF Research Database (Denmark)

    Kiniry, Joseph Roland; Zimmerman, Daniel; Hyland, Ralph

    2012-01-01

    of client programs. Specication and verification researchers regularly face the question of whether the library specications we use are correct and useful, and we have collectively provided no good answers. Over the past few years we have created and refined a software engineering process, which we call...

  12. Testing for one Generalized Linear Single Order Parameter

    DEFF Research Database (Denmark)

    Ellegaard, Niels Langager; Christensen, Tage Emil; Dyre, Jeppe

    We examine a linear single order parameter model for thermoviscoelastic relaxation in viscous liquids, allowing for a distribution of relaxation times. In this model the relaxation of volume and entalpy is completely described by the relaxation of one internal order parameter. In contrast to prior...... work the order parameter may be chosen to have a non-exponential relaxation. The model predictions contradict the general consensus of the properties of viscous liquids in two ways: (i) The model predicts that following a linear isobaric temperature step, the normalized volume and entalpy relaxation...... responses or extrapolate from measurements of a glassy state away from equilibrium. Starting from a master equation description of inherent dynamics, we calculate the complex thermodynamic response functions. We device a way of testing for the generalized single order parameter model by measuring 3 complex...

  13. Verifying versus falsifying banknotes

    Science.gov (United States)

    van Renesse, Rudolf L.

    1998-04-01

    A series of counterfeit Dutch, German, English, and U.S. banknotes was examined with respect to the various modi operandi to imitate paper based, printed and post-printed security features. These features provide positive evidence (verifiability) as well as negative evidence (falsifiability). It appears that the positive evidence provided in most cases is insufficiently convincing: banknote inspection mainly rests on negative evidence. The act of falsifying (to prove to be false), however, is an inefficacious procedure. Ergonomic verificatory security features are demanded. This demand is increasingly met by security features based on nano- technology. The potential of nano-security has a twofold base: (1) the unique optical effects displayed allow simple, fast and unambiguous inspection, and (2) the nano-technology they are based on, makes successful counterfeit or simulation extremely improbable.

  14. Verified scientific findings

    International Nuclear Information System (INIS)

    Bullinger, M.G.

    1982-01-01

    In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

  15. Interventions to Educate Family Physicians to Change Test Ordering

    Directory of Open Access Journals (Sweden)

    Roger Edmund Thomas MD, PhD, CCFP, MRCGP

    2016-03-01

    Full Text Available The purpose is to systematically review randomised controlled trials (RCTs to change family physicians’ laboratory test-ordering. We searched 15 electronic databases (no language/date limitations. We identified 29 RCTs (4,111 physicians, 175,563 patients. Six studies specifically focused on reducing unnecessary tests, 23 on increasing screening tests. Using Cochrane methodology 48.5% of studies were low risk-of-bias for randomisation, 7% concealment of randomisation, 17% blinding of participants/personnel, 21% blinding outcome assessors, 27.5% attrition, 93% selective reporting. Only six studies were low risk for both randomisation and attrition. Twelve studies performed a power computation, three an intention-to-treat analysis and 13 statistically controlled clustering. Unweighted averages were computed to compare intervention/control groups for tests assessed by >5 studies. The results were that fourteen studies assessed lipids (average 10% more tests than control, 14 diabetes (average 8% > control, 5 cervical smears, 2 INR, one each thyroid, fecal occult-blood, cotinine, throat-swabs, testing after prescribing, and urine-cultures. Six studies aimed to decrease test groups (average decrease 18%, and two to increase test groups. Intervention strategies: one study used education (no change: two feedback (one 5% increase, one 27% desired decrease; eight education + feedback (average increase in desired direction >control 4.9%, ten system change (average increase 14.9%, one system change + feedback (increases 5-44%, three education + system change (average increase 6%, three education + system change + feedback (average 7.7% increase, one delayed testing. The conclusions are that only six RCTs were assessed at low risk of bias from both randomisation and attrition. Nevertheless, despite methodological shortcomings studies that found large changes (e.g. >20% probably obtained real change.

  16. Improving Molecular Genetic Test Utilization through Order Restriction, Test Review, and Guidance.

    Science.gov (United States)

    Riley, Jacquelyn D; Procop, Gary W; Kottke-Marchant, Kandice; Wyllie, Robert; Lacbawan, Felicitas L

    2015-05-01

    The ordering of molecular genetic tests by health providers not well trained in genetics may have a variety of untoward effects. These include the selection of inappropriate tests, the ordering of panels when the assessment of individual or fewer genes would be more appropriate, inaccurate result interpretation and inappropriate patient guidance, and significant unwarranted cost expenditure. We sought to improve the utilization of molecular genetic tests by requiring providers without specialty training in genetics to use genetic counselors and molecular genetic pathologists to assist in test selection. We used a genetic and genomic test review process wherein the laboratory-based genetic counselor performed the preanalytic assessment of test orders and test triage. Test indication and clinical findings were evaluated against the test panel composition, methods, and test limitations under the supervision of the molecular genetic pathologist. These test utilization management efforts resulted in a decrease in genetic test ordering and a gross cost savings of $1,531,913 since the inception of these programs in September 2011 through December 2013. The combination of limiting the availability of complex genetic tests and providing guidance regarding appropriate test strategies is an effective way to improve genetic tests, contributing to judicious use of limited health care resources. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  17. Testing second order cyclostationarity in the squared envelope spectrum of non-white vibration signals

    Science.gov (United States)

    Borghesani, P.; Pennacchi, P.; Ricci, R.; Chatterton, S.

    2013-10-01

    Cyclostationary models for the diagnostic signals measured on faulty rotating machineries have proved to be successful in many laboratory tests and industrial applications. The squared envelope spectrum has been pointed out as the most efficient indicator for the assessment of second order cyclostationary symptoms of damages, which are typical, for instance, of rolling element bearing faults. In an attempt to foster the spread of rotating machinery diagnostics, the current trend in the field is to reach higher levels of automation of the condition monitoring systems. For this purpose, statistical tests for the presence of cyclostationarity have been proposed during the last years. The statistical thresholds proposed in the past for the identification of cyclostationary components have been obtained under the hypothesis of having a white noise signal when the component is healthy. This need, coupled with the non-white nature of the real signals implies the necessity of pre-whitening or filtering the signal in optimal narrow-bands, increasing the complexity of the algorithm and the risk of losing diagnostic information or introducing biases on the result. In this paper, the authors introduce an original analytical derivation of the statistical tests for cyclostationarity in the squared envelope spectrum, dropping the hypothesis of white noise from the beginning. The effect of first order and second order cyclostationary components on the distribution of the squared envelope spectrum will be quantified and the effectiveness of the newly proposed threshold verified, providing a sound theoretical basis and a practical starting point for efficient automated diagnostics of machine components such as rolling element bearings. The analytical results will be verified by means of numerical simulations and by using experimental vibration data of rolling element bearings.

  18. Backfilling and closure of the deep repository. Phase 3 - pilot tests to verify engineering feasibility. Geotechnical investigations made on unsaturated backfill materials

    International Nuclear Information System (INIS)

    Johannesson, Lars-Erik

    2008-12-01

    The investigations described in this report is a part of the third phase of the joint SKB-Posiva project 'Backfilling and Closure of the Deep Repository, BACLO'. The overall objective of the BACLO project is to develop backfilling concept for the deep repository that can be configured to meet SKB's and Posiva's requirements in the chosen repository sites. The project was divided into four phases, of which two have already been performed. The second phase of the BACLO project consisted of laboratory tests and deepened analyses of the investigated backfill materials and methods and resulted in recommendation to focus on the development and testing of the block placement concept with three alternative backfill materials. The third phase investigations comprise of laboratory and large-scale experiments aiming at testing the engineering feasibility of the concept. In addition, how site-specific constraints, backfilling method and materials affect the long-term functions of the barriers will be described and analysed in order to set design specifications for the backfill. The third phase of the BACLO project is divided into several subprojects. The work described in this report belongs to subproject 1 concerning processes during installation and saturation of the backfill that may affect the long-term function of the bentonite buffer and the backfill itself. One of the main functions of backfill is to restrict buffer expansion which can lead to decrease in buffer density in the deposition hole. The criterion used as a basis for the Baclo investigations was that the buffer density at saturation should not be below 1,950 kg/m 3 at the level of the canister. The same criterion was applied for the work described in this report. The upward swelling of the buffer and the enclosed compression of the backfill was first studied assuming that both the buffer and the backfill were saturated. The main objective of this work was to study a case where the buffer is fully saturated

  19. Backfilling and closure of the deep repository. Phase 3 - pilot tests to verify engineering feasibility. Geotechnical investigations made on unsaturated backfill materials

    Energy Technology Data Exchange (ETDEWEB)

    Johannesson, Lars-Erik (Clay Technology AB, Lund (Sweden))

    2008-12-15

    The investigations described in this report is a part of the third phase of the joint SKB-Posiva project 'Backfilling and Closure of the Deep Repository, BACLO'. The overall objective of the BACLO project is to develop backfilling concept for the deep repository that can be configured to meet SKB's and Posiva's requirements in the chosen repository sites. The project was divided into four phases, of which two have already been performed. The second phase of the BACLO project consisted of laboratory tests and deepened analyses of the investigated backfill materials and methods and resulted in recommendation to focus on the development and testing of the block placement concept with three alternative backfill materials. The third phase investigations comprise of laboratory and large-scale experiments aiming at testing the engineering feasibility of the concept. In addition, how site-specific constraints, backfilling method and materials affect the long-term functions of the barriers will be described and analysed in order to set design specifications for the backfill. The third phase of the BACLO project is divided into several subprojects. The work described in this report belongs to subproject 1 concerning processes during installation and saturation of the backfill that may affect the long-term function of the bentonite buffer and the backfill itself. One of the main functions of backfill is to restrict buffer expansion which can lead to decrease in buffer density in the deposition hole. The criterion used as a basis for the Baclo investigations was that the buffer density at saturation should not be below 1,950 kg/m3 at the level of the canister. The same criterion was applied for the work described in this report. The upward swelling of the buffer and the enclosed compression of the backfill was first studied assuming that both the buffer and the backfill were saturated. The main objective of this work was to study a case where the buffer is

  20. Stochastic order in dichotomous item response models for fixed tests, research adaptive tests, or multiple abilities

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1995-01-01

    Dichotomous item response theory (IRT) models can be viewed as families of stochastically ordered distributions of responses to test items. This paper explores several properties of such distributiom. The focus is on the conditions under which stochastic order in families of conditional

  1. A Highest Order Hypothesis Compatibility Test for Monocular SLAM

    Directory of Open Access Journals (Sweden)

    Edmundo Guerra

    2013-08-01

    Full Text Available Simultaneous Location and Mapping (SLAM is a key problem to solve in order to build truly autonomous mobile robots. SLAM with a unique camera, or monocular SLAM, is probably one of the most complex SLAM variants, based entirely on a bearing-only sensor working over six DOF. The monocular SLAM method developed in this work is based on the Delayed Inverse-Depth (DI-D Feature Initialization, with the contribution of a new data association batch validation technique, the Highest Order Hypothesis Compatibility Test, HOHCT. The Delayed Inverse-Depth technique is used to initialize new features in the system and defines a single hypothesis for the initial depth of features with the use of a stochastic technique of triangulation. The introduced HOHCT method is based on the evaluation of statistically compatible hypotheses and a search algorithm designed to exploit the strengths of the Delayed Inverse-Depth technique to achieve good performance results. This work presents the HOHCT with a detailed formulation of the monocular DI-D SLAM problem. The performance of the proposed HOHCT is validated with experimental results, in both indoor and outdoor environments, while its costs are compared with other popular approaches.

  2. Testing quantum mechanics using third-order correlations

    International Nuclear Information System (INIS)

    Kinsler, P.

    1996-01-01

    Semiclassical theories similar to stochastic electrodynamics are widely used in optics. The distinguishing feature of such theories is that the quantum uncertainty is represented by random statistical fluctuations. They can successfully predict some quantum-mechanical phenomena; for example, the squeezing of the quantum uncertainty in the parametric oscillator. However, since such theories are not equivalent to quantum mechanics, they will not always be useful. Complex number representations can be used to exactly model the quantum uncertainty, but care has to be taken that approximations do not reduce the description to a hidden variable one. This paper helps show the limitations of open-quote open-quote semiclassical theories,close-quote close-quote and helps show where a true quantum-mechanical treatment needs to be used. Third-order correlations are a test that provides a clear distinction between quantum and hidden variable theories in a way analogous to that provided by the open-quote open-quote all or nothing close-quote close-quote Greenberger-Horne-Zeilinger test of local hidden variable theories. copyright 1996 The American Physical Society

  3. The status of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.

    1985-01-01

    Identity verification devices based on the interrogation of six different human biometric features or actions now exist and in general have been in development for about ten years. The capability of these devices to meet the cost and operational requirements of speed, accuracy, ease of use and reliability has generally increased although the verifier industry is still immature. Sandia Laboratories makes a continuing effort to stay abreast of identity verifier developments and to assess the capabilities and improvements of each device. Operating environment and procedures more typical of field use can often reveal performance results substantially different from laboratory tests. An evaluation of several recently available verifiers is herein reported

  4. Verified OS Interface Code Synthesis

    Science.gov (United States)

    2016-12-01

    results into the larger proof framework of the seL4 microkernel to be directly usable in practice. Beyond the stated project goals, the solution...CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This combination increases proof productivity...were used for the verified ML compiler CakeML, can now also be used in the Isabelle/HOL system that was used for the verified seL4 microkernel. This

  5. A Highest Order Hypothesis Compatibility Test for Monocular SLAM

    OpenAIRE

    Edmundo Guerra; Rodrigo Munguia; Yolanda Bolea; Antoni Grau

    2013-01-01

    Simultaneous Location and Mapping (SLAM) is a key problem to solve in order to build truly autonomous mobile robots. SLAM with a unique camera, or monocular SLAM, is probably one of the most complex SLAM variants, based entirely on a bearing-only sensor working over six DOF. The monocular SLAM method developed in this work is based on the Delayed Inverse-Depth (DI-D) Feature Initialization, with the contribution of a new data association batch validation technique, the Highest Order Hyp...

  6. Analysis of Daily Laboratory Orders at a Large Urban Academic Center: A Multifaceted Approach to Changing Test Ordering Patterns.

    Science.gov (United States)

    Rudolf, Joseph W; Dighe, Anand S; Coley, Christopher M; Kamis, Irina K; Wertheim, Bradley M; Wright, Douglas E; Lewandrowski, Kent B; Baron, Jason M

    2017-08-01

    We sought to address concerns regarding recurring inpatient laboratory test order practices (daily laboratory tests) through a multifaceted approach to changing ordering patterns. We engaged in an interdepartmental collaboration to foster mindful test ordering through clinical policy creation, electronic clinical decision support, and continuous auditing and feedback. Annualized daily order volumes decreased from approximately 25,000 to 10,000 during a 33-month postintervention review. This represented a significant change from preintervention order volumes (95% confidence interval, 0.61-0.64; P < 10-16). Total inpatient test volumes were not affected. Durable changes to inpatient order practices can be achieved through a collaborative approach to utilization management that includes shared responsibility for establishing clinical guidelines and electronic decision support. Our experience suggests auditing and continued feedback are additional crucial components to changing ordering behavior. Curtailing daily orders alone may not be a sufficient strategy to reduce in-laboratory costs. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  7. 78 FR 78694 - Orders: Supplemental Orders on Reporting by Regulated Entities of Stress Testing Results as of...

    Science.gov (United States)

    2013-12-27

    ... reporting under section 165(i)(2) of the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd... testing. II. Orders For the convenience of the affected parties, the text of the Supplemental Orders... Reform and Consumer Protection Act (``Dodd-Frank Act'') requires certain financial companies with total...

  8. Auto-identification fiberoptical seal verifier

    International Nuclear Information System (INIS)

    Yamamoto, Yoichi; Mukaiyama, Takehiro

    1998-08-01

    An auto COBRA seal verifier was developed by Japan Atomic Energy Research Institute (JAERI) to provide more efficient and simpler inspection measures for IAEA safeguards. The verifier is designed to provide means of a simple, quantitative and objective judgment on in-situ verification for the COBRA seal. The equipment is a portable unit with hand-held weight and size. It can be operated by battery or AC power. The verifier reads a COBRA seal signature by using a built-in CCD camera and carries out the signature comparison procedure automatically on digital basis. The result of signature comparison is given as a YES/NO answer. The production model of the verifier was completed in July 1996. The development was carried out in collaboration with Mitsubishi Heavy Industries, Ltd. This report describes the design and functions of the COBRA seal verifier and the results of environmental and functional tests. The development of the COBRA seal verifier was carried out in the framework of Japan Support Programme for Agency Safeguards (JASPAS) as a project, JD-4 since 1981. (author)

  9. Direct test of the Gaussian auxiliary field ansatz in nonconserved order parameter phase ordering dynamics

    Science.gov (United States)

    Yeung, Chuck

    2018-06-01

    The assumption that the local order parameter is related to an underlying spatially smooth auxiliary field, u (r ⃗,t ) , is a common feature in theoretical approaches to non-conserved order parameter phase separation dynamics. In particular, the ansatz that u (r ⃗,t ) is a Gaussian random field leads to predictions for the decay of the autocorrelation function which are consistent with observations, but distinct from predictions using alternative theoretical approaches. In this paper, the auxiliary field is obtained directly from simulations of the time-dependent Ginzburg-Landau equation in two and three dimensions. The results show that u (r ⃗,t ) is equivalent to the distance to the nearest interface. In two dimensions, the probability distribution, P (u ) , is well approximated as Gaussian except for small values of u /L (t ) , where L (t ) is the characteristic length-scale of the patterns. The behavior of P (u ) in three dimensions is more complicated; the non-Gaussian region for small u /L (t ) is much larger than that in two dimensions but the tails of P (u ) begin to approach a Gaussian form at intermediate times. However, at later times, the tails of the probability distribution appear to decay faster than a Gaussian distribution.

  10. Comparison of VerifyNow-P2Y12 test and Flow Cytometry for monitoring individual platelet response to clopidogrel. What is the cut-off value for identifying patients who are low responders to clopidogrel therapy?

    Directory of Open Access Journals (Sweden)

    Castelli Alfredo

    2009-05-01

    Full Text Available Abstract Background Dual anti-platelet therapy with aspirin and a thienopyridine (DAT is used to prevent stent thrombosis after percutaneous coronary intervention (PCI. Low response to clopidogrel therapy (LR occurs, but laboratory tests have a controversial role in the identification of this condition. Methods We studied LR in patients with stable angina undergoing elective PCI, all on DAT for at least 7 days, by comparing: 1 Flow cytometry (FC to measure platelet membrane expression of P-selectin (CD62P and PAC-1 binding following double stimulation with ADP and collagen type I either in the presence of prostaglandin (PG E1; 2 VerifyNow-P2Y12 test, in which results are reported as absolute P2Y12-Reaction-Units (PRU or % of inhibition (% inhibition. Results Thirty controls and 52 patients were analyzed. The median percentage of platelets exhibiting CD62P expression and PAC-1 binding by FC evaluation after stimulation in the presence of PG E1 was 25.4% (IQR: 21.4–33.1% and 3.5% (1.7–9.4%, respectively. Only 6 patients receiving DAT (11.5% had both values above the 1st quartile of controls, and were defined as LR. Evaluation of the same patients with the VerifyNow-P2Y12 test revealed that the area under the receiver-operating-characteristic (ROC curve was 0.94 (95% CI: 0.84–0.98, p 213 PRU gave the maximum accuracy for the detection of patients defined as having LR by FC. Conclusion In conclusion our findings show that a cut-off value of ≤ 15% inhibition or > 213 PRU in the VerifyNow-P2Y12 test may provide the best accuracy for the identification of patients with LR.

  11. Test-retest reliability and task order effects of emotional cognitive tests in healthy subjects.

    Science.gov (United States)

    Adams, Thomas; Pounder, Zoe; Preston, Sally; Hanson, Andy; Gallagher, Peter; Harmer, Catherine J; McAllister-Williams, R Hamish

    2016-11-01

    Little is known of the retest reliability of emotional cognitive tasks or the impact of using different tasks employing similar emotional stimuli within a battery. We investigated this in healthy subjects. We found improved overall performance in an emotional attentional blink task (EABT) with repeat testing at one hour and one week compared to baseline, but the impact of an emotional stimulus on performance was unchanged. Similarly, performance on a facial expression recognition task (FERT) was better one week after a baseline test, though the relative effect of specific emotions was unaltered. There was no effect of repeat testing on an emotional word categorising, recall and recognition task. We found no difference in performance in the FERT and EABT irrespective of task order. We concluded that it is possible to use emotional cognitive tasks in longitudinal studies and combine tasks using emotional facial stimuli in a single battery.

  12. Optical nonclassicality test based on third-order intensity correlations

    Science.gov (United States)

    Rigovacca, L.; Kolthammer, W. S.; Di Franco, C.; Kim, M. S.

    2018-03-01

    We develop a nonclassicality criterion for the interference of three delayed, but otherwise identical, light fields in a three-mode Bell interferometer. We do so by comparing the prediction of quantum mechanics with those of a classical framework in which independent sources emit electric fields with random phases. In particular, we evaluate third-order correlations among output intensities as a function of the delays, and show how the presence of a correlation revival for small delays cannot be explained by the classical model of light. The observation of a revival is thus a nonclassicality signature, which can be achieved only by sources with a photon-number statistics that is highly sub-Poissonian. Our analysis provides strong evidence for the nonclassicality of the experiment discussed by Menssen et al. [Phys. Rev. Lett. 118, 153603 (2017), 10.1103/PhysRevLett.118.153603], and shows how a collective "triad" phase affects the interference of any three or more light fields, irrespective of their quantum or classical character.

  13. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  14. Impact of providing fee data on laboratory test ordering: a controlled clinical trial.

    Science.gov (United States)

    Feldman, Leonard S; Shihab, Hasan M; Thiemann, David; Yeh, Hsin-Chieh; Ardolino, Margaret; Mandell, Steven; Brotman, Daniel J

    2013-05-27

    Inpatient care providers often order laboratory tests without any appreciation for the costs of the tests. To determine whether we could decrease the number of laboratory tests ordered by presenting providers with test fees at the time of order entry in a tertiary care hospital, without adding extra steps to the ordering process. Controlled clinical trial. Tertiary care hospital. All providers, including physicians and nonphysicians, who ordered laboratory tests through the computerized provider order entry system at The Johns Hopkins Hospital. We randomly assigned 61 diagnostic laboratory tests to an "active" arm (fee displayed) or to a control arm (fee not displayed). During a 6-month baseline period (November 10, 2008, through May 9, 2009), we did not display any fee data. During a 6-month intervention period 1 year later (November 10, 2009, through May 9, 2010), we displayed fees, based on the Medicare allowable fee, for active tests only. We examined changes in the total number of orders placed, the frequency of ordered tests (per patient-day), and total charges associated with the orders according to the time period (baseline vs intervention period) and by study group (active test vs control). For the active arm tests, rates of test ordering were reduced from 3.72 tests per patient-day in the baseline period to 3.40 tests per patient-day in the intervention period (8.59% decrease; 95% CI, -8.99% to -8.19%). For control arm tests, ordering increased from 1.15 to 1.22 tests per patient-day from the baseline period to the intervention period (5.64% increase; 95% CI, 4.90% to 6.39%) (P fee data to providers at the time of order entry resulted in a modest decrease in test ordering. Adoption of this intervention may reduce the number of inappropriately ordered diagnostic tests.

  15. Diagnostic and laboratory test ordering in Northern Portuguese Primary Health Care: a cross-sectional study

    Science.gov (United States)

    Sá, Luísa; Teixeira, Andreia Sofia Costa; Tavares, Fernando; Costa-Santos, Cristina; Couto, Luciana; Costa-Pereira, Altamiro; Hespanhol, Alberto Pinto; Santos, Paulo

    2017-01-01

    Objectives To characterise the test ordering pattern in Northern Portugal and to investigate the influence of context-related factors, analysing the test ordered at the level of geographical groups of family physicians and at the level of different healthcare organisations. Design Cross-sectional study. Setting Northern Primary Health Care, Portugal. Participants Records about diagnostic and laboratory tests ordered from 2035 family physicians working at the Northern Regional Health Administration, who served approximately 3.5 million Portuguese patients, in 2014. Outcomes To determine the 20 most ordered diagnostic and laboratory tests in the Northern Regional Health Administration; to identify the presence and extent of variations in the 20 most ordered diagnostic and laboratory tests between the Groups of Primary Care Centres and between health units; and to study factors that may explain these variations. Results The 20 most ordered diagnostic and laboratory tests almost entirely comprise laboratory tests and account for 70.9% of the total tests requested. We can trace a major pattern of test ordering for haemogram, glucose, lipid profile, creatinine and urinalysis. There was a significant difference (P<0.001) in test orders for all tests between Groups of Primary Care Centres and for all tests, except glycated haemoglobin (P=0.06), between health units. Generally, the Personalised Healthcare Units ordered more than Family Health Units. Conclusions The results from this study show that the most commonly ordered tests in Portugal are laboratory tests, that there is a tendency for overtesting and that there is a large variability in diagnostic and laboratory test ordering in different geographical and organisational Portuguese primary care practices, suggesting that there may be considerable potential for the rationalisation of test ordering. The existence of Family Health Units seems to be a strong determinant in decreasing test ordering by Portuguese family

  16. Unconditionally verifiable blind quantum computation

    Science.gov (United States)

    Fitzsimons, Joseph F.; Kashefi, Elham

    2017-07-01

    Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  17. Test ordering by GP trainees: Effects of an educational intervention on attitudes and intended practice.

    Science.gov (United States)

    Morgan, Simon; Morgan, Andy; Kerr, Rohan; Tapley, Amanda; Magin, Parker

    2016-09-01

    To assess the effectiveness of an educational intervention on test-ordering attitudes and intended practice of GP trainees, and any associations between changes in test ordering and trainee characteristics. Preworkshop and postworkshop survey of attitudes to test ordering, intended test-ordering practices for 3 clinical scenarios (fatigue, screening, and shoulder pain), and tolerance for uncertainty. Three Australian regional general practice training providers. General practice trainees (N = 167). A 2-hour workshop session and an online module. Proportion of trainees who agreed with attitudinal statements before and after the workshop; proportion of trainees who would order tests, mean number of tests ordered, and number of appropriate and inappropriate tests ordered for each scenario before and after the workshop. Of 167 trainees, 132 (79.0%) completed both the preworkshop and postworkshop questionnaires. A total of 122 trainees attended the workshop. At baseline, 88.6% thought that tests can harm patients, 84.8% believed overtesting was a problem, 72.0% felt pressured by patients, 52.3% believed that tests would reassure patients, and 50.8% thought that they were less likely to be sued if they ordered tests. There were desirable changes in all attitudes after the workshop. Before the workshop, the mean number of tests that trainees would have ordered was 4.4, 4.8, and 1.5 for the fatigue, screening, and shoulder pain scenarios, respectively. After the workshop there were decreases in the mean number of both appropriate tests (decrease of 0.94) and inappropriate tests (decrease of 0.24) in the fatigue scenario; there was no change in the mean number of appropriate tests and a decrease in inappropriate tests (decrease of 0.76) in the screening scenario; and there was an increase in the proportion of trainees who would appropriately not order tests in the shoulder pain scenario. There were no significant associations between changes in test ordering and trainee

  18. A performance evaluation of personnel identity verifiers

    International Nuclear Information System (INIS)

    Maxwell, R.L.; Wright, L.J.

    1987-01-01

    Personnel identity verification devices, which are based on the examination and assessment of a body feature or a unique repeatable personal action, are steadily improving. These biometric devices are becoming more practical with respect to accuracy, speed, user compatibility, reliability and cost, but more development is necessary to satisfy the varied and sometimes ill-defined future requirements of the security industry. In an attempt to maintain an awareness of the availability and the capabilities of identity verifiers for the DOE security community, Sandia Laboratories continues to comparatively evaluate the capabilities and improvements of developing devices. An evaluation of several recently available verifiers is discussed in this paper. Operating environments and procedures more typical of physical access control use can reveal performance substantially different from the basic laboratory tests

  19. Selective otolith dysfunctions objectively verified.

    Science.gov (United States)

    Manzari, Leonardo; MacDougall, Hamish G; Burgess, Ann M; Curthoys, Ian S

    2014-01-01

    Vertigo and vigorous horizontal spontaneous nystagmus in a presenting patient is usually taken to indicate unilaterally reduced horizontal canal function. However here we report results which question that presumption. In three such patients with an acute vestibular syndrome, complete testing of all peripheral vestibular sense organs using new tests of canal and otolith function (vHIT and VEMPs) showed that semicircular canal function was normal, but that there were unilateral otolithic deficits which probably caused their acute syndrome.

  20. 77 FR 15101 - Results From Inert Ingredient Test Orders Issued Under EPA's Endocrine Disruptor Screening...

    Science.gov (United States)

    2012-03-14

    ... the selection criteria for endocrine testing under the Safe Drinking Water Act (SDWA). EPA has no...) because the chemicals meet the selection criteria. EPA has no plans to issue further test orders for the... Screening Program (EDSP) and the Federal Food, Drug, and Cosmetic Act (FFDCA). In response to the test...

  1. Automated measurement and control of concrete properties in a ready mix truck with VERIFI.

    Science.gov (United States)

    2014-02-01

    In this research, twenty batches of concrete with six different mixture proportions were tested with VERIFI to evaluate 1) accuracy : and repeatability of VERIFI measurements, 2) ability of VERIFI to adjust slump automatically with water and admixtur...

  2. Benchmarking to Identify Practice Variation in Test Ordering: A Potential Tool for Utilization Management.

    Science.gov (United States)

    Signorelli, Heather; Straseski, Joely A; Genzen, Jonathan R; Walker, Brandon S; Jackson, Brian R; Schmidt, Robert L

    2015-01-01

    Appropriate test utilization is usually evaluated by adherence to published guidelines. In many cases, medical guidelines are not available. Benchmarking has been proposed as a method to identify practice variations that may represent inappropriate testing. This study investigated the use of benchmarking to identify sites with inappropriate utilization of testing for a particular analyte. We used a Web-based survey to compare 2 measures of vitamin D utilization: overall testing intensity (ratio of total vitamin D orders to blood-count orders) and relative testing intensity (ratio of 1,25(OH)2D to 25(OH)D test orders). A total of 81 facilities contributed data. The average overall testing intensity index was 0.165, or approximately 1 vitamin D test for every 6 blood-count tests. The average relative testing intensity index was 0.055, or one 1,25(OH)2D test for every 18 of the 25(OH)D tests. Both indexes varied considerably. Benchmarking can be used as a screening tool to identify outliers that may be associated with inappropriate test utilization. Copyright© by the American Society for Clinical Pathology (ASCP).

  3. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  4. 1,2-Ethylene Dichloride; Final Enforceable Consent Agreement and Testing Consent Order

    Science.gov (United States)

    This document announces that EPA has signed an enforceable testing Consent Order with the Dow Chemical Co, Vulcan Materials Co, Occidental Chemical Corp, Oxy Vinyls, LP, Georgia Gulf Corp, Westlake Chemical Corp, PPG Industries, Inc., and Formosa Plastics.

  5. Examination of changes in pathology tests ordered by Diagnosis-Related Group (DRGs) following CPOE introduction.

    Science.gov (United States)

    Vecellio, Elia; Georgiou, Andrew; Toouli, George; Eigenstetter, Alex; Li, Ling; Wilson, Roger; Westbrook, Johanna I

    2013-01-01

    Electronic test ordering, via the Electronic Medical Record (EMR), which incorporates computerised provider order entry (CPOE), is widely considered as a useful tool to support appropriate pathology test ordering. Diagnosis-related groups (DRGs) are clinically meaningful categories that allow comparisons in pathology utilisation by patient groups by controlling for many potentially confounding variables. This study used DRG data linked to pathology test data to examine changes in rates of test ordering across four years coinciding with the introduction of an EMR in six hospitals in New South Wales, Australia. This method generated a list of high pathology utilisation DRGs. We investigated patients with a Chest pain DRG to examine whether tests rates changed for specific test groups by hospital emergency department (ED) pre- and post-EMR. There was little change in testing rates between EDs or between time periods pre- and post-EMR. This is a valuable method for monitoring the impact of EMR and clinical decision support on test order rates.

  6. Verifying design patterns in Hoare Type Theory

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Buisse, Alexandre; Birkedal, Lars

    In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory.......In this technical report we document our experiments formally verifying three design patterns in Hoare Type Theory....

  7. USCIS E-Verify Program Reports

    Data.gov (United States)

    Department of Homeland Security — The report builds on the last comprehensive evaluation of the E-Verify Program and demonstrates that E-Verify produces accurate results and that accuracy rates have...

  8. Test order in teacher-rated behavior assessments: Is counterbalancing necessary?

    Science.gov (United States)

    Kooken, Janice; Welsh, Megan E; McCoach, D Betsy; Miller, Faith G; Chafouleas, Sandra M; Riley-Tillman, T Chris; Fabiano, Gregory

    2017-01-01

    Counterbalancing treatment order in experimental research design is well established as an option to reduce threats to internal validity, but in educational and psychological research, the effect of varying the order of multiple tests to a single rater has not been examined and is rarely adhered to in practice. The current study examines the effect of test order on measures of student behavior by teachers as raters utilizing data from a behavior measure validation study. Using multilevel modeling to control for students nested within teachers, the effect of rating an earlier measure on the intercept or slope of a later behavior assessment was statistically significant in 22% of predictor main effects for the spring test period. Test order effects had potential for high stakes consequences with differences large enough to change risk classification. Results suggest that researchers and practitioners in classroom settings using multiple measures evaluate the potential impact of test order. Where possible, they should counterbalance when the risk of an order effect exists and report justification for the decision to not counterbalance. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. The Implications of Family Size and Birth Order for Test Scores and Behavioral Development

    Science.gov (United States)

    Silles, Mary A.

    2010-01-01

    This article, using longitudinal data from the National Child Development Study, presents new evidence on the effects of family size and birth order on test scores and behavioral development at age 7, 11 and 16. Sibling size is shown to have an adverse causal effect on test scores and behavioral development. For any given family size, first-borns…

  10. Testing the Suitability of Mediation of Child Support Orders in Title IV-D Cases

    Science.gov (United States)

    Schraufnagel, Scot; Li, Quan

    2010-01-01

    Objectives: The purpose of this study is to test mediation versus a traditional court process for the establishment or modification of child support orders. The intention is to determine which dispute resolution process is associated with greater client satisfaction and compliance. An auxiliary objective is to test the type of cases which are most…

  11. Using computerized provider order entry to enforce documentation of tests with pending results at hospital discharge.

    Science.gov (United States)

    Cadwallader, J; Asirwa, C; Li, X; Kesterson, J; Tierney, W M; Were, M C

    2012-01-01

    Small numbers of tests with pending results are documented in hospital discharge summaries leading to breakdown in communication and medical errors due to inadequate followup. Evaluate effect of using a computerized provider order entry (CPOE) system to enforce documentation of tests with pending results into hospital discharge summaries. We assessed the percent of all tests with pending results and those with actionable results that were documented before (n = 182 discharges) and after (n = 203 discharges) implementing the CPOE-enforcement tool. We also surveyed providers (n = 52) about the enforcement functionality. Documentation of all tests with pending results improved from 12% (87/701 tests) before to 22% (178/812 tests) (p = 0.02) after implementation. Documentation of tests with eventual actionable results increased from 0% (0/24) to 50% (14/28)(ppending results into discharge summaries significantly increased documentation rates, especially of actionable tests. However, gaps in documentation still exist.

  12. Changes in pathology test ordering by early career general practitioners: a longitudinal study.

    Science.gov (United States)

    Magin, Parker J; Tapley, Amanda; Morgan, Simon; Henderson, Kim; Holliday, Elizabeth G; Davey, Andrew R; Ball, Jean; Catzikiris, Nigel F; Mulquiney, Katie J; van Driel, Mieke L

    2017-07-17

    To assess the number of pathology tests ordered by general practice registrars during their first 18-24 months of clinical general practice. Longitudinal analysis of ten rounds of data collection (2010-2014) for the Registrar Clinical Encounters in Training (ReCEnT) study, an ongoing, multicentre, cohort study of general practice registrars in Australia. The principal analysis employed negative binomial regression in a generalised estimating equations framework (to account for repeated measures on registrars).Setting, participants: General practice registrars in training posts with five of 17 general practice regional training providers in five Australian states. The registrar participation rate was 96.4%. Number of pathology tests requested per consultation. The time unit for analysis was the registrar training term (the 6-month full-time equivalent component of clinical training); registrars contributed data for up to four training terms. 876 registrars contributed data for 114 584 consultations. The number of pathology tests requested increased by 11% (95% CI, 8-15%; P pathology test ordering by general practice registrars increased significantly during their first 2 years of clinical practice. This causes concerns about overtesting. As established general practitioners order fewer tests than registrars, test ordering may peak during late vocational training and early career practice. Registrars need support during this difficult period in the development of their clinical practice patterns.

  13. Rapid Detection of the Chlamydiaceae and Other Families in the Order Chlamydiales: Three PCR Tests

    Science.gov (United States)

    Everett, Karin D. E.; Hornung, Linda J.; Andersen, Arthur A.

    1999-01-01

    Few identification methods will rapidly or specifically detect all bacteria in the order Chlamydiales, family Chlamydiaceae. In this study, three PCR tests based on sequence data from over 48 chlamydial strains were developed for identification of these bacteria. Two tests exclusively recognized the Chlamydiaceae: a multiplex test targeting the ompA gene and the rRNA intergenic spacer and a TaqMan test targeting the 23S ribosomal DNA. The multiplex test was able to detect as few as 200 inclusion-forming units (IFU), while the TaqMan test could detect 2 IFU. The amplicons produced in these tests ranged from 132 to 320 bp in length. The third test, targeting the 23S rRNA gene, produced a 600-bp amplicon from strains belonging to several families in the order Chlamydiales. Direct sequence analysis of this amplicon has facilitated the identification of new chlamydial strains. These three tests permit ready identification of chlamydiae for diagnostic and epidemiologic study. The specificity of these tests indicates that they might also be used to identify chlamydiae without culture or isolation. PMID:9986815

  14. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  15. Ordering molecular genetic tests and reporting results: practices in laboratory and clinical settings.

    Science.gov (United States)

    Lubin, Ira M; Caggana, Michele; Constantin, Carolyn; Gross, Susan J; Lyon, Elaine; Pagon, Roberta A; Trotter, Tracy L; Wilson, Jean Amos; McGovern, Margaret M

    2008-09-01

    Previous studies have suggested that patient care may be compromised as a consequence of poor communication between clinicians and laboratory professionals in cases in which molecular genetic test results are reported. To understand better the contributing factors to such compromised care, we investigated both pre- and postanalytical processes using cystic fibrosis mutation analysis as our model. We found that although the majority of test requisition forms requested patient/family information that was necessary for the proper interpretation of test results, in many cases, these data were not provided by the individuals filling out the forms. We found instances in which result reports for simulated diagnostic testing described individuals as carriers where only a single mutation was found with no comment pertaining to a diagnosis of cystic fibrosis. Similarly, reports based on simulated scenarios for carrier testing were problematic when no mutations were identified, and the patient's race/ethnicity and family history were not discussed in reference to residual risk of disease. Remarkably, a pilot survey of obstetrician-gynecologists revealed that office staff, including secretaries, often helped order genetic tests and reported test results to patients, raising questions about what efforts are undertaken to ensure personnel competency. These findings are reviewed in light of what efforts should be taken to improve the quality of test-ordering and result-reporting practices.

  16. Intelligence Test Scores and Birth Order among Young Norwegian Men (Conscripts) Analyzed within and between Families

    Science.gov (United States)

    Bjerkedal, Tor; Kristensen, Petter; Skjeret, Geir A.; Brevik, John I.

    2007-01-01

    The present paper reports the results of a within and between family analysis of the relation between birth order and intelligence. The material comprises more than a quarter of a million test scores for intellectual performance of Norwegian male conscripts recorded during 1984-2004. Conscripts, mostly 18-19 years of age, were born to women for…

  17. A unified framework for testing in the linear regression model under unknown order of fractional integration

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; Kruse, Robinson; Sibbertsen, Philipp

    We consider hypothesis testing in a general linear time series regression framework when the possibly fractional order of integration of the error term is unknown. We show that the approach suggested by Vogelsang (1998a) for the case of integer integration does not apply to the case of fractional...

  18. Verifying FreeRTOS; a feasibility study

    NARCIS (Netherlands)

    Pronk, C.

    2010-01-01

    This paper presents a study on modeling and verifying the kernel of Real-Time Operating Systems (RTOS). The study will show advances in formally verifying such an RTOS both by refinement and by model checking approaches. This work fits in the context of Hoare’s verification challenge. Several

  19. Identification of reduced-order model for an aeroelastic system from flutter test data

    Directory of Open Access Journals (Sweden)

    Wei Tang

    2017-02-01

    Full Text Available Recently, flutter active control using linear parameter varying (LPV framework has attracted a lot of attention. LPV control synthesis usually generates controllers that are at least of the same order as the aeroelastic models. Therefore, the reduced-order model is required by synthesis for avoidance of large computation cost and high-order controller. This paper proposes a new procedure for generation of accurate reduced-order linear time-invariant (LTI models by using system identification from flutter testing data. The proposed approach is in two steps. The well-known poly-reference least squares complex frequency (p-LSCF algorithm is firstly employed for modal parameter identification from frequency response measurement. After parameter identification, the dominant physical modes are determined by clear stabilization diagrams and clustering technique. In the second step, with prior knowledge of physical poles, the improved frequency-domain maximum likelihood (ML estimator is presented for building accurate reduced-order model. Before ML estimation, an improved subspace identification considering the poles constraint is also proposed for initializing the iterative procedure. Finally, the performance of the proposed procedure is validated by real flight flutter test data.

  20. Evaluating a mobile application for improving clinical laboratory test ordering and diagnosis.

    Science.gov (United States)

    Meyer, Ashley N D; Thompson, Pamela J; Khanna, Arushi; Desai, Samir; Mathews, Benji K; Yousef, Elham; Kusnoor, Anita V; Singh, Hardeep

    2018-04-20

    Mobile applications for improving diagnostic decision making often lack clinical evaluation. We evaluated if a mobile application improves generalist physicians' appropriate laboratory test ordering and diagnosis decisions and assessed if physicians perceive it as useful for learning. In an experimental, vignette study, physicians diagnosed 8 patient vignettes with normal prothrombin times (PT) and abnormal partial thromboplastin times (PTT). Physicians made test ordering and diagnosis decisions for 4 vignettes using each resource: a mobile app, PTT Advisor, developed by the Centers for Disease Control and Prevention (CDC)'s Clinical Laboratory Integration into Healthcare Collaborative (CLIHC); and usual clinical decision support. Then, physicians answered questions regarding their perceptions of the app's usefulness for diagnostic decision making and learning using a modified Kirkpatrick Training Evaluation Framework. Data from 368 vignettes solved by 46 physicians at 7 US health care institutions show advantages for using PTT Advisor over usual clinical decision support on test ordering and diagnostic decision accuracy (82.6 vs 70.2% correct; P < .001), confidence in decisions (7.5 vs 6.3 out of 10; P < .001), and vignette completion time (3:02 vs 3:53 min.; P = .06). Physicians reported positive perceptions of the app's potential for improved clinical decision making, and recommended it be used to address broader diagnostic challenges. A mobile app, PTT Advisor, may contribute to better test ordering and diagnosis, serve as a learning tool for diagnostic evaluation of certain clinical disorders, and improve patient outcomes. Similar methods could be useful for evaluating apps aimed at improving testing and diagnosis for other conditions.

  1. Inappropriate emergency laboratory test ordering: defensive or peer evidence shared based medicine?

    Directory of Open Access Journals (Sweden)

    C. Descovich

    2013-05-01

    Full Text Available BACKGROUND The laboratory overuse is widely prevalent in hospital practice, mostly in the emergency care. Reasons for excessive and inappropriate test-ordering include defensive behaviour and fear or uncertainty, lack of experience, the misuse of protocols and guidelines, “routine” and local attitudes, inadequate educational feedback and clinician’s unawareness about the cost of examinations and their related implications. AIM OF THE STUDY AND METHODS The primary target of our working group was to reduce inappropriate ordering on a urgent basis test, implementing further examinations not yet previewed in the hospital panel of the available urgencies, according to the evidence based diagnosis concept. The secondary goal was to indicate strategies of re-engineering of the processes, improving turnaround time in the laboratory management of emergencies. After evaluating, as first intervention, the more reliable sources for practice guidelines, systematic reviews and RCTs, the committee further discussed main topics with in-hospital stakeholders, selected from Emergency, Internal Medicine and Surgery Depts. The working group, in many subsequent audits, tried to obtain a systematic feed back with all involved professionals. RESULTS After reviewing literature’s evidence, the board constrained testing options by defining the basic emergency laboratory panel tests (blood type, hemogram, blood urea nitrogen, plasma creatinine, glucose, sodium, potassium, chloride, osmolarity, CRP, bicarbonate, CPK, creatine phosphokinase-MB, myoglobin, troponin, BNP and NT-proBNP, PT-INR, PTT, D-dimer, beta- HCG, biochemical urinalysis etc.. As final result, the proposed tests reduced the overall number of inappropriate investigations and increased, with newer and updated tests, the available panel for critical patients. DISCUSSION A collegiate review of data reporting, in-hospital deepening of problems and the inter- professional discussion of the evidences

  2. A Modified Jonckheere Test Statistic for Ordered Alternatives in Repeated Measures Design

    Directory of Open Access Journals (Sweden)

    Hatice Tül Kübra AKDUR

    2016-09-01

    Full Text Available In this article, a new test based on Jonckheere test [1] for  randomized blocks which have dependent observations within block is presented. A weighted sum for each block statistic rather than the unweighted sum proposed by Jonckheereis included. For Jonckheere type statistics, the main assumption is independency of observations within block. In the case of repeated measures design, the assumption of independence is violated. The weighted Jonckheere type statistic for the situation of dependence for different variance-covariance structure and the situation based on ordered alternative hypothesis structure of each block on the design is used. Also, the proposed statistic is compared to the existing test based on Jonckheere in terms of type I error rates by performing Monte Carlo simulation. For the strong correlations, circular bootstrap version of the proposed Jonckheere test provides lower rates of type I error.

  3. From guidelines to hospital practice: reducing inappropriate ordering of thyroid hormone and antibody tests.

    Science.gov (United States)

    Toubert, M E; Chevret, S; Cassinat, B; Schlageter, M H; Beressi, J P; Rain, J D

    2000-06-01

    Because of major technical improvements and conscious care about cost effectiveness, limiting the inadequate use of thyroid biological tests appears to be a major issue. To (i) estimate the ordering prevalence of each thyroid test, (ii) assess the prevalence of relevant thyroid tests, and (iii) evaluate the impact of expressing justification for tests during a 2-month intervention period on these prevalences. During a prospective 2-month survey (June-July 1997), all the request forms were divided into four groups of prescription: (1) investigation of thyroid function, (2) taking drugs affecting the thyroid, (3) monitoring of nodule and cancer, and (4) investigation of thyroid autoimmunity. Their appropriateness was thus determined according to consensus in our hospital and previously published recommendations. Results were compared with those of retrospective similar 2-month periods in 1996 and 1998. Combinations of thyroid function tests and thyroid antibodies were analyzed during the 1996, 1997 and 1998 periods. The overall estimated rate of appropriate ordering between 1996 and 1997 increased from 42.5% to 72.4% (P<10(-4)), with a significant improvement in each group of main diagnosis referral, except in group 3 where suitability was always over 85%. However, in group 4, appropriateness remained low (36%). Combinations of thyroid tests revealed an increase in single TSH order forms and single autoantibodies to thyroperoxidase (TPOAb) ones, while TSH+free thyroxine+free tri-iodothyronine and TPOAb+ autoantibodies to thyroglobulin ones decreased significantly. Interestingly, all these changes were maintained 1 year later (June-July 1998) even though physicians were not aware of this new study. Persistent change in medical practice was thus assessed.

  4. Medical Devices; Clinical Chemistry and Clinical Toxicology Devices; Classification of the Organophosphate Test System. Final order.

    Science.gov (United States)

    2017-10-18

    The Food and Drug Administration (FDA or we) is classifying the organophosphate test system into class II (special controls). The special controls that apply to the device type are identified in this order and will be part of the codified language for the organophosphate test system's classification. We are taking this action because we have determined that classifying the device into class II (special controls) will provide a reasonable assurance of safety and effectiveness of the device. We believe this action will also enhance patients' access to beneficial innovative devices, in part by reducing regulatory burdens.

  5. Medical Devices; Hematology and Pathology Devices; Classification of a Cervical Intraepithelial Neoplasia Test System. Final order.

    Science.gov (United States)

    2018-01-03

    The Food and Drug Administration (FDA or we) is classifying the cervical intraepithelial neoplasia (CIN) test system into class II (special controls). The special controls that apply to the device type are identified in this order and will be part of the codified language for the CIN test system's classification. We are taking this action because we have determined that classifying the device into class II (special controls) will provide a reasonable assurance of safety and effectiveness of the device. We believe this action will also enhance patients' access to beneficial innovative devices, in part by reducing regulatory burdens.

  6. Avoidance of anticipated regret: the ordering of prostate-specific antigen tests.

    Science.gov (United States)

    Sorum, Paul C; Mullet, Etienne; Shim, Junseop; Bonnin-Scaon, Sylvie; Chasseigne, Gérard; Cogneau, Joël

    2004-01-01

    When making decisions, people are known to try to minimize the regret that would be provoked by unwanted consequences of these decisions. The authors explored the strength and determinants of such anticipated regret in a study of physicians' decisions to order prostate-specific antigen (PSA) tests. 32 US and 33 French primary care physicians indicated the likelihood they would order a PSA for 32 hypothetical men presenting for routine physical exams. They then indicated how much regret they would feel if they found advanced prostate cancer in 12 other patients for whom they had chosen not to order PSAs several years before. The latter patients differed according to age (55, 65, or 75 years), a prior request or not for PSA testing, and no or some irregularity of the prostate on the earlier rectal exam. ANOVA found that regret was higher when the patient had requested a PSA, the prostate was irregular, and the patient was younger. Shape had less effect when the patient had requested a PSA. US physicians had more regret than the French, patient request had a greater impact on the Americans, and increasing patient age reduced regret more among the French. In a 1-way correlation, the regret score was associated with the likelihood of ordering PSAs for both the French (r = 0.64, P regret score was the most important predictor of the likelihood of ordering a PSA (beta = 0.37, P Regret over failing to diagnose aggressive prostate cancer is associated with a policy of ordering PSAs. This regret appears to be culturally sensitive.

  7. Mean Abnormal Result Rate: Proof of Concept of a New Metric for Benchmarking Selectivity in Laboratory Test Ordering.

    Science.gov (United States)

    Naugler, Christopher T; Guo, Maggie

    2016-04-01

    There is a need to develop and validate new metrics to access the appropriateness of laboratory test requests. The mean abnormal result rate (MARR) is a proposed measure of ordering selectivity, the premise being that higher mean abnormal rates represent more selective test ordering. As a validation of this metric, we compared the abnormal rate of lab tests with the number of tests ordered on the same requisition. We hypothesized that requisitions with larger numbers of requested tests represent less selective test ordering and therefore would have a lower overall abnormal rate. We examined 3,864,083 tests ordered on 451,895 requisitions and found that the MARR decreased from about 25% if one test was ordered to about 7% if nine or more tests were ordered, consistent with less selectivity when more tests were ordered. We then examined the MARR for community-based testing for 1,340 family physicians and found both a wide variation in MARR as well as an inverse relationship between the total tests ordered per year per physician and the physician-specific MARR. The proposed metric represents a new utilization metric for benchmarking relative selectivity of test orders among physicians. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. The Method of a Standalone Functional Verifying Operability of Sonar Control Systems

    Directory of Open Access Journals (Sweden)

    A. A. Sotnikov

    2014-01-01

    Full Text Available This article describes a method of standalone verifying sonar control system, which is based on functional checking of control system operability.The main features of realized method are a development of the valid mathematic model for simulation of sonar signals at the point of hydroacoustic antenna, a valid representation of the sonar control system modes as a discrete Markov model, providing functional object verification in real time mode.Some ways are proposed to control computational complexity in case of insufficient computing resources of the simulation equipment, namely the way of model functionality reduction and the way of adequacy reduction.Experiments were made using testing equipment, which was developed by department of Research Institute of Information Control System at Bauman Moscow State Technical University to verify technical validity of industrial sonar complexes.On-board software was artificially changed to create malfunctions in functionality of sonar control systems during the verifying process in order to estimate verifying system performances.The method efficiency was proved by the theory and experiment results in comparison with the basic methodology of verifying technical systems.This method could be also used in debugging of on-board software of sonar complexes and in development of new promising algorithms of sonar signal processing.

  9. Validity testing of third-order nonlinear models for synchronous generators

    Energy Technology Data Exchange (ETDEWEB)

    Arjona, M.A. [Division de Estudios de Posgrado e Investigacion, Instituto Tecnologico de La Laguna Torreon, Coah. (Mexico); Escarela-Perez, R. [Universidad Autonoma Metropolitana - Azcapotzalco, Departamento de Energia, Av. San Pablo 180, Col. Reynosa, C.P. 02200 (Mexico); Espinosa-Perez, G. [Division de Estudios Posgrado de la Facultad de Ingenieria Universidad Nacional Autonoma de Mexico (Mexico); Alvarez-Ramirez, J. [Universidad Autonoma Metropolitana -Iztapalapa, Division de Ciencias Basicas e Ingenieria (Mexico)

    2009-06-15

    Third-order nonlinear models are commonly used in control theory for the analysis of the stability of both open-loop and closed-loop synchronous machines. However, the ability of these models to describe the electrical machine dynamics has not been tested experimentally. This work focuses on this issue by addressing the parameters identification problem for third-order models for synchronous generators. For a third-order model describing the dynamics of power angle {delta}, rotor speed {omega} and quadrature axis transient EMF E{sub q}{sup '}, it is shown that the parameters cannot be identified because of the effects of the unknown initial condition of E{sub q}{sup '}. To avoid this situation, a model that incorporates the measured electrical power dynamics is considered, showing that state measurements guarantee the identification of the model parameters. Data obtained from a 7 kVA lab-scale synchronous generator and from a 150 MVA finite-element simulation were used to show that, at least for the worked examples, the estimated parameters display only moderate variations over the operating region. This suggests that third-order models can suffice to describe the main dynamical features of synchronous generators, and that third-order models can be used to design and tune power system stabilizers and voltage regulators. (author)

  10. USCIS E-Verify Self-Check

    Data.gov (United States)

    Department of Homeland Security — E-Verify is an internet based system that contains datasets to compare information from an employee's Form I-9, Employment Eligibility Verification, to data from the...

  11. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    Science.gov (United States)

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  12. A low-order coupled chemistry meteorology model for testing online and offline data assimilation schemes

    Science.gov (United States)

    Haussaire, J.-M.; Bocquet, M.

    2015-08-01

    Bocquet and Sakov (2013) have introduced a low-order model based on the coupling of the chaotic Lorenz-95 model which simulates winds along a mid-latitude circle, with the transport of a tracer species advected by this zonal wind field. This model, named L95-T, can serve as a playground for testing data assimilation schemes with an online model. Here, the tracer part of the model is extended to a reduced photochemistry module. This coupled chemistry meteorology model (CCMM), the L95-GRS model, mimics continental and transcontinental transport and the photochemistry of ozone, volatile organic compounds and nitrogen oxides. Its numerical implementation is described. The model is shown to reproduce the major physical and chemical processes being considered. L95-T and L95-GRS are specifically designed and useful for testing advanced data assimilation schemes, such as the iterative ensemble Kalman smoother (IEnKS) which combines the best of ensemble and variational methods. These models provide useful insights prior to the implementation of data assimilation methods on larger models. We illustrate their use with data assimilation schemes on preliminary, yet instructive numerical experiments. In particular, online and offline data assimilation strategies can be conveniently tested and discussed with this low-order CCMM. The impact of observed chemical species concentrations on the wind field can be quantitatively estimated. The impacts of the wind chaotic dynamics and of the chemical species non-chaotic but highly nonlinear dynamics on the data assimilation strategies are illustrated.

  13. Privacy-Preserving Verifiability: A Case for an Electronic Exam Protocol

    DEFF Research Database (Denmark)

    Giustolisi, Rosario; Iovino, Vincenzo; Lenzini, Gabriele

    2017-01-01

    We introduce the notion of privacy-preserving verifiability for security protocols. It holds when a protocol admits a verifiability test that does not reveal, to the verifier that runs it, more pieces of information about the protocol’s execution than those required to run the test. Our definition...... of privacy-preserving verifiability is general and applies to cryptographic protocols as well as to human security protocols. In this paper we exemplify it in the domain of e-exams. We prove that the notion is meaningful by studying an existing exam protocol that is verifiable but whose verifiability tests...... are not privacy-preserving. We prove that the notion is applicable: we review the protocol using functional encryption so that it admits a verifiability test that preserves privacy according to our definition. We analyse, in ProVerif, that the verifiability holds despite malicious parties and that the new...

  14. The scope for first-order tests of light speed anisotropy

    International Nuclear Information System (INIS)

    Aspden, H.

    1983-01-01

    New optical experiments aimed at testing light speed anisotropy to first-order in v/c are proposed on the basis of an intuitive enquiry into the physical processes by which the vacuum exhibits zero dispersion when regulating the proppagation of light waves. Such experiments can be justified because standing waves are present in experiments of the Michelson-Morley type and these may have a disturbing influence on light propagation speed in the standing-wave region. Though a null result from an initial experiment is reported, the outcome of a second experiment yet to be performed is needed to reach a conclusion from this investigation

  15. Order of Administration of Math and Verbal Tests: An Ecological Intervention to Reduce Stereotype Threat on Girls' Math Performance

    Science.gov (United States)

    Smeding, Annique; Dumas, Florence; Loose, Florence; Régner, Isabelle

    2013-01-01

    In 2 field experiments, we relied on the very features of real testing situations--where both math and verbal tests are administered--to examine whether order of test administration can, by itself, create vs. alleviate stereotype threat (ST) effects on girls' math performance. We predicted that taking the math test before the verbal test would be…

  16. An IBM 370 assembly language program verifier

    Science.gov (United States)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  17. BloodLink: Computer-based Decision Support for Blood Test Ordering; Assessment of the effect on physicians' test-ordering behavior

    NARCIS (Netherlands)

    M.A.M. van Wijk (Marc)

    2000-01-01

    textabstractRequesting blood tests is an important aspect of the health care delivered by the general practitioner in The Netherlands. About three to four percent of the patients encounters with Dutch general practitioners result in the physician requesting blood tests, which is lower than in many

  18. Classroom Experiment to Verify the Lorentz Force

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 3. Classroom Experiment to Verify the Lorentz Force. Somnath Basu Anindita Bose Sumit Kumar Sinha Pankaj Vishe S Chatterjee. Classroom Volume 8 Issue 3 March 2003 pp 81-86 ...

  19. On alternative approach for verifiable secret sharing

    OpenAIRE

    Kulesza, Kamil; Kotulski, Zbigniew; Pieprzyk, Joseph

    2002-01-01

    Secret sharing allows split/distributed control over the secret (e.g. master key). Verifiable secret sharing (VSS) is the secret sharing extended by verification capacity. Usually verification comes at the price. We propose "free lunch", the approach that allows to overcome this inconvenience.

  20. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  1. A Verifiable Secret Shuffle of Homomorphic Encryptions

    DEFF Research Database (Denmark)

    Groth, Jens

    2003-01-01

    We show how to prove in honest verifier zero-knowledge the correctness of a shuffle of homomorphic encryptions (or homomorphic commitments.) A shuffle consists in a rearrangement of the input ciphertexts and a reencryption of them so that the permutation is not revealed....

  2. The Fragility of Individual-Based Explanations of Social Hierarchies: A Test Using Animal Pecking Orders

    Science.gov (United States)

    2016-01-01

    The standard approach in accounting for hierarchical differentiation in biology and the social sciences considers a hierarchy as a static distribution of individuals possessing differing amounts of some valued commodity, assumes that the hierarchy is generated by micro-level processes involving individuals, and attempts to reverse engineer the processes that produced the hierarchy. However, sufficient experimental and analytical results are available to evaluate this standard approach in the case of animal dominance hierarchies (pecking orders). Our evaluation using evidence from hierarchy formation in small groups of both hens and cichlid fish reveals significant deficiencies in the three tenets of the standard approach in accounting for the organization of dominance hierarchies. In consequence, we suggest that a new approach is needed to explain the organization of pecking orders and, very possibly, by implication, for other kinds of social hierarchies. We develop an example of such an approach that considers dominance hierarchies to be dynamic networks, uses dynamic sequences of interaction (dynamic network motifs) to explain the organization of dominance hierarchies, and derives these dynamic sequences directly from observation of hierarchy formation. We test this dynamical explanation using computer simulation and find a good fit with actual dynamics of hierarchy formation in small groups of hens. We hypothesize that the same dynamic sequences are used in small groups of many other animal species forming pecking orders, and we discuss the data required to evaluate our hypothesis. Finally, we briefly consider how our dynamic approach may be generalized to other kinds of social hierarchies using the example of the distribution of empty gastropod (snail) shells occupied in populations of hermit crabs. PMID:27410230

  3. Verifying the gravitational shift due to the earth's rotation

    International Nuclear Information System (INIS)

    Briatore, L.; Leschiutta, S.

    1976-01-01

    Data on various independent time scales kept in different laboratories are elaborated in order to verify the gravitational shift due to the earth's rotation. It is shown that the state of the art in the measurement of time is just now resulting in the possibility to make measurement of Δ t/t approximately 10 -13 . Moreover it is shown an experimental evidence of the earth's rotation relativistic effects

  4. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  5. A Practical Voter-Verifiable Election Scheme.

    OpenAIRE

    Chaum, D; Ryan, PYA; Schneider, SA

    2005-01-01

    We present an election scheme designed to allow voters to verify that their vote is accurately included in the count. The scheme provides a high degree of transparency whilst ensuring the secrecy of votes. Assurance is derived from close auditing of all the steps of the vote recording and counting process with minimal dependence on the system components. Thus, assurance arises from verification of the election rather than having to place trust in the correct behaviour of components of the vot...

  6. Analytic treatment of leading-order parton evolution equations: Theory and tests

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; McKay, Douglas W.

    2009-01-01

    We recently derived an explicit expression for the gluon distribution function G(x,Q 2 )=xg(x,Q 2 ) in terms of the proton structure function F 2 γp (x,Q 2 ) in leading-order (LO) QCD by solving the LO Dokshitzer-Gribov-Lipatov-Altarelli-Parisi equation for the Q 2 evolution of F 2 γp (x,Q 2 ) analytically, using a differential-equation method. We showed that accurate experimental knowledge of F 2 γp (x,Q 2 ) in a region of Bjorken x and virtuality Q 2 is all that is needed to determine the gluon distribution in that region. We rederive and extend the results here using a Laplace-transform technique, and show that the singlet quark structure function F S (x,Q 2 ) can be determined directly in terms of G from the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi gluon evolution equation. To illustrate the method and check the consistency of existing LO quark and gluon distributions, we used the published values of the LO quark distributions from the CTEQ5L and MRST2001 LO analyses to form F 2 γp (x,Q 2 ), and then solved analytically for G(x,Q 2 ). We find that the analytic and fitted gluon distributions from MRST2001LO agree well with each other for all x and Q 2 , while those from CTEQ5L differ significantly from each other for large x values, x > or approx. 0.03-0.05, at all Q 2 . We conclude that the published CTEQ5L distributions are incompatible in this region. Using a nonsinglet evolution equation, we obtain a sensitive test of quark distributions which holds in both LO and next-to-leading order perturbative QCD. We find in either case that the CTEQ5 quark distributions satisfy the tests numerically for small x, but fail the tests for x > or approx. 0.03-0.05--their use could potentially lead to significant shifts in predictions of quantities sensitive to large x. We encountered no problems with the MRST2001LO distributions or later CTEQ distributions. We suggest caution in the use of the CTEQ5 distributions.

  7. Niche evolution and adaptive radiation: Testing the order of trait divergence

    Science.gov (United States)

    Ackerly, D.D.; Schwilk, D.W.; Webb, C.O.

    2006-01-01

    In the course of an adaptive radiation, the evolution of niche parameters is of particular interest for understanding modes of speciation and the consequences for coexistence of related species within communities. We pose a general question: In the course of an evolutionary radiation, do traits related to within-community niche differences (?? niche) evolve before or after differentiation of macrohabitat affinity or climatic tolerances (?? niche)? Here we introduce a new test to address this question, based on a modification of the method of independent contrasts. The divergence order test (DOT) is based on the average age of the nodes on a tree, weighted by the absolute magnitude of the contrast at each node for a particular trait. The comparison of these weighted averages reveals whether large divergences for one trait have occurred earlier or later in the course of diversification, relative to a second trait; significance is determined by bootstrapping from maximum-likelihood ancestral state reconstructions. The method is applied to the evolution of Ceanothus, a woody plant group in California, in which co-occurring species exhibit significant differences in a key leaf trait (specific leaf area) associated with contrasting physiological and life history strategies. Co-occurring species differ more for this trait than expected under a null model of community assembly. This ?? niche difference evolved early in the divergence of two major subclades within Ceanothus, whereas climatic distributions (?? niche traits) diversified later within each of the subclades. However, rapid evolution of climate parameters makes inferences of early divergence events highly uncertain, and differentiation of the ?? niche might have taken place throughout the evolution of the group, without leaving a clear phylogenetic signal. Similar patterns observed in several plant and animal groups suggest that early divergence of ?? niche traits might be a common feature of niche evolution in

  8. Verified Subtyping with Traits and Mixins

    Directory of Open Access Journals (Sweden)

    Asankhaya Sharma

    2014-07-01

    Full Text Available Traits allow decomposing programs into smaller parts and mixins are a form of composition that resemble multiple inheritance. Unfortunately, in the presence of traits, programming languages like Scala give up on subtyping relation between objects. In this paper, we present a method to check subtyping between objects based on entailment in separation logic. We implement our method as a domain specific language in Scala and apply it on the Scala standard library. We have verified that 67% of mixins used in the Scala standard library do indeed conform to subtyping between the traits that are used to build them.

  9. Unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2016-07-01

    Full Text Available stream_source_info Marais_2016_ABSTRACT.pdf.txt stream_content_type text/plain stream_size 796 Content-Encoding ISO-8859-1 stream_name Marais_2016_ABSTRACT.pdf.txt Content-Type text/plain; charset=ISO-8859-1 18th... International Workshop on Descriptional Complexity of Formal Systems, 5 - 8 July 2016, Bucharest, Romania Unary self-verifying symmetric difference automata Laurette Marais1,2 and Lynette van Zijl1(B) 1 Department of Computer Science, Stellenbosch...

  10. Task Order 22 – Engineering and Technical Support, Deep Borehole Field Test. AREVA Summary Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Denton, Mark A. [AREVA Federal Services, Charlotte, NC (United States)

    2016-01-19

    Under Task Order 22 of the industry Advisory and Assistance Services (A&AS) Contract to the Department of Energy (DOE) DE-NE0000291, AREVA has been tasked with providing assistance with engineering, analysis, cost estimating, and design support of a system for disposal of radioactive wastes in deep boreholes (without the use of radioactive waste). As part of this task order, AREVA was requested, through a letter of technical direction, to evaluate Sandia National Laboratory’s (SNL’s) waste package borehole emplacement system concept recommendation using input from DOE and SNL. This summary review report (SRR) documents this evaluation, with its focus on the primary input document titled: “Deep Borehole Field Test Specifications/M2FT-15SN0817091” Rev. 1 [1], hereafter referred to as the “M2 report.” The M2 report focuses on the conceptual design development for the Deep Borehole Field Test (DBFT), mainly the test waste packages (WPs) and the system for demonstrating emplacement and retrieval of those packages in the Field Test Borehole (FTB). This SRR follows the same outline as the M2 report, which allows for easy correlation between AREVA’s review comments, discussion, potential proposed alternatives, and path forward with information established in the M2 report. AREVA’s assessment focused on three primary elements of the M2 report: the conceptual design of the WPs proposed for deep borehole disposal (DBD), the mode of emplacement of the WP into DBD, and the conceptual design of the DBFT. AREVA concurs with the M2 report’s selection of the wireline emplacement mode specifically over the drill-string emplacement mode and generically over alternative emplacement modes. Table 5-1 of this SRR compares the pros and cons of each emplacement mode considered viable for DBD. The primary positive characteristics of the wireline emplacement mode include: (1) considered a mature technology; (2) operations are relatively simple; (3) probability of a

  11. Testing SUSY at the LHC: Electroweak and Dark matter fine tuning at two-loop order

    CERN Document Server

    Cassel, S; Ross, G G

    2010-01-01

    In the framework of the Constrained Minimal Supersymmetric Standard Model (CMSSM) we evaluate the electroweak fine tuning measure that provides a quantitative test of supersymmetry as a solution to the hierarchy problem. Taking account of current experimental constraints we compute the fine tuning at two-loop order and determine the limits on the CMSSM parameter space and the measurements at the LHC most relevant in covering it. Without imposing the LEPII bound on the Higgs mass, it is shown that the fine tuning computed at two-loop has a minimum $\\Delta=8.8$ corresponding to a Higgs mass $m_h=114\\pm 2$ GeV. Adding the constraint that the SUSY dark matter relic density should be within present bounds we find $\\Delta=15$ corresponding to $m_h=114.7\\pm 2$ GeV and this rises to $\\Delta=17.8$ ($m_h=115.9\\pm 2$ GeV) for SUSY dark matter abundance within 3$\\sigma$ of the WMAP constraint. We extend the analysis to include the contribution of dark matter fine tuning. In this case the overall fine tuning and Higgs mas...

  12. Next-to-leading-order tests of NRQCD factorization with J/{psi} yield and polarization

    Energy Technology Data Exchange (ETDEWEB)

    Butenschoen, Mathias [Wien Univ. (Austria). Fakultaet fuer Physik; Kniehl, Bernd A. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik

    2012-12-15

    We report on recent progress in testing the factorization formalism of nonrelativistic quantum chromodynamics (NRQCD) at next-to-leading order (NLO) for J/{psi} yield and polarization. We demonstrate that it is possible to unambiguously determine the leading color-octet long-distance matrix elements (LDMEs) in compliance with the velocity scaling rules through a global fit to experimental data of unpolarized J/{psi} production in pp, p anti p, ep, {gamma}{gamma}, and e{sup +}e{sup -} collisions.Three data sets not included in the fit, from hadroproduction and from photoproduction in the fixed-target and colliding-beam modes, are nicely reproduced. The polarization observables measured in different frames at DESY HERA and CERN LHC reasonably agree with NLO NRQCD predictions obtained using the LDMEs extracted from the global fit, while measurements from the FNAL Tevatron exhibit severe disagreement. We demonstrate that alternative LDME sets recently obtained in two other NLO NRQCD analyses of J/{psi} yield and polarization, with different philosophies, also fail to reconcile the Tevatron polarization data with the other available world data.

  13. Testing the Item-Order Account of Design Effects Using the Production Effect

    Science.gov (United States)

    Jonker, Tanya R.; Levene, Merrick; MacLeod, Colin M.

    2014-01-01

    A number of memory phenomena evident in recall in within-subject, mixed-lists designs are reduced or eliminated in between-subject, pure-list designs. The item-order account (McDaniel & Bugg, 2008) proposes that differential retention of order information might underlie this pattern. According to this account, order information may be encoded…

  14. Going beyond audit and feedback: towards behaviour-based interventions to change physician laboratory test ordering behaviour.

    Science.gov (United States)

    Meidani, Z; Mousavi, G A; Kheirkhah, D; Benar, N; Maleki, M R; Sharifi, M; Farrokhian, A

    2017-12-01

    Studies indicate there are a variety of contributing factors affecting physician test ordering behaviour. Identifying these behaviours allows development of behaviour-based interventions. Methods Through a pilot study, the list of contributing factors in laboratory tests ordering, and the most ordered tests, were identified, and given to 50 medical students, interns, residents and paediatricians in questionnaire form. The results showed routine tests and peer or supervisor pressure as the most influential factors affecting physician ordering behaviour. An audit and feedback mechanism was selected as an appropriate intervention to improve physician ordering behaviour. The intervention was carried out at two intervals over a three-month period. Findings There was a large reduction in the number of laboratory tests ordered; from 908 before intervention to 389 and 361 after first and second intervention, respectively. There was a significant relationship between audit and feedback and the meaningful reduction of 7 out of 15 laboratory tests including complete blood count (p = 0.002), erythrocyte sedimentation rate (p = 0.01), C-reactive protein (p = 0.01), venous blood gas (p = 0.016), urine analysis (p = 0.005), blood culture (p = 0.045) and stool examination (p = 0.001). Conclusion The audit and feedback intervention, even in short duration, affects physician ordering behaviour. It should be designed in terms of behaviour-based intervention and diagnosis of the contributing factors in physicians' behaviour. Further studies are required to substantiate the effectiveness of such behaviour-based intervention strategies in changing physician behaviour.

  15. A new first-order turbulence mixing model for the stable atmospheric boundary-layer: development and testing in large-eddy and single column models

    Science.gov (United States)

    Huang, J.; Bou-Zeid, E.; Golaz, J.

    2011-12-01

    Parameterization of the stably-stratified atmospheric boundary-layer is of crucial importance to different aspects of numerical weather prediction at regional scales and climate modeling at global scales, such as land-surface temperature forecasts, fog and frost prediction, and polar climate. It is well-known that most operational climate models require excessive turbulence mixing of the stable boundary-layer to prevent decoupling of the atmospheric component from the land component under strong stability, but the performance of such a model is unlikely to be satisfactory under weakly and moderately stable conditions. In this study we develop and test a general turbulence mixing model of the stable boundary-layer which works under different stabilities and for steady as well as unsteady conditions. A-priori large-eddy simulation (LES) tests are presented to motivate and verify the new parameterization. Subsequently, an assessment of this model using the GFDL single-column model (SCM) is performed. Idealized test cases including continuously varying stability, as well as stability discontinuity, are used to test the new SCM against LES results. A good match of mean and flux profiles is found when the new parameterization is used, while other traditional first-order turbulence models using the concept of stability function perform poorly. SCM spatial resolution is also found to have little impact on the performance of the new turbulence closure, but temporal resolution is important and a numerical stability criterion based on the model time step is presented.

  16. Computerised decision support systems in order communication for diagnostic, screening or monitoring test ordering: systematic reviews of the effects and cost-effectiveness of systems.

    Science.gov (United States)

    Main, C; Moxham, T; Wyatt, J C; Kay, J; Anderson, R; Stein, K

    2010-10-01

    Order communication systems (OCS) are computer applications used to enter diagnostic and therapeutic patient care orders and to view test results. Many potential benefits of OCS have been identified including improvements in clinician ordering patterns, optimisation of clinical time, and aiding communication processes between clinicians and different departments. Many OCS now include computerised decision support systems (CDSS), which are information systems designed to improve clinical decision-making. CDSS match individual patient characteristics to a computerised knowledge base, and software algorithms generate patient-specific recommendations. To investigate which CDSS in OCS are in use within the UK and the impact of CDSS in OCS for diagnostic, screening or monitoring test ordering compared to OCS without CDSS. To determine what features of CDSS are associated with clinician or patient acceptance of CDSS in OCS and what is known about the cost-effectiveness of CDSS in diagnostic, screening or monitoring test OCS compared to OCS without CDSS. A generic search to identify potentially relevant studies for inclusion was conducted using MEDLINE, EMBASE, Cochrane Controlled Trials Register (CCTR), CINAHL (Cumulative Index to Nursing and Allied Health Literature), DARE (Database of Abstracts of Reviews of Effects), Health Technology Assessment (HTA) database, IEEE (Institute of Electrical and Electronic Engineers) Xplore digital library, NHS Economic Evaluation Database (NHS EED) and EconLit, searched between 1974 and 2009 with a total of 22,109 titles and abstracts screened for inclusion. CDSS for diagnostic, screening and monitoring test ordering OCS in use in the UK were identified through contact with the 24 manufacturers/suppliers currently contracted by the National Project for Information Technology (NpfIT) to provide either national or specialist decision support. A generic search to identify potentially relevant studies for inclusion in the review was

  17. 40 CFR 799.5000 - Testing consent orders for substances and mixtures with Chemical Abstract Service Registry Numbers.

    Science.gov (United States)

    2010-07-01

    ... section sets forth a list of substances and mixtures which are the subject of testing consent orders... the substances and mixtures which are the subject of these orders and the Federal Register citations... Crotonaldehyde Environmental effects November 9, 1989. Chemical fate November 9, 1989. 4675-54-3 Bisphenol A...

  18. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  19. Testing the Processing Hypothesis of word order variation using a probabilistic language model

    NARCIS (Netherlands)

    Bloem, J.

    2016-01-01

    This work investigates the application of a measure of surprisal to modeling a grammatical variation phenomenon between near-synonymous constructions. We investigate a particular variation phenomenon, word order variation in Dutch two-verb clusters, where it has been established that word order

  20. Not All Order Memory Is Equal: Test Demands Reveal Dissociations in Memory for Sequence Information

    Science.gov (United States)

    Jonker, Tanya R.; MacLeod, Colin M.

    2017-01-01

    Remembering the order of a sequence of events is a fundamental feature of episodic memory. Indeed, a number of formal models represent temporal context as part of the memory system, and memory for order has been researched extensively. Yet, the nature of the code(s) underlying sequence memory is still relatively unknown. Across 4 experiments that…

  1. The Impact of Financing Surpluses and Large Financing Deficits on Tests of the Pecking Order Theory

    NARCIS (Netherlands)

    de Jong, Abe; Verbeek, Marno; Verwijmeren, Patrick

    2010-01-01

    This paper extends the basic pecking order model of Shyam-Sunder and Myers by separating the effects of financing surpluses, normal deficits, and large deficits. Using a panel of US firms over the period 1971-2005, we find that the estimated pecking order coefficient is highest for surpluses (0.90),

  2. ON TESTING OF CRYPTOGRAPHYC GENERATORS OUTPUT SEQUENCES USING MARKOV CHAINS OF CONDITIONAL ORDER

    Directory of Open Access Journals (Sweden)

    M. V. Maltsev

    2013-01-01

    Full Text Available The paper deals with the Markov chain of conditional order, which is used for statisticaltesting of cryptographic generators. Statistical estimations of model parameters are given. Consistency of the order estimator is proved. Results of computer experiments are presented.

  3. Reduction in unnecessary red blood cell folate testing by restricting computerized physician order entry in the electronic health record.

    Science.gov (United States)

    MacMillan, Thomas E; Gudgeon, Patrick; Yip, Paul M; Cavalcanti, Rodrigo B

    2018-05-02

    Red blood cell folate is a laboratory test with limited clinical utility. Previous attempts to reduce physician ordering of unnecessary laboratory tests, including folate, have resulted in only modest success. The objective of this study was to assess the effectiveness and impacts of restricting red blood cell folate ordering in the electronic health record. This was a retrospective observational study from January 2010 to December 2016 at a large academic healthcare network in Toronto, Canada. All inpatients and outpatients who underwent at least 1 red blood cell folate or vitamin B12 test during the study period were included. Red blood cell folate ordering was restricted to clincians in gastroenterology and hematology and was removed from other physicians' computerized order entry screen in the electronic health record in June 2013. Red blood cell folate testing decreased by 94.4% during the study, from a mean of 493.0 (SD 48.0) tests/month before intervention to 27.6 (SD 10.3) tests/month after intervention (P<.001). Restricting red blood cell folate ordering in the electronic health record resulted in a large and sustained reduction in red blood cell folate testing. Significant cost savings estimated at over a quarter-million dollars (CAD) over three years were achieved. There was no significant clinical impact of the intervention on the diagnosis of folate deficiency. Copyright © 2018. Published by Elsevier Inc.

  4. Concreteness effects in short-term memory: a test of the item-order hypothesis.

    Science.gov (United States)

    Roche, Jaclynn; Tolan, G Anne; Tehan, Gerald

    2011-12-01

    The following experiments explore word length and concreteness effects in short-term memory within an item-order processing framework. This framework asserts order memory is better for those items that are relatively easy to process at the item level. However, words that are difficult to process benefit at the item level for increased attention/resources being applied. The prediction of the model is that differential item and order processing can be detected in episodic tasks that differ in the degree to which item or order memory are required by the task. The item-order account has been applied to the word length effect such that there is a short word advantage in serial recall but a long word advantage in item recognition. The current experiment considered the possibility that concreteness effects might be explained within the same framework. In two experiments, word length (Experiment 1) and concreteness (Experiment 2) are examined using forward serial recall, backward serial recall, and item recognition. These results for word length replicate previous studies showing the dissociation in item and order tasks. The same was not true for the concreteness effect. In all three tasks concrete words were better remembered than abstract words. The concreteness effect cannot be explained in terms of an item-order trade off. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  5. Blood test ordering for unexplained complaints in general practice: the VAMPIRE randomised clinical trial protocol. [ISRCTN55755886

    NARCIS (Netherlands)

    van Bokhoven, Marloes A.; Koch, Hèlen; van der Weijden, Trudy; Grol, Richard P. T. M.; Bindels, Patrick J. E.; Dinant, Geert-Jan

    2006-01-01

    BACKGROUND: General practitioners (GPs) frequently order blood tests when they see patients presenting with unexplained complaints. Due to the low prevalence of serious pathology in general practice, the risk of false-positive test results is relatively high. This may result in unnecessary further

  6. Psychological and social aspects verified after the Goiania's radioactive accident

    International Nuclear Information System (INIS)

    Helou, Suzana

    1995-01-01

    Psychological and social aspects verified after the radioactive accident occurred in 1987 in Goiania - brazilian city - are discussed. With this goal was going presented a public opinion research in order to retract the Goiania's radioactive accident residual psychological effects. They were going consolidated data obtained in 1.126 interviews. Four involvement different levels groups with the accident are compared with regard to the event. The research allowed to conclude that the accident affected psychologically somehow all Goiania's population. Besides, the research allowed to analyze the professionals performance quality standard in terms of the accident

  7. Flux wire measurements in Cavalier for verifying computer code applications

    International Nuclear Information System (INIS)

    Fehr, M.; Stubbs, J.; Hosticka, B.

    1988-01-01

    The Cavalier and UVAR research reactors are to be converted from high-enrichment uranium (HEU) to low-enrichment uranium (LEU) fuel. As a first step, an extensive set of gold wire activation measurements has been taken on the Cavalier reactor. Axial traverses show internal consistency to the order of ±5%, while horizontal traverses show somewhat larger deviations. The activation measurements will be converted to flux measurements via the Thermos code and will then be used to verify the Leopard-2DB codes. The codes will ultimately be used to design an upgraded LEU core for the UVAR

  8. Spin temperature concept verified by optical magnetometry of nuclear spins

    Science.gov (United States)

    Vladimirova, M.; Cronenberger, S.; Scalbert, D.; Ryzhov, I. I.; Zapasskii, V. S.; Kozlov, G. G.; Lemaître, A.; Kavokin, K. V.

    2018-01-01

    We develop a method of nonperturbative optical control over adiabatic remagnetization of the nuclear spin system and apply it to verify the spin temperature concept in GaAs microcavities. The nuclear spin system is shown to exactly follow the predictions of the spin temperature theory, despite the quadrupole interaction that was earlier reported to disrupt nuclear spin thermalization. These findings open a way for the deep cooling of nuclear spins in semiconductor structures, with the prospect of realizing nuclear spin-ordered states for high-fidelity spin-photon interfaces.

  9. Reflex test reminders in required cancer synoptic templates decrease order entry error: An analysis of mismatch repair immunohistochemical orders to screen for Lynch syndrome

    Directory of Open Access Journals (Sweden)

    Mark R Kilgore

    2016-01-01

    being before S/O PostImp. Conclusion: This algorithm ensures MMR IHC ordering in women ≤60 with EC and can be applied to similar scenarios. Ancillary tests for management are increasing, especially genetic and molecular-based methods. The burden of managing orders and results remains with the pathologist and relying on human intervention alone is ineffective. Ordering IHC before or at S/O prevents oversight and the additional work of retrospective ordering and reporting.

  10. Reflex test reminders in required cancer synoptic templates decrease order entry error: An analysis of mismatch repair immunohistochemical orders to screen for Lynch syndrome.

    Science.gov (United States)

    Kilgore, Mark R; McIlwain, Carrie A; Schmidt, Rodney A; Norquist, Barbara M; Swisher, Elizabeth M; Garcia, Rochelle L; Rendi, Mara H

    2016-01-01

    ensures MMR IHC ordering in women ≤60 with EC and can be applied to similar scenarios. Ancillary tests for management are increasing, especially genetic and molecular-based methods. The burden of managing orders and results remains with the pathologist and relying on human intervention alone is ineffective. Ordering IHC before or at S/O prevents oversight and the additional work of retrospective ordering and reporting.

  11. Towards Verifying National CO2 Emissions

    Science.gov (United States)

    Fung, I. Y.; Wuerth, S. M.; Anderson, J. L.

    2017-12-01

    With the Paris Agreement, nations around the world have pledged their voluntary reductions in future CO2 emissions. Satellite observations of atmospheric CO2 have the potential to verify self-reported emission statistics around the globe. We present a carbon-weather data assimilation system, wherein raw weather observations together with satellite observations of the mixing ratio of column CO2 from the Orbiting Carbon Observatory-2 are assimilated every 6 hours into the NCAR carbon-climate model CAM5 coupled to the Ensemble Kalman Filter of DART. In an OSSE, we reduced the fossil fuel emissions from a country, and estimated the emissions innovations demanded by the atmospheric CO2 observations. The uncertainties in the innovation are analyzed with respect to the uncertainties in the meteorology to determine the significance of the result. The work follows from "On the use of incomplete historical data to infer the present state of the atmosphere" (Charney et al. 1969), which maps the path for continuous data assimilation for weather forecasting and the five decades of progress since.

  12. Not All the Bots Are Created Equal: The Ordering Turing Test for the Labeling of Bots in MMORPGs

    Directory of Open Access Journals (Sweden)

    Stefano De Paoli

    2017-11-01

    Full Text Available This article contributes to the research on bots in Social Media. It takes as its starting point an emerging perspective which proposes that we should abandon the investigation of the Turing Test and the functional aspects of bots in favor of studying the authentic and cooperative relationship between humans and bots. Contrary to this view, this article argues that Turing Tests are one of the ways in which authentic relationships between humans and bots take place. To understand this, this article introduces the concept of Ordering Turing Tests: these are sort of Turing Tests proposed by social actors for purposes of achieving social order when bots produce deviant behavior. An Ordering Turing Test is method for labeling deviance, whereby social actors can use this test to tell apart rule-abiding humans and rule-breaking bots. Using examples from Massively Multiplayer Online Role-Playing Games, this article illustrates how Ordering Turing Tests are proposed and justified by players and service providers. Data for the research comes from scientific literature on Machine Learning proposed for the identification of bots and from game forums and other player produced paratexts from the case study of the game Runescape.

  13. A novel strategy for evaluating the effects of an electronic test ordering alert message: Optimizing cardiac marker use

    Directory of Open Access Journals (Sweden)

    Jason M Baron

    2012-01-01

    Full Text Available Background: Laboratory ordering functions within computerized provider order entry (CPOE systems typically support the display of electronic alert messages to improve test utilization or implement new ordering policies. However, alert strategies have been shown to vary considerably in their success and the characteristics contributing to an alert′s success are poorly understood. Improved methodologies are needed to evaluate alerts and their mechanisms of action. Materials and Methods: Clinicians order inpatient and emergency department laboratory tests using our institutional CPOE system. We analyzed user interaction data captured by our CPOE system to evaluate how clinicians responded to an alert. We evaluated an alert designed to implement an institutional policy restricting the indications for ordering creatine kinase-MB (CKMB. Results: Within 2 months of alert implementation, CKMB-associated searches declined by 79% with a corresponding decline in CKMB orders. Furthermore, while prior to alert implementation, clinicians searching for CKMB ultimately ordered this test 99% of the time, following implementation, only 60% of CKMB searches ultimately led to CKMB test orders. This difference presumably represents clinicians who reconsidered the need for CKMB in response to the alert, demonstrating the alert′s just-in-time advisory capability. In addition, as clinicians repeatedly viewed the alert, there was a "dose-dependant" decrease in the fraction of searches without orders. This presumably reflects the alerting strategy′s long-term educational component, as clinicians aware of the new policy will not search for CKMB when not indicated. Conclusions: Our analytic approach provides insight into the mechanism of a CPOE alert and demonstrates that alerts may act through a combination of just-in-time advice and longer term education. Use of this approach when implementing alerts may prove useful to improve the success of a given alerting

  14. A novel strategy for evaluating the effects of an electronic test ordering alert message: Optimizing cardiac marker use.

    Science.gov (United States)

    Baron, Jason M; Lewandrowski, Kent B; Kamis, Irina K; Singh, Balaji; Belkziz, Sidi M; Dighe, Anand S

    2012-01-01

    Laboratory ordering functions within computerized provider order entry (CPOE) systems typically support the display of electronic alert messages to improve test utilization or implement new ordering policies. However, alert strategies have been shown to vary considerably in their success and the characteristics contributing to an alert's success are poorly understood. Improved methodologies are needed to evaluate alerts and their mechanisms of action. Clinicians order inpatient and emergency department laboratory tests using our institutional CPOE system. We analyzed user interaction data captured by our CPOE system to evaluate how clinicians responded to an alert. We evaluated an alert designed to implement an institutional policy restricting the indications for ordering creatine kinase-MB (CKMB). Within 2 months of alert implementation, CKMB-associated searches declined by 79% with a corresponding decline in CKMB orders. Furthermore, while prior to alert implementation, clinicians searching for CKMB ultimately ordered this test 99% of the time, following implementation, only 60% of CKMB searches ultimately led to CKMB test orders. This difference presumably represents clinicians who reconsidered the need for CKMB in response to the alert, demonstrating the alert's just-in-time advisory capability. In addition, as clinicians repeatedly viewed the alert, there was a "dose-dependant" decrease in the fraction of searches without orders. This presumably reflects the alerting strategy's long-term educational component, as clinicians aware of the new policy will not search for CKMB when not indicated. Our analytic approach provides insight into the mechanism of a CPOE alert and demonstrates that alerts may act through a combination of just-in-time advice and longer term education. Use of this approach when implementing alerts may prove useful to improve the success of a given alerting strategy.

  15. Visibility-Based Hypothesis Testing Using Higher-Order Optical Interference

    Science.gov (United States)

    Jachura, Michał; Jarzyna, Marcin; Lipka, Michał; Wasilewski, Wojciech; Banaszek, Konrad

    2018-03-01

    Many quantum information protocols rely on optical interference to compare data sets with efficiency or security unattainable by classical means. Standard implementations exploit first-order coherence between signals whose preparation requires a shared phase reference. Here, we analyze and experimentally demonstrate the binary discrimination of visibility hypotheses based on higher-order interference for optical signals with a random relative phase. This provides a robust protocol implementation primitive when a phase lock is unavailable or impractical. With the primitive cost quantified by the total detected optical energy, optimal operation is typically reached in the few-photon regime.

  16. 78 FR 59165 - Orders: Information Reporting With Respect to Stress Testing of Regulated Entities

    Science.gov (United States)

    2013-09-26

    ... Instructions and Guidance is effective on October 28, 2013. FOR FURTHER INFORMATION CONTACT: Naa Awaa Tagoe, Senior Associate Director, Office of Financial Analysis, Modeling and Simulations, (202) 649-3140... Guidance accompanying each Order provides to the regulated entities general advice concerning the content...

  17. A High-Order Test for Optimality of Bang-Bang Controls.

    Science.gov (United States)

    1983-11-01

    Systems * Istituto di Matematica Applicata, Universitl di Padova, ITALY. sponsored by the United States Army under Contract No. DAAG29-80-C-0041...the first order variation at the terminal point of the trajectory lim [x(T,u ) - x(TW)]/E (1.1) Istituto di Matematica Applicata, Universitl di Padova

  18. Construction of experimental HMA test sections in order to monitor the compaction process

    NARCIS (Netherlands)

    ter Huerne, Henderikus L.; Molenaar, A.A.A.; van de Ven, M.F.C.

    2003-01-01

    For getting better understanding about the process of HMA compaction, a test section was constructed while the governing process parameters, like; compaction progress, temperature of the material at which activities were employed, equipment properties and meteorological circumstances, were

  19. 40 CFR 1068.405 - What is in a test order?

    Science.gov (United States)

    2010-07-01

    ... CONTROLS GENERAL COMPLIANCE PROVISIONS FOR ENGINE PROGRAMS Selective Enforcement Auditing § 1068.405 What.../equipment for testing. The information would apply only for a single model year so it would be best to...

  20. A note on power and sample size calculations for the Kruskal-Wallis test for ordered categorical data.

    Science.gov (United States)

    Fan, Chunpeng; Zhang, Donghui

    2012-01-01

    Although the Kruskal-Wallis test has been widely used to analyze ordered categorical data, power and sample size methods for this test have been investigated to a much lesser extent when the underlying multinomial distributions are unknown. This article generalizes the power and sample size procedures proposed by Fan et al. ( 2011 ) for continuous data to ordered categorical data, when estimates from a pilot study are used in the place of knowledge of the true underlying distribution. Simulations show that the proposed power and sample size formulas perform well. A myelin oligodendrocyte glycoprotein (MOG) induced experimental autoimmunce encephalomyelitis (EAE) mouse study is used to demonstrate the application of the methods.

  1. Verifying cell loss requirements in high-speed communication networks

    Directory of Open Access Journals (Sweden)

    Kerry W. Fendick

    1998-01-01

    Full Text Available In high-speed communication networks it is common to have requirements of very small cell loss probabilities due to buffer overflow. Losses are measured to verify that the cell loss requirements are being met, but it is not clear how to interpret such measurements. We propose methods for determining whether or not cell loss requirements are being met. A key idea is to look at the stream of losses as successive clusters of losses. Often clusters of losses, rather than individual losses, should be regarded as the important “loss events”. Thus we propose modeling the cell loss process by a batch Poisson stochastic process. Successive clusters of losses are assumed to arrive according to a Poisson process. Within each cluster, cell losses do not occur at a single time, but the distance between losses within a cluster should be negligible compared to the distance between clusters. Thus, for the purpose of estimating the cell loss probability, we ignore the spaces between successive cell losses in a cluster of losses. Asymptotic theory suggests that the counting process of losses initiating clusters often should be approximately a Poisson process even though the cell arrival process is not nearly Poisson. The batch Poisson model is relatively easy to test statistically and fit; e.g., the batch-size distribution and the batch arrival rate can readily be estimated from cell loss data. Since batch (cluster sizes may be highly variable, it may be useful to focus on the number of batches instead of the number of cells in a measurement interval. We also propose a method for approximately determining the parameters of a special batch Poisson cell loss with geometric batch-size distribution from a queueing model of the buffer content. For this step, we use a reflected Brownian motion (RBM approximation of a G/D/1/C queueing model. We also use the RBM model to estimate the input burstiness given the cell loss rate. In addition, we use the RBM model to

  2. Middleborns disadvantaged? Testing birth-order effects on fitness in pre-industrial Finns.

    Directory of Open Access Journals (Sweden)

    Charlotte Faurie

    Full Text Available Parental investment is a limited resource for which offspring compete in order to increase their own survival and reproductive success. However, parents might be selected to influence the outcome of sibling competition through differential investment. While evidence for this is widespread in egg-laying species, whether or not this may also be the case in viviparous species is more difficult to determine. We use pre-industrial Finns as our model system and an equal investment model as our null hypothesis, which predicts that (all else being equal middleborns should be disadvantaged through competition. We found no overall evidence to suggest that middleborns in a family are disadvantaged in terms of their survival, age at first reproduction or lifetime reproductive success. However, when considering birth-order only among same-sexed siblings, first-, middle- and lastborn sons significantly differed in the number of offspring they were able to rear to adulthood, although there was no similar effect among females. Middleborn sons appeared to produce significantly less offspring than first- or lastborn sons, but they did not significantly differ from lastborn sons in the number of offspring reared to adulthood. Our results thus show that taking sex differences into account is important when modelling birth-order effects. We found clear evidence of firstborn sons being advantaged over other sons in the family, and over firstborn daughters. Therefore, our results suggest that parents invest differentially in their offspring in order to both preferentially favour particular offspring or reduce offspring inequalities arising from sibling competition.

  3. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose

    2013-01-01

    for some individuals, in order to minimize this loss in the discriminatory power. The distribution of the continuous antibody response against MAP has been obtained for healthy, MAP-infected and MAP-infectious cows of different age groups. The overall power of the milk-ELISA to discriminate between healthy...... and MAP-infected cows was extremely poor but was high between healthy and MAP-infectious. The discriminatory ability increased with increasing age. The great overlap between the distributions of the different infection stages would have hampered our ability to discriminate between the different infection...

  4. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  5. DEA models equivalent to general Nth order stochastic dominance efficiency tests

    Czech Academy of Sciences Publication Activity Database

    Branda, Martin; Kopa, Miloš

    2016-01-01

    Roč. 44, č. 2 (2016), s. 285-289 ISSN 0167-6377 R&D Projects: GA ČR GA13-25911S; GA ČR GA15-00735S Grant - others:GA ČR(CZ) GA15-02938S Institutional support: RVO:67985556 Keywords : Nth order stochastic dominance efficiency * Data envelopment analysis * Convex NSD efficiency * NSD portfolio efficiency Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.657, year: 2016 http://library.utia.cas.cz/separaty/2016/E/branda-0458120.pdf

  6. Higher-Order Asymptotics and Its Application to Testing the Equality of the Examinee Ability Over Two Sets of Items.

    Science.gov (United States)

    Sinharay, Sandip; Jensen, Jens Ledet

    2018-06-27

    In educational and psychological measurement, researchers and/or practitioners are often interested in examining whether the ability of an examinee is the same over two sets of items. Such problems can arise in measurement of change, detection of cheating on unproctored tests, erasure analysis, detection of item preknowledge, etc. Traditional frequentist approaches that are used in such problems include the Wald test, the likelihood ratio test, and the score test (e.g., Fischer, Appl Psychol Meas 27:3-26, 2003; Finkelman, Weiss, & Kim-Kang, Appl Psychol Meas 34:238-254, 2010; Glas & Dagohoy, Psychometrika 72:159-180, 2007; Guo & Drasgow, Int J Sel Assess 18:351-364, 2010; Klauer & Rettig, Br J Math Stat Psychol 43:193-206, 1990; Sinharay, J Educ Behav Stat 42:46-68, 2017). This paper shows that approaches based on higher-order asymptotics (e.g., Barndorff-Nielsen & Cox, Inference and asymptotics. Springer, London, 1994; Ghosh, Higher order asymptotics. Institute of Mathematical Statistics, Hayward, 1994) can also be used to test for the equality of the examinee ability over two sets of items. The modified signed likelihood ratio test (e.g., Barndorff-Nielsen, Biometrika 73:307-322, 1986) and the Lugannani-Rice approximation (Lugannani & Rice, Adv Appl Prob 12:475-490, 1980), both of which are based on higher-order asymptotics, are shown to provide some improvement over the traditional frequentist approaches in three simulations. Two real data examples are also provided.

  7. 34 CFR 462.41 - How must tests be administered in order to accurately measure educational gain?

    Science.gov (United States)

    2010-07-01

    ... measure educational gain? 462.41 Section 462.41 Education Regulations of the Offices of the Department of... EDUCATIONAL GAIN IN THE NATIONAL REPORTING SYSTEM FOR ADULT EDUCATION What Requirements Must States and Local Eligible Providers Follow When Measuring Educational Gain? § 462.41 How must tests be administered in order...

  8. A Test of Macromolecular Crystallization in Microgravity: Large, Well-Ordered Insulin Crystals

    Science.gov (United States)

    Borgstahl, Gloria E. O.; Vahedi-Faridi, Ardeschir; Lovelace, Jeff; Bellamy, Henry D.; Snell, Edward H.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Crystals of insulin grown in microgravity on space shuttle mission STS-95 were extremely well-ordered and unusually large (many > 2 mm). The physical characteristics of six microgravity and six earth-grown crystals were examined by X-ray analysis employing superfine f slicing and unfocused synchrotron radiation. This experimental setup allowed hundreds of reflections to be precisely examined for each crystal in a short period of time. The microgravity crystals were on average 34 times larger, had 7 times lower mosaicity, had 54 times higher reflection peak heights and diffracted to significantly higher resolution than their earth grown counterparts. A single mosaic domain model could account for reflections in microgravity crystals whereas reflections from earth crystals required a model with multiple mosaic domains. This statistically significant and unbiased characterization indicates that the microgravity environment was useful for the improvement of crystal growth and resultant diffraction quality in insulin crystals and may be similarly useful for macromolecular crystals in general.

  9. Breath Tests Application in Order to Improve the Outcomes of Treatment for Celiac Disease

    Directory of Open Access Journals (Sweden)

    Ye.Yu. Gubskaya

    2014-02-01

    Full Text Available The article presents data from an own study on modern opportunities to improve the effectiveness of treatment of patients with celiac disease (n = 41. All patients were on a gluten-free diet, nonetheless effectiveness of treatment was regarded as unsatisfactory. Due to the use of modern carbon and hydrogen breath tests and diagnosis of bacterial overgrowth syndrome, lactase deficiency and exocrine pancreatic insufficiency, which were the causes for the persistence of clinical symptoms, we obtained reasons for their correction and achieved a complete remission of the underlying disease.

  10. Astrophysical Tests of Kinematical Conformal Cosmology in Fourth-Order Conformal Weyl Gravity

    Directory of Open Access Journals (Sweden)

    Gabriele U. Varieschi

    2014-12-01

    Full Text Available In this work we analyze kinematical conformal cosmology (KCC, an alternative cosmological model based on conformal Weyl gravity (CG, and test it against current type Ia supernova (SNIa luminosity data and other astrophysical observations. Expanding upon previous work on the subject, we revise the analysis of SNIa data, confirming that KCC can explain the evidence for an accelerating expansion of the Universe without using dark energy or other exotic components. We obtain an independent evaluation of the Hubble constant, H0 = 67:53 kms-1 Mpc-1, very close to the current best estimates. The main KCC and CG parameters are re-evaluated and their revised values are found to be close to previous estimates. We also show that available data for the Hubble parameter as a function of redshift can be fitted using KCC and that this model does not suffer from any apparent age problem. Overall,

  11. Consequences of discrepancies on verified material balances

    International Nuclear Information System (INIS)

    Jaech, J.L.; Hough, C.G.

    1983-01-01

    the test statistic, and this is followed by a test on an optimized MUF, taking into account both the facility's and the inspector's data. (author)

  12. USCIS E-Verify Customer Satisfaction Survey, January 2013

    Data.gov (United States)

    Department of Homeland Security — This report focuses on the customer satisfaction of companies currently enrolled in the E-Verify program. Satisfaction with E-Verify remains high and follows up a...

  13. New concepts in nuclear arms control: verified cutoff and verified disposal

    International Nuclear Information System (INIS)

    Donnelly, W.H.

    1990-01-01

    Limiting the numbers of nuclear warheads by reducing military production and stockpiles of fissionable materials has been a constant item on the nuclear arms control agenda for the last 45 years. It has become more salient recently, however, because of two events: the enforced closure for safety reasons of the current United States military plutonium production facilities; and the possibility that the US and USSR may soon conclude an agreement providing for the verified destruction of significant numbers of nuclear warheads and the recovery of the fissionable material they contain with the option of transferring these materials to peaceful uses. A study has been made of the practical problems of verifying the cut off of fissionable material production for military purposes in the nuclear weapon states, as well as providing assurance that material recovered from warheads is not re-used for proscribed military purposes and facilitating its transfer to civil uses. Implementation of such measures would have important implications for non-proliferation. The resultant paper was presented to a meeting of the PPNN Core Group held in Baden, close to Vienna, over the weekend of 18/19th November 1989 and is reprinted in this booklet. (author)

  14. Verifying competence of operations personnel in nuclear power plants

    International Nuclear Information System (INIS)

    Farber, G.H.

    1986-01-01

    To ensure that only competent people are authorized to fill positions in a nuclear power plant, both the initial competence of personnel and the continuous maintenance of competence have to be verified. Two main methods are normally used for verifying competence, namely evaluation of a person's performance over a period of time, and evaluation of his knowledge and skills at a particular time by means of an examination. Both methods have limitations, and in practice they are often used together to give different and to some extent complementary evaluations of a person's competence. Verification of competence itself is a problem area, because objective judging of human competence is extremely difficult. Formal verification methods, such as tests and examinations, are particularly or exclusively applied for the direct operating personnel in the control room (very rarely for management personnel). Out of the many elements contributing to a person's competence, the knowledge which is needed and the intellectual skills are the main subjects of the formal verification methods. Therefore the presentation will concentrate on the proof of the technical qualification of operators by means of examinations. The examination process in the Federal Republic of Germany for the proof of knowledge and skills will serve as an example to describe and analyze the important aspects. From that recommendations are derived regarding standardization of the procedure as well as validation. (orig./GL)

  15. Verified Interval Orbit Propagation in Satellite Collision Avoidance

    NARCIS (Netherlands)

    Römgens, B.A.; Mooij, E.; Naeije, M.C.

    2011-01-01

    Verified interval integration methods enclose a solution set corresponding to interval initial values and parameters, and bound integration and rounding errors. Verified methods suffer from overestimation of the solution, i.e., non-solutions are also included in the solution enclosure. Two verified

  16. 20 CFR 401.45 - Verifying your identity.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make a...

  17. 28 CFR 802.13 - Verifying your identity.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity. You...

  18. Verifying black hole orbits with gravitational spectroscopy

    International Nuclear Information System (INIS)

    Drasco, Steve

    2009-01-01

    Gravitational waves from test masses bound to geodesic orbits of rotating black holes are simulated, using Teukolsky's black hole perturbation formalism, for about ten thousand generic orbital configurations. Each binary radiates power exclusively in modes with frequencies that are integer-linear combinations of the orbit's three fundamental frequencies. General spectral properties are found with a survey of orbits about a black hole taken to be rotating at 80% of the maximal spin. The orbital eccentricity is varied from 0.1 to 0.9. Inclination ranges from 20 deg. to 160 deg. and comes to within 20 deg. of polar. Semilatus rectum is varied from 1.2 to 3 times the value at the innermost stable circular orbits. The following general spectral properties are found: (i) 99% of the radiated power is typically carried by a few hundred modes, and at most by about a thousand modes, (ii) the dominant frequencies can be grouped into a small number of families defined by fixing two of the three integer frequency multipliers, and (iii) the specifics of these trends can be qualitatively inferred from the geometry of the orbit under consideration. Detections using triperiodic analytic templates modeled on these general properties would constitute a verification of radiation from an adiabatic sequence of black hole orbits and would recover the evolution of the fundamental orbital frequencies. In an analogy with ordinary spectroscopy, this would compare to observing the Bohr model's atomic hydrogen spectrum without being able to rule out alternative atomic theories or nuclei. The suitability of such a detection technique is demonstrated using snapshots computed at 12-hour intervals throughout the last three years before merger of a kludged inspiral. The system chosen is typical of those thought to occur in galactic nuclei and to be observable with space-based gravitational wave detectors like LISA. Because of circularization, the number of excited modes decreases as the binary

  19. TESTING OF PECKING ORDER THEORY THROUGH THE RELATIONSHIP: EARNINGS, CAPITAL STRUCTURE, DIVIDEND POLICY, AND FIRM’S VALUE

    Directory of Open Access Journals (Sweden)

    Harmono Harmono

    2017-03-01

    Full Text Available This study aimed to test the pecking order theory through its correlation among earnings dimension, capitalstructure, dividend policy and firm’s value perspective. By loading the correlation between dimension one toanother, it indicated that management behavior tended to retained earnings accumulation or to debt collectionin financing the operation of the firm. The pecking order theory were tested when the management behaviortended to retained earnings in accumulating sources of the fund equity rather than borrowing liabilities fromcreditors. Therefore, rationally if the capital structure was optimum, management tended to external financinguntil any trade off between earnings and debt financing. Based on the testing hypothesis, it indicated that therole of capital structure dimension had significance as intervening variable between earnings dimension andfirm’s value. On the other hand, the dividend policy had no significance to become intervening variable.Empirically, it could be concluded that the management behavior in Indonesia tended to leverage rather thanretained earnings accumulation in supporting the pecking order theory. Furthermore, the variable had the roleto differentiate the characteristic of industries represented by the capital structure dimension, especially, debtto assets and debt to equity ratio.

  20. An Audit of Repeat Testing at an Academic Medical Center: Consistency of Order Patterns With Recommendations and Potential Cost Savings.

    Science.gov (United States)

    Hueth, Kyle D; Jackson, Brian R; Schmidt, Robert L

    2018-05-31

    To evaluate the prevalence of potentially unnecessary repeat testing (PURT) and the associated economic burden for an inpatient population at a large academic medical facility. We evaluated all inpatient test orders during 2016 for PURT by comparing the intertest times to published recommendations. Potential cost savings were estimated using the Centers for Medicare & Medicaid Services maximum allowable reimbursement rate. We evaluated result positivity as a determinant of PURT through logistic regression. Of the evaluated 4,242 repeated target tests, 1,849 (44%) were identified as PURT, representing an estimated cost-savings opportunity of $37,376. Collectively, the association of result positivity and PURT was statistically significant (relative risk, 1.2; 95% confidence interval, 1.1-1.3; P < .001). PURT contributes to unnecessary health care costs. We found that a small percentage of providers account for the majority of PURT, and PURT is positively associated with result positivity.

  1. A Finite Equivalence of Verifiable Multi-secret Sharing

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2012-02-01

    Full Text Available We give an abstraction of verifiable multi-secret sharing schemes that is accessible to a fully mechanized analysis. This abstraction is formalized within the applied pi-calculus by using an equational theory which characterizes the cryptographic semantics of secret share. We also present an encoding from the equational theory into a convergent rewriting system, which is suitable for the automated protocol verifier ProVerif. Based on that, we verify the threshold certificate protocol in ProVerif.

  2. TESTING ON PECKING ORDER THEORY AND ANALYSIS OF COMPANY’S CHARACTERISTIC EFFECTS ON EMITTEN’S CAPITAL STRUCTURE

    Directory of Open Access Journals (Sweden)

    Tajudin Noor

    2015-05-01

    Full Text Available Pecking Order Theory (POT states that hierarchy fundings based on the cheapest cost coming from internal fund, followed by external fund are needed to determine the capital structure. The research objectives were to examine the concept of POT in agriculture companies listed on Indonesia Stock Exchange in order to decide the capital structure policies as well as to analyse the effects of company’s characteristics to the emitten capital structure. The research used regression analysis with pooled least square (PLS method in order to test POT, while the fixed effect model (FEM was applied to analyze the effect of company’s characteristics on capital structure. Regression analysis in evaluating pecking order theory’s concept shows that internal funding deficit significantly gives positive influence to the change of long term debts. Regression analysis resulted from company’s characteristics (profitability, size, growth, tangibility and liquidity shows that the company’s size and growth have significant positive effects on capital structure (leverage, whereas company’s profitability and liquidity have significant negative effects on capital structure (leverage. By contrast, company’s assets structure (tangibility do not give significantly influence on capital structure (leverage in 10% level of significance. The research shows that issuers in agricultural sector have implemented the concept of POT through the hierarchy usage of the cheapest financing from the internal as a priority followed by the external financing (debt.Keywords: Pecking Order Theory, capital structure, company’s characteristics, PLS, FEMABSTRAKPecking Order Theory menyatakan bahwa penentuan struktur modal yang optimal didasarkan pada keputusan pendanaan secara hirarki berdasarkan biaya modal yang paling murah yang bersumber pada dana internal, baru kemudian menggunakan sumber dana eksternal. Penelitian ini bertujuan menguji penggunaan konsep Pecking Order Theory

  3. AUTOMATIC ESTIMATION OF SIZE PARAMETERS USING VERIFIED COMPUTERIZED STEREOANALYSIS

    Directory of Open Access Journals (Sweden)

    Peter R Mouton

    2011-05-01

    Full Text Available State-of-the-art computerized stereology systems combine high-resolution video microscopy and hardwaresoftware integration with stereological methods to assist users in quantifying multidimensional parameters of importance to biomedical research, including volume, surface area, length, number, their variation and spatial distribution. The requirement for constant interactions between a trained, non-expert user and the targeted features of interest currently limits the throughput efficiency of these systems. To address this issue we developed a novel approach for automatic stereological analysis of 2-D images, Verified Computerized Stereoanalysis (VCS. The VCS approach minimizes the need for user interactions with high contrast [high signal-to-noise ratio (S:N] biological objects of interest. Performance testing of the VCS approach confirmed dramatic increases in the efficiency of total object volume (size estimation, without a loss of accuracy or precision compared to conventional computerized stereology. The broad application of high efficiency VCS to high-contrast biological objects on tissue sections could reduce labor costs, enhance hypothesis testing, and accelerate the progress of biomedical research focused on improvements in health and the management of disease.

  4. Verifying large modular systems using iterative abstraction refinement

    International Nuclear Information System (INIS)

    Lahtinen, Jussi; Kuismin, Tuomas; Heljanko, Keijo

    2015-01-01

    Digital instrumentation and control (I&C) systems are increasingly used in the nuclear engineering domain. The exhaustive verification of these systems is challenging, and the usual verification methods such as testing and simulation are typically insufficient. Model checking is a formal method that is able to exhaustively analyse the behaviour of a model against a formally written specification. If the model checking tool detects a violation of the specification, it will give out a counter-example that demonstrates how the specification is violated in the system. Unfortunately, sometimes real life system designs are too big to be directly analysed by traditional model checking techniques. We have developed an iterative technique for model checking large modular systems. The technique uses abstraction based over-approximations of the model behaviour, combined with iterative refinement. The main contribution of the work is the concrete abstraction refinement technique based on the modular structure of the model, the dependency graph of the model, and a refinement sampling heuristic similar to delta debugging. The technique is geared towards proving properties, and outperforms BDD-based model checking, the k-induction technique, and the property directed reachability algorithm (PDR) in our experiments. - Highlights: • We have developed an iterative technique for model checking large modular systems. • The technique uses BDD-based model checking, k-induction, and PDR in parallel. • We have tested our algorithm by verifying two models with it. • The technique outperforms classical model checking methods in our experiments

  5. Measurements of higher-order mode damping in the PEP-II low-power test cavity

    International Nuclear Information System (INIS)

    Rimmer, R.A.; Goldberg, D.A.

    1993-05-01

    The paper describes the results of measurements of the Higher-Order Mode (HOM) spectrum of the low-power test model of the PEP-II RF cavity and the reduction in the Q's of the modes achieved by the addition of dedicated damping waveguides. All the longitudinal (monopole) and deflecting (dipole) modes below the beam pipe cut-off are identified by comparing their measured frequencies and field distributions with calculations using the URMEL code. Field configurations were determined using a perturbation method with an automated bead positioning system. The loaded Q's agree well with the calculated values reported previously, and the strongest HOMs are damped by more than three orders of magnitude. This is sufficient to reduce the coupled-bunch growth rates to within the capability of a reasonable feedback system. A high power test cavity will now be built to validate the thermal design at the 150 kW nominal operating level, as described elsewhere at this conference

  6. Appraising the value of independent EIA follow-up verifiers

    Energy Technology Data Exchange (ETDEWEB)

    Wessels, Jan-Albert, E-mail: janalbert.wessels@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, Department of Geography and Environmental Management, North-West University, C/O Hoffman and Borcherd Street, Potchefstroom, 2520 (South Africa); Environmental Assessment, School of Environmental Science, Murdoch University, Australia. (Australia)

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  7. Appraising the value of independent EIA follow-up verifiers

    International Nuclear Information System (INIS)

    Wessels, Jan-Albert; Retief, Francois; Morrison-Saunders, Angus

    2015-01-01

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  8. Test of the Flavour Independence of $\\alpha_{s}$ using Next-to-Leading Order Calculations for Heavy Quarks

    CERN Document Server

    Abbiendi, G.; Alexander, G.; Allison, John; Altekamp, N.; Anderson, K.J.; Anderson, S.; Arcelli, S.; Asai, S.; Ashby, S.F.; Axen, D.; Azuelos, G.; Ball, A.H.; Barberio, E.; Barlow, Roger J.; Batley, J.R.; Baumann, S.; Bechtluft, J.; Behnke, T.; Bell, Kenneth Watson; Bella, G.; Bellerive, A.; Bentvelsen, S.; Bethke, S.; Betts, S.; Biebel, O.; Biguzzi, A.; Bloodworth, I.J.; Bock, P.; Bohme, J.; Bonacorsi, D.; Boutemeur, M.; Braibant, S.; Bright-Thomas, P.; Brigliadori, L.; Brown, Robert M.; Burckhart, H.J.; Capiluppi, P.; Carnegie, R.K.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, David G.; Chrisman, D.; Ciocca, C.; Clarke, P.E.L.; Clay, E.; Cohen, I.; Conboy, J.E.; Cooke, O.C.; Couchman, J.; Couyoumtzelis, C.; Coxe, R.L.; Cuffiani, M.; Dado, S.; Dallavalle, G.Marco; Davis, R.; De Jong, S.; de Roeck, A.; Dervan, P.; Desch, K.; Dienes, B.; Dixit, M.S.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Estabrooks, P.G.; Etzion, E.; Fabbri, F.; Fanfani, A.; Fanti, M.; Faust, A.A.; Fiedler, F.; Fierro, M.; Fleck, I.; Frey, A.; Furtjes, A.; Futyan, D.I.; Gagnon, P.; Gary, J.W.; Gascon-Shotkin, S.M.; Gaycken, G.; Geich-Gimbel, C.; Giacomelli, G.; Giacomelli, P.; Gibson, V.; Gibson, W.R.; Gingrich, D.M.; Glenzinski, D.; Goldberg, J.; Gorn, W.; Grandi, C.; Graham, K.; Gross, E.; Grunhaus, J.; Gruwe, M.; Hajdu, C.; Hanson, G.G.; Hansroul, M.; Hapke, M.; Harder, K.; Harel, A.; Hargrove, C.K.; Harin-Dirac, M.; Hauschild, M.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Herndon, M.; Herten, G.; Heuer, R.D.; Hildreth, M.D.; Hill, J.C.; Hobson, P.R.; Hocker, James Andrew; Hoffman, Kara Dion; Homer, R.J.; Honma, A.K.; Horvath, D.; Hossain, K.R.; Howard, R.; Huntemeyer, P.; Igo-Kemenes, P.; Imrie, D.C.; Ishii, K.; Jacob, F.R.; Jawahery, A.; Jeremie, H.; Jimack, M.; Jones, C.R.; Jovanovic, P.; Junk, T.R.; Kanaya, N.; Kanzaki, J.; Karlen, D.; Kartvelishvili, V.; Kawagoe, K.; Kawamoto, T.; Kayal, P.I.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kim, D.H.; Klier, A.; Kobayashi, T.; Kobel, M.; Kokott, T.P.; Kolrep, M.; Komamiya, S.; Kowalewski, Robert V.; Kress, T.; Krieger, P.; von Krogh, J.; Kuhl, T.; Kyberd, P.; Lafferty, G.D.; Landsman, H.; Lanske, D.; Lauber, J.; Lawson, I.; Layter, J.G.; Lellouch, D.; Letts, J.; Levinson, L.; Liebisch, R.; List, B.; Littlewood, C.; Lloyd, A.W.; Lloyd, S.L.; Loebinger, F.K.; Long, G.D.; Losty, M.J.; Lu, J.; Ludwig, J.; Lui, D.; Macchiolo, A.; Macpherson, A.; Mader, W.; Mannelli, M.; Marcellini, S.; Martin, A.J.; Martin, J.P.; Martinez, G.; Mashimo, T.; Mattig, Peter; McDonald, W.John; McKenna, J.; Mckigney, E.A.; McMahon, T.J.; McPherson, R.A.; Meijers, F.; Mendez-Lorenzo, P.; Merritt, F.S.; Mes, H.; Michelini, A.; Mihara, S.; Mikenberg, G.; Miller, D.J.; Mohr, W.; Montanari, A.; Mori, T.; Nagai, K.; Nakamura, I.; Neal, H.A.; Nisius, R.; O'Neale, S.W.; Oakham, F.G.; Odorici, F.; Ogren, H.O.; Okpara, A.; Oreglia, M.J.; Orito, S.; Pasztor, G.; Pater, J.R.; Patrick, G.N.; Patt, J.; Perez-Ochoa, R.; Petzold, S.; Pfeifenschneider, P.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Poffenberger, P.; Poli, B.; Polok, J.; Przybycien, M.; Quadt, A.; Rembser, C.; Rick, H.; Robertson, S.; Robins, S.A.; Rodning, N.; Roney, J.M.; Rosati, S.; Roscoe, K.; Rossi, A.M.; Rozen, Y.; Runge, K.; Runolfsson, O.; Rust, D.R.; Sachs, K.; Saeki, T.; Sahr, O.; Sang, W.M.; Sarkisian, E.K.G.; Sbarra, C.; Schaile, A.D.; Schaile, O.; Scharff-Hansen, P.; Schieck, J.; Schmitt, S.; Schoning, A.; Schroder, Matthias; Schumacher, M.; Schwick, C.; Scott, W.G.; Seuster, R.; Shears, T.G.; Shen, B.C.; Shepherd-Themistocleous, C.H.; Sherwood, P.; Siroli, G.P.; Sittler, A.; Skuja, A.; Smith, A.M.; Snow, G.A.; Sobie, R.; Soldner-Rembold, S.; Spagnolo, S.; Sproston, M.; Stahl, A.; Stephens, K.; Steuerer, J.; Stoll, K.; Strom, David M.; Strohmer, R.; Surrow, B.; Talbot, S.D.; Taras, P.; Tarem, S.; Teuscher, R.; Thiergen, M.; Thomas, J.; Thomson, M.A.; Torrence, E.; Towers, S.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turner-Watson, M.F.; Ueda, I.; Van Kooten, Rick J.; Vannerem, P.; Verzocchi, M.; Voss, H.; Wackerle, F.; Wagner, A.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wermes, N.; Wetterling, D.; White, J.S.; Wilson, G.W.; Wilson, J.A.; Wyatt, T.R.; Yamashita, S.; Zacek, V.; Zer-Zion, D.

    1999-01-01

    We present a test of the flavour independence of the strong coupling constant for charm and bottom quarks with respect to light (uds) quarks, based on a hadronic event sample obtained with the OPAL detector at LEP. Five observables related to global event shapes were used to measure alpha_s in three flavour tagged samples (uds, c and b). The event shape distributions were fitted by Order(alpha_s**2) calculations of jet production taking into account mass effects for the c and b quarks. We find: = 0.997 +- 0.038(stat.) +- 0.030(syst.) +- 0.012(theory) and = 0.993 +- 0.008(stat.) +- 0.006(syst.) +- 0.011(theory) for the ratios alpha_s(charm)/alpha_s(uds) and alpha_s(b)/alpha_s(uds) respectively.

  9. Towards the ’Verified Verifier’. Theory and Practice

    Directory of Open Access Journals (Sweden)

    D. A. Kondratyev

    2014-01-01

    Full Text Available As opposed to traditional testing, the deductive verification represents a formal way to examine the program correctness. But what about the correctness of the verification system itself? The theoretical foundations of Hoare’s logic were examined in classical works, and some soundness/completeness theorems are well-known. However, we practically are not aware of implementations of those theoretical methods which were subjected to anything more than testing. In other words, our ultimate goal is a verification system which can be self-applicable (at least partially. In our recent studies we addressed ourselves to the metageneration approach in order to make such a task more feasible.

  10. When is rational to order a diagnostic test, or prescribe treatment: the threshold model as an explanation of practice variation.

    Science.gov (United States)

    Djulbegovic, Benjamin; van den Ende, Jef; Hamm, Robert M; Mayrhofer, Thomas; Hozo, Iztok; Pauker, Stephen G

    2015-05-01

    The threshold model represents an important advance in the field of medical decision-making. It is a linchpin between evidence (which exists on the continuum of credibility) and decision-making (which is a categorical exercise - we decide to act or not act). The threshold concept is closely related to the question of rational decision-making. When should the physician act, that is order a diagnostic test, or prescribe treatment? The threshold model embodies the decision theoretic rationality that says the most rational decision is to prescribe treatment when the expected treatment benefit outweighs its expected harms. However, the well-documented large variation in the way physicians order diagnostic tests or decide to administer treatments is consistent with a notion that physicians' individual action thresholds vary. We present a narrative review summarizing the existing literature on physicians' use of a threshold strategy for decision-making. We found that the observed variation in decision action thresholds is partially due to the way people integrate benefits and harms. That is, explanation of variation in clinical practice can be reduced to a consideration of thresholds. Limited evidence suggests that non-expected utility threshold (non-EUT) models, such as regret-based and dual-processing models, may explain current medical practice better. However, inclusion of costs and recognition of risk attitudes towards uncertain treatment effects and comorbidities may improve the explanatory and predictive value of the EUT-based threshold models. The decision when to act is closely related to the question of rational choice. We conclude that the medical community has not yet fully defined criteria for rational clinical decision-making. The traditional notion of rationality rooted in EUT may need to be supplemented by reflective rationality, which strives to integrate all aspects of medical practice - medical, humanistic and socio-economic - within a coherent

  11. POSSIBILITIES TO EVALUATE THE QUALITY OF EDUCATION BY VERIFYING THE DISTRIBUTION OF MARKS

    Directory of Open Access Journals (Sweden)

    Alexandru BOROIU

    2015-05-01

    Full Text Available In the higher education, for the evaluation of education process it is of high interest to use some numeric indicators obtained from the database with the final results realized by the students on exams session. For this purpose could be used the following numeric indicators: proportion of students absent on final evaluation, proportion of non-promoted students, normality degree of passing marks distribution. In order to do this we realized an Excel calculation program that could be applied to each discipline. The inputs are concrete (students total, students present to final evaluation, marks absolute frequency and the outputs for the three indicators are binary (competent or noncompetent, in the last situation the verdict being: “Give explanations. Propose an action plan, with actions, responsible and terms”. To verify the imposed normality degree we elaborate a calculation program based on Kolmogorov-Smirnov concordance test. So, it was realized the increase of analyze objectivity and it was created the opportunity to apply corrective measures in order to improve the education process.

  12. Age related neuromuscular changes in sEMG of m. Tibialis Anterior using higher order statistics (Gaussianity & linearity test).

    Science.gov (United States)

    Siddiqi, Ariba; Arjunan, Sridhar P; Kumar, Dinesh K

    2016-08-01

    Age-associated changes in the surface electromyogram (sEMG) of Tibialis Anterior (TA) muscle can be attributable to neuromuscular alterations that precede strength loss. We have used our sEMG model of the Tibialis Anterior to interpret the age-related changes and compared with the experimental sEMG. Eighteen young (20-30 years) and 18 older (60-85 years) performed isometric dorsiflexion at 6 different percentage levels of maximum voluntary contractions (MVC), and their sEMG from the TA muscle was recorded. Six different age-related changes in the neuromuscular system were simulated using the sEMG model at the same MVCs as the experiment. The maximal power of the spectrum, Gaussianity and Linearity Test Statistics were computed from the simulated and experimental sEMG. A correlation analysis at α=0.05 was performed between the simulated and experimental age-related change in the sEMG features. The results show the loss in motor units was distinguished by the Gaussianity and Linearity test statistics; while the maximal power of the PSD distinguished between the muscular factors. The simulated condition of 40% loss of motor units with halved the number of fast fibers best correlated with the age-related change observed in the experimental sEMG higher order statistical features. The simulated aging condition found by this study corresponds with the moderate motor unit remodelling and negligible strength loss reported in literature for the cohorts aged 60-70 years.

  13. NOS CO-OPS Water Level Data, Verified, High Low

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), daily, high low water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services...

  14. NOS CO-OPS Water Level Data, Verified, 6-Minute

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), 6-minute, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  15. Verifying Correct Usage of Atomic Blocks and Typestate: Technical Companion

    National Research Council Canada - National Science Library

    Beckman, Nels E; Aldrich, Jonathan

    2008-01-01

    In this technical report, we present a static and dynamic semantics as well as a proof of soundness for a programming language presented in the paper entitled, 'Verifying Correct Usage of Atomic Blocks and Typestate...

  16. NOS CO-OPS Water Level Data, Verified, Hourly

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset has verified (quality-controlled), hourly, water level (tide) data from NOAA NOS Center for Operational Oceanographic Products and Services (CO-OPS)....

  17. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    Science.gov (United States)

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Non-genetic health professionals' attitude towards, knowledge of and skills in discussing and ordering genetic testing for hereditary cancer.

    Science.gov (United States)

    Douma, Kirsten F L; Smets, Ellen M A; Allain, Dawn C

    2016-04-01

    Non-genetic health professionals (NGHPs) have insufficient knowledge of cancer genetics, express educational needs and are unprepared to counsel their patients regarding their genetic test results. So far, it is unclear how NGHPs perceive their own communication skills. This study was undertaken to gain insight in their perceptions, attitudes and knowledge. Two publically accessible databases were used to invite NGHPs providing cancer genetic services to complete a questionnaire. The survey assessed: sociodemographic attributes, experience in ordering hereditary cancer genetic testing, attitude, knowledge, perception of communication skills (e.g. information giving, decision-making) and educational needs. Of all respondents (N = 49, response rate 11%), most have a positive view of their own information giving (mean = 53.91, range 13-65) and decision making skills (64-77% depending on topic). NGHPs feel responsible for enabling disease and treatment related behavior (89-91%). However, 20-30% reported difficulties managing patients' emotions and did not see management of long-term emotions as their responsibility. Correct answers on knowledge questions ranged between 41 and 96%. Higher knowledge was associated with more confidence in NGHPs' own communication skills (r(s) = .33, p = 0.03). Although NGHPs have a positive view of their communication skills, they perceive more difficulties managing emotions. The association between less confidence in communication skills and lower knowledge level suggests awareness of knowledge gaps affects confidence. NGHPs might benefit from education about managing client emotions. Further research using observation of actual counselling consultations is needed to investigate the skills of this specific group of providers.

  19. Diagnostic utility of carotid artery duplex ultrasonography in the evaluation of syncope: a good test ordered for the wrong reason.

    Science.gov (United States)

    Kadian-Dodov, Daniella; Papolos, Alexander; Olin, Jeffrey W

    2015-06-01

    Syncope refers to a transient loss of consciousness and postural tone secondary to cerebral hypoperfusion. Guidelines recommend against neurovascular testing in cases of syncope without neurologic symptoms; however, many pursue carotid artery duplex ultrasonography (CUS) due to the prognostic implications of identified cerebrovascular disease. Our objective was to determine the diagnostic utility of CUS in the evaluation of syncope and the identification of new or severe atherosclerosis with the potential to change patient management. We reviewed records of 569 patients with CUS ordered for the primary indication of syncope through an accredited vascular laboratory at an academic, urban medical centre. Findings on CUS, patient demographic, clinical and laboratory information, and medications within 6 months of the CUS exam were reviewed. Bivariate relationships between key medical history characteristics and atherosclerosis status (known vs. new disease) were examined. Among 495 patients with complete information, cerebrovascular findings could potentially explain syncope in 2% (10 patients). Optimization of cardiovascular risk factors would benefit patients with known (56.6%) and new atherosclerosis (33.5%) with suboptimal lipid control, (LDL > 70 in 42.2 and 34.9% respectively; LDL > 100 in 15.7 and 20.4%), and those not on high-intensity statin therapy (80 and 87.5%) or antiplatelet medications (13.2 and 50.6%). CUS is a low-yield diagnostic test in the evaluation of syncope, but it is useful in the diagnosis of atherosclerosis and identification of subjects who would benefit from optimal medical therapy. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  20. Exploration of Bacillus thuringiensis Berl. from soil and screening test its toxicity on insects of Lepidoptera order

    Science.gov (United States)

    Astuti, DT; Pujiastuti, Y.; Suparman, SHK; Damiri, N.; Nugraha, S.; Sembiring, ER; Mulawarman

    2018-01-01

    Bacillus thuringiensis is a gram-positive bacterium that produces crystal proteins toxic (ᴕ-endotoxin) specific to the target insect, but is not toxic to humans and non-target organisms. This study aims to explore the origin of the soil bacterium B. thuringiensis sub-district Sekayu, Banyuasin, South Sumatra and toxicity to larvae of lepidoptera. Fifty soil samples were taken from Musi Banyuasin District, namely 15 from Kayuare strip 2, 20 from Kayuare and 15 from Lumpatan. Isolation, characterization, identification and screening test were conducted in the laboratorium of Pest and Disease, Agricultural Faculty, Sriwijaya University. Isolat codes were given based on the area origin of the samples. Results of the study showed that from 50 isolates of bacteria that had been isolated, there were 15 bacterial isolates, characterized by morphology and physiology the same as B. thuringiensis, which has round colonies, white, wrinkled edges, slippery, elevation arise, aerobic and gram-positive. Of the 15 codes that contain positive isolates of B. thuringiensis, we have obtained several isolates of the following codes: KJ2D5, KJ2N1, KJ2N4, KJ2B3, KJ3R1, KJ3R2, KJ3R3, KJ3R5, KJ3J3, KJ3J4, KJ3P1, DLM5, DLKK12, and DLKK23. Results of screening tests on insects of the Lepidoptera Order showed that there were six isolates that had toxic to Plutella xylostella and Spodoptera litura insects, ie bacterial isolate codes DLM5, KJ3R3, KJ3R5, KJ3J4, KJ3P1, and DLKK23.

  1. A Low-order Coupled Chemistry Meteorology Model for Testing Online and Offline Advanced Data Assimilation Schemes

    Science.gov (United States)

    Bocquet, M.; Haussaire, J. M.

    2015-12-01

    Bocquet and Sakov have recently introduced a low-order model based on the coupling of thechaotic Lorenz-95 model which simulates winds along a mid-latitude circle, with thetransport of a tracer species advected by this wind field. It has been used to testadvanced data assimilation methods with an online model that couples meteorology andtracer transport. In the present study, the tracer subsystem of the model is replacedwith a reduced photochemistry module meant to emulate reactive air pollution. Thiscoupled chemistry meteorology model, the L95-GRS model, mimics continental andtranscontinental transport and photochemistry of ozone, volatile organic compounds andnitrogen dioxides.The L95-GRS is specially useful in testing advanced data assimilation schemes, such as theiterative ensemble Kalman smoother (IEnKS) that combines the best of ensemble andvariational methods. The model provides useful insights prior to any implementation ofthe data assimilation method on larger models. For instance, online and offline dataassimilation strategies based on the ensemble Kalman filter or the IEnKS can easily beevaluated with it. It allows to document the impact of species concentration observationson the wind estimation. The model also illustrates a long standing issue in atmosphericchemistry forecasting: the impact of the wind chaotic dynamics and of the chemical speciesnon-chaotic but highly nonlinear dynamics on the selected data assimilation approach.

  2. 78 FR 78165 - Orders: Reporting by Regulated Entities of Stress Testing Results as of September 30, 2013...

    Science.gov (United States)

    2013-12-26

    ... Instructions and Guidance AGENCY: Federal Housing Finance Agency. ACTION: Orders. SUMMARY: In this document... is amending the Summary Instructions and Guidance, which accompanied the Orders. DATES: Each Order is effective November 26, 2013. FOR FURTHER INFORMATION CONTACT: Naa Awaa Tagoe, Senior Associate Director...

  3. Incentivizing Verifiable Privacy-Protection Mechanisms for Offline Crowdsensing Applications.

    Science.gov (United States)

    Sun, Jiajun; Liu, Ningzhong

    2017-09-04

    Incentive mechanisms of crowdsensing have recently been intensively explored. Most of these mechanisms mainly focus on the standard economical goals like truthfulness and utility maximization. However, enormous privacy and security challenges need to be faced directly in real-life environments, such as cost privacies. In this paper, we investigate offline verifiable privacy-protection crowdsensing issues. We firstly present a general verifiable privacy-protection incentive mechanism for the offline homogeneous and heterogeneous sensing job model. In addition, we also propose a more complex verifiable privacy-protection incentive mechanism for the offline submodular sensing job model. The two mechanisms not only explore the private protection issues of users and platform, but also ensure the verifiable correctness of payments between platform and users. Finally, we demonstrate that the two mechanisms satisfy privacy-protection, verifiable correctness of payments and the same revenue as the generic one without privacy protection. Our experiments also validate that the two mechanisms are both scalable and efficient, and applicable for mobile devices in crowdsensing applications based on auctions, where the main incentive for the user is the remuneration.

  4. The assessment of stat laboratory test ordering practice and impact of targeted individual feedback in an urban teaching hospital.

    Science.gov (United States)

    Sorita, Atsushi; Steinberg, Daniel I; Leitman, Michael; Burger, Alfred; Husk, Gregg; Sivaprasad, Latha

    2014-01-01

    Overuse of inpatient stat laboratory orders ("stat" is an abbreviation of the Latin word "statim," meaning immediately, without delay) is a major problem in the modern healthcare system. To understand patterns of stat laboratory ordering practices at our institution and to assess the effectiveness of individual feedback in reducing these orders. Medicine and General Surgery residents were given a teaching session about appropriate stat ordering practice in January 2010. Individual feedback was given to providers who were the highest utilizers of stat laboratory orders by their direct supervisors from February through June of 2010. The proportion of stat orders out of total laboratory orders per provider was the main outcome measure. All inpatient laboratory orders from September 2009 to June 2010 were analyzed. The median proportion of stat orders out of total laboratory orders was 41.6% for nontrainee providers (N = 500), 38.7% for Medicine residents (N = 125), 80.2% for General Surgery residents (N = 32), and 24.2% for other trainee providers (N = 150). Among 27 providers who received feedback (7 nontrainees, 16 Medicine residents, and 4 General Surgery residents), the proportion of stat laboratory orders per provider decreased by 15.7% (95% confidence interval: 5.6%-25.9%, P = 0.004) after feedback, whereas the decrease among providers who were high utilizers but did not receive feedback (N = 39) was not significant (4.5%; 95% confidence interval: 2.1%-11.0%, P = 0.18). Monthly trends showed reduction in the proportion of stat orders among Medicine and General Surgery residents, but not among other trainee providers. The frequency of stat ordering was highly variable among providers. Individual feedback to the highest utilizers of stat orders was effective in decreasing these orders. © 2013 Society of Hospital Medicine.

  5. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  6. Verifiable Distribution of Material Goods Based on Cryptology

    Directory of Open Access Journals (Sweden)

    Radomír Palovský

    2015-12-01

    Full Text Available Counterfeiting of material goods is a general problem. In this paper an architecture for verifiable distribution of material goods is presented. This distribution is based on printing such a QR code on goods, which would contain digitally signed serial number of the product, and validity of this digital signature could be verifiable by a customer. Extension consisting of adding digital signatures to revenue stamps used for state-controlled goods is also presented. Discussion on possibilities in making copies leads to conclusion that cryptographic security needs to be completed by technical difficulties of copying.

  7. Raising test scores vs. teaching higher order thinking (HOT): senior science teachers' views on how several concurrent policies affect classroom practices

    Science.gov (United States)

    Zohar, Anat; Alboher Agmon, Vered

    2018-04-01

    This study investigates how senior science teachers viewed the effects of a Raising Test Scores policy and its implementation on instruction of higher order thinking (HOT), and on teaching thinking to students with low academic achievements.

  8. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Directory of Open Access Journals (Sweden)

    Fernanda C Dórea

    Full Text Available BACKGROUND: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. METHODS: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. RESULTS: High performance (F1-macro = 0.9995 was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro, due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro score of 0.923 (falling to 0.311 when classes are given equal weight. A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro= 0.994 and F(1-macro = .955, however the classification process is not transparent to the domain experts. CONCLUSION: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish

  9. EFFECTS ON THE PERFORMANCE DURING A MATCHING-TO-SAMPLE TASK DUE TO THE TYPE AND ORDER OF EXPOSITION TO THE TRANSFERENCE TESTS

    Directory of Open Access Journals (Sweden)

    CAMILO HURTADO-PARRADO

    2007-08-01

    Full Text Available This study evaluated the effects of manipulating the type and order of presentation of transference tests. Twenty eightundergraduate students divided in 4 groups were exposed to a second order matching to sample procedure. Theconditions of exposition were: ascending difficulty/complexity order of the tests, descending order and two randomlyassigned orders. Results are discussed in terms of percentages of effectiveness; additionally, the latency is proposed asan alternative measure sensitive to the level of difficulty of this kind of tasks. Findings showed heterogeneity in thevelocity of acquisition of the conditional discriminations during the training phase, even though the conditions of thetask were equal for all the subjects. The exposition to the ascending and descending order seemed to affect negativelythe effective behavioral adjustment, whereas one of the randomly assigned sequences seemed to be the best condition.The order of exposition to transference tests, in interaction with a history of early acquisition in the training phase,served to understand the findings of this study and to discuss the necessity of a systematical evaluation of the factors implied in the transference tests. It is suggested to assess the validity of different kind of transference tests and theconvenience of some of them to be use in the investigation of the phenomena related to the effective and variablebehavior.

  10. Verifying different-modality properties for concepts produces switching costs.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Barsalou, Lawrence W

    2003-03-01

    According to perceptual symbol systems, sensorimotor simulations underlie the representation of concepts. It follows that sensorimotor phenomena should arise in conceptual processing. Previous studies have shown that switching from one modality to another during perceptual processing incurs a processing cost. If perceptual simulation underlies conceptual processing, then verifying the properties of concepts should exhibit a switching cost as well. For example, verifying a property in the auditory modality (e.g., BLENDER-loud) should be slower after verifying a property in a different modality (e.g., CRANBERRIES-tart) than after verifying a property in the same modality (e.g., LEAVES-rustling). Only words were presented to subjects, and there were no instructions to use imagery. Nevertheless, switching modalities incurred a cost, analogous to the cost of switching modalities in perception. A second experiment showed that this effect was not due to associative priming between properties in the same modality. These results support the hypothesis that perceptual simulation underlies conceptual processing.

  11. An experiment designed to verify the general theory of relativity

    International Nuclear Information System (INIS)

    Surdin, Maurice

    1960-01-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [fr

  12. Building Program Verifiers from Compilers and Theorem Provers

    Science.gov (United States)

    2015-05-14

    Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015

  13. Verifying a smart design of TCAP : a synergetic experience

    NARCIS (Netherlands)

    T. Arts; I.A. van Langevelde

    1999-01-01

    textabstractAn optimisation of the SS No. 7 Transport Capabilities Procedures is verified by specifying both the original and the optimised {scriptsize sf TCAP in {scriptsize sf $mu$CRL, generating transition systems for both using the {scriptsize sf $mu$CRL tool set, and checking weak bisimulation

  14. A Trustworthy Internet Auction Model with Verifiable Fairness.

    Science.gov (United States)

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  15. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...... tool based on belief logics and explicit attacker knowledge....

  16. Making Digital Artifacts on the Web Verifiable and Reliable

    NARCIS (Netherlands)

    Kuhn, T.; Dumontier, M.

    2015-01-01

    The current Web has no general mechanisms to make digital artifacts - such as datasets, code, texts, and images - verifiable and permanent. For digital artifacts that are supposed to be immutable, there is moreover no commonly accepted method to enforce this immutability. These shortcomings have a

  17. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    International Nuclear Information System (INIS)

    Jacobson, Jacob J.; Jeffers, Robert F.; Matthern, Gretchen E.; Piet, Steven J.; Baker, Benjamin A.; Grimm, Joseph

    2009-01-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R and D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several Microsoft

  18. Tests of QED [Quantum Electrodynamics] to fourth order in alpha in electron-positron collisions at 29 GeV

    International Nuclear Information System (INIS)

    Hawkins, C.A.

    1989-02-01

    Tests of Quantum Electrodynamics to order /alpha//sup 4/ in e/sup +/e/sup /minus// collisions using the ASP detector at PEP (/radical/s = 29 GeV) are presented. Measurements are made of e/sup +/e/sup /minus// /yields/ /gamma//gamma//gamma//gamma/, e/sup +/e/sup /minus// /yields/ e/sup +/e/sup /minus///gamma//gamma/ and e/sup +/e/sup /minus// /yields/ e/sup +/e/sup /minus//e/sup +/e/sup /minus// where all four final state particles are separated from the beam line and each other. These are the most precise and highest statistics measurements yet reported for these processes. The ratios of measured to predicted cross sections are /gamma//gamma//gamma//gamma/: 0.97 /plus minus/ 0.04 /plus minus/ 0.14 e/sup /+/e/sup /minus///gamma/gamma/: 0.94 /plus minus/ 0.03 /plus minus/ 0.03 e/sup +/e/sup /minus//e/sup +/e/sup /minus//: 1.01 /plus minus/ 0.02 /plus minus/ 0.04 where the first uncertainty is the systematic uncertainty, and the second is the statistical uncertainty. All measurements show good agreement with theoretical predictions. A Monte Carlo method for simulating multi-pole processes is also presented, along with applications to the e/sup +/e/sup /minus// /yields/ e/sup +/e/sup /minus///gamma//gamma/ and e/sup +/e/sup /minus// /yields/ /gamma//gamma//gamma//gamma/ processes. The first measurements of five-body /alpha//sup 5/ events (/sup 5//gamma/, e/sup +/e/sup /minus///gamma//gamma//gamma/ and e/sup +/e/sup /minus//e/sup +/ e/sup /minus///gamma/) and one candidate six-body /alpha//sup 6/event (e/sup +/e/sup /minus//4/gamma/) are reported. Both the /alpha//sup 5/ and /alpha//sup 6/ measurements agree with estimates of their cross sections. 20 refs., 34 figs., 14 tabs

  19. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    Science.gov (United States)

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  20. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  1. Building and Verifying a Predictive Model of Interruption Resumption

    Science.gov (United States)

    2012-03-01

    the gardener to remember those plants (and whether they need to be removed), and so will not commit resources to remember that information . The overall...camera), the storyteller needed help much less often. This result suggests that when there is no one to help them remember the last thing they said...INV ITED P A P E R Building and Verifying a Predictive Model of Interruption Resumption Help from a robot, to allow a human storyteller to continue

  2. Verifying a nuclear weapon`s response to radiation environments

    Energy Technology Data Exchange (ETDEWEB)

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  3. TrustGuard: A Containment Architecture with Verified Output

    Science.gov (United States)

    2017-01-01

    that the TrustGuard system has minimal performance decline, despite restrictions such as high communication latency and limited available bandwidth...design are the availability of high bandwidth and low delays between the host and the monitoring chip. 3-D integration provides an alternate way of...TRUSTGUARD: A CONTAINMENT ARCHITECTURE WITH VERIFIED OUTPUT SOUMYADEEP GHOSH A DISSERTATION PRESENTED TO THE FACULTY OF PRINCETON UNIVERSITY IN

  4. Verifying Temporal Properties of Reactive Systems by Transformation

    OpenAIRE

    Hamilton, Geoff

    2015-01-01

    We show how program transformation techniques can be used for the verification of both safety and liveness properties of reactive systems. In particular, we show how the program transformation technique distillation can be used to transform reactive systems specified in a functional language into a simplified form that can subsequently be analysed to verify temporal properties of the systems. Example systems which are intended to model mutual exclusion are analysed using these techniques with...

  5. Robustness and device independence of verifiable blind quantum computing

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-01-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456–60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant. (paper)

  6. Rigidity of quantum steering and one-sided device-independent verifiable quantum computation

    International Nuclear Information System (INIS)

    Gheorghiu, Alexandru; Wallden, Petros; Kashefi, Elham

    2017-01-01

    The relationship between correlations and entanglement has played a major role in understanding quantum theory since the work of Einstein et al (1935 Phys. Rev. 47 777–80). Tsirelson proved that Bell states, shared among two parties, when measured suitably, achieve the maximum non-local correlations allowed by quantum mechanics (Cirel’son 1980 Lett. Math. Phys. 4 93–100). Conversely, Reichardt et al showed that observing the maximal correlation value over a sequence of repeated measurements, implies that the underlying quantum state is close to a tensor product of maximally entangled states and, moreover, that it is measured according to an ideal strategy (Reichardt et al 2013 Nature 496 456–60). However, this strong rigidity result comes at a high price, requiring a large number of entangled pairs to be tested. In this paper, we present a significant improvement in terms of the overhead by instead considering quantum steering where the device of the one side is trusted. We first demonstrate a robust one-sided device-independent version of self-testing, which characterises the shared state and measurement operators of two parties up to a certain bound. We show that this bound is optimal up to constant factors and we generalise the results for the most general attacks. This leads us to a rigidity theorem for maximal steering correlations. As a key application we give a one-sided device-independent protocol for verifiable delegated quantum computation, and compare it to other existing protocols, to highlight the cost of trust assumptions. Finally, we show that under reasonable assumptions, the states shared in order to run a certain type of verification protocol must be unitarily equivalent to perfect Bell states. (paper)

  7. Verifiable Outsourced Decryption of Attribute-Based Encryption with Constant Ciphertext Length

    Directory of Open Access Journals (Sweden)

    Jiguo Li

    2017-01-01

    Full Text Available Outsourced decryption ABE system largely reduces the computation cost for users who intend to access the encrypted files stored in cloud. However, the correctness of the transformation ciphertext cannot be guaranteed because the user does not have the original ciphertext. Lai et al. provided an ABE scheme with verifiable outsourced decryption which helps the user to check whether the transformation done by the cloud is correct. In order to improve the computation performance and reduce communication overhead, we propose a new verifiable outsourcing scheme with constant ciphertext length. To be specific, our scheme achieves the following goals. (1 Our scheme is verifiable which ensures that the user efficiently checks whether the transformation is done correctly by the CSP. (2 The size of ciphertext and the number of expensive pairing operations are constant, which do not grow with the complexity of the access structure. (3 The access structure in our scheme is AND gates on multivalued attributes and we prove our scheme is verifiable and it is secure against selectively chosen-plaintext attack in the standard model. (4 We give some performance analysis which indicates that our scheme is adaptable for various limited bandwidth and computation-constrained devices, such as mobile phone.

  8. Anesthesiologists’ and surgeons’ perceptions about routine pre-operative testing in low-risk patients: application of the Theoretical Domains Framework (TDF to identify factors that influence physicians’ decisions to order pre-operative tests

    Directory of Open Access Journals (Sweden)

    Patey Andrea M

    2012-06-01

    Full Text Available Abstract Background Routine pre-operative tests for anesthesia management are often ordered by both anesthesiologists and surgeons for healthy patients undergoing low-risk surgery. The Theoretical Domains Framework (TDF was developed to investigate determinants of behaviour and identify potential behaviour change interventions. In this study, the TDF is used to explore anaesthesiologists’ and surgeons’ perceptions of ordering routine tests for healthy patients undergoing low-risk surgery. Methods Sixteen clinicians (eleven anesthesiologists and five surgeons throughout Ontario were recruited. An interview guide based on the TDF was developed to identify beliefs about pre-operative testing practices. Content analysis of physicians’ statements into the relevant theoretical domains was performed. Specific beliefs were identified by grouping similar utterances of the interview participants. Relevant domains were identified by noting the frequencies of the beliefs reported, presence of conflicting beliefs, and perceived influence on the performance of the behaviour under investigation. Results Seven of the twelve domains were identified as likely relevant to changing clinicians’ behaviour about pre-operative test ordering for anesthesia management. Key beliefs were identified within these domains including: conflicting comments about who was responsible for the test-ordering (Social/professional role and identity; inability to cancel tests ordered by fellow physicians (Beliefs about capabilities and social influences; and the problem with tests being completed before the anesthesiologists see the patient (Beliefs about capabilities and Environmental context and resources. Often, tests were ordered by an anesthesiologist based on who may be the attending anesthesiologist on the day of surgery while surgeons ordered tests they thought anesthesiologists may need (Social influences. There were also conflicting comments about the potential

  9. Anesthesiologists' and surgeons' perceptions about routine pre-operative testing in low-risk patients: application of the Theoretical Domains Framework (TDF) to identify factors that influence physicians' decisions to order pre-operative tests.

    Science.gov (United States)

    Patey, Andrea M; Islam, Rafat; Francis, Jill J; Bryson, Gregory L; Grimshaw, Jeremy M

    2012-06-09

    Routine pre-operative tests for anesthesia management are often ordered by both anesthesiologists and surgeons for healthy patients undergoing low-risk surgery. The Theoretical Domains Framework (TDF) was developed to investigate determinants of behaviour and identify potential behaviour change interventions. In this study, the TDF is used to explore anaesthesiologists' and surgeons' perceptions of ordering routine tests for healthy patients undergoing low-risk surgery. Sixteen clinicians (eleven anesthesiologists and five surgeons) throughout Ontario were recruited. An interview guide based on the TDF was developed to identify beliefs about pre-operative testing practices. Content analysis of physicians' statements into the relevant theoretical domains was performed. Specific beliefs were identified by grouping similar utterances of the interview participants. Relevant domains were identified by noting the frequencies of the beliefs reported, presence of conflicting beliefs, and perceived influence on the performance of the behaviour under investigation. Seven of the twelve domains were identified as likely relevant to changing clinicians' behaviour about pre-operative test ordering for anesthesia management. Key beliefs were identified within these domains including: conflicting comments about who was responsible for the test-ordering (Social/professional role and identity); inability to cancel tests ordered by fellow physicians (Beliefs about capabilities and social influences); and the problem with tests being completed before the anesthesiologists see the patient (Beliefs about capabilities and Environmental context and resources). Often, tests were ordered by an anesthesiologist based on who may be the attending anesthesiologist on the day of surgery while surgeons ordered tests they thought anesthesiologists may need (Social influences). There were also conflicting comments about the potential consequences associated with reducing testing, from negative

  10. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  11. Low-order model of the Loss-of-Fluid Test (LOFT) reactor plant for use in Kalman filter-based optimal estimators

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1980-01-01

    A low-order, nonlinear model of the Loss-of-Fluid Test (LOFT) reactor plant, for use in Kalman filter estimators, is developed, described, and evaluated. This model consists of 31 differential equations and represents all major subsystems of both the primary and secondary sides of the LOFT plant. Comparisons between model calculations and available LOFT power range testing transients demonstrate the accuracy of the low-order model. The nonlinear model is numerically linearized for future implementation in Kalman filter and optimal control algorithms. The linearized model is shown to be an adequate representation of the nonlinear plant dynamics

  12. Implementation of a Computerized Order Entry Tool to Reduce the Inappropriate and Unnecessary Use of Cardiac Stress Tests With Imaging in Hospitalized Patients.

    Science.gov (United States)

    Gertz, Zachary M; O'Donnell, William; Raina, Amresh; Balderston, Jessica R; Litwack, Andrew J; Goldberg, Lee R

    2016-10-15

    The rising use of imaging cardiac stress tests has led to potentially unnecessary testing. Interventions designed to reduce inappropriate stress testing have focused on the ambulatory setting. We developed a computerized order entry tool intended to reduce the use of imaging cardiac stress tests and improve appropriate use in hospitalized patients. The tool was evaluated using preimplementation and postimplementation cohorts at a single urban academic teaching hospital. All hospitalized patients referred for testing were included. The co-primary outcomes were the use of imaging stress tests as a percentage of all stress tests and the percentage of inappropriate tests, compared between the 2 cohorts. There were 478 patients in the precohort and 463 in the postcohort. The indication was chest pain in 66% and preoperative in 18% and was not significantly different between groups. The use of nonimaging stress tests increased from 4% in the pregroup to 15% in the postgroup (p nonimaging stress tests increased from 7% to 25% (p nonimaging cardiac stress tests and reduced the use of imaging tests yet was not able to reduce inappropriate use. Our study highlights the differences in cardiac stress testing between hospitalized and ambulatory patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Verifying the integrity of hardcopy document using OCR

    CSIR Research Space (South Africa)

    Mthethwa, Sthembile

    2018-03-01

    Full Text Available stream_source_info Mthethwa_20042_2018.pdf.txt stream_content_type text/plain stream_size 7349 Content-Encoding UTF-8 stream_name Mthethwa_20042_2018.pdf.txt Content-Type text/plain; charset=UTF-8 Verifying the Integrity...) of the document to be defined. Each text in the meta-template is labelled with a unique identifier, which makes it easier for the process of validation. The meta-template consist of two types of text; normal text and validation text (important text that must...

  14. Verifying Architectural Design Rules of the Flight Software Product Line

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  15. From Operating-System Correctness to Pervasively Verified Applications

    Science.gov (United States)

    Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike

    Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.

  16. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    Science.gov (United States)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  17. ASTUS system for verifying the transport seal TITUS 1

    International Nuclear Information System (INIS)

    Barillaux; Monteil, D.; Destain, G.D.

    1991-01-01

    ASTUS, a system for acquisition and processing ultrasonic signatures of TITUS 1 seals has been developed. TITUS seals are used to verify the integrity of the fissile material's container sealing after transport. An autonomous portable reading case permit to take seals signatures at the starting point and to transmit these reference signatures to a central safeguards computer by phonic modem. Then, at the terminal point with a similar reading case, an authority takes again the signature of seals and immediately transmit these signatures to the central safeguards computer. The central computer processes the data in real time by autocorrelation and return its verdict to the terminal point

  18. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    We propose an approach to automatic verification of real-time systems against scenario-based requirements. A real-time system is modeled as a network of Timed Automata (TA), and a scenario-based requirement is specified as a Live Sequence Chart (LSC). We define a trace-based semantics for a kernel...... subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  19. Organization of Proficiency Testing for Dairy Laboratories in Croatia, Bosnia and Herzegovina and Macedonia in Order to Improve Quality Assurance

    Directory of Open Access Journals (Sweden)

    Nataša Mikulec

    2009-06-01

    Full Text Available Participation in proficiency testing is not only an obligation for all analytical laboratories which tend to be credible, but also an opportunity to check how the results agree with the reference or assign value. The Reference Laboratory for Milk and Dairy Products of the Dairy Science Department, Faculty of Agriculture University of Zagreb, is itself incorporated in the proficiency testing organized by dairy laboratories from Germany, Italy, France, Switzerland and Slovenia. The aim is to find out its own accuracy and reliability in particular milk and dairy products analyses. On the basis of seven years experience of participating in proficiency testing, five years ago the Reference Laboratory started organizing its own proficiency testing for dairy laboratories in Croatia, Bosnia and Herzegovina and Macedonia for milk components such as milk fat, protein, lactose and somatic cells count. The results of the analyses have been statistically analyzed and, on the basis of Z-score, the successful measurements have been estimated. The aim of this paper is to demonstrate the organisation and data processing of proficiency testing for milk fat, protein, lactose and somatic cells count in milk for the involved dairy laboratories.

  20. Hypothesis Tests for Bernoulli Experiments: Ordering the Sample Space by Bayes Factors and Using Adaptive Significance Levels for Decisions

    Directory of Open Access Journals (Sweden)

    Carlos A. de B. Pereira

    2017-12-01

    Full Text Available The main objective of this paper is to find the relation between the adaptive significance level presented here and the sample size. We statisticians know of the inconsistency, or paradox, in the current classical tests of significance that are based on p-value statistics that are compared to the canonical significance levels (10%, 5%, and 1%: “Raise the sample to reject the null hypothesis” is the recommendation of some ill-advised scientists! This paper will show that it is possible to eliminate this problem of significance tests. We present here the beginning of a larger research project. The intention is to extend its use to more complex applications such as survival analysis, reliability tests, and other areas. The main tools used here are the Bayes factor and the extended Neyman–Pearson Lemma.

  1. Molecular phylogeny of selected species of the order Dinophysiales (Dinophyceae) - testing the hypothesis of a Dinophysioid radiation

    DEFF Research Database (Denmark)

    Jensen, Maria Hastrup; Daugbjerg, Niels

    2009-01-01

    additional information on morphology and ecology to these evolutionary lineages. We have for the first time combined morphological information with molecular phylogenies to test the dinophysioid radiation hypothesis in a modern context. Nuclear-encoded LSU rDNA sequences including domains D1-D6 from 27...

  2. Non-genetic health professionals' attitude towards, knowledge of and skills in discussing and ordering genetic testing for hereditary cancer

    NARCIS (Netherlands)

    Douma, Kirsten F. L.; Smets, Ellen M. A.; Allain, Dawn C.

    2016-01-01

    Non-genetic health professionals (NGHPs) have insufficient knowledge of cancer genetics, express educational needs and are unprepared to counsel their patients regarding their genetic test results. So far, it is unclear how NGHPs perceive their own communication skills. This study was undertaken to

  3. Ordering blood tests for patients with unexplained fatigue in general practice: what does it yield? Results of the VAMPIRE trial.

    NARCIS (Netherlands)

    Koch, H.; Bokhoven, M.A. van; Riet, G. ter; Alphen-Jager, J.T. van; Weijden, T.T. van der; Dinant, G.J.; Bindels, P.J.

    2009-01-01

    BACKGROUND: Unexplained fatigue is frequently encountered in general practice. Because of the low prior probability of underlying somatic pathology, the positive predictive value of abnormal (blood) test results is limited in such patients. AIM: The study objectives were to investigate the

  4. Ordering blood tests for patients with unexplained fatigue in general practice: what does it yield? Results of the VAMPIRE trial

    NARCIS (Netherlands)

    Koch, Hèlen; van Bokhoven, Marloes A.; ter Riet, Gerben; van Alphen-Jager, Jm Tineke; van der Weijden, Trudy; Dinant, Geert-Jan; Bindels, Patrick Je

    2009-01-01

    Background Unexplained fatigue is frequently encountered in general practice. Because of the low prior probability of underlying somatic pathology, the positive predictive value of abnormal (blood) test results is limited in such patients. Aim The study objectives were to investigate the

  5. People consider reliability and cost when verifying their autobiographical memories.

    Science.gov (United States)

    Wade, Kimberley A; Nash, Robert A; Garry, Maryanne

    2014-02-01

    Because memories are not always accurate, people rely on a variety of strategies to verify whether the events that they remember really did occur. Several studies have examined which strategies people tend to use, but none to date has asked why people opt for certain strategies over others. Here we examined the extent to which people's beliefs about the reliability and the cost of different strategies would determine their strategy selection. Subjects described a childhood memory and then suggested strategies they might use to verify the accuracy of that memory. Next, they rated the reliability and cost of each strategy, and the likelihood that they might use it. Reliability and cost each predicted strategy selection, but a combination of the two ratings provided even greater predictive value. Cost was significantly more influential than reliability, which suggests that a tendency to seek and to value "cheap" information more than reliable information could underlie many real-world memory errors. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. A record and verify system for radiotherapy treatment

    International Nuclear Information System (INIS)

    Koens, M.L.; Vroome, H. de

    1984-01-01

    The Record and Verify system developed for the radiotherapy department of the Leiden University Hospital is described. The system has been in use since 1980 and will now be installed in at least four of the Dutch University Hospitals. The system provides the radiographer with a powerful tool for checking the set-up of the linear accelerator preceeding the irradiation of a field. After the irradiation of a field the machine settings are registered in the computer system together with the newly calculated cumulative dose. These registrations are used by the system to produce a daily report which provides the management of the department with insight into the established differences between treatment and treatment planning. Buying a record and verify system from the manufacturer of the linear accelerator is not an optimal solution especially for a department with more than one accelerator from different manufacturers. Integration in a Hospital Information System (HIS) has important advantages over the development of a dedicated departmental system. (author)

  7. Characterizing Verified Head Impacts in High School Girls' Lacrosse.

    Science.gov (United States)

    Caswell, Shane V; Lincoln, Andrew E; Stone, Hannah; Kelshaw, Patricia; Putukian, Margot; Hepburn, Lisa; Higgins, Michael; Cortes, Nelson

    2017-12-01

    Girls' high school lacrosse players have higher rates of head and facial injuries than boys. Research indicates that these injuries are caused by stick, player, and ball contacts. Yet, no studies have characterized head impacts in girls' high school lacrosse. To characterize girls' high school lacrosse game-related impacts by frequency, magnitude, mechanism, player position, and game situation. Descriptive epidemiology study. Thirty-five female participants (mean age, 16.2 ± 1.2 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) volunteered during 28 games in the 2014 and 2015 lacrosse seasons. Participants wore impact sensors affixed to the right mastoid process before each game. All game-related impacts recorded by the sensors were verified using game video. Data were summarized for all verified impacts in terms of frequency, peak linear acceleration (PLA), and peak rotational acceleration (PRA). Descriptive statistics and impact rates were calculated. Fifty-eight verified game-related impacts ≥20 g were recorded (median PLA, 33.8 g; median PRA, 6151.1 rad/s 2 ) during 467 player-games. The impact rate for all game-related verified impacts was 0.12 per athlete-exposure (AE) (95% CI, 0.09-0.16), equivalent to 2.1 impacts per team game, indicating that each athlete suffered fewer than 2 head impacts per season ≥20 g. Of these impacts, 28 (48.3%) were confirmed to directly strike the head, corresponding with an impact rate of 0.05 per AE (95% CI, 0.00-0.10). Overall, midfielders (n = 28, 48.3%) sustained the most impacts, followed by defenders (n = 12, 20.7%), attackers (n = 11, 19.0%), and goalies (n = 7, 12.1%). Goalies demonstrated the highest median PLA and PRA (38.8 g and 8535.0 rad/s 2 , respectively). The most common impact mechanisms were contact with a stick (n = 25, 43.1%) and a player (n = 17, 29.3%), followed by the ball (n = 7, 12.1%) and the ground (n = 7, 12.1%). One hundred percent of ball impacts occurred to goalies. Most impacts

  8. Business rescue decision making through verifier determinants – ask the specialists

    Directory of Open Access Journals (Sweden)

    Marius Pretorius

    2013-11-01

    Full Text Available Orientation: Business rescue has become a critical part of business strategy decision making, especially during economic downturns and recessions. Past legislation has generally supported creditor-friendly regimes, and its mind-set still applies which increases the difficulty of such turnarounds. There are many questions and critical issues faced by those involved in rescue. Despite extensive theory in the literature on failure, there is a void regarding practical verifiers of the signs and causes of venture decline, as specialists are not forthcoming about what they regard as their “competitive advantage”. Research purpose: This article introduces the concept and role of “verifier determinants” of early warning signs, as a tool to confirm the causes of decline in order to direct rescue strategies and, most importantly, reduce time between the first observation and the implementation of the rescue. Motivation for the study: Knowing how specialist practitioners confirm causes of business decline could assist in deciding on strategies for the rescue earlier than can be done using traditional due diligence which is time consuming. Reducing time is a crucial element of a successful rescue. Research design and approach: The researchers interviewed specialist practitioners with extensive experience in rescue and turnaround. An experimental design was used to ensure the specialists evaluated the same real cases to extract their experiences and base their decisions on. Main findings: The specialists confirmed the use of verifier determinants and identified such determinants as they personally used them to confirm causes of decline. These verifier determinants were classified into five categories; namely, management, finance, strategic, banking and operations and marketing of the ventures under investigation. The verifier determinants and their use often depend heavily on subconscious (non-factual information based on previous experiences

  9. Raising Test Scores vs. Teaching Higher Order Thinking (HOT): Senior Science Teachers' Views on How Several Concurrent Policies Affect Classroom Practices

    Science.gov (United States)

    Zohar, Anat; Alboher Agmon, Vered

    2018-01-01

    Purpose: This study investigates how senior science teachers viewed the effects of a Raising Test Scores policy and its implementation on instruction of higher order thinking (HOT), and on teaching thinking to students with low academic achievements. Background: The study was conducted in the context of three concurrent policies advocating: (a)…

  10. General practitioner views on the determinants of test ordering: a theory-based qualitative approach to the development of an intervention to improve immunoglobulin requests in primary care.

    Science.gov (United States)

    Cadogan, S L; McHugh, S M; Bradley, C P; Browne, J P; Cahill, M R

    2016-07-19

    Research suggests that variation in laboratory requesting patterns may indicate unnecessary test use. Requesting patterns for serum immunoglobulins vary significantly between general practitioners (GPs). This study aims to explore GP's views on testing to identify the determinants of behaviour and recommend feasible intervention strategies for improving immunoglobulin test use in primary care. Qualitative semi-structured interviews were conducted with GPs requesting laboratory tests at Cork University Hospital or University Hospital Kerry in the South of Ireland. GPs were identified using a Health Service Executive laboratory list of GPs in the Cork-Kerry region. A random sample of GPs (stratified by GP requesting patterns) was generated from this list. GPs were purposively sampled based on the criteria of location (urban/rural); length of time qualified; and practice size (single-handed/group). Interviews were carried out between December 2014 and February 2015. Interviews were transcribed verbatim using NVivo 10 software and analysed using the framework analysis method. Emerging themes were mapped to the theoretical domains framework (TDF), which outlines 12 domains that can enable or inhibit behaviour change. The behaviour change wheel and behaviour change technique (BCT) taxonomy were then used to identify potential intervention strategies. Sixteen GPs were interviewed (ten males and six females). Findings suggest that intervention strategies should specifically target the key barriers to effective test ordering, while considering the context of primary care practice. Seven domains from the TDF were perceived to influence immunoglobulin test ordering behaviours and were identified as 'mechanisms for change' (knowledge, environmental context and resources, social/professional role and identity, beliefs about capabilities, beliefs about consequences, memory, attention and decision-making processes and behavioural regulation). Using these TDF domains, seven BCTs

  11. [The development and evaluation of software to verify diagnostic accuracy].

    Science.gov (United States)

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  12. Calling Out Cheaters : Covert Security with Public VerifiabilitySecurity

    DEFF Research Database (Denmark)

    Asharov, Gilad; Orlandi, Claudio

    2012-01-01

    We introduce the notion of covert security with public verifiability, building on the covert security model introduced by Aumann and Lindell (TCC 2007). Protocols that satisfy covert security guarantee that the honest parties involved in the protocol will notice any cheating attempt with some...... constant probability ε. The idea behind the model is that the fear of being caught cheating will be enough of a deterrent to prevent any cheating attempt. However, in the basic covert security model, the honest parties are not able to persuade any third party (say, a judge) that a cheating occurred. We...... propose (and formally define) an extension of the model where, when an honest party detects cheating, it also receives a certificate that can be published and used to persuade other parties, without revealing any information about the honest party’s input. In addition, malicious parties cannot create fake...

  13. Developing a flexible and verifiable integrated dose assessment capability

    International Nuclear Information System (INIS)

    Parzyck, D.C.; Rhea, T.A.; Copenhaver, E.D.; Bogard, J.S.

    1987-01-01

    A flexible yet verifiable system of computing and recording personnel doses is needed. Recent directions in statutes establish the trend of combining internal and external doses. We are developing a Health Physics Information Management System (HPIMS) that will centralize dosimetry calculations and data storage; integrate health physics records with other health-related disciplines, such as industrial hygiene, medicine, and safety; provide a more auditable system with published algorithms and clearly defined flowcharts of system operation; readily facilitate future changes dictated by new regulations, new dosimetric models, and new systems of units; and address ad-hoc inquiries regarding worker/workplace interactions, including potential synergisms with non-radiation exposures. The system is modular and provides a high degree of isolation from low-level detail, allowing flexibility for changes without adversely affecting other parts of the system. 10 refs., 3 figs

  14. Design of a verifiable subset for HAL/S

    Science.gov (United States)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  15. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  16. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  17. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  18. Noninteractive Verifiable Outsourcing Algorithm for Bilinear Pairing with Improved Checkability

    Directory of Open Access Journals (Sweden)

    Yanli Ren

    2017-01-01

    Full Text Available It is well known that the computation of bilinear pairing is the most expensive operation in pairing-based cryptography. In this paper, we propose a noninteractive verifiable outsourcing algorithm of bilinear pairing based on two servers in the one-malicious model. The outsourcer need not execute any expensive operation, such as scalar multiplication and modular exponentiation. Moreover, the outsourcer could detect any failure with a probability close to 1 if one of the servers misbehaves. Therefore, the proposed algorithm improves checkability and decreases communication cost compared with the previous ones. Finally, we utilize the proposed algorithm as a subroutine to achieve an anonymous identity-based encryption (AIBE scheme with outsourced decryption and an identity-based signature (IBS scheme with outsourced verification.

  19. Modelling and Verifying Communication Failure of Hybrid Systems in HCSP

    DEFF Research Database (Denmark)

    Wang, Shuling; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    Hybrid systems are dynamic systems with interacting discrete computation and continuous physical processes. They have become ubiquitous in our daily life, e.g. automotive, aerospace and medical systems, and in particular, many of them are safety-critical. For a safety-critical hybrid system......, in the presence of communication failure, the expected control from the controller will get lost and as a consequence the physical process cannot behave as expected. In this paper, we mainly consider the communication failure caused by the non-engagement of one party in communication action, i.......e. the communication itself fails to occur. To address this issue, this paper proposes a formal framework by extending HCSP, a formal modeling language for hybrid systems, for modeling and verifying hybrid systems in the absence of receiving messages due to communication failure. We present two inference systems...

  20. A detailed and verified wind resource atlas for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Mortensen, N G; Landberg, L; Rathmann, O; Nielsen, M N [Risoe National Lab., Roskilde (Denmark); Nielsen, P [Energy and Environmental Data, Aalberg (Denmark)

    1999-03-01

    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  1. Verifying reciprocal relations for experimental diffusion coefficients in multicomponent mixtures

    DEFF Research Database (Denmark)

    Medvedev, Oleg; Shapiro, Alexander

    2003-01-01

    The goal of the present study is to verify the agreement of the available data on diffusion in ternary mixtures with the theoretical requirement of linear non-equilibrium thermodynamics consisting in symmetry of the matrix of the phenomenological coefficients. A common set of measured diffusion...... coefficients for a three-component mixture consists of four Fickian diffusion coefficients, each being reported separately. However, the Onsager theory predicts the existence of only three independent coefficients, as one of them disappears due to the symmetry requirement. Re-calculation of the Fickian...... extended sets of experimental data and reliable thermodynamic models were available. The sensitivity of the symmetry property to different thermodynamic parameters of the models was also checked. (C) 2003 Elsevier Science B.V. All rights reserved....

  2. How to Verify and Manage the Translational Plagiarism?

    Science.gov (United States)

    Wiwanitkit, Viroj

    2016-01-01

    The use of Google translator as a tool for determining translational plagiarism is a big challenge. As noted, plagiarism of the original papers written in Macedonian and translated into other languages can be verified after computerised translation in other languages. Attempts to screen the translational plagiarism should be supported. The use of Google Translate tool might be helpful. Special focus should be on any non-English reference that might be the source of plagiarised material and non-English article that might translate from an original English article, which cannot be detected by simple plagiarism screening tool. It is a hard job for any journal to detect the complex translational plagiarism but the harder job might be how to effectively manage the case. PMID:27703588

  3. Biochemically verified smoking cessation and vaping beliefs among vape store customers.

    Science.gov (United States)

    Tackett, Alayna P; Lechner, William V; Meier, Ellen; Grant, DeMond M; Driskill, Leslie M; Tahirkheli, Noor N; Wagener, Theodore L

    2015-05-01

    To evaluate biochemically verified smoking status and electronic nicotine delivery systems (ENDS) use behaviors and beliefs among a sample of customers from vapor stores (stores specializing in ENDS). A cross-sectional survey of 215 adult vapor store customers at four retail locations in the Midwestern United States; a subset of participants (n = 181) also completed exhaled carbon monoxide (CO) testing to verify smoking status. Outcomes evaluated included ENDS preferences, harm beliefs, use behaviors, smoking history and current biochemically verified smoking status. Most customers reported starting ENDS as a means of smoking cessation (86%), using newer-generation devices (89%), vaping non-tobacco/non-menthol flavors (72%) and using e-liquid with nicotine strengths of ≤20 mg/ml (72%). There was a high rate of switching (91.4%) to newer-generation ENDS among those who started with a first-generation product. Exhaled CO readings confirmed that 66% of the tested sample had quit smoking. Among those who continued to smoke, mean cigarettes per day decreased from 22.1 to 7.5 (P customers in the United States who use electronic nicotine delivery devices to stop smoking, vaping longer, using newer-generation devices and using non-tobacco and non-menthol flavored e-liquid appear to be associated with higher rates of smoking cessation. © 2015 Society for the Study of Addiction.

  4. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    Directory of Open Access Journals (Sweden)

    J. H. Lee

    2015-01-01

    Full Text Available A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions.

  5. Two tests of electric fields, second-order in source-velocity terms of closed, steady currents: (1) an electron beam; (2) a superconducting coil

    International Nuclear Information System (INIS)

    Kenyon, C.S.

    1980-01-01

    One particular prediction of Maxwell's theory that has been previously neglected is that the motion of charges traveling in closed loops produces no constant electric fields. This study presents and analyzes the results of two new experiments designed to test for second-order, source-velocity electric fields from steady, closed currents and analyzes another experiment in light of these fields. The first experiment employed an electron beam. The second used a niobium-titanium coil designed so that the voltage measurement configuration could be easily switched from a Faraday to a non-faraday configuration between sets of runs. The implications of the observation of a null charge on magnetically suspended superconducting spheres vis-a-vis the second-order, source-velocity fields were discussed as the third case. The observation of a null potential corresponding to a null effective charge from a hypothetical velocity-squared field in both the beam and the coil experiment placed the upper bound on a field term at 0.02 with respect a Coulomb term. An observed null charge on the suspended spheres reduced this bound to 0.001. Such an upper bound is strong evidence against alternative theories predicting a relative contribution of the order of unity for a simple velocity-squared term. A simple velocity-squared electric field would be indistinguishable from a velocity-squared charge variation. The latter test limits such a charge variation to 0.001 of the total charge. The suspended-spheres test allowed the previously neglected issue of a general second-order, source-velocity electric field to be addressed. The observed null charge in this test contradicts and thus eliminates a hypothesized, general, electric field expression containing three second-order, source-velocity terms

  6. Development of a hardware-in-the-loop-test rig to verify the reliability of oil burner pumps. Application by the use of biocide in domestic heating oil; Entwicklung eines Hardware-in-the-loop Pruefstands zum Nachweis der Betriebssicherheit von Oelbrennerpumpen. Anwendungen bei Einsatz von Biozidadditiven

    Energy Technology Data Exchange (ETDEWEB)

    Rheinberg, Oliver van; Lukito, Jayadi; Liska, Martin [Oel-Waerme-Institut gGmbH (OWI), Aachen-Herzogenrath (Germany)

    2009-09-15

    Within this project, a hardware-in-the-loop test rig has been developed to investigate the influence of different fuels on the reliability of oil burner pumps. The test rig is constructed with commercial burner components. One test rig consists of four pump cycles, where the fuel recirculates for max. 2000 h. Low powered electric motors of 90 Watts have been used deliberately, so that the apparatus is more sensitive to failure due to an increase in pump load. A practise relevant intermittent operating mode has been implemented for the simulation of real operation characteristics. The measured variable and evaluation parameters are start-up torque, intake pressure, fuel pump pressure and temperature. Operation failures of oil burner pumps in the field, due to an over-additisation of biocides, have been observed. These failures could be reproducibly simulated on the pump test stands. The results of the project are a redefinition of limits of biocide concentration and the development of new biocides, which are suitable for use in domestic heating oil with a content of up to 20 % Fatty-Acid-Methyl-Ester. (orig.)

  7. Verifying Quality of Service of ARCnet Based ATOMOS Communication System for Integrated Ship Control

    DEFF Research Database (Denmark)

    Nielsen, N.N.; Nielsen, Jens Frederik Dalsgaard; Schiøler, Henrik

    point) layer. An important characteristic of the communication system is that the functionality and timing must be verifiable in order to satisfy requirements from classification companies like Lloyds and Norsk Veritas. By including Service Categories, Traffic Descriptors and Quality of Service concepts......As part of the ATOMOS project (Funded by EU, DG VII) a reliable communication system with predictable behaviour has been designed. The selected solution is a network based on redundant ARCnet segments extended with an EN50170 compliant fieldbus based layer on top of an ARCnet SAP (service access...

  8. Verifying Quality of Service of ARCnet Based ATOMOS Communication System for Integrated Ship Control

    DEFF Research Database (Denmark)

    Nielsen, N.N.; Nielsen, Jens Frederik Dalsgaard; Schiøler, Henrik

    1999-01-01

    point) layer. An important characteristic of the communication system is that the functionality and timing must be verifiable in order to satisfy requirements from classification companies like Lloyds and Norsk Veritas. By including Service Categories, Traffic Descriptors and Quality of Service concepts......As part of the ATOMOS project (Funded by EU, DG VII) a reliable communication system with predictable behaviour has been designed. The selected solution is a network based on redundant ARCnet segments extended with an EN50170 compliant fieldbus based layer on top of an ARCnet SAP (service access...

  9. Verifying the competition between haloperidol and biperiden in serum albumin through a model based on spectrofluorimetry

    Science.gov (United States)

    Muniz da Silva Fragoso, Viviane; Patrícia de Morais e Coura, Carla; Paulino, Erica Tex; Valdez, Ethel Celene Narvaez; Silva, Dilson; Cortez, Celia Martins

    2017-11-01

    The aim of this work was to apply mathematical-computational modeling to study the interactions of haloperidol (HLP) and biperiden (BPD) with human (HSA) and bovine (BSA) serum albumin in order to verify the competition of these drugs for binding sites in HSA, using intrinsic tryptophan fluorescence quenching data. The association constants estimated for HPD-HSA was 2.17(±0.05) × 107 M-1, BPD-HSA was 2.01(±0.03) × 108 M-1 at 37 °C. Results have shown that drugs do not compete for the same binding sites in albumin.

  10. A new (k,n verifiable secret image sharing scheme (VSISS

    Directory of Open Access Journals (Sweden)

    Amitava Nag

    2014-11-01

    Full Text Available In this paper, a new (k,n verifiable secret image sharing scheme (VSISS is proposed in which third order LFSR (linear-feedback shift register-based public key cryptosystem is applied for the cheating prevention and preview before decryption. In the proposed scheme the secret image is first partitioned into several non-overlapping blocks of k pixels. Every k pixel is then used to form m=⌈k/4⌉+1 pixels of one encrypted share. The original secret image can be reconstructed by gathering any k or more encrypted shared images. The experimental results show that the proposed VSISS is an efficient and safe method.

  11. Tests of Quantum electrodynamics at the α3 and α4 orders and search of excited leptons with the CELLO detector at PETRA

    International Nuclear Information System (INIS)

    Janot, P.

    1987-06-01

    Single - and double - Bremsstrahlung processes in e + e - annihilation have been studied in order to perform QED tests up to the 3rd and 4th orders of perturbation theory on one hand, and to detect possible excited leptonic states on the other hand. An integrated luminosity of about 130 pb -1 , accumulated with the CELLO detector at PETRA at center of mass energies ranging from 35 to 46.8 GeV has been analysed. In order to compare data with the QED predictions, simulation programs had to be developed, in particular for the annihilation into 4 photons. For all the processes, good agreement with QED is observed and new limits are derived for excited leptonics states [fr

  12. A Hybrid Verifiable and Delegated Cryptographic Model in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jaber Ibrahim Naser

    2018-02-01

    Full Text Available Access control is very important in cloud data sharing. Especially in the domains like healthcare, it is essential to have access control mechanisms in place for confidentiality and secure data access. Attribute based encryption has been around for many years to secure data and provide controlled access. In this paper, we proposed a framework that supports circuit and attributes based encryption mechanism that involves multiple parties. They are data owner, data user, cloud server and attribute authority. An important feature of the proposed system is the verifiable delegation of the decryption process to cloud server. Data owner encrypts data and delegates decryption process to cloud. Cloud server performs partial decryption and then the final decrypted data are shared for users as per the privileges. Data owner  thus reduces computational complexity by delegating decryption process cloud server. We built a prototype application using the Microsoft.NET platform for proof of the concept. The empirical results revealed that there is controlled access with multiple user roles and access control rights for secure and confidential data access in cloud computing.

  13. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    Science.gov (United States)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  14. Verifying operator fitness - an imperative not an option

    International Nuclear Information System (INIS)

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered

  15. A credit card verifier structure using diffraction and spectroscopy concepts

    Science.gov (United States)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  16. Preliminary results from ASP on tests of QED to order α4 in e+e- annihilation at √s = 29 GeV

    International Nuclear Information System (INIS)

    Hawkins, C.A.

    1988-11-01

    Tests of QED to order α 4 performed with the ASP detector at PEP are presented. Measurements have been made of exclusive e + e - e + e - , e + e - γγ and γγγγ final states with all particles above 50 milliradians with respect to the e + e - beam line. These measurements represent a significant increase in statistics over previous measurements. All measurements agree well with theoretical predictions. 5 refs., 1 tab

  17. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    Science.gov (United States)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from

  18. Scenarios for exercising technical approaches to verified nuclear reductions

    International Nuclear Information System (INIS)

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  19. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  20. Verifying Elimination Programs with a Special Emphasis on Cysticercosis Endpoints and Postelimination Surveillance

    Directory of Open Access Journals (Sweden)

    Sukwan Handali

    2012-01-01

    Full Text Available Methods are needed for determining program endpoints or postprogram surveillance for any elimination program. Cysticercosis has the necessary effective strategies and diagnostic tools for establishing an elimination program; however, tools to verify program endpoints have not been determined. Using a statistical approach, the present study proposed that taeniasis and porcine cysticercosis antibody assays could be used to determine with a high statistical confidence whether an area is free of disease. Confidence would be improved by using secondary tests such as the taeniasis coproantigen assay and necropsy of the sentinel pigs.

  1. Calculation of neutron flux and reactivity by perturbation theory at high order

    International Nuclear Information System (INIS)

    Silva, W.L.P. da; Silva, F.C. da; Thome Filho, Z.D.

    1982-01-01

    A high order pertubation theory is studied, independent of time, applied to integral parameter calculation of a nuclear reactor. A pertubative formulation, based on flux difference technique, which gives directy the reactivity and neutron flux up to the aproximation order required, is presented. As an application of the method, global pertubations represented by fuel temperature variations, are used. Tests were done aiming to verify the relevancy of the approximation order for several intensities of the pertubations considered. (E.G.) [pt

  2. A proposal of verifying of an inductive voltage transformers precision class; Uma proposta de verificacao da classe de exatidao de transformadores de potencial indutivos

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Izael Pereira da

    1997-07-01

    The possibility of verifying the inductive voltage transformer (IVT) precision class during its operational life by means of simple excitation and short circuit tests is shown. By this way, the transportation of such equipment to laboratories or factories for new calibrations is avoided. As IVT precision is function of geometric parameters and material characteristics such as winding resistivity, core permeability and others, if they do not change, there is no reason to expect modifications in the precision of the equipment. After a critical analysis, is intended to demonstrate and experimentally verify that excitation and short-circuit test are sufficient to detect any modification in the above parameters. If such alteration are not detected, it is possible to assure that the equipment maintains its specified class of precision. The Moellinger and Gewecke method is used in order to determine the actual value of turns ratio and the separate leakage reactance of primary winding. These parameters are not easily obtainable in practice. The present Brazilian Rule for IVT does not mention any method which permits the determination os such parameters. Comparison of errors obtained by the present method with those found with Schering-Alberti Bridge, showed that this methodology is effective and useful mainly when precision test equipment (such as A C relation bridge, standard transformer, standard burdens) are not available, and it is necessary to verify a transformer condition or even decide about a non-accordance between two results founded in laboratories through conventional methods. Several discussions about transformer models are also included, in particular the decomposition method, which, being essentially different from the T equivalent model, shows interesting aspects of transformer analysis. Two other points of special interest are: the critical analysis of the Moellinger and Gewecke method described at part 3.5 and its possible insertion in the NBR 6820

  3. A low-order coupled chemistry meteorology model for testing online and offline data assimilation schemes: L95-GRS (v1.0)

    Science.gov (United States)

    Haussaire, J.-M.; Bocquet, M.

    2016-01-01

    Bocquet and Sakov (2013) introduced a low-order model based on the coupling of the chaotic Lorenz-95 (L95) model, which simulates winds along a mid-latitude circle, with the transport of a tracer species advected by this zonal wind field. This model, named L95-T, can serve as a playground for testing data assimilation schemes with an online model. Here, the tracer part of the model is extended to a reduced photochemistry module. This coupled chemistry meteorology model (CCMM), the L95-GRS (generic reaction set) model, mimics continental and transcontinental transport and the photochemistry of ozone, volatile organic compounds and nitrogen oxides. Its numerical implementation is described. The model is shown to reproduce the major physical and chemical processes being considered. L95-T and L95-GRS are specifically designed and useful for testing advanced data assimilation schemes, such as the iterative ensemble Kalman smoother (IEnKS), which combines the best of ensemble and variational methods. These models provide useful insights prior to the implementation of data assimilation methods into larger models. We illustrate their use with data assimilation schemes on preliminary yet instructive numerical experiments. In particular, online and offline data assimilation strategies can be conveniently tested and discussed with this low-order CCMM. The impact of observed chemical species concentrations on the wind field estimate can be quantitatively assessed. The impacts of the wind chaotic dynamics and of the chemical species non-chaotic but highly nonlinear dynamics on the data assimilation strategies are illustrated.

  4. HOM (higher-order mode) test of the storage ring single-cell cavity with a 20-MeV e- beam for the Advanced Photon Source (APS)

    International Nuclear Information System (INIS)

    Song, J.; Kang, Y.W.; Kustom, R.

    1993-01-01

    To test the effectiveness of damping techniques of the APS storage ring single-cell cavity, a beamline has been designed and assembled to use the ANL Chemistry Division linac beam (20-MeV, FWHM of 20 ps). A single-cell cavity will be excited by the electron beam to investigate the effect on higher-order modes (HOMs) with and without coaxial dampers (H-loop damper, E-probe damper), and wideband aperture dampers. In order for the beam to propagate on- and off-center of the cavity, the beamline consists of two sections -- a beam collimating section and a cavity measurement section -- separated by two double Aluminum foil windows. RF cavity measurements were made with coupling loops and E-probes. The results are compared with both the TBCI calculations and 'cold' measurements with the bead-perturbation method. The data acquisition system and beam diagnostics will be described in a separate paper

  5. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    Science.gov (United States)

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  6. Pilot study to verify the calibration of electrometers

    International Nuclear Information System (INIS)

    Becker, P.; Meghzifene, A.

    2002-01-01

    National Laboratory for Electrical Measurements has not yet developed its capability for the standardization of small electrical charge produced by DC, the IRD is trying to verify its standardization procedures of the electrical charge through a comparison programme. This subject was discussed with a major electrometer manufacturer that has offered to provide free of charge, three of their electrometer calibration standards for a pilot run. The model to be provided consists of four calibrated resistors and two calibrated capacitors, covering the charge/current range of interest. For producing charge or current a standard DC voltage must be applied to these components. Since practically all-modern electrometers measure using virtual ground, this methodology is viable. The IRD, in collaboration with the IAEA, wishes to invite interested laboratories to participate in this pilot comparison programme. This exercise is expected to be useful for all participants and will hopefully open the way for the establishment of routine comparisons in this area. The results will be discussed and published in an appropriate journal. Interested institutions should contact directly Mr. Paulo H. B. Becker through e-mail (pbecker at ird.gov.br) or fax +55 21 24421950 informing him of the model and manufacturer of the electrometer to be used for the pilot study and discuss all practical details. (author)

  7. Orbitally invariant internally contracted multireference unitary coupled cluster theory and its perturbative approximation: theory and test calculations of second order approximation.

    Science.gov (United States)

    Chen, Zhenhua; Hoffmann, Mark R

    2012-07-07

    A unitary wave operator, exp (G), G(+) = -G, is considered to transform a multiconfigurational reference wave function Φ to the potentially exact, within basis set limit, wave function Ψ = exp (G)Φ. To obtain a useful approximation, the Hausdorff expansion of the similarity transformed effective Hamiltonian, exp (-G)Hexp (G), is truncated at second order and the excitation manifold is limited; an additional separate perturbation approximation can also be made. In the perturbation approximation, which we refer to as multireference unitary second-order perturbation theory (MRUPT2), the Hamiltonian operator in the highest order commutator is approximated by a Mo̸ller-Plesset-type one-body zero-order Hamiltonian. If a complete active space self-consistent field wave function is used as reference, then the energy is invariant under orbital rotations within the inactive, active, and virtual orbital subspaces for both the second-order unitary coupled cluster method and its perturbative approximation. Furthermore, the redundancies of the excitation operators are addressed in a novel way, which is potentially more efficient compared to the usual full diagonalization of the metric of the excited configurations. Despite the loss of rigorous size-extensivity possibly due to the use of a variational approach rather than a projective one in the solution of the amplitudes, test calculations show that the size-extensivity errors are very small. Compared to other internally contracted multireference perturbation theories, MRUPT2 only needs reduced density matrices up to three-body even with a non-complete active space reference wave function when two-body excitations within the active orbital subspace are involved in the wave operator, exp (G). Both the coupled cluster and perturbation theory variants are amenable to large, incomplete model spaces. Applications to some widely studied model systems that can be problematic because of geometry dependent quasidegeneracy, H4, P4

  8. Radiative Transfer Theory Verified by Controlled Laboratory Experiments

    Science.gov (United States)

    Mishchenko, Michael I.; Goldstein, Dennis H.; Chowdhary, Jacek; Lompado, Arthur

    2013-01-01

    We report the results of high-accuracy controlled laboratory measurements of the Stokes reflection matrix for suspensions of submicrometer-sized latex particles in water and compare them with the results of a numerically exact computer solution of the vector radiative transfer equation (VRTE). The quantitative performance of the VRTE is monitored by increasing the volume packing density of the latex particles from 2 to 10. Our results indicate that the VRTE can be applied safely to random particulate media with packing densities up to 2. VRTE results for packing densities of the order of 5 should be taken with caution, whereas the polarized bidirectional reflectivity of suspensions with larger packing densities cannot be accurately predicted. We demonstrate that a simple modification of the phase matrix entering the VRTE based on the so-called static structure factor can be a promising remedy that deserves further examination.

  9. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    International Nuclear Information System (INIS)

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing

  10. Tests of Enhanced Leading Order QCD in W Boson plus Jet Production in 1.96-TeV Proton-Antiproton Collisions

    Energy Technology Data Exchange (ETDEWEB)

    Tsuno, Soushi [Univ. of Tsukuba (Japan)

    2004-01-01

    The authors have studied the W + ≥ n jets process in Tevatron Run II experiment. The data used correspond to a total integrated luminosity of 72 pb-1 taken from March 2002 through January 2003. The lowest order QCD predictions have been tested with a new prescription of the parton-jet matching, which allows to construct the enhanced LO phase space. According to this procedure, one gets unique results which do not depend on unphysical bias of kinematical cuts to avoid the collinear/infrared divergence in calculation. Namely, one can get the meaningful results in the lowest order prediction. The controllable event samples of the W boson plus jets events by the enhanced lowest order prediction will lead smaller systematic uncertainty than the naive prediction without any cares of the collinear/infrared divergence. They expect their method will be also useful to make systematically small samples as the background estimates in the top quark analysis. They found a good agreement between data and theory in typical kinematics distributions. The number of events for each inclusive sample up to 3 jets are compared with Monte Carlo calculations. A comparison with Run I results is also presented. This is the first result for the CDF Run II experiment.

  11. MUSE: An Efficient and Accurate Verifiable Privacy-Preserving Multikeyword Text Search over Encrypted Cloud Data

    Directory of Open Access Journals (Sweden)

    Zhu Xiangyang

    2017-01-01

    Full Text Available With the development of cloud computing, services outsourcing in clouds has become a popular business model. However, due to the fact that data storage and computing are completely outsourced to the cloud service provider, sensitive data of data owners is exposed, which could bring serious privacy disclosure. In addition, some unexpected events, such as software bugs and hardware failure, could cause incomplete or incorrect results returned from clouds. In this paper, we propose an efficient and accurate verifiable privacy-preserving multikeyword text search over encrypted cloud data based on hierarchical agglomerative clustering, which is named MUSE. In order to improve the efficiency of text searching, we proposed a novel index structure, HAC-tree, which is based on a hierarchical agglomerative clustering method and tends to gather the high-relevance documents in clusters. Based on the HAC-tree, a noncandidate pruning depth-first search algorithm is proposed, which can filter the unqualified subtrees and thus accelerate the search process. The secure inner product algorithm is used to encrypted the HAC-tree index and the query vector. Meanwhile, a completeness verification algorithm is given to verify search results. Experiment results demonstrate that the proposed method outperforms the existing works, DMRS and MRSE-HCI, in efficiency and accuracy, respectively.

  12. A Study on Dismantling and Verifying North Korea's Nuclear Capabilities

    International Nuclear Information System (INIS)

    Kim, Young Jae; Cheon, Seong Whun

    2007-10-01

    North Korea's nuclear weapon development is a serious threat to South Korea's national security and can become a trigger to change the status quo in the Korean peninsula. Having prevailed security dynamics in Northeast Asia last 20 years, the North Korea's nuclear problem faced a key turning point when Pyongyang tested its first nuclear weapon on October 9, 2006. Despite this test, however, diplomatic efforts to resolve the nuclear issue were never given up, resulting in a so-called, initial agreement signed at the Six-Party Talks in February 2007. With the Six-Party Talks being held more than four years, the six countries have had sufficient time to discuss principal and political matters regarding the dismantlement of North Korea's nuclear weapons. Under the circumstances, this report is going to study practical and detail issues related with dismantling the North's nuclear weapons. Specifically, in light of historical experiences, the report will investigate possible problems to be faced in the course of dismantlement and propose policy measures to overcome these problems

  13. Verifying generator waste certification: NTS waste characterization QA requirements

    International Nuclear Information System (INIS)

    Williams, R.E.; Brich, R.F.

    1988-01-01

    Waste management activities managed by the US Department of Energy (DOE) at the Nevada Test Site (NTS) include the disposal of low-level wastes (LLW) and mixed waste (MW), waste which is both radioactive and hazardous. A majority of the packaged LLW is received from offsite DOE generators. Interim status for receipt of MW at the NTS Area 5 Radioactive Waste Management Site (RWMS) was received from the state of Nevada in 1987. The RWMS Mixed Waste Management Facility (MWMF) is expected to be operational in 1988 for approved DOE MW generators. The Nevada Test Site Defense Waste Acceptance Criteria and Certification Requirements (NVO-185, Revision 5) delineates waste acceptance criteria for waste disposal at the NTS. Regulation of the hazardous component of mixed waste requires the implementation of US Environmental Protection Agency (EPA) requirements pursuant to the Resource Conservation and Recovery Act (RCRA). Waste generators must implement a waste certification program to provide assurance that the disposal site waste acceptance criteria are met. The DOE/Nevada Operations Office (NV) developed guidance for generator waste certification program plans. Periodic technical audits are conducted by DOE/NV to assess performance of the waste certification programs. The audit scope is patterned from the waste certification program plan guidance as it integrates and provides a common format for the applicable criteria. The criteria focus on items and activities critical to processing, characterizing, packaging, certifying, and shipping waste

  14. Verifying a computational method for predicting extreme ground motion

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, Brad T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  15. On a Test of Hypothesis to Verify the Operating Risk Due to Accountancy Errors

    Directory of Open Access Journals (Sweden)

    Paola Maddalena Chiodini

    2014-12-01

    Full Text Available According to the Statement on Auditing Standards (SAS No. 39 (AU 350.01, audit sampling is defined as “the application of an audit procedure to less than 100 % of the items within an account balance or class of transactions for the purpose of evaluating some characteristic of the balance or class”. The audit system develops in different steps: some are not susceptible to sampling procedures, while others may be held using sampling techniques. The auditor may also be interested in two types of accounting error: the number of incorrect records in the sample that overcome a given threshold (natural error rate, which may be indicative of possible fraud, and the mean amount of monetary errors found in incorrect records. The aim of this study is to monitor jointly both types of errors through an appropriate system of hypotheses, with particular attention to the second type error that indicates the risk of non-reporting errors overcoming the upper precision limits.

  16. Evaluation of the ability of rod drop tests to verify the stability margins in FTR

    International Nuclear Information System (INIS)

    Harris, R.A.; Sevenich, R.A.

    1976-01-01

    Predictions of the stability characteristics of FTR indicate that the reactor can be easily controlled even under the worst possible conditions. Nevertheless, experimental verification and monitoring of these characteristics will be performed during operation of the reactor. An initial evaluation of rod drop experiments which could possibly provide this verification is presented

  17. An Initial Examination for Verifying Separation Algorithms by Simulation

    Science.gov (United States)

    White, Allan L.; Neogi, Natasha; Herencia-Zapana, Heber

    2012-01-01

    An open question in algorithms for aircraft is what can be validated by simulation where the simulation shows that the probability of undesirable events is below some given level at some confidence level. The problem is including enough realism to be convincing while retaining enough efficiency to run the large number of trials needed for high confidence. The paper first proposes a goal based on the number of flights per year in several regions. The paper examines the probabilistic interpretation of this goal and computes the number of trials needed to establish it at an equivalent confidence level. Since any simulation is likely to consider the algorithms for only one type of event and there are several types of events, the paper examines under what conditions this separate consideration is valid. This paper is an initial effort, and as such, it considers separation maneuvers, which are elementary but include numerous aspects of aircraft behavior. The scenario includes decisions under uncertainty since the position of each aircraft is only known to the other by broadcasting where GPS believes each aircraft to be (ADS-B). Each aircraft operates under feedback control with perturbations. It is shown that a scenario three or four orders of magnitude more complex is feasible. The question of what can be validated by simulation remains open, but there is reason to be optimistic.

  18. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    Science.gov (United States)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  19. Spot: A Programming Language for Verified Flight Software

    Science.gov (United States)

    Bocchino, Robert L., Jr.; Gamble, Edward; Gostelow, Kim P.; Some, Raphael R.

    2014-01-01

    The C programming language is widely used for programming space flight software and other safety-critical real time systems. C, however, is far from ideal for this purpose: as is well known, it is both low-level and unsafe. This paper describes Spot, a language derived from C for programming space flight systems. Spot aims to maintain compatibility with existing C code while improving the language and supporting verification with the SPIN model checker. The major features of Spot include actor-based concurrency, distributed state with message passing and transactional updates, and annotations for testing and verification. Spot also supports domain-specific annotations for managing spacecraft state, e.g., communicating telemetry information to the ground. We describe the motivation and design rationale for Spot, give an overview of the design, provide examples of Spot's capabilities, and discuss the current status of the implementation.

  20. A suite of exercises for verifying dynamic earthquake rupture codes

    Science.gov (United States)

    Harris, Ruth A.; Barall, Michael; Aagaard, Brad T.; Ma, Shuo; Roten, Daniel; Olsen, Kim B.; Duan, Benchun; Liu, Dunyu; Luo, Bin; Bai, Kangchen; Ampuero, Jean-Paul; Kaneko, Yoshihiro; Gabriel, Alice-Agnes; Duru, Kenneth; Ulrich, Thomas; Wollherr, Stephanie; Shi, Zheqiang; Dunham, Eric; Bydlon, Sam; Zhang, Zhenguo; Chen, Xiaofei; Somala, Surendra N.; Pelties, Christian; Tago, Josue; Cruz-Atienza, Victor Manuel; Kozdon, Jeremy; Daub, Eric; Aslam, Khurram; Kase, Yuko; Withers, Kyle; Dalguer, Luis

    2018-01-01

    We describe a set of benchmark exercises that are designed to test if computer codes that simulate dynamic earthquake rupture are working as intended. These types of computer codes are often used to understand how earthquakes operate, and they produce simulation results that include earthquake size, amounts of fault slip, and the patterns of ground shaking and crustal deformation. The benchmark exercises examine a range of features that scientists incorporate in their dynamic earthquake rupture simulations. These include implementations of simple or complex fault geometry, off‐fault rock response to an earthquake, stress conditions, and a variety of formulations for fault friction. Many of the benchmarks were designed to investigate scientific problems at the forefronts of earthquake physics and strong ground motions research. The exercises are freely available on our website for use by the scientific community.

  1. A two-dimensional deformable phantom for quantitatively verifying deformation algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, Neil; Chuang, Cynthia; Pouliot, Jean [Department of Radiation Oncology, University of California San Francisco, San Francisco, California 94143-1708 (United States)

    2011-08-15

    Purpose: The incorporation of deformable image registration into the treatment planning process is rapidly advancing. For this reason, the methods used to verify the underlying deformation algorithms must evolve equally fast. This manuscript proposes a two-dimensional deformable phantom, which can objectively verify the accuracy of deformation algorithms, as the next step for improving these techniques. Methods: The phantom represents a single plane of the anatomy for a head and neck patient. Inflation of a balloon catheter inside the phantom simulates tumor growth. CT and camera images of the phantom are acquired before and after its deformation. Nonradiopaque markers reside on the surface of the deformable anatomy and are visible through an acrylic plate, which enables an optical camera to measure their positions; thus, establishing the ground-truth deformation. This measured deformation is directly compared to the predictions of deformation algorithms, using several similarity metrics. The ratio of the number of points with more than a 3 mm deformation error over the number that are deformed by more than 3 mm is used for an error metric to evaluate algorithm accuracy. Results: An optical method of characterizing deformation has been successfully demonstrated. For the tests of this method, the balloon catheter deforms 32 out of the 54 surface markers by more than 3 mm. Different deformation errors result from the different similarity metrics. The most accurate deformation predictions had an error of 75%. Conclusions: The results presented here demonstrate the utility of the phantom for objectively verifying deformation algorithms and determining which is the most accurate. They also indicate that the phantom would benefit from more electron density heterogeneity. The reduction of the deformable anatomy to a two-dimensional system allows for the use of nonradiopaque markers, which do not influence deformation algorithms. This is the fundamental advantage of this

  2. ICG Utilities (Manitoba) Ltd. application for an order determining rate base, rate of return and rates based on a 1989 mid-year historic test year and confirmation of Board orders 1/90 and 2/90

    Energy Technology Data Exchange (ETDEWEB)

    1990-08-28

    In the fall of 1989, ICG Utilities (Manitoba) Ltd. and Greater Winnipeg Gas Company (collectively referred to as the Company) applied to the Public Utilities Board for Interim Refundable Rate increases effective January 1, 1990. The Board was satisfied that the Company had proved a prima facie case for financial need and, accordingly, the Board approved rates on an interim basis to become effective on January 1, 1990. On March 1, 1990, the Company filed its General Rate Application with the Board on the basis of a 1989 mid-year historic test year with rates to be effective September 1, 1990. In the amended application, the Company sought the Board's approval to recover a total revenue deficiency of $17.0 million and earn an overall rate of return of 12.9%, which includes a requested return on shareholders' equity of 14%, an increase from 13% previously approved. The Company stated that absent confirmation of the interim rates and approval of increased rates effective September 1, 1990, the overall rate of return would be 8.2% which would yield a return on shareholders' equity of 2.27%. On an overall basis, the Board reduced the Company's requested increase in revenue requirement from 7.12% to 5.24%. Of the 5.24% increase allowed by the Board, 3.07% is currently being recovered through rates which took effect on January 1, 1990, and the remaining 2.17% increase is accommodated by this order. The Board also dealt with several other matters including the following: a change in the due date for bill payment; the Company's request for Board endorsement of a lock-off policy; the Company's capital expenditure program; the efficiencies in the Company's operations and capital program; and retroactive recovery of the revenue deficiency from January 1, 1990 to August 31,1990. 33 tabs.

  3. Is it possible to verify directly a proton-treatment plan using positron emission tomography?

    International Nuclear Information System (INIS)

    Vynckier, S.; Derreumaux, S.; Richard, F.; Wambersie, A.; Bol, A.; Michel, C.

    1993-01-01

    A PET camera is used to visualize the positron activity induced during protonbeam therapy in order to verify directly the proton-treatment plans. The positron emitters created are predominantly the 15 O and 11 C, whose total activity amounts to 12 MBq after an irradiation with 85 MeV protons, delivering 3 Gy in a volume of approximately 300 cm 3 . Although this method is a useful verification of patient setup, care must be taken when deriving dose distributions from activity distributions. Correlation between both quantities is difficult, moreover at the last millimeters of their range, protons will no longer activate tissue. Due to the short half-lives the PET camera must be located close to the treatment facility. (author) 17 refs

  4. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  5. Verifying Stability of Dynamic Soft-Computing Systems

    Science.gov (United States)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  6. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  7. German Children’s Use of Word Order and Case Marking to Interpret Simple and Complex Sentences: Testing Differences Between Constructions and Lexical Items

    Science.gov (United States)

    Brandt, Silke; Lieven, Elena; Tomasello, Michael

    2016-01-01

    ABSTRACT Children and adults follow cues such as case marking and word order in their assignment of semantic roles in simple transitives (e.g., the dog chased the cat). It has been suggested that the same cues are used for the interpretation of complex sentences, such as transitive relative clauses (RCs) (e.g., that’s the dog that chased the cat) (Bates, Devescovi, & D’Amico, 1999). We used a pointing paradigm to test German-speaking 3-, 4-, and 6-year-old children’s sensitivity to case marking and word order in their interpretation of simple transitives and transitive RCs. In Experiment 1, case marking was ambiguous. The only cue available was word order. In Experiment 2, case was marked on lexical NPs or demonstrative pronouns. In Experiment 3, case was marked on lexical NPs or personal pronouns. Whereas the younger children mainly followed word order, the older children were more likely to base their interpretations on the more reliable case-marking cue. In most cases, children from both age groups were more likely to use these cues in their interpretation of simple transitives than in their interpretation of transitive RCs. Finally, children paid more attention to nominative case when it was marked on first-person personal pronouns than when it was marked on third-person lexical NPs or demonstrative pronouns, such as der Löwe ‘the-NOM lion’ or der ‘he-NOM.’ They were able to successfully integrate this case-marking cue in their sentence processing even when it appeared late in the sentence. We discuss four potential reasons for these differences across development, constructions, and lexical items. (1) Older children are relatively more sensitive to cue reliability. (2) Word order is more reliable in simple transitives than in transitive RCs. (3) The processing of case marking might initially be item-specific. (4) The processing of case marking might depend on its saliency and position in the sentence. PMID:27019652

  8. Study and survey of assembling parameters to a radioactive source production laboratory used to verify equipment

    International Nuclear Information System (INIS)

    Gauglitz, Erica

    2010-01-01

    This paper presents a survey of parameters for the proper and safe flooring, doors, windows, fume hoods and others, in a radiochemical laboratory. The layout of each item follows guidelines and national standards of the National Commission of Nuclear Energy (CNEN) and the International Atomic Energy Agency (IAEA), aiming to ensure the radiological protection of workers and environment. The adequate items arrangement in the radiochemical laboratory ensures quality and safety in the production of 57 Co 137 Cs and 133 Ba radioactive sealed sources, with activities 185, 9.3 and 5.4 MBq, respectively. These sources are used to verify meter activity equipment and should be available throughout the Nuclear Medicine Center, following the recommendations of CNEN-NN-3.05 standard R equirements for Radiation Protection and Safety Services for Nuclear Medicine , to verify the activity of radiopharmaceuticals that are administered in patients, for diagnosis and therapy. Verification of measuring activity equipment will be used to perform accuracy, reproducibility and linearity tests, which should show results within the limits specified in the standard CNEN-NN-3.05. (author)

  9. 31 CFR 363.14 - How will you verify my identity?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application. At...

  10. The influence of age, sex, bulb position, visual feedback, and the order of testing on maximum anterior and posterior tongue strength and endurance in healthy belgian adults.

    Science.gov (United States)

    Vanderwegen, Jan; Guns, Cindy; Van Nuffelen, Gwen; Elen, Rik; De Bodt, Marc

    2013-06-01

    This study collected data on the maximum anterior and posterior tongue strength and endurance in 420 healthy Belgians across the adult life span to explore the influence of age, sex, bulb position, visual feedback, and order of testing. Measures were obtained using the Iowa Oral Performance Instrument (IOPI). Older participants (more than 70 years old) demonstrated significantly lower strength than younger persons at the anterior and the posterior tongue. Endurance remains stable throughout the major part of life. Gender influence remains significant but minor throughout life, with males showing higher pressures and longer endurance. The anterior part of the tongue has both higher strength and longer endurance than the posterior part. Mean maximum tongue pressures in this European population seem to be lower than American values and are closer to Asian results. The normative data can be used for objective assessment of tongue weakness and subsequent therapy planning of dysphagic patients.

  11. Female PFP patients present alterations in eccentric muscle activity but not the temporal order of activation of the vastus lateralis muscle during the single leg triple hop test.

    Science.gov (United States)

    Kalytczak, Marcelo Martins; Lucareli, Paulo Roberto Garcia; Dos Reis, Amir Curcio; Bley, André Serra; Biasotto-Gonzalez, Daniela Aparecida; Correa, João Carlos Ferrari; Politti, Fabiano

    2018-04-07

    This study aimed to compare the concentric and eccentric activity and the temporal order of peak activity of the hip and knee muscles between women with patellofemoral pain (PFP) and healthy women during the single leg triple hop test (SLTHT). Electromyographic (EMG) and Kinematic data were collected from 14 healthy women (CG) and 14 women diagnosed with PFP (PFG) during a single session of the single leg triple hop test. Integral surface electromyography (iEMG) data of the hip and knee muscles in eccentric and concentric phases and the length of time that each muscle needed to reach the maximal peak of muscle activity were calculated. The iEMG in the eccentric phase was significantly higher (p < 0.05) than the concentric phase, for the gluteus maximus and gluteus medius muscles (CG and PFG) and for the vastus lateralis muscle (PFG). The vastus lateralis muscle was the first muscle to reach the highest peak of activity in the PFG, and the third to reach this peak in the CG. In the present study, the activity of the vastus lateralis muscle during the eccentric phase of the jump was greater than concentric phase, as a temporal anticipation of its peak in activity among women with PFP. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. New verifiable stationarity concepts for a class of mathematical programs with disjunctive constraints.

    Science.gov (United States)

    Benko, Matúš; Gfrerer, Helmut

    2018-01-01

    In this paper, we consider a sufficiently broad class of non-linear mathematical programs with disjunctive constraints, which, e.g. include mathematical programs with complemetarity/vanishing constraints. We present an extension of the concept of [Formula: see text]-stationarity which can be easily combined with the well-known notion of M-stationarity to obtain the stronger property of so-called [Formula: see text]-stationarity. We show how the property of [Formula: see text]-stationarity (and thus also of M-stationarity) can be efficiently verified for the considered problem class by computing [Formula: see text]-stationary solutions of a certain quadratic program. We consider further the situation that the point which is to be tested for [Formula: see text]-stationarity, is not known exactly, but is approximated by some convergent sequence, as it is usually the case when applying some numerical method.

  13. Reference Material Properties and Standard Problems to Verify the Fuel Performance Models Ver 1.0

    International Nuclear Information System (INIS)

    Yang, Yong Sik; Kim, Jae Yong; Koo, Yang Hyun

    2010-12-01

    All fuel performance models must be validated by in-pile and out-pile tests. However, the model validation requires much efforts and times to confirm its exactness. In many fields, new performance models and codes are confirmed by code-to-code benchmarking process under simplified standard problem analysis. At present, the DUOS, which is the steady state fuel performance analysis code for dual cooled annular fuel, development project is progressing and new FEM module is developed to analyze the fuel performance during transient period. In addition, the verification process is planning to examine the new models and module's rightness by comparing with commercial finite element analysis such as a ADINA, ABAQUS and ANSYS. This reports contains the result of unification of material properties and establishment of standard problem to verify the newly developed models with commercial FEM code

  14. Verified Gaming

    DEFF Research Database (Denmark)

    Kiniry, Joseph Roland; Zimmerman, Daniel

    2011-01-01

    ---falls every year and any mention of mathematics in the classroom seems to frighten students away. So the question is: How do we attract new students in computing to the area of dependable software systems? Over the past several years at three universities we have experimented with the use of computer games......In recent years, several Grand Challenges (GCs) of computing have been identified and expounded upon by various professional organizations in the U.S. and England. These GCs are typically very difficult problems that will take many hundreds, or perhaps thousands, of man-years to solve. Researchers...

  15. Dynamic simulation platform to verify the performance of the reactor regulating system for a research reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-07-01

    Digital instrumentation and controls system technique is being introduced in new constructed research reactor or life extension of older research reactor. Digital systems are easy to change and optimize but the validated process for them is required. Also, to reduce project risk or cost, we have to make it sure that configuration and control functions is right before the commissioning phase on research reactor. For this purpose, simulators have been widely used in developing control systems in automotive and aerospace industries. In these literatures, however, very few of these can be found regarding test on the control system of research reactor with simulator. Therefore, this paper proposes a simulation platform to verify the performance of RRS (Reactor Regulating System) for research reactor. This simulation platform consists of the reactor simulation model and the interface module. This simulation platform is applied to I and C upgrade project of TRIGA reactor, and many problems of RRS configuration were found and solved. And it proved that the dynamic performance testing based on simulator enables significant time saving and improves economics and quality for RRS in the system test phase. (authors)

  16. Scheduling system for test automation framework

    NARCIS (Netherlands)

    Wahyudi, Djohan

    2014-01-01

    An Interventional X-ray (iXR) system provides real time X-ray imaging with high image clarity and low X-ray dose. After several years of development, the iXR System has become complex. In order to verify the correct operation of the system, the system integration and test group performs extensive

  17. Second-order asymptotics for quantum hypothesis testing in settings beyond i.i.d.—quantum lattice systems and more

    International Nuclear Information System (INIS)

    Datta, Nilanjana; Rouzé, Cambyse; Pautrat, Yan

    2016-01-01

    Quantum Stein’s lemma is a cornerstone of quantum statistics and concerns the problem of correctly identifying a quantum state, given the knowledge that it is one of two specific states (ρ or σ). It was originally derived in the asymptotic i.i.d. setting, in which arbitrarily many (say, n) identical copies of the state (ρ"⊗"n or σ"⊗"n) are considered to be available. In this setting, the lemma states that, for any given upper bound on the probability α_n of erroneously inferring the state to be σ, the probability β_n of erroneously inferring the state to be ρ decays exponentially in n, with the rate of decay converging to the relative entropy of the two states. The second order asymptotics for quantum hypothesis testing, which establishes the speed of convergence of this rate of decay to its limiting value, was derived in the i.i.d. setting independently by Tomamichel and Hayashi, and Li. We extend this result to settings beyond i.i.d. Examples of these include Gibbs states of quantum spin systems (with finite-range, translation-invariant interactions) at high temperatures, and quasi-free states of fermionic lattice gases.

  18. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    Science.gov (United States)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  19. Aperiodic order

    CERN Document Server

    Grimm, Uwe

    2017-01-01

    Quasicrystals are non-periodic solids that were discovered in 1982 by Dan Shechtman, Nobel Prize Laureate in Chemistry 2011. The mathematics that underlies this discovery or that proceeded from it, known as the theory of Aperiodic Order, is the subject of this comprehensive multi-volume series. This second volume begins to develop the theory in more depth. A collection of leading experts, among them Robert V. Moody, cover various aspects of crystallography, generalising appropriately from the classical case to the setting of aperiodically ordered structures. A strong focus is placed upon almost periodicity, a central concept of crystallography that captures the coherent repetition of local motifs or patterns, and its close links to Fourier analysis. The book opens with a foreword by Jeffrey C. Lagarias on the wider mathematical perspective and closes with an epilogue on the emergence of quasicrystals, written by Peter Kramer, one of the founders of the field.

  20. Test

    DEFF Research Database (Denmark)

    Bendixen, Carsten

    2014-01-01

    Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers.......Bidrag med en kortfattet, introducerende, perspektiverende og begrebsafklarende fremstilling af begrebet test i det pædagogiske univers....

  1. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    Science.gov (United States)

    Stephen M Ogle; Kenneth Davis; Thomas Lauvaux; Andrew Schuh; Dan Cooley; Tristram O West; Linda S Heath; Natasha L Miles; Scott Richardson; F Jay Breidt; James E Smith; Jessica L McCarty; Kevin R Gurney; Pieter Tans; A Scott. Denning

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country's contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated...

  2. Order in Chaos

    DEFF Research Database (Denmark)

    Hansen, Bertel Teilfeldt; Olsen, Asmus Leth

    2014-01-01

    Ballot order effects are well documented in established democracies, but less so in fragile post-conflict settings. We test for the presence of ballot order effects in the 2010 parliamentary election in Afghanistan. Turning out for the 2010 election was a potentially life-threatening endeavor for...

  3. Verifying Safety Messages Using Relative-Time and Zone Priority in Vehicular Ad Hoc Networks

    Science.gov (United States)

    Banani, Sam; Thiemjarus, Surapa; Kittipiyakul, Somsak

    2018-01-01

    In high-density road networks, with each vehicle broadcasting multiple messages per second, the arrival rate of safety messages can easily exceed the rate at which digital signatures can be verified. Since not all messages can be verified, algorithms for selecting which messages to verify are required to ensure that each vehicle receives appropriate awareness about neighbouring vehicles. This paper presents a novel scheme to select important safety messages for verification in vehicular ad hoc networks (VANETs). The proposed scheme uses location and direction of the sender, as well as proximity and relative-time between vehicles, to reduce the number of irrelevant messages verified (i.e., messages from vehicles that are unlikely to cause an accident). Compared with other existing schemes, the analysis results show that the proposed scheme can verify messages from nearby vehicles with lower inter-message delay and reduced packet loss and thus provides high level of awareness of the nearby vehicles. PMID:29652840

  4. What do physicians tell laboratories when requesting tests? A multi-method examination of information supplied to the microbiology laboratory before and after the introduction of electronic ordering.

    Science.gov (United States)

    Georgiou, Andrew; Prgomet, Mirela; Toouli, George; Callen, Joanne; Westbrook, Johanna

    2011-09-01

    The provision of relevant clinical information on pathology requests is an important part of facilitating appropriate laboratory utilization and accurate results interpretation and reporting. (1) To determine the quantity and importance of handwritten clinical information provided by physicians to the Microbiology Department of a hospital pathology service; and (2) to examine the impact of a Computerized Provider Order Entry (CPOE) system on the nature of clinical information communication to the laboratory. A multi-method and multi-stage investigation which included: (a) a retrospective audit of all handwritten Microbiology requests received over a 1-month period in the Microbiology Department of a large metropolitan teaching hospital; (b) the administration of a survey to laboratory professionals to investigate the impact of different clinical information on the processing and/or interpretation of tests; (c) an expert panel consisting of medical staff and senior scientists to assess the survey findings and their impact on pathology practice and patient care; and (d) a comparison of the provision and value of clinical information before CPOE, and across 3 years after its implementation. The audit of handwritten requests found that 43% (n=4215) contained patient-related clinical information. The laboratory survey showed that 97% (84/86) of the different types of clinical information provided for wound specimens and 86% (43/50) for stool specimens were shown to have an effect on the processing or interpretation of the specimens by one or more laboratory professionals. The evaluation of the impact of CPOE revealed a significant improvement in the provision of useful clinical information from 2005 to 2008, rising from 90.1% (n=749) to 99.8% (n=915) (p<.0001) for wound specimens and 34% (n=129) to 86% (n=422) (p<.0001) for stool specimens. This study showed that the CPOE system provided an integrated platform to access and exchange valuable patient-related information

  5. General practitioner views on the determinants of test ordering: a theory-based qualitative approach to the development of an intervention to improve immunoglobulin requests in primary care.

    LENUS (Irish Health Repository)

    Cadogan, S L

    2016-07-19

    Research suggests that variation in laboratory requesting patterns may indicate unnecessary test use. Requesting patterns for serum immunoglobulins vary significantly between general practitioners (GPs). This study aims to explore GP\\'s views on testing to identify the determinants of behaviour and recommend feasible intervention strategies for improving immunoglobulin test use in primary care.

  6. A verified and efficient approach towards fatigue validation of safety parts

    Energy Technology Data Exchange (ETDEWEB)

    Weihe, Stefan; Weigel, Nicolas [Daimler AG, Stuttgart (Germany); Dressler, Klaus; Speckert, Michael; Feth, Sascha

    2011-07-01

    In the automotive industry, safety parts must be designed according to the state of the art of science and technology such that they do not fail as long as the vehicle is used according to its purpose and misuse of the vehicle does not exceed a reasonably expectable degree. Due to scatter in customer loads and component properties, fatigue validation needs to be based on statistical methods. Mathematically sound methods are devised in order to make the validation process as efficient as possible. They allow considering all test results, including censored test data (e.g. tests suspended due to premature failure of components which are not under consideration). Furthermore, these methods permit adapting the success run criterion successively to the testing process. (orig.)

  7. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Science.gov (United States)

    Kung, Woon-Man; Chen, Shuo-Tsung; Lin, Chung-Hsiang; Lu, Yu-Mei; Chen, Tzu-Hsuan; Lin, Muh-Shi

    2013-01-01

    Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM) implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D) CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS). From January 2011 to June 2012, decompressive craniectomy (DC) was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84). CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (ppairs signed rank test). These data evidenced the highly accurate symmetry of these CAD models with regular contours. CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  8. Measurement of Deformations by MEMS Arrays, Verified at Sub-millimetre Level Using Robotic Total Stations

    Directory of Open Access Journals (Sweden)

    Tomas Beran

    2014-06-01

    Full Text Available Measurement of sub-millimetre-level deformations of structures in the presence of ambienttemperature changes can be challenging. This paper describes the measurement of astructure moving due to temperature changes, using two ShapeAccelArray (SAAinstruments, and verified by a geodetic monitoring system. SAA is a geotechnicalinstrument often used for monitoring of displacements in soil. SAA uses micro-electro-mechanical system (MEMS sensors to measure tilt in the gravity field. The geodeticmonitoring system, which uses ALERT software, senses the displacements of targetsrelative to control points, using a robotic total station (RTS. The test setup consists of acentral four-metre free-standing steel tube with other steel tubes welded to most of itslength. The central tube is anchored in a concrete foundation. This composite “pole” isequipped with two SAAs as well as three geodetic prisms mounted on the top, in the middle,and in the foundation. The geodetic system uses multiple control targets mounted inconcrete foundations of nearby buildings, and at the base of the pole. Long-termobservations using two SAAs indicate that the pole is subject to deformations due to cyclicalambient temperature variations causing the pole to move by a few millimetres each day. Ina multiple-day experiment, it was possible to track this movement using SAA as well as theRTS system. This paper presents data comparing the measurements of the two instrumentsand provides a good example of the detection of two-dimensional movements of seeminglyrigid objects due to temperature changes.

  9. Verifying the model of predicting entrepreneurial intention among students of business and non-business orientation

    Directory of Open Access Journals (Sweden)

    Zoran Sušanj

    2015-01-01

    Full Text Available This study aims to verify whether certain entrepreneurial characteristics, like entrepreneurial potential and entrepreneurial propensity, affect the level of entrepreneurial self-efficacy and desirability of entrepreneurship, and further have direct and indirect effect on entrepreneurial intentions. Furthermore, this study seeks to compare the strength of the relationship between these variables among groups of students who receive some entrepreneurship education and students outside the business sphere. Data was collected from a sample of undergraduate students of business and non-business orientation and analyzed with multi-group analysis within SEM. Results of the multi-group analysis indicate that indeed, the strength of the relationship among tested variables is more pronounced when it comes to business students. That is, mediating effect of perceived entrepreneurial self-efficacy and desirability of entrepreneurship in the relationship between entrepreneurial characteristics and intent, is significantly stronger for the business-oriented groups, in comparison to non-business orientation group. The amount of explained variance of all constructs (except entrepreneurial propensity is also larger in business students in comparison to non-business students. Educational implications of obtained results are discussed.

  10. 40 CFR 205.168-11 - Order to cease distribution.

    Science.gov (United States)

    2010-07-01

    ... order will not be issued if the manufacturer has made a good faith attempt to properly production verify the category and can establish such good faith. (b) Any such order shall be issued after notice and...

  11. An experiment designed to verify the general theory of relativity; Une experience destinee a verifier la theorie de la relativite generalisee

    Energy Technology Data Exchange (ETDEWEB)

    Surdin, Maurice [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1960-07-01

    The project for an experiment which uses the effect of gravitation on Maser-type clocks placed on the ground at two different heights and which is designed to verify the general theory of relativity. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, sitting of 11 January 1960 [French] Projet d'une experience, utilisant l'effet de gravitation sur des horloges du type Maser placees sur la terre a deux altitudes differentes, et destinee a verifier la theorie de la relativite generalisee. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 250, p. 299-301, seance du 11 janvier 1960.

  12. Trust, but verify – accuracy of clinical commercial radiation treatment planning systems

    International Nuclear Information System (INIS)

    Lehmann, J; Kenny, J; Lye, J; Dunn, L; Williams, I

    2014-01-01

    Computer based Treatment Planning Systems (TPS) are used worldwide to design and calculate treatment plans for treating radiation therapy patients. TPS are generally well designed and thoroughly tested by their developers and local physicists prior to clinical use. However, the wide-reaching impact of their accuracy warrants ongoing vigilance. This work reviews the findings of the Australian national audit system and provides recommendations for checks of TPS. The Australian Clinical Dosimetry Service (ACDS) has designed and implemented a national system of audits, currently in a three year test phase. The Level III audits verify the accuracy of a beam model of a facility's TPS through a comparison of measurements with calculation at selected points in an anthropomorphic phantom. The plans are prescribed by the ACDS and all measurement equipment is brought in for independent onsite measurements. In this first version of audits, plans are comparatively simple, involving asymmetric fields, wedges and inhomogeneities. The ACDS has performed 14 Level III audits to-date. Six audits returned at least one measurement at Action Level, indicating that the measured dose differed more than 3.3% (but less than 5%) from the planned dose. Two audits failed (difference >5%). One fail was caused by a data transmission error coupled with quality assurance (QA) not being performed. The second fail was investigated and reduced to Action Level with the onsite audit team finding phantom setup at treatment a contributing factor. The Action Level results are attributed to small dose calculation deviations within the TPS, which are investigated and corrected by the facilities. Small deviations exist in clinical TPS which can add up and can combine with output variations to result in unacceptable variations. Ongoing checks and independent audits are recommended.

  13. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    Science.gov (United States)

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  14. Trust, but verify - Accuracy of clinical commercial radiation Treatment Planning Systems

    Science.gov (United States)

    Lehmann, J.; Kenny, J.; Lye, J.; Dunn, L.; Williams, I.

    2014-03-01

    Computer based Treatment Planning Systems (TPS) are used worldwide to design and calculate treatment plans for treating radiation therapy patients. TPS are generally well designed and thoroughly tested by their developers and local physicists prior to clinical use. However, the wide-reaching impact of their accuracy warrants ongoing vigilance. This work reviews the findings of the Australian national audit system and provides recommendations for checks of TPS. The Australian Clinical Dosimetry Service (ACDS) has designed and implemented a national system of audits, currently in a three year test phase. The Level III audits verify the accuracy of a beam model of a facility's TPS through a comparison of measurements with calculation at selected points in an anthropomorphic phantom. The plans are prescribed by the ACDS and all measurement equipment is brought in for independent onsite measurements. In this first version of audits, plans are comparatively simple, involving asymmetric fields, wedges and inhomogeneities. The ACDS has performed 14 Level III audits to-date. Six audits returned at least one measurement at Action Level, indicating that the measured dose differed more than 3.3% (but less than 5%) from the planned dose. Two audits failed (difference >5%). One fail was caused by a data transmission error coupled with quality assurance (QA) not being performed. The second fail was investigated and reduced to Action Level with the onsite audit team finding phantom setup at treatment a contributing factor. The Action Level results are attributed to small dose calculation deviations within the TPS, which are investigated and corrected by the facilities. Small deviations exist in clinical TPS which can add up and can combine with output variations to result in unacceptable variations. Ongoing checks and independent audits are recommended.

  15. Role of serial order in the impact of talker variability on short-term memory: testing a perceptual organization-based account.

    Science.gov (United States)

    Hughes, Robert W; Marsh, John E; Jones, Dylan M

    2011-11-01

    In two experiments, we examined the impact of the degree of match between sequential auditory perceptual organization processes and the demands of a short-term memory task (memory for order vs. item information). When a spoken sequence of digits was presented so as to promote its perceptual partitioning into two distinct streams by conveying it in alternating female (F) and male (M) voices (FMFMFMFM)--thereby disturbing the perception of true temporal order--recall of item order was greatly impaired (as compared to recall of item identity). Moreover, an order error type consistent with the formation of voice-based streams was committed more quickly in the alternating-voice condition (Exp. 1). In contrast, when the perceptual organization of the sequence mapped well onto an optimal two-group serial rehearsal strategy--by presenting the two voices in discrete clusters (FFFFMMMM)--order, but not item, recall was enhanced (Exp. 2). The results are consistent with the view that the degree of compatibility between perceptual and deliberate sequencing processes is a key determinant of serial short-term memory performance. Alternative accounts of talker variability effects in short-term memory, based on the concept of a dedicated phonological short-term store and a capacity-limited focus of attention, are also reviewed.

  16. Inverse biomimetics: how robots can help to verify concepts concerning sensorimotor control of human arm and leg movements.

    Science.gov (United States)

    Kalveram, Karl Theodor; Seyfarth, André

    2009-01-01

    Simulation test, hardware test and behavioral comparison test are proposed to experimentally verify whether a technical control concept for limb movements is logically precise, physically sound, and biologically relevant. Thereby, robot test-beds may play an integral part by mimicking functional limb movements. The procedure is exemplarily demonstrated for human aiming movements with the forearm: when comparing competitive control concepts, these movements are described best by a spring-like operating muscular-skeletal device which is assisted by feedforward control through an inverse internal model of the limb--without regress to a forward model of the limb. In a perspective on hopping, the concept of exploitive control is addressed, and its comparison to concepts derived from classical control theory advised.

  17. German Children's Use of Word Order and Case Marking to Interpret Simple and Complex Sentences: Testing Differences between Constructions and Lexical Items

    Science.gov (United States)

    Brandt, Silke; Lieven, Elena; Tomasello, Michael

    2016-01-01

    Children and adults follow cues such as case marking and word order in their assignment of semantic roles in simple transitives (e.g., "the dog chased the cat"). It has been suggested that the same cues are used for the interpretation of complex sentences, such as transitive relative clauses (RCs) (e.g., "that's the dog that chased…

  18. The design decisions of breeding zone sub-module for testing in ITER in order to validate the CHC TBM concept

    International Nuclear Information System (INIS)

    Leshukov, A.Yu.; Kapyshev, V.K.; Kartashev, I.A.; Kovalenko, V.G.; Razmerov, A.V.; Sviridenko, M.N.; Strebkov, Yu.S.

    2010-01-01

    Russian Federation has adopted the strategy to participate in the TBM Program on the rights of 'Partner' in the development of ceramic helium-cooled (CHC) test blanket module (TBM) concept. In this connection one of the possible collaboration scenarios is to integrate the characteristic design element of RF concept in the structure of 'Leader's' TBM and to test it in ITER environment. According to the collaboration in the framework of Test Blanket Working Group (TBWG) the 'Leader' and 'Partner' should develop together the selected (DEMO-relevant) TBM concept which will not disturb the ITER operation. Because of the analogue in the design principles, testing objectives and parameters of the EU CHC TBM concept ('Leader') and of the RF one, the RF specialists have developed the design options of breeding zone sub-module (BZSM) to be integrated in one of the EU TBM cells for further testing in ITER. There are four BZSM design options (according to four types of TBM to be tested) have been developed. Brief explanation of RF strategy in the partnership for the development of CHC blanket concept is presented in this paper. This paper also contains the description of all the four BZSM designs and some technological features.

  19. Main Test Floor (MTF)

    Data.gov (United States)

    Federal Laboratory Consortium — Purpose: The MTF is employed to validate advanced structural concepts and verify new analytical methodologies. Test articles range in size from subcomponent to full...

  20. The Influence of Antiobesity Media Content on Intention to Eat Healthily and Exercise: A Test of the Ordered Protection Motivation Theory

    OpenAIRE

    Raeann Ritland; Lulu Rodriguez

    2014-01-01

    This study extended the ordered protection motivation framework to determine whether exposure and attention to antiobesity media content increases people's appraisals of threat and their ability to cope with it. It also assesses whether these cognitive processes, in turn, affected people's intention to abide by the practices recommended to prevent obesity. The results of a national online survey using a nonprobability sample indicate that attention to mediated obesity and related information ...

  1. Reasoning about knowledge: Children’s evaluations of generality and verifiability

    Science.gov (United States)

    Koenig, Melissa A.; Cole, Caitlin A.; Meyer, Meredith; Ridge, Katherine E.; Kushnir, Tamar; Gelman, Susan A.

    2015-01-01

    In a series of experiments, we examined 3- to 8-year-old children’s (N = 223) and adults’ (N = 32) use of two properties of testimony to estimate a speaker’s knowledge: generality and verifiability. Participants were presented with a “Generic speaker” who made a series of 4 general claims about “pangolins” (a novel animal kind), and a “Specific speaker” who made a series of 4 specific claims about “this pangolin” as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., “has a pointy nose”) or a non-evident feature that was not visible (e.g., “sleeps in a hollow tree”). Three main findings emerged: (1) Young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) Children’s attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) Children often generalized speakers’ knowledge outside of the pangolin domain, indicating a belief that a person’s knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  2. Can EC and UK national methane emission inventories be verified using high precision stable isotope data?

    International Nuclear Information System (INIS)

    Lowry, D.; Holmes, C.W.; Nisbet, E.G.; Rata, N.D.

    2002-01-01

    The main anthropogenic sources of methane in industrialised countries (landfill/waste treatment, gas storage and distribution, coal) are far easier to reduce than CO 2 sources and the implementation of reduction strategies is potentially profitable. Statistical databases of methane emissions need independent external verification and carbon isotope data provide one way of estimating the expected source mix for each country if the main source types have been characterised isotopically. Using this method each country participating in the CORINAIR 94 database has been assigned an expected isotopic value for its emissions. The averaged δ 13 C of methane emitted from the CORINAIR region of Europe, based on total emissions of each country is -55.4 per mille for 1994. This European source mix can be verified using trajectory analysis for air samples collected at background stations. Methane emissions from the UK, and particularly the London region, have undergone more detailed analysis using data collected at the Royal Holloway site on the western fringe of London. If the latest emissions inventory figures are correct then the modelled isotopic change in the UK source mix is from -48.4 per mille in 1990 to -50.7 per mille in 1997. This represents a reduction in emissions of 25% over a 7-year period, important in meeting proposed UK greenhouse gas reduction targets. These changes can be tested by the isotopic analysis of air samples at carefully selected coastal background and interior sites. Regular sampling and isotopic analysis coupled with back trajectory analysis from a range of sites could provide an important tool for monitoring and verification of EC and UK methane emissions in the run-up to 2010. (author)

  3. Verifying three-dimensional skull model reconstruction using cranial index of symmetry.

    Directory of Open Access Journals (Sweden)

    Woon-Man Kung

    Full Text Available BACKGROUND: Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS. MATERIALS AND METHODS: From January 2011 to June 2012, decompressive craniectomy (DC was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. RESULTS: CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47-99.84. CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (p<0.001, p = 0.064, p = 0.021 respectively, Wilcoxon matched pairs signed rank test. These data evidenced the highly accurate symmetry of these CAD models with regular contours. CONCLUSIONS: CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation.

  4. Use of models and mockups in verifying man-machine interfaces

    International Nuclear Information System (INIS)

    Seminara, J.L.

    1985-01-01

    The objective of Human Factors Engineering is to tailor the design of facilities and equipment systems to match the capabilities and limitations of the personnel who will operate and maintain the system. This optimization of the man-machine interface is undertaken to enhance the prospects for safe, reliable, timely, and error-free human performance in meeting system objectives. To ensure the eventual success of a complex man-machine system it is important to systematically and progressively test and verify the adequacy of man-machine interfaces from initial design concepts to system operation. Human factors specialists employ a variety of methods to evaluate the quality of the human-system interface. These methods include: (1) Reviews of two-dimensional drawings using appropriately scaled transparent overlays of personnel spanning the anthropometric range, considering clothing and protective gear encumbrances (2) Use of articulated, scaled, plastic templates or manikins that are overlayed on equipment or facility drawings (3) Development of computerized manikins in computer aided design approaches (4) Use of three-dimensional scale models to better conceptualize work stations, control rooms or maintenance facilities (5) Full or half-scale mockups of system components to evaluate operator/maintainer interfaces (6) Part of full-task dynamic simulation of operator or maintainer tasks and interactive system responses (7) Laboratory and field research to establish human performance capabilities with alternative system design concepts or configurations. Of the design verification methods listed above, this paper will only consider the use of models and mockups in the design process

  5. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses

    NARCIS (Netherlands)

    Kuiper, Rebecca M.; Nederhoff, Tim; Klugkist, Irene

    2015-01-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is

  6. The use of screening tests in spacecraft lubricant evaluation

    Science.gov (United States)

    Kalogeras, Chris; Hilton, Mike; Carre, David; Didziulis, Stephen; Fleischauer, Paul

    1993-01-01

    A lubricant screening test fixture has been devised in order to satisfy the need to obtain lubricant performance data in a timely manner. This fixture has been used to perform short-term tests on potential lubricants for several spacecraft applications. The results of these tests have saved time by producing qualitative performance rankings of lubricant selections prior to life testing. To date, this test fixture has been used to test lubricants for 3 particular applications. The qualitative results from these tests have been verified by life test results and have provided insight into the function of various anti-wear additives.

  7. A Jeziorski-Monkhorst fully uncontracted multi-reference perturbative treatment. I. Principles, second-order versions, and tests on ground state potential energy curves

    Science.gov (United States)

    Giner, Emmanuel; Angeli, Celestino; Garniron, Yann; Scemama, Anthony; Malrieu, Jean-Paul

    2017-06-01

    The present paper introduces a new multi-reference perturbation approach developed at second order, based on a Jeziorski-Mokhorst expansion using individual Slater determinants as perturbers. Thanks to this choice of perturbers, an effective Hamiltonian may be built, allowing for the dressing of the Hamiltonian matrix within the reference space, assumed here to be a CAS-CI. Such a formulation accounts then for the coupling between the static and dynamic correlation effects. With our new definition of zeroth-order energies, these two approaches are strictly size-extensive provided that local orbitals are used, as numerically illustrated here and formally demonstrated in the Appendix. Also, the present formalism allows for the factorization of all double excitation operators, just as in internally contracted approaches, strongly reducing the computational cost of these two approaches with respect to other determinant-based perturbation theories. The accuracy of these methods has been investigated on ground-state potential curves up to full dissociation limits for a set of six molecules involving single, double, and triple bond breaking together with an excited state calculation. The spectroscopic constants obtained with the present methods are found to be in very good agreement with the full configuration interaction results. As the present formalism does not use any parameter or numerically unstable operation, the curves obtained with the two methods are smooth all along the dissociation path.

  8. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  9. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    International Nuclear Information System (INIS)

    Yampolskiy, Roman V

    2017-01-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification. (invited comment)

  10. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    Science.gov (United States)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  11. Governor stability simulations of Svartisen power plant verified by the installed monitoring system on site

    International Nuclear Information System (INIS)

    Nielsen, T K; Kjeldsen, M

    2010-01-01

    Many Norwegian hydro power plants have complex lay-out with several reservoirs, broke intakes, surge shafts and even air cushion chambers. There are kilometers of excavated tunnels as well as long tail water systems. The stations are often equipped by multiple of turbines, both in series and parallel. A number of operation modes are therefore possible. Doing transient simulations and simulations of governor stability in the design phase, the problem is to find the worst case scenario regarding these operating modes. Svartisen power plant has been of particular interest these days. The power plant is originally designed for two 350 MW Francis turbines, however, only one turbine was installed. When designed, governor stability was regarded as problematic due to the long penstock. A long penstock will give a too high time constant for the hydraulic inertia. The main problem here is, however, the water hammer frequency that interferes with the governor performance. The frequency is in the same range as the cross frequency. Therefore the governor will react on these water hammer waves, which in its nature is notoriously unstable. The common solution is to build an air cushion and thereby increase the water hammer frequency above the cross frequency. The expenses were, however, deemed too high, and it was necessary to seek for other solutions. A pressure feedback on the governor was introduced in order to have stable operation at least for two turbines. With only one turbine installed, the pressure feedback has not been activated because, based on the simulations, it was regarded unnecessary. Even if the original simulations shows good stability margins when only one turbine is running, there has been some indications that the aggregate has suffered from instability. In 2004 Svartisen Power Plant was equipped with a comprehensive monitoring system. Both the turbine and the generator performance have been observed. This gives valuable information on how the hydropower

  12. Governor stability simulations of Svartisen power plant verified by the installed monitoring system on site

    Science.gov (United States)

    Nielsen, T. K.; Kjeldsen, M.

    2010-08-01

    Many Norwegian hydro power plants have complex lay-out with several reservoirs, broke intakes, surge shafts and even air cushion chambers. There are kilometers of excavated tunnels as well as long tail water systems. The stations are often equipped by multiple of turbines, both in series and parallel. A number of operation modes are therefore possible. Doing transient simulations and simulations of governor stability in the design phase, the problem is to find the worst case scenario regarding these operating modes. Svartisen power plant has been of particular interest these days. The power plant is originally designed for two 350 MW Francis turbines, however, only one turbine was installed. When designed, governor stability was regarded as problematic due to the long penstock. A long penstock will give a too high time constant for the hydraulic inertia. The main problem here is, however, the water hammer frequency that interferes with the governor performance. The frequency is in the same range as the cross frequency. Therefore the governor will react on these water hammer waves, which in its nature is notoriously unstable. The common solution is to build an air cushion and thereby increase the water hammer frequency above the cross frequency. The expenses were, however, deemed too high, and it was necessary to seek for other solutions. A pressure feedback on the governor was introduced in order to have stable operation at least for two turbines. With only one turbine installed, the pressure feedback has not been activated because, based on the simulations, it was regarded unnecessary. Even if the original simulations shows good stability margins when only one turbine is running, there has been some indications that the aggregate has suffered from instability. In 2004 Svartisen Power Plant was equipped with a comprehensive monitoring system. Both the turbine and the generator performance have been observed. This gives valuable information on how the hydropower

  13. Development of material measures for performance verifying surface topography measuring instruments

    International Nuclear Information System (INIS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-01-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments

  14. Association between cotinine-verified smoking status and hypertension in 167,868 Korean adults.

    Science.gov (United States)

    Kim, Byung Jin; Han, Ji Min; Kang, Jung Gyu; Kim, Bum Soo; Kang, Jin Ho

    2017-10-01

    Previous studies showed inconsistent results concerning the relationship between chronic smoking and blood pressure. Most of the studies involved self-reported smoking status. This study was performed to evaluate the association of urinary cotinine or self-reported smoking status with hypertension and blood pressure in Korean adults. Among individuals enrolled in the Kangbuk Samsung Health Study and Kangbuk Samsung Cohort Study, 167,868 participants (men, 55.7%; age, 37.5 ± 6.9 years) between 2011 and 2013 who had urinary cotinine measurements were included. Individuals with urinary cotinine levels ≥50 ng/mL were defined as cotinine-verified current smokers. The prevalence of hypertension and cotinine-verified current smokers in the overall population was 6.8% and 22.7%, respectively (10.0% in men and 2.8% in women for hypertension: 37.7% in men and 3.9% in women for cotinine-verified current smokers). In a multivariate regression analysis adjusted for age, sex, body mass index, waist circumference, alcohol drinking, vigorous exercise, and diabetes, cotinine-verified current smoking was associated with lower prevalence of hypertension compared with cotinine-verified never smoking (OR[95% CI], 0.79 [0.75, 0.84]). Log-transformed cotinine levels and unobserved smoking were negatively associated with hypertension, respectively (0.96 [0.96, 0.97] and 0.55 [0.39, 0.79]). In a multivariate linear regression analysis, the cotinine-verified current smoking was inversely associated with systolic and diastolic blood pressure (BP) (regression coefficient[95% CI], -1.23[-1.39, -1.07] for systolic BP and -0.71 [-0.84, -0.58] for diastolic BP). In subgroup analyses according to sex, the inverse associations between cotinine-verified current smoking and hypertension were observed only in men. This large observational study showed that cotinine-verified current smoking and unobserved smoking were inversely associated with hypertension in Korean adults, especially only in

  15. The influence of antiobesity media content on intention to eat healthily and exercise: a test of the ordered protection motivation theory.

    Science.gov (United States)

    Ritland, Raeann; Rodriguez, Lulu

    2014-01-01

    This study extended the ordered protection motivation framework to determine whether exposure and attention to antiobesity media content increases people's appraisals of threat and their ability to cope with it. It also assesses whether these cognitive processes, in turn, affected people's intention to abide by the practices recommended to prevent obesity. The results of a national online survey using a nonprobability sample indicate that attention to mediated obesity and related information significantly increased people's intention to exercise as well as their overall coping appraisals (the perceived effectiveness of the recommended behaviors and their ability to perform them). Likewise, increased threat and coping appraisals were both found to significantly influence people's intention to exercise and diet. Coping (rather than threat) appraisals more strongly predicted behavioral intent. Following the attitude-behavior literature, behavioral intention was used as the most proximate predictor of actual behavior (i.e., stronger intentions increase the likelihood of behavior change).

  16. A framework for verifying the dismantlement and abandonment of nuclear weapons. A policy implication for the denuclearization of Korea Peninsula

    International Nuclear Information System (INIS)

    Ichimasa, Sukeyuki

    2011-01-01

    Denuclearization of Korean Peninsula has been a serious security issue in the North East Asian region. Although the Six-Party Talks has been suspended since North Korea declared a boycott in 2008, aims of denuclearizing North Korea has still been discussed. For instance, the recent Japan and the U.S. '2+2' dialogue affirmed its importance to achieve complete and verifiable denuclearization of North Korea, including scrutinizing its uranium enrichment program, through irreversible steps under the Six Party process. In order to identify effective and efficient framework for denuclearization of North Korea, this paper examines 5 major denuclearization methods including (1) the Nunn-Luger Method, (2) the Iraqi Method, (3) the South African Method, (4) the Libyan Method and (5) the denuclearization method shown in the Nuclear Weapons Convention (NWC), while referring to the recent developments of the verification studies for nuclear disarmament, such as a joint research conducted by the United Kingdom and Norway and any other arguments made by disarmament experts. Moreover, this paper argues what political and security conditions will be required to make North Korea to accept intrusive verification for its denuclearization. Conditions for successful denuclearization talks among the Six-Party member states and a realistic approach of verifiable denuclearization will be also examined. (author)

  17. The Influence of Antiobesity Media Content on Intention to Eat Healthily and Exercise: A Test of the Ordered Protection Motivation Theory

    Directory of Open Access Journals (Sweden)

    Raeann Ritland

    2014-01-01

    Full Text Available This study extended the ordered protection motivation framework to determine whether exposure and attention to antiobesity media content increases people’s appraisals of threat and their ability to cope with it. It also assesses whether these cognitive processes, in turn, affected people’s intention to abide by the practices recommended to prevent obesity. The results of a national online survey using a nonprobability sample indicate that attention to mediated obesity and related information significantly increased people’s intention to exercise as well as their overall coping appraisals (the perceived effectiveness of the recommended behaviors and their ability to perform them. Likewise, increased threat and coping appraisals were both found to significantly influence people’s intention to exercise and diet. Coping (rather than threat appraisals more strongly predicted behavioral intent. Following the attitude-behavior literature, behavioral intention was used as the most proximate predictor of actual behavior (i.e., stronger intentions increase the likelihood of behavior change.

  18. Regressive transgressive cycle of Devonian sea in Uruguay verified by Palynology

    International Nuclear Information System (INIS)

    Da Silva, J.

    1990-01-01

    This work is about the results and conclusions of the populations palinomorphs study, carried out in Devonian formations in the center of Uruguay. The existence of a regressive transgressive cycle is verified by analyzing the vertical distribution of palinomorphs as well as is mentioned the presence of chintziest for the section studied - hoesphaeridium Cyathochitina kinds

  19. Die verifiëring, verfyning en toepassing van leksikografiese liniale ...

    African Journals Online (AJOL)

    Leksikografiese liniale vir Afrikaans en die Afrikatale is 'n dekade oud en word algemeen gebruik in die samestelling van woordeboeke. Die samestellers het dit tot dusver nie nodig geag om hierdie liniale te verifieer of te verfyn nie. Kritiek is egter uitgespreek op die samestelling van die Afrikaanse Liniaal en dit word in ...

  20. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility`s WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator`s waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits.

  1. Descriptional complexity of non-unary self-verifying symmetric difference automata

    CSIR Research Space (South Africa)

    Marais, Laurette

    2017-09-01

    Full Text Available Previously, self-verifying symmetric difference automata were defined and a tight bound of 2^n-1-1 was shown for state complexity in the unary case. We now consider the non-unary case and show that, for every n at least 2, there is a regular...

  2. 13 CFR 127.403 - What happens if SBA verifies the concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA verifies the concern's eligibility? 127.403 Section 127.403 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  3. 13 CFR 127.404 - What happens if SBA is unable to verify a concern's eligibility?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA is unable to verify a concern's eligibility? 127.404 Section 127.404 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION WOMEN-OWNED SMALL BUSINESS FEDERAL CONTRACT ASSISTANCE PROCEDURES Eligibility Examinations § 127...

  4. 40 CFR 8.9 - Measures to assess and verify environmental impacts.

    Science.gov (United States)

    2010-07-01

    ... environmental impacts. 8.9 Section 8.9 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL ENVIRONMENTAL IMPACT ASSESSMENT OF NONGOVERNMENTAL ACTIVITIES IN ANTARCTICA § 8.9 Measures to assess and verify environmental impacts. (a) The operator shall conduct appropriate monitoring of key environmental indicators as...

  5. Methods for verifying compliance with low-level radioactive waste acceptance criteria

    International Nuclear Information System (INIS)

    1993-09-01

    This report summarizes the methods that are currently employed and those that can be used to verify compliance with low-level radioactive waste (LLW) disposal facility waste acceptance criteria (WAC). This report presents the applicable regulations representing the Federal, State, and site-specific criteria for accepting LLW. Typical LLW generators are summarized, along with descriptions of their waste streams and final waste forms. General procedures and methods used by the LLW generators to verify compliance with the disposal facility WAC are presented. The report was written to provide an understanding of how a regulator could verify compliance with a LLW disposal facility's WAC. A comprehensive study of the methodology used to verify waste generator compliance with the disposal facility WAC is presented in this report. The study involved compiling the relevant regulations to define the WAC, reviewing regulatory agency inspection programs, and summarizing waste verification technology and equipment. The results of the study indicate that waste generators conduct verification programs that include packaging, classification, characterization, and stabilization elements. The current LLW disposal facilities perform waste verification steps on incoming shipments. A model inspection and verification program, which includes an emphasis on the generator's waste application documentation of their waste verification program, is recommended. The disposal facility verification procedures primarily involve the use of portable radiological survey instrumentation. The actual verification of generator compliance to the LLW disposal facility WAC is performed through a combination of incoming shipment checks and generator site audits

  6. Architecture optimization at IPEN animal facility in order to improve the welfare and the quality of the animals employed at radiopharmaceutical tests

    Energy Technology Data Exchange (ETDEWEB)

    Lainetti, Elizabeth Brigagao de Faria; Nascimento, Nanci do [Instituto de Pesquisas Energeticas e Nucleares (IPEN-CNEN/SP), Sao Paulo, SP (Brazil)], e-mail: eblainet@ipen.br; Passos, Luiz Augusto Correa [Universidade Estadual de Campinas, SP (Brazil). Centro Multidisciplinar para a Investigacao Biologica (CEMIB/UNICAMP)

    2009-07-01

    The production and the issue of high quality laboratory animals are essentials for the accomplishment of vanguard scientific research, with reproducibility and universality. The quality of those animals depends, largely, of the available facilities for their production and lodging, to assure the demanded sanitary control and animals' well being, in agreement with the ethical principles that control the activity. The facilities also have to fill out other requirements, such as: the functionality of the environments to make possible the suitable and efficient handling of the animals, facilitating the execution of the routine activities; the respect to ergonomic principles to provide a safe environment and the operators' well being. The facilities design is of vital importance so that the mentioned requirements can be reached. The project of the Nuclear and Energy Research Institute (IPEN) Animal House Facilities was accomplished in the year of 1964. However, by that time there were not the current recommendations with respect to the sanitary, genetic and environmental controls. The facility was planned with the objective of being a production unit and a local for keeping of defined animals from sanitary, genetic and environmental point of view. Nevertheless, the original unit drawing presents an unsuitable distribution of the area where animals are stockpiled and different activities are performed. The Animal House Facilities occupies an area of 840 m{sup 2}, with one pavement, where the production areas and the stock of original animal models of the own institution are distributed, as well as the maintenance of animals from other national or foreigner institutions. It supplies rats and mice for biological tests of radiopharmaceutical lots, produced in IPEN, before they be sent to hospitals and clinics spread out in Brazil, for use in Nuclear Medicine. It also supplies rats and mice for tests of odontologic materials, for tests with growth hormones and for

  7. Architecture optimization at IPEN animal facility in order to improve the welfare and the quality of the animals employed at radiopharmaceutical tests

    International Nuclear Information System (INIS)

    Lainetti, Elizabeth Brigagao de Faria; Nascimento, Nanci do; Passos, Luiz Augusto Correa

    2009-01-01

    The production and the issue of high quality laboratory animals are essentials for the accomplishment of vanguard scientific research, with reproducibility and universality. The quality of those animals depends, largely, of the available facilities for their production and lodging, to assure the demanded sanitary control and animals' well being, in agreement with the ethical principles that control the activity. The facilities also have to fill out other requirements, such as: the functionality of the environments to make possible the suitable and efficient handling of the animals, facilitating the execution of the routine activities; the respect to ergonomic principles to provide a safe environment and the operators' well being. The facilities design is of vital importance so that the mentioned requirements can be reached. The project of the Nuclear and Energy Research Institute (IPEN) Animal House Facilities was accomplished in the year of 1964. However, by that time there were not the current recommendations with respect to the sanitary, genetic and environmental controls. The facility was planned with the objective of being a production unit and a local for keeping of defined animals from sanitary, genetic and environmental point of view. Nevertheless, the original unit drawing presents an unsuitable distribution of the area where animals are stockpiled and different activities are performed. The Animal House Facilities occupies an area of 840 m 2 , with one pavement, where the production areas and the stock of original animal models of the own institution are distributed, as well as the maintenance of animals from other national or foreigner institutions. It supplies rats and mice for biological tests of radiopharmaceutical lots, produced in IPEN, before they be sent to hospitals and clinics spread out in Brazil, for use in Nuclear Medicine. It also supplies rats and mice for tests of odontologic materials, for tests with growth hormones and for researches of

  8. Accuracy of self-reported length of coma and posttraumatic amnesia in persons with medically verified traumatic brain injury.

    Science.gov (United States)

    Sherer, Mark; Sander, Angelle M; Maestas, Kacey Little; Pastorek, Nicholas J; Nick, Todd G; Li, Jingyun

    2015-04-01

    To determine the accuracy of self-reported length of coma and posttraumatic amnesia (PTA) in persons with medically verified traumatic brain injury (TBI) and to investigate factors that affect self-report of length of coma and PTA duration. Prospective cohort study. Specialized rehabilitation center with inpatient and outpatient programs. Persons (N=242) with medically verified TBI who were identified from a registry of persons who had previously participated in TBI-related research. Not applicable. Self-reported length of coma and self-reported PTA duration. Review of medical records revealed that the mean medically documented length of coma and PTA duration was 6.9±12 and 19.2±22 days, respectively, and the mean self-reported length of coma and PTA duration was 16.7±22 and 106±194 days, respectively. The average discrepancy between self-report and medical record for length of coma and PTA duration was 8.2±21 and 64±176 days, respectively. Multivariable regression models revealed that time since injury, performance on cognitive tests, and medical record values were associated with self-reported values for both length of coma and PTA duration. In this investigation, persons with medically verified TBI showed poor accuracy in their self-report of length of coma and PTA duration. Discrepancies were large enough to affect injury severity classification. Caution should be exercised when considering self-report of length of coma and PTA duration. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. Loss on Ignition Furnace Acceptance and Operability Test Procedure

    International Nuclear Information System (INIS)

    JOHNSTON, D.C.

    2000-01-01

    The purpose of this Acceptance Test Procedure and Operability Test Procedure (ATP/OTP)is to verify the operability of newly installed Loss on Ignition (LOI) equipment, including a model 1608FL CMTM Furnace, a dessicator, and balance. The operability of the furnace will be verified. The arrangement of the equipment placed in Glovebox 157-3/4 to perform LOI testing on samples supplied from the Thermal Stabilization line will be verified. In addition to verifying proper operation of the furnace, this ATP/OTP will also verify the air flow through the filters, verify a damper setting to establish and maintain the required differential pressure between the glovebox and the room pressure, and test the integrity of the newly installed HEPA filter. In order to provide objective evidence of proper performance of the furnace, the furnace must heat 15 crucibles, mounted on a crucible rack, to 1000 C, according to a program entered into the furnace controller located outside the glovebox. The glovebox differential pressure will be set to provide the 0.5 to 2.0 inches of water (gauge) negative pressure inside the glovebox with an expected airflow of 100 to 125 cubic feet per minute (cfm) through the inlet filter. The glovebox inlet G1 filter will be flow tested to ensure the integrity of the filter connections and the efficiency of the filter medium. The newly installed windows and glovebox extension, as well as all disturbed joints, will be sonically tested via ultra probe to verify no leaks are present. The procedure for DOS testing of the filter is found in Appendix A

  10. Loss on Ignition Furnace Acceptance and Operability Test Procedure

    Energy Technology Data Exchange (ETDEWEB)

    JOHNSON, D.C.

    2000-06-01

    The purpose of this Acceptance Test Procedure and Operability Test Procedure (ATP/OTP)is to verify the operability of newly installed LOI equipment, including a model 1608FL CM{trademark} Furnace, a dessicator, and balance. The operability of the furnace will be verified. The arrangement of the equipment placed in Glovebox 157-3/4 to perform Loss on Ignition (LOI) testing on samples supplied from the Thermal Stabilization line will be verified. In addition to verifying proper operation of the furnace, this ATP/OTP will also verify the air flow through the filters, verify a damper setting to establish and maintain the required differential pressure between the glovebox and the room pressure, and test the integrity of the newly installed HEPA filter. In order to provide objective evidence of proper performance of the furnace, the furnace must heat 15 crucibles, mounted on a crucible rack, to 1000 C, according to a program entered into the furnace controller located outside the glovebox. The glovebox differential pressure will be set to provide the 0.5 to 2.0 inches of water (gauge) negative pressure inside the glovebox with an airflow of 100 to 125 cubic feet per minute (cfm) through the inlet filter. The glovebox inlet Glfilter will he flow tested to ensure the integrity of the filter connections and the efficiency of the filter medium. The newly installed windows and glovebox extension, as well as all disturbed joints, will be sonically tested via ultra probe to verify no leaks are present. The procedure for DOS testing of the filter is found in Appendix A.

  11. [Sequencing and analysis of the resistome of Streptomyces fradiae ATCC19609 in order to develop a test system for screening of new antimicrobial agents].

    Science.gov (United States)

    Vatlin, A A; Bekker, O B; Lysenkova, L N; Korolev, A M; Shchekotikhin, A E; Danilenko, V N

    2016-06-01

    The paper provides the annotation and data on sequencing the antibiotic resistance genes in Streptomyces fradiae strain ATCC19609, highly sensitive to different antibiotics. Genome analysis revealed four groups of genes that determined the resistome of the tested strain. These included classical antibiotic resistance genes (nine aminoglycoside phosphotransferase genes, two beta-lactamase genes, and the genes of puromycin N-acetyltransferase, phosphinothricin N-acetyltransferase, and aminoglycoside acetyltransferase); the genes of ATP-dependent ABC transporters, involved in the efflux of antibiotics from the cell (MacB-2, BcrA, two-subunit MDR1); the genes of positive and negative regulation of transcription (whiB and padR families); and the genes of post-translational modification (serine-threonine protein kinases). A comparative characteristic of aminoglycoside phosphotransferase genes in S. fradiae ATCC19609, S. lividans TK24, and S. albus J1074, the causative agent of actinomycosis, is provided. The possibility of using the S. fradiae strain ATCC19609 as the test system for selection of the macrolide antibiotic oligomycin A derivatives with different levels of activity is demonstrated. Analysis of more than 20 semisynthetic oligomycin A derivatives made it possible to divide them into three groups according to the level of activity: inactive (>1 nmol/disk), 10 substances; with medium activity level (0.05–1 nmol/disk), 12 substances; and more active (0.01–0.05 nmol/disk), 2 substances. Important for the activity of semisynthetic derivatives is the change in the position of the 33rd carbon atom in the oligomycin A molecule.

  12. An approach for verifying biogenic greenhouse gas emissions inventories with atmospheric CO2 concentration data

    International Nuclear Information System (INIS)

    Ogle, Stephen M; Davis, Kenneth; Lauvaux, Thomas; Miles, Natasha L; Richardson, Scott; Schuh, Andrew; Cooley, Dan; Breidt, F Jay; West, Tristram O; Heath, Linda S; Smith, James E; McCarty, Jessica L; Gurney, Kevin R; Tans, Pieter; Denning, A Scott

    2015-01-01

    Verifying national greenhouse gas (GHG) emissions inventories is a critical step to ensure that reported emissions data to the United Nations Framework Convention on Climate Change (UNFCCC) are accurate and representative of a country’s contribution to GHG concentrations in the atmosphere. Furthermore, verifying biogenic fluxes provides a check on estimated emissions associated with managing lands for carbon sequestration and other activities, which often have large uncertainties. We report here on the challenges and results associated with a case study using atmospheric measurements of CO 2 concentrations and inverse modeling to verify nationally-reported biogenic CO 2 emissions. The biogenic CO 2 emissions inventory was compiled for the Mid-Continent region of United States based on methods and data used by the US government for reporting to the UNFCCC, along with additional sources and sinks to produce a full carbon balance. The biogenic emissions inventory produced an estimated flux of −408 ± 136 Tg CO 2 for the entire study region, which was not statistically different from the biogenic flux of −478 ± 146 Tg CO 2 that was estimated using the atmospheric CO 2 concentration data. At sub-regional scales, the spatial density of atmospheric observations did not appear sufficient to verify emissions in general. However, a difference between the inventory and inversion results was found in one isolated area of West-central Wisconsin. This part of the region is dominated by forestlands, suggesting that further investigation may be warranted into the forest C stock or harvested wood product data from this portion of the study area. The results suggest that observations of atmospheric CO 2 concentration data and inverse modeling could be used to verify biogenic emissions, and provide more confidence in biogenic GHG emissions reporting to the UNFCCC. (letter)

  13. Order Aggressiveness and Order Book Dynamics

    OpenAIRE

    Anthony D. Hall; Nikolaus Hautsch

    2004-01-01

    In this paper, we study the determinants of order aggressiveness and traders' order submission strategy in an open limit order book market. Using order book data from the Australian Stock Exchange, we model traders' aggressiveness in market trading, limit order trading as well as in order cancellations on both sides of the market using a six-dimensional autoregressive intensity model. The information revealed by the open order book plays an important role in explaining the degree of order agg...

  14. Using Participatory System Dynamics Modeling to Examine the Local HIV Test and Treatment Care Continuum in Order to Reduce Community Viral Load.

    Science.gov (United States)

    Weeks, Margaret R; Li, Jianghong; Lounsbury, David; Green, Helena Danielle; Abbott, Maryann; Berman, Marcie; Rohena, Lucy; Gonzalez, Rosely; Lang, Shawn; Mosher, Heather

    2017-12-01

    Achieving community-level goals to eliminate the HIV epidemic requires coordinated efforts through community consortia with a common purpose to examine and critique their own HIV testing and treatment (T&T) care system and build effective tools to guide their efforts to improve it. Participatory system dynamics (SD) modeling offers conceptual, methodological, and analytical tools to engage diverse stakeholders in systems conceptualization and visual mapping of dynamics that undermine community-level health outcomes and identify those that can be leveraged for systems improvement. We recruited and engaged a 25-member multi-stakeholder Task Force, whose members provide or utilize HIV-related services, to participate in SD modeling to examine and address problems of their local HIV T&T service system. Findings from the iterative model building sessions indicated Task Force members' increasingly complex understanding of the local HIV care system and demonstrated their improved capacity to visualize and critique multiple models of the HIV T&T service system and identify areas of potential leverage. Findings also showed members' enhanced communication and consensus in seeking deeper systems understanding and options for solutions. We discuss implications of using these visual SD models for subsequent simulation modeling of the T&T system and for other community applications to improve system effectiveness. © Society for Community Research and Action 2017.

  15. Evaluation of zero-order controlled release preparations of nifedipine tablet on dissolution test, together with cost benefit point of view.

    Science.gov (United States)

    Sakurai, Miyuki; Naruto, Ikue; Matsuyama, Kenji

    2008-05-01

    Many generic drugs have been released to decrease medical expenses, but some problems have been reported with regard to bioavailability and safety. In this study, we compared three once-a-day controlled-release preparations of nifedipine by the dissolution test (one branded and two generic preparations). Although the two generic drugs were equivalent to the branded drug according to the criteria listed in the Japanese "Guideline for Bioequivalence Studies of Generic Products", there was still a possibility of problems arising. For example, side effects could be caused by a rapid increase in the blood level of nifedipine with one generic drug, while bioavailability might be inadequate with the other due to its small area under the concentration vs. time curve. When each drug was prescribed at a dosage of 20 mg once daily for two weeks, the difference in the copayment for the patient was only 10 yen. Accordingly, it is important for doctors and pharmacists to carefully consider whether such a slight difference in price is really a benefit for the patient.

  16. Order Theory in Environmental Sciences

    DEFF Research Database (Denmark)

    Sørensen, P. B.; Brüggemann, R.; Lerche, D. B.

    This is the proceeding from the fifth workshop in Order Theory in Environ-mental Science. In this workshop series the concept of Partial Order Theory is development in relation to application and the use is tested based on specific problems. The Partial Order Theory will have a potential use...

  17. A high-order SPH method by introducing inverse kernels

    Directory of Open Access Journals (Sweden)

    Le Fang

    2017-02-01

    Full Text Available The smoothed particle hydrodynamics (SPH method is usually expected to be an efficient numerical tool for calculating the fluid-structure interactions in compressors; however, an endogenetic restriction is the problem of low-order consistency. A high-order SPH method by introducing inverse kernels, which is quite easy to be implemented but efficient, is proposed for solving this restriction. The basic inverse method and the special treatment near boundary are introduced with also the discussion of the combination of the Least-Square (LS and Moving-Least-Square (MLS methods. Then detailed analysis in spectral space is presented for people to better understand this method. Finally we show three test examples to verify the method behavior.

  18. 0.1 Trend analysis of δ18O composition of precipitation in Germany: Combining Mann-Kendall trend test and ARIMA models to correct for higher order serial correlation

    Science.gov (United States)

    Klaus, Julian; Pan Chun, Kwok; Stumpp, Christine

    2015-04-01

    Spatio-temporal dynamics of stable oxygen (18O) and hydrogen (2H) isotopes in precipitation can be used as proxies for changing hydro-meteorological and regional and global climate patterns. While spatial patterns and distributions gained much attention in recent years the temporal trends in stable isotope time series are rarely investigated and our understanding of them is still limited. These might be a result of a lack of proper trend detection tools and effort for exploring trend processes. Here we make use of an extensive data set of stable isotope in German precipitation. In this study we investigate temporal trends of δ18O in precipitation at 17 observation station in Germany between 1978 and 2009. For that we test different approaches for proper trend detection, accounting for first and higher order serial correlation. We test if significant trends in the isotope time series based on different models can be observed. We apply the Mann-Kendall trend tests on the isotope series, using general multiplicative seasonal autoregressive integrate moving average (ARIMA) models which account for first and higher order serial correlations. With the approach we can also account for the effects of temperature, precipitation amount on the trend. Further we investigate the role of geographic parameters on isotope trends. To benchmark our proposed approach, the ARIMA results are compared to a trend-free prewhiting (TFPW) procedure, the state of the art method for removing the first order autocorrelation in environmental trend studies. Moreover, we explore whether higher order serial correlations in isotope series affects our trend results. The results show that three out of the 17 stations have significant changes when higher order autocorrelation are adjusted, and four stations show a significant trend when temperature and precipitation effects are considered. Significant trends in the isotope time series are generally observed at low elevation stations (≤315 m a

  19. Verifying compliance with nuclear non-proliferation undertakings: IAEA safeguards agreements and additional protocols

    International Nuclear Information System (INIS)

    2008-06-01

    This report provides background information on safeguards and explains procedures for States to conclude Additional Protocols to comprehensive Safeguards Agreements with the IAEA. Since the IAEA was founded in 1957, its safeguards system has been an indispensable component of the nuclear non-proliferation regime and has facilitated peaceful nuclear cooperation. In recognition of this, the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) makes it mandatory for all non-nuclear-weapon States (NNWS) party to the Treaty to conclude comprehensive safeguards agreements with the IAEA, and thus allow for the application of safeguards to all their nuclear material. Under Article III of the NPT, all NNWS undertake to accept safeguards, as set forth in agreements to be negotiated and concluded with the IAEA, for the exclusive purpose of verification of the fulfilment of the States' obligations under the NPT. In May 1997, the IAEA Board of Governors approved the Model Additional Protocol to Safeguards Agreements (reproduced in INFCIRC/540(Corr.)) which provided for an additional legal authority. In States that have both a comprehensive safeguards agreement and an additional protocol in force, the IAEA is able to optimize the implementation of all safeguards measures available. In order to simplify certain procedures under comprehensive safeguards agreements for States with little or no nuclear material and no nuclear material in a facility, the IAEA began making available, in 1971, a 'small quantities protocol' (SQP), which held in abeyance the implementation of most of the detailed provisions of comprehensive safeguards agreements for so long as the State concerned satisfied these criteria. The safeguards system aims at detecting and deterring the diversion of nuclear material. Such material includes enriched uranium, plutonium and uranium-233, which could be used directly in nuclear weapons. It also includes natural uranium and depleted uranium, the latter of which is

  20. Verifying compliance with nuclear non-proliferation undertakings: IAEA safeguards agreements and additional protocols

    International Nuclear Information System (INIS)

    2008-04-01

    This report provides background information on safeguards and explains procedures for States to conclude Additional Protocols to comprehensive Safeguards Agreements with the IAEA. Since the IAEA was founded in 1957, its safeguards system has been an indispensable component of the nuclear non-proliferation regime and has facilitated peaceful nuclear cooperation. In recognition of this, the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) makes it mandatory for all non-nuclear-weapon States (NNWS) party to the Treaty to conclude comprehensive safeguards agreements with the IAEA, and thus allow for the application of safeguards to all their nuclear material. Under Article III of the NPT, all NNWS undertake to accept safeguards, as set forth in agreements to be negotiated and concluded with the IAEA, for the exclusive purpose of verification of the fulfilment of the States' obligations under the NPT. In May 1997, the IAEA Board of Governors approved the Model Additional Protocol to Safeguards Agreements (reproduced in INFCIRC/540(Corr.)) which provided for an additional legal authority. In States that have both a comprehensive safeguards agreement and an additional protocol in force, the IAEA is able to optimize the implementation of all safeguards measures available. In order to simplify certain procedures under comprehensive safeguards agreements for States with little or no nuclear material and no nuclear material in a facility, the IAEA began making available, in 1971, a 'small quantities protocol' (SQP), which held in abeyance the implementation of most of the detailed provisions of comprehensive safeguards agreements for so long as the State concerned satisfied these criteria. The safeguards system aims at detecting and deterring the diversion of nuclear material. Such material includes enriched uranium, plutonium and uranium-233, which could be used directly in nuclear weapons. It also includes natural uranium and depleted uranium, the latter of which is

  1. Stability analysis of distributed order fractional chen system.

    Science.gov (United States)

    Aminikhah, H; Refahi Sheikhani, A; Rezazadeh, H

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results.

  2. Stability Analysis of Distributed Order Fractional Chen System

    Science.gov (United States)

    Aminikhah, H.; Refahi Sheikhani, A.; Rezazadeh, H.

    2013-01-01

    We first investigate sufficient and necessary conditions of stability of nonlinear distributed order fractional system and then we generalize the integer-order Chen system into the distributed order fractional domain. Based on the asymptotic stability theory of nonlinear distributed order fractional systems, the stability of distributed order fractional Chen system is discussed. In addition, we have found that chaos exists in the double fractional order Chen system. Numerical solutions are used to verify the analytical results. PMID:24489508

  3. Efficient Verifiable Range and Closest Point Queries in Zero-Knowledge

    Directory of Open Access Journals (Sweden)

    Ghosh Esha

    2016-10-01

    Full Text Available We present an efficient method for answering one-dimensional range and closest-point queries in a verifiable and privacy-preserving manner. We consider a model where a data owner outsources a dataset of key-value pairs to a server, who answers range and closest-point queries issued by a client and provides proofs of the answers. The client verifies the correctness of the answers while learning nothing about the dataset besides the answers to the current and previous queries. Our work yields for the first time a zero-knowledge privacy assurance to authenticated range and closest-point queries. Previous work leaked the size of the dataset and used an inefficient proof protocol. Our construction is based on hierarchical identity-based encryption. We prove its security and analyze its efficiency both theoretically and with experiments on synthetic and real data (Enron email and Boston taxi datasets.

  4. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  5. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  6. Agreement between self-reported and physically verified male circumcision status in Nyanza region, Kenya: Evidence from the TASCO study.

    Science.gov (United States)

    Odoyo-June, Elijah; Agot, Kawango; Mboya, Edward; Grund, Jonathan; Musingila, Paul; Emusu, Donath; Soo, Leonard; Otieno-Nyunya, Boaz

    2018-01-01

    Self-reported male circumcision (MC) status is widely used to estimate community prevalence of circumcision, although its accuracy varies in different settings depending on the extent of misreporting. Despite this challenge, self-reported MC status remains essential because it is the most feasible method of collecting MC status data in community surveys. Therefore, its accuracy is an important determinant of the reliability of MC prevalence estimates based on such surveys. We measured the concurrence between self-reported and physically verified MC status among men aged 25-39 years during a baseline household survey for a study to test strategies for enhancing MC uptake by older men in Nyanza region of Kenya. The objective was to determine the accuracy of self-reported MC status in communities where MC for HIV prevention is being rolled out. Agreement between self-reported and physically verified MC status was measured among 4,232 men. A structured questionnaire was used to collect data on MC status followed by physical examination to verify the actual MC status whose outcome was recorded as fully circumcised (no foreskin), partially circumcised (foreskin is past corona sulcus but covers less than half of the glans) or uncircumcised (foreskin covers half or more of the glans). The sensitivity and specificity of self-reported MC status were calculated using physically verified MC status as the gold standard. Out of 4,232 men, 2,197 (51.9%) reported being circumcised, of whom 99.0% were confirmed to be fully circumcised on physical examination. Among 2,035 men who reported being uncircumcised, 93.7% (1,907/2,035) were confirmed uncircumcised on physical examination. Agreement between self-reported and physically verified MC status was almost perfect, kappa (k) = 98.6% (95% CI, 98.1%-99.1%. The sensitivity of self-reporting being circumcised was 99.6% (95% CI, 99.2-99.8) while specificity of self-reporting uncircumcised was 99.0% (95% CI, 98.4-99.4) and did not differ

  7. ACCURATE ESTIMATES OF CHARACTERISTIC EXPONENTS FOR SECOND ORDER DIFFERENTIAL EQUATION

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper, a second order linear differential equation is considered, and an accurate estimate method of characteristic exponent for it is presented. Finally, we give some examples to verify the feasibility of our result.

  8. Election Verifiability: Cryptographic Definitions and an Analysis of Helios and JCJ

    Science.gov (United States)

    2015-08-06

    Computer Society, 2014. To appear. [26] David Chaum . Untraceable electronic mail, return addresses, and digital pseudonyms. Communications of the ACM...24(2):84–88, 1981. [27] David Chaum . Secret-ballot receipts: True voter-verifiable elections. IEEE Security and Privacy, 2(1):38–47, 2004. [28... David Chaum , Richard Carback, Jeremy Clark, Aleksander Essex, Stefan Popoveniuc, Ronald L. Rivest, Peter Y. A. Ryan, Emily Shen, and Alan T. Sherman

  9. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    Science.gov (United States)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  10. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  11. An method of verify period signal based on data acquisition card

    International Nuclear Information System (INIS)

    Zeng Shaoli

    2005-01-01

    This paper introduces an method to verify index voltage of Period Signal Generator by using data acquisition card. which it's error is less 0.5%. A corresponding Win32's program, which use voluntarily developed VxD to control data acquisition card direct I/O and multi thread technique for gain the best time scale precision, has developed in Windows platform. The program will real time collect inda voltage data and auto measure period. (authors)

  12. Fractional Order Generalized Information

    Directory of Open Access Journals (Sweden)

    José Tenreiro Machado

    2014-04-01

    Full Text Available This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.

  13. Procedures for measuring and verifying gastric tube placement in newborns: an integrative review.

    Science.gov (United States)

    Dias, Flávia de Souza Barbosa; Emidio, Suellen Cristina Dias; Lopes, Maria Helena Baena de Moraes; Shimo, Antonieta Keiko Kakuda; Beck, Ana Raquel Medeiros; Carmona, Elenice Valentim

    2017-07-10

    to investigate evidence in the literature on procedures for measuring gastric tube insertion in newborns and verifying its placement, using alternative procedures to radiological examination. an integrative review of the literature carried out in the Cochrane, LILACS, CINAHL, EMBASE, MEDLINE and Scopus databases using the descriptors "Intubation, gastrointestinal" and "newborns" in original articles. seventeen publications were included and categorized as "measuring method" or "technique for verifying placement". Regarding measuring methods, the measurements of two morphological distances and the application of two formulas, one based on weight and another based on height, were found. Regarding the techniques for assessing placement, the following were found: electromagnetic tracing, diaphragm electrical activity, CO2 detection, indigo carmine solution, epigastrium auscultation, gastric secretion aspiration, color inspection, and evaluation of pH, enzymes and bilirubin. the measuring method using nose to earlobe to a point midway between the xiphoid process and the umbilicus measurement presents the best evidence. Equations based on weight and height need to be experimentally tested. The return of secretion into the tube aspiration, color assessment and secretion pH are reliable indicators to identify gastric tube placement, and are the currently indicated techniques. investigar, na literatura, evidências sobre procedimentos de mensuração da sonda gástrica em recém-nascidos e de verificação do seu posicionamento, procedimentos alternativos ao exame radiológico. revisão integrativa da literatura nas bases Biblioteca Cochrane, LILACS, CINAHL, EMBASE, MEDLINE e Scopus, utilizando os descritores "intubação gastrointestinal" e "recém-nascido" em artigos originais. dezessete publicações foram incluídas e categorizadas em "método de mensuração" ou "técnica de verificação do posicionamento". Como métodos de mensuração, foram encontrados os de tomada

  14. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  15. Use of the Neutron Die-Away Technique to Test Control Rod Effectiveness Theories; Emploi de la Methode d'Absorption des Neutrons pour Verifier les Theories sur l'Efficacite des Barres de Commande; Ispol'zovanie metoda spada potoka nejtronov dlya proverki teorij ehffektivnosti reguliruyushchikh sterzhnej; Aplicacion de la Tecnica de Extincion Neutronica a la Verificacion de las Teorias sobre la Eficacia de las Barras de Control

    Energy Technology Data Exchange (ETDEWEB)

    Perez, R. B. [University of Florida, Gainesville, FL (United States); De Saussure, G.; Silver, E. G. [University of Florida, Gainesville, FL (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    1964-04-15

    The calculation of control tod effectiveness is complicated by its dependence on both the neutron energy distribution and the geometry of the assembly. When one compares the theory with experimental results obtained from either reactors or subcritical systems, difficulties arise in the comparison because of the intrinsic complexity of such systems. The neutron die-away technique affords the possibility of having an all-thermal neutron model, in which the neutron energy distribution can be separated from spatial effects. Hence, the geometrical factor of the control rod effectiveness can be studied without regard to the details of the neutron spectrum, and the results compared with a clean, simple experimental set-up. The method is based on the fact that in a neutron die- away experiment of the type described here, the buckling of the assembly is related to the decay constant of the fundamental mode by B{sup 2} = ({lambda} - {lambda}{sub a})/D {lambda}{sub a} = inverse lifetime of the neutrons in moderator (s{sup -1}) D = diffusion constant (cm{sup 2}/s). The moderating assemblies used for these experiments were rectangular prisms of beryllium, built in several sizes from small (2.54 cm high, 7.3 cm square) blocks. Three types of cadmium control rods were used: thin 0.476 cm diameter rods; a cruciform-section rod; and hollow ''thick'' rods 7.3 cm x 7.3 cm crosssection. The theoretical schemes tested were: (1) Nordheim-Scalettar (2) Hurwitz-Roe (3) Numerical Diffusion Code. The-effect of a cruciform absorber was computed by using the Hurwitz-Roe conformal transformation technique and value of 0.0188 cm{sup -2} was found for the buckling which compares with the experimental results of 0.0187 {+-} 0.0006 cm{sup -2}. For the thick rods, both Nordheim-Scalettar and the diffusion code overestimated the experimental results by about 10%. However, the interaction between thick rods was correctly predicted by both methods. For thin rods, the Nordheim-Scalettar technique was

  16. Middle-aged patients with an MRI-verified medial meniscal tear report symptoms commonly associated with knee osteoarthritis

    DEFF Research Database (Denmark)

    Hare, Kristoffer B.; Stefan Lohmander, L.; Kise, Nina Jullum

    2017-01-01

    Background and purpose — No consensus exists on when to perform arthroscopic partial meniscectomy in patients with a degenerative meniscal tear. Since MRI and clinical tests are not accurate in detecting a symptomatic meniscal lesion, the patient’s symptoms often play a large role when deciding...... when to perform surgery. We determined the prevalence and severity of self-reported knee symptoms in patients eligible for arthroscopic partial meniscectomy due to a degenerative meniscal tear. We investigated whether symptoms commonly considered to be related to meniscus injury were associated...... with early radiographic signs of knee osteoarthritis. Patients and methods — We included individual baseline items from the Knee injury and Osteoarthritis Outcome Score collected in 2 randomized controlled trials evaluating treatment for an MRI-verified degenerative medial meniscal tears in 199 patients aged...

  17. Static Loads Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides the capability to perform large-scale structural loads testing on spacecraft and other structures. Results from these tests can be used to verify...

  18. Validation of a noninvasive diagnostic tool to verify neuter status in dogs: The urinary FSH to creatinine ratio.

    Science.gov (United States)

    Albers-Wolthers, C H J; de Gier, J; Oei, C H Y; Schaefers-Okkens, A C; Kooistra, H S

    2016-09-15

    Determining the presence of functional gonadal tissue in dogs can be challenging, especially in bitches during anestrus or not known to have been ovariectomized, or in male dogs with nonscrotal testes. Furthermore, in male dogs treated with deslorelin, a slow-release GnRH agonist implant for reversible chemical castration, the verification of complete downregulation of the hypothalamic-pituitary-gonadal (HPG) axis can be difficult, especially if pretreatment parameters such as the size of the testes or prostate gland are not available. The aims of this study were to validate an immunoradiometric assay for measurement of FSH in canine urine, to determine if the urinary FSH to creatinine ratio can be used to verify the neuter status in bitches and male dogs, as an alternative to the plasma FSH concentration, and to determine if downregulation of the HPG axis is achieved in male dogs during deslorelin treatment. Recovery of added canine FSH and serial dilutions of urine reported that the immunoradiometric assay measures urinary FSH concentration accurately and with high precision. Plasma FSH concentrations (the mean of two samples, taken 40 minutes apart) and the urinary FSH to creatinine ratio were determined before gonadectomy and 140 days (median, range 121-225 days) and 206 days (median, range 158-294 days) after gonadectomy of 13 bitches and five male dogs, respectively, and in 13 male dogs before and 132 days (median, range 117-174 days) after administration of a deslorelin implant. In both bitches and male dogs, the plasma FSH concentration and the urinary FSH to creatinine ratio were significantly higher after gonadectomy, with no overlapping of their ranges. Receiver operating characteristic analysis of the urinary FSH to creatinine ratio revealed a cut-off value of 2.9 in bitches and 6.5 in males to verify the presence or absence of functional gonadal tissue. In male dogs treated with deslorelin, the plasma FSH concentrations and urinary FSH to

  19. Verifying the performance of instrumentation under adverse environmental conditions in nuclear power plants

    International Nuclear Information System (INIS)

    Navorro, S.M.; Gonzalez-Granda, C.

    1983-01-01

    The current standards concerning the environmental qualification of electrical equipment and instrumentation, although extensive and consistent, are likely to be modified or improved in the short term, but will certainly not undergo any fundamental changes. At present, there is a requirement that the condition of equipment in plants in operation or approaching operational status should be checked and monitored for compliance with the relevant standards. One method of checking and monitoring electrical equipment and instrumentation basically consists in determining the environmental conditions in the various areas where safety-related equipment is being installed and then carrying out a study, component by component, using a pre-established form which summarizes the qualification requirements. The form consists of three different columns: the first contains information on the component; the second, information on the environmental conditions for which the component is to be certified or has been certified; and the third, information on the reference documents relating to those conditions. This form makes it possible to determine deficiencies, which are then collated in a table. Once the criteria for acceptance or refusal have been established, the necessary justification or proposal for corrective action is drawn up. Tolerances, accessories and subsequent tests are examples of grounds for justifying requalification, a change of an instrument or of its position, protection of the instrument and additional analyses. These are the possible corrective measures, and a careful study has to be made in order to determine which is the most appropriate measure in each case. A study of this type calls for experts in various fields. Co-operation between the organizations dealing with environmental qualification is desirable in order to facilitate the gathering of data and the adoption of uniform approaches. (author)

  20. Usefulness of Tinker Toy Test for Schizophrenic Patients: A Pilot Study

    OpenAIRE

    中村, 泰久; 穴水, 幸子; 山中, 武彦; 石井, 文康; 三村, 將

    2017-01-01

     This is the pilot study in order to verify the usability of Tinker Toy Test (TTT). Participants were assigned to schizophrenia and control groups based on propensity scores which was computed using confounding factors. Neuropsychological testing for basic information, TTT, and others was performed in order to compare between two groups, and logistic regression analysis was used to assess the difference between them according to the items which showed significant differences in the neuropsych...

  1. Design and experiment of pneumatic EPB test platform

    OpenAIRE

    Jianshi GONG; Tianle JIA; Dali TIAN; Hongliang WANG; Di HUANG

    2017-01-01

    In order to verify the accuracy and reliability of the function and control strategy of the pneumatic electronic parking brake(EPB) system, a test platform of the pneumatic EPB system is designed. The working principle of the air pressure type EPB test platform is introduced, the composition of the platform is confirmed, including air press storage module, braking module, man-machine interaction module, signal imitation module, data collection module, and fault diagnosis module, and the funct...

  2. Order aggressiveness and order book dynamics

    DEFF Research Database (Denmark)

    Hall, Anthony D.; Hautsch, Nikolaus

    2006-01-01

    In this paper, we study the determinants of order aggressiveness and traders’ order submission strategy in an open limit order book market. Applying an order classification scheme, we model the most aggressive market orders, limit orders as well as cancellations on both sides of the market...... employing a six-dimensional autoregressive conditional intensity model. Using order book data from the Australian Stock Exchange, we find that market depth, the queued volume, the bid-ask spread, recent volatility, as well as recent changes in both the order flow and the price play an important role...... in explaining the determinants of order aggressiveness. Overall, our empirical results broadly confirm theoretical predictions on limit order book trading. However, we also find evidence for behavior that can be attributed to particular liquidity and volatility effects...

  3. Multilingual Validation of the Questionnaire for Verifying Stroke-Free Status in West Africa.

    Science.gov (United States)

    Sarfo, Fred; Gebregziabher, Mulugeta; Ovbiagele, Bruce; Akinyemi, Rufus; Owolabi, Lukman; Obiako, Reginald; Akpa, Onoja; Armstrong, Kevin; Akpalu, Albert; Adamu, Sheila; Obese, Vida; Boa-Antwi, Nana; Appiah, Lambert; Arulogun, Oyedunni; Mensah, Yaw; Adeoye, Abiodun; Tosin, Aridegbe; Adeleye, Osimhiarherhuo; Tabi-Ajayi, Eric; Phillip, Ibinaiye; Sani, Abubakar; Isah, Suleiman; Tabari, Nasir; Mande, Aliyu; Agunloye, Atinuke; Ogbole, Godwin; Akinyemi, Joshua; Laryea, Ruth; Melikam, Sylvia; Uvere, Ezinne; Adekunle, Gregory; Kehinde, Salaam; Azuh, Paschal; Dambatta, Abdul; Ishaq, Naser; Saulson, Raelle; Arnett, Donna; Tiwari, Hemnant; Jenkins, Carolyn; Lackland, Dan; Owolabi, Mayowa

    2016-01-01

    The Questionnaire for Verifying Stroke-Free Status (QVSFS), a method for verifying stroke-free status in participants of clinical, epidemiological, and genetic studies, has not been validated in low-income settings where populations have limited knowledge of stroke symptoms. We aimed to validate QVSFS in 3 languages, Yoruba, Hausa and Akan, for ascertainment of stroke-free status of control subjects enrolled in an on-going stroke epidemiological study in West Africa. Data were collected using a cross-sectional study design where 384 participants were consecutively recruited from neurology and general medicine clinics of 5 tertiary referral hospitals in Nigeria and Ghana. Ascertainment of stroke status was by neurologists using structured neurological examination, review of case records, and neuroimaging (gold standard). Relative performance of QVSFS without and with pictures of stroke symptoms (pictograms) was assessed using sensitivity, specificity, positive predictive value, and negative predictive value. The overall median age of the study participants was 54 years and 48.4% were males. Of 165 stroke cases identified by gold standard, 98% were determined to have had stroke, whereas of 219 without stroke 87% were determined to be stroke-free by QVSFS. Negative predictive value of the QVSFS across the 3 languages was 0.97 (range, 0.93-1.00), sensitivity, specificity, and positive predictive value were 0.98, 0.82, and 0.80, respectively. Agreement between the questionnaire with and without the pictogram was excellent/strong with Cohen k=0.92. QVSFS is a valid tool for verifying stroke-free status across culturally diverse populations in West Africa. © 2015 American Heart Association, Inc.

  4. Trends in the incidence rate, type and treatment of surgically verified endometriosis - a nationwide cohort study.

    Science.gov (United States)

    Saavalainen, Liisu; Tikka, Tuulia; But, Anna; Gissler, Mika; Haukka, Jari; Tiitinen, Aila; Härkki, Päivi; Heikinheimo, Oskari

    2018-01-01

    To study the trends in incidence rate, type and surgical treatment, and patient characteristics of surgically verified endometriosis during 1987-2012. This is a register-based cohort study. We identified women receiving their first diagnosis of endometriosis in surgery from the Finnish Hospital Discharge Register (FHDR). Quality of the FHDR records was assessed bidirectionally. The age-standardized incidence rates of the first surgically verified endometriosis was assessed by calendar year. The cohort comprises 49 956 women. The quality assessment suggested the FHDR data to be of good quality. The most common diagnosis, ovarian endometriosis (46%), was associated with highest median age 38.5 years (interquartile range 31.0-44.8) and the second most common diagnosis, peritoneal endometriosis (40%), with median age 34.9 years (28.6-41.7). Between 1987 and 2012, a decrease was observed in the median age, from 38.8 (32.3-43.6) to 34.0 (28.9-41.0) years, and in the age-standardized incidence rate from 116 [95% confidence interval (CI) 112-121] to 45 (42-48) per 100 000 women. The proportion of hysterectomy as a first surgical treatment decreased from 38 to 19%, whereas that of laparoscopy increased from 42 to 73% when comparing 1987-1995 with 1996-2012. This nationwide cohort of surgically verified endometriosis showed a decrease in the incidence rate and in the patient age at the time of first diagnosis, even though the proportion of laparoscopy has increased. The number of hysterectomies has decreased. These changes are likely to reflect the evolving diagnostics, increasing awareness of endometriosis, and effective use of medical treatment before surgery. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  5. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    Science.gov (United States)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  6. National, Regional and Global Certification Bodies for Polio Eradication: A Framework for Verifying Measles Elimination.

    Science.gov (United States)

    Deblina Datta, S; Tangermann, Rudolf H; Reef, Susan; William Schluter, W; Adams, Anthony

    2017-07-01

    The Global Certification Commission (GCC), Regional Certification Commissions (RCCs), and National Certification Committees (NCCs) provide a framework of independent bodies to assist the Global Polio Eradication Initiative (GPEI) in certifying and maintaining polio eradication in a standardized, ongoing, and credible manner. Their members meet regularly to comprehensively review population immunity, surveillance, laboratory, and other data to assess polio status in the country (NCC), World Health Organization (WHO) region (RCC), or globally (GCC). These highly visible bodies provide a framework to be replicated to independently verify measles and rubella elimination in the regions and globally. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  7. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  8. On the safe use of verify-and-record systems in external beam radiation therapy

    International Nuclear Information System (INIS)

    Seelantag, W.W.; Davis, J.B.

    1997-01-01

    Verify-and-record (V and R) systems are being used increasingly, not only for verification, but also for computer aided setup and chart printing. The close intercorrelation between V and R system and treatment routine requires new ideas for quality assurance (QA): pure ''machine checking'' as with treatment units is not sufficient anymore. The level of QA obviously depends on the tasks of the V and R system: the most advanced case of the system being used for computer aided setup and for chart printing is discussed -both are indispensable for an efficient use of V and R systems. Seven propositions are defined to make this not only efficient but safe. (author)

  9. Experimental observation of G banding verifying X-ray workers' chromosome translocation detected by FISH

    International Nuclear Information System (INIS)

    Sun Yuanming; Li Jin; Wang Qin; Tang Weisheng; Wang Zhiquan

    2002-01-01

    Objective: FISH is the most effective way of detecting chromosome aberration and many factors affect its accuracy. G-banding is used to verify the results of early X-ray workers' chromosome translocation examined by FISH. Methods: The chromosome translocations of early X-ray workers have been analysed by FISH (fluorescence in situ hybridization) and G-banding, yields of translocation treated with statistics. Results: The chromosome aberrations frequencies by tow methods are closely related. Conclusion: FISH is a feasible way to analyse chromosome aberrations of X-ray workers and reconstruct dose

  10. Construct a procedure to verify radiation protection for apparatus of industrial gamma radiography

    International Nuclear Information System (INIS)

    Nghiem Xuan Long; Trinh Dinh Truong; Dinh Chi Hung; Le Ngoc Hieu

    2013-01-01

    Apparatus for industrial gamma radiography include an exposure container, source guide tube, remote control hand crank assembly and other attached equipment. It is used a lot in inspection and evaluation of projects. In Vietnam, there are now more than 50 companies in radiography field and more than 100 apparatus are being used on the site. Therefore, the verification and evaluation is very necessary and important. This project constructs a procedure to verify a radiation protection for apparatus in the industrial gamma radiography for its application in Vietnam. (author)

  11. Force10 networks performance in world's first transcontinental 10 gigabit ethernet network verified by Ixia

    CERN Multimedia

    2003-01-01

    Force10 Networks, Inc., today announced that the performance of the Force10 E-Series switch/routers deployed in a transcontinental network has been verified as line-rate 10 GE throughput by Ixia, a leading provider of high-speed, network performance and conformance analysis systems. The network, the world's first transcontinental 10 GE wide area network, consists of a SURFnet OC-192 lambda between Geneva and the StarLight facility in Chicago via Amsterdam and another OC-192 lambda between this same facility in Chicago and Carleton University in Ottawa, Canada provided by CANARIE and ORANO (1/2 page).

  12. Reverse osmosis integrity monitoring in water reuse: The challenge to verify virus removal - A review.

    Science.gov (United States)

    Pype, Marie-Laure; Lawrence, Michael G; Keller, Jurg; Gernjak, Wolfgang

    2016-07-01

    A reverse osmosis (RO) process is often included in the treatment train to produce high quality reuse water from treated effluent for potable purposes because of its high removal efficiency for salinity and many inorganic and organic contaminants, and importantly, it also provides an excellent barrier for pathogens. In order to ensure the continued protection of public health from pathogen contamination, monitoring RO process integrity is necessary. Due to their small sizes, viruses are the most difficult class of pathogens to be removed in physical separation processes and therefore often considered the most challenging pathogen to monitor. To-date, there is a gap between the current log credit assigned to this process (determined by integrity testing approved by regulators) and its actual log removal capability as proven in a variety of laboratory and pilot studies. Hence, there is a challenge to establish a methodology that more closely links to the theoretical performance. In this review, after introducing the notion of risk management in water reuse, we provide an overview of existing and potentially new RO integrity monitoring techniques, highlight their strengths and drawbacks, and debate their applicability to full-scale treatment plants, which open to future research opportunities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Analysis and Design of High-Order Parallel Resonant Converters

    Science.gov (United States)

    Batarseh, Issa Eid

    1990-01-01

    In this thesis, a special state variable transformation technique has been derived for the analysis of high order dc-to-dc resonant converters. Converters comprised of high order resonant tanks have the advantage of utilizing the parasitic elements by making them part of the resonant tank. A new set of state variables is defined in order to make use of two-dimensional state-plane diagrams in the analysis of high order converters. Such a method has been successfully used for the analysis of the conventional Parallel Resonant Converters (PRC). Consequently, two -dimensional state-plane diagrams are used to analyze the steady state response for third and fourth order PRC's when these converters are operated in the continuous conduction mode. Based on this analysis, a set of control characteristic curves for the LCC-, LLC- and LLCC-type PRC are presented from which various converter design parameters are obtained. Various design curves for component value selections and device ratings are given. This analysis of high order resonant converters shows that the addition of the reactive components to the resonant tank results in converters with better performance characteristics when compared with the conventional second order PRC. Complete design procedure along with design examples for 2nd, 3rd and 4th order converters are presented. Practical power supply units, normally used for computer applications, were built and tested by using the LCC-, LLC- and LLCC-type commutation schemes. In addition, computer simulation results are presented for these converters in order to verify the theoretical results.

  14. Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol

    Science.gov (United States)

    Huang, Xiaowan; Singh, Anu; Smolka, Scott A.

    2010-01-01

    We use the UPPAAL model checker for Timed Automata to verify the Timing-Sync time-synchronization protocol for sensor networks (TPSN). The TPSN protocol seeks to provide network-wide synchronization of the distributed clocks in a sensor network. Clock-synchronization algorithms for sensor networks such as TPSN must be able to perform arithmetic on clock values to calculate clock drift and network propagation delays. They must be able to read the value of a local clock and assign it to another local clock. Such operations are not directly supported by the theory of Timed Automata. To overcome this formal-modeling obstacle, we augment the UPPAAL specification language with the integer clock derived type. Integer clocks, which are essentially integer variables that are periodically incremented by a global pulse generator, greatly facilitate the encoding of the operations required to synchronize clocks as in the TPSN protocol. With this integer-clock-based model of TPSN in hand, we use UPPAAL to verify that the protocol achieves network-wide time synchronization and is devoid of deadlock. We also use the UPPAAL Tracer tool to illustrate how integer clocks can be used to capture clock drift and resynchronization during protocol execution

  15. An improved system to verify CANDU spent fuel elements in dry storage silos

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel

    2000-01-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  16. An improved system to verify CANDU spent fuel elements in dry storage silos

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Gevaldo L. de; Soares, Milton G.; Filho, Anizio M.; Martorelli, Daniel S.; Fonseca, Manoel [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2000-07-01

    An improved system to verify CANDU spent fuel elements stored in dry storage silos was developed. It is constituted by a mechanical device which moves a semi-conductor detector along a vertical verification pipe incorporated to the silo, and a modified portable multi-channel analyzer. The mechanical device contains a winding drum accommodating a cable hanging the detector, in such a way that the drum rotates as the detector goes down due to its own weight. The detector is coupled to the multi-channel analyzer operating in the multi-scaler mode, generating therefore a spectrum of total counts against time. To assure a linear transformation of time into detector position, the mechanical device dictating the detector speed is controlled by the multi-channel analyzer. This control is performed via a clock type escapement device activated by a solenoid. Whenever the multi-channel analyzer shifts to the next channel, the associated pulse is amplified, powering the solenoid causing the drum to rotate a fixed angle. Spectra taken in laboratory, using radioactive sources, have shown a good reproducibility. This qualify the system to be used as an equipment to get a fingerprint of the overall distribution of the fuel elements along the silo axis, and hence, to verify possible diversion of the nuclear material by comparing spectra taken at consecutive safeguards inspections. All the system is battery operated, being thus capable to operate in the field where no power supply is available. (author)

  17. Independent technique of verifying high-dose rate (HDR) brachytherapy treatment plans

    International Nuclear Information System (INIS)

    Saw, Cheng B.; Korb, Leroy J.; Darnell, Brenda; Krishna, K. V.; Ulewicz, Dennis

    1998-01-01

    Purpose: An independent technique for verifying high-dose rate (HDR) brachytherapy treatment plans has been formulated and validated clinically. Methods and Materials: In HDR brachytherapy, dwell times at respective dwell positions are computed, using an optimization algorithm in a HDR treatment-planning system to deliver a specified dose to many target points simultaneously. Because of the variability of dwell times, concerns have been expressed regarding the ability of the algorithm to compute the correct dose. To address this concern, a commercially available low-dose rate (LDR) algorithm was used to compute the doses at defined distances, based on the dwell times obtained from the HDR treatment plans. The percent deviation between doses computed using the HDR and LDR algorithms were reviewed for HDR procedures performed over the last year. Results: In this retrospective study, the difference between computed doses using the HDR and LDR algorithms was found to be within 5% for about 80% of the HDR procedures. All of the reviewed procedures have dose differences of less than 10%. Conclusion: An independent technique for verifying HDR brachytherapy treatment plans has been validated based on clinical data. Provided both systems are available, this technique is universal in its applications and not limited to either a particular implant applicator, implant site, or implant type

  18. How to Verify Plagiarism of the Paper Written in Macedonian and Translated in Foreign Language?

    Science.gov (United States)

    Spiroski, Mirko

    2016-03-15

    The aim of this study was to show how to verify plagiarism of the paper written in Macedonian and translated in foreign language. Original article "Ethics in Medical Research Involving Human Subjects", written in Macedonian, was submitted as an assay-2 for the subject Ethics and published by Ilina Stefanovska, PhD candidate from the Iustinianus Primus Faculty of Law, Ss Cyril and Methodius University of Skopje (UKIM), Skopje, Republic of Macedonia in Fabruary, 2013. Suspected article for plagiarism was published by Prof. Dr. Gordana Panova from the Faculty of Medical Sciences, University Goce Delchev, Shtip, Republic of Macedonia in English with the identical title and identical content in International scientific on-line journal "SCIENCE & TECHNOLOGIES", Publisher "Union of Scientists - Stara Zagora". Original document (written in Macedonian) was translated with Google Translator; suspected article (published in English pdf file) was converted into Word document, and compared both documents with several programs for plagiarism detection. It was found that both documents are identical in 71%, 78% and 82%, respectively, depending on the computer program used for plagiarism detection. It was obvious that original paper was entirely plagiarised by Prof. Dr. Gordana Panova, including six references from the original paper. Plagiarism of the original papers written in Macedonian and translated in other languages can be verified after computerised translation in other languages. Later on, original and translated documents can be compared with available software for plagiarism detection.

  19. An experimental method to verify soil conservation by check dams on the Loess Plateau, China.

    Science.gov (United States)

    Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q

    2009-12-01

    A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.

  20. Insights from Synthetic Star-forming Regions. II. Verifying Dust Surface Density, Dust Temperature, and Gas Mass Measurements With Modified Blackbody Fitting

    Energy Technology Data Exchange (ETDEWEB)

    Koepferl, Christine M.; Robitaille, Thomas P. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Dale, James E., E-mail: koepferl@usm.lmu.de [University Observatory Munich, Scheinerstr. 1, D-81679 Munich (Germany)

    2017-11-01

    We use a large data set of realistic synthetic observations (produced in Paper I of this series) to assess how observational techniques affect the measurement physical properties of star-forming regions. In this part of the series (Paper II), we explore the reliability of the measured total gas mass, dust surface density and dust temperature maps derived from modified blackbody fitting of synthetic Herschel observations. We find from our pixel-by-pixel analysis of the measured dust surface density and dust temperature a worrisome error spread especially close to star formation sites and low-density regions, where for those “contaminated” pixels the surface densities can be under/overestimated by up to three orders of magnitude. In light of this, we recommend to treat the pixel-based results from this technique with caution in regions with active star formation. In regions of high background typical in the inner Galactic plane, we are not able to recover reliable surface density maps of individual synthetic regions, since low-mass regions are lost in the far-infrared background. When measuring the total gas mass of regions in moderate background, we find that modified blackbody fitting works well (absolute error: + 9%; −13%) up to 10 kpc distance (errors increase with distance). Commonly, the initial images are convolved to the largest common beam-size, which smears contaminated pixels over large areas. The resulting information loss makes this commonly used technique less verifiable as now χ {sup 2} values cannot be used as a quality indicator of a fitted pixel. Our control measurements of the total gas mass (without the step of convolution to the largest common beam size) produce similar results (absolute error: +20%; −7%) while having much lower median errors especially for the high-mass stellar feedback phase. In upcoming papers (Paper III; Paper IV) of this series we test the reliability of measured star formation rate with direct and indirect

  1. Insights from Synthetic Star-forming Regions. II. Verifying Dust Surface Density, Dust Temperature, and Gas Mass Measurements with Modified Blackbody Fitting

    Science.gov (United States)

    Koepferl, Christine M.; Robitaille, Thomas P.; Dale, James E.

    2017-11-01

    We use a large data set of realistic synthetic observations (produced in Paper I of this series) to assess how observational techniques affect the measurement physical properties of star-forming regions. In this part of the series (Paper II), we explore the reliability of the measured total gas mass, dust surface density and dust temperature maps derived from modified blackbody fitting of synthetic Herschel observations. We find from our pixel-by-pixel analysis of the measured dust surface density and dust temperature a worrisome error spread especially close to star formation sites and low-density regions, where for those “contaminated” pixels the surface densities can be under/overestimated by up to three orders of magnitude. In light of this, we recommend to treat the pixel-based results from this technique with caution in regions with active star formation. In regions of high background typical in the inner Galactic plane, we are not able to recover reliable surface density maps of individual synthetic regions, since low-mass regions are lost in the far-infrared background. When measuring the total gas mass of regions in moderate background, we find that modified blackbody fitting works well (absolute error: + 9%; -13%) up to 10 kpc distance (errors increase with distance). Commonly, the initial images are convolved to the largest common beam-size, which smears contaminated pixels over large areas. The resulting information loss makes this commonly used technique less verifiable as now χ 2 values cannot be used as a quality indicator of a fitted pixel. Our control measurements of the total gas mass (without the step of convolution to the largest common beam size) produce similar results (absolute error: +20%; -7%) while having much lower median errors especially for the high-mass stellar feedback phase. In upcoming papers (Paper III; Paper IV) of this series we test the reliability of measured star formation rate with direct and indirect techniques.

  2. Multiple sclerosis and birth order.

    Science.gov (United States)

    James, W H

    1984-01-01

    Studies on the birth order of patients with multiple sclerosis have yielded contradictory conclusions. Most of the sets of data, however, have been tested by biased tests. Data that have been submitted to unbiased tests seem to suggest that cases are more likely to occur in early birth ranks. This should be tested on further samples and some comments are offered on how this should be done. PMID:6707558

  3. Multiple sclerosis and birth order.

    OpenAIRE

    James, W H

    1984-01-01

    Studies on the birth order of patients with multiple sclerosis have yielded contradictory conclusions. Most of the sets of data, however, have been tested by biased tests. Data that have been submitted to unbiased tests seem to suggest that cases are more likely to occur in early birth ranks. This should be tested on further samples and some comments are offered on how this should be done.

  4. Birth order modifies the effect of IL13 gene polymorphisms on serum IgE at age 10 and skin prick test at ages 4, 10 and 18: a prospective birth cohort study

    Science.gov (United States)

    2010-01-01

    Background Susceptibility to atopy originates from effects of the environment on genes. Birth order has been identified as a risk factor for atopy and evidence for some candidate genes has been accumulated; however no study has yet assessed a birth order-gene interaction. Objective To investigate the interaction of IL13 polymorphisms with birth order on allergic sensitization at ages 4, 10 and 18 years. Methods Mother-infant dyads were recruited antenatally and followed prospectively to age 18 years. Questionnaire data (at birth, age 4, 10, 18); skin prick test (SPT) at ages 4, 10, 18; total serum IgE and specific inhalant screen at age 10; and genotyping for IL13 were collected. Three SNPs were selected from IL13: rs20541 (exon 4, nonsynonymous SNP), rs1800925 (promoter region) and rs2066960 (intron 1). Analysis included multivariable log-linear regression analyses using repeated measurements to estimate prevalence ratios (PRs). Results Of the 1456 participants, birth order information was available for 83.2% (1212/1456); SPT was performed on 67.4% at age 4, 71.2% at age 10 and 58.0% at age 18. The prevalence of atopy (sensitization to one or more food or aeroallergens) increased from 19.7% at age 4, to 26.7% at 10 and 41.1% at age 18. Repeated measurement analysis indicated interaction between rs20541 and birth order on SPT. The stratified analyses demonstrated that the effect of IL13 on SPT was restricted only to first-born children (p = 0.007; adjusted PR = 1.35; 95%CI = 1.09, 1.69). Similar findings were noted for firstborns regarding elevated total serum IgE at age 10 (p = 0.007; PR = 1.73; 1.16, 2.57) and specific inhalant screen (p = 0.034; PR = 1.48; 1.03, 2.13). Conclusions This is the first study to show an interaction between birth order and IL13 polymorphisms on allergic sensitization. Future functional genetic research need to determine whether or not birth order is related to altered expression and methylation of the IL13 gene. PMID:20403202

  5. Birth order modifies the effect of IL13 gene polymorphisms on serum IgE at age 10 and skin prick test at ages 4, 10 and 18: a prospective birth cohort study

    Directory of Open Access Journals (Sweden)

    Ogbuanu Ikechukwu U

    2010-04-01

    Full Text Available Abstract Background Susceptibility to atopy originates from effects of the environment on genes. Birth order has been identified as a risk factor for atopy and evidence for some candidate genes has been accumulated; however no study has yet assessed a birth order-gene interaction. Objective To investigate the interaction of IL13 polymorphisms with birth order on allergic sensitization at ages 4, 10 and 18 years. Methods Mother-infant dyads were recruited antenatally and followed prospectively to age 18 years. Questionnaire data (at birth, age 4, 10, 18; skin prick test (SPT at ages 4, 10, 18; total serum IgE and specific inhalant screen at age 10; and genotyping for IL13 were collected. Three SNPs were selected from IL13: rs20541 (exon 4, nonsynonymous SNP, rs1800925 (promoter region and rs2066960 (intron 1. Analysis included multivariable log-linear regression analyses using repeated measurements to estimate prevalence ratios (PRs. Results Of the 1456 participants, birth order information was available for 83.2% (1212/1456; SPT was performed on 67.4% at age 4, 71.2% at age 10 and 58.0% at age 18. The prevalence of atopy (sensitization to one or more food or aeroallergens increased from 19.7% at age 4, to 26.7% at 10 and 41.1% at age 18. Repeated measurement analysis indicated interaction between rs20541 and birth order on SPT. The stratified analyses demonstrated that the effect of IL13 on SPT was restricted only to first-born children (p = 0.007; adjusted PR = 1.35; 95%CI = 1.09, 1.69. Similar findings were noted for firstborns regarding elevated total serum IgE at age 10 (p = 0.007; PR = 1.73; 1.16, 2.57 and specific inhalant screen (p = 0.034; PR = 1.48; 1.03, 2.13. Conclusions This is the first study to show an interaction between birth order and IL13 polymorphisms on allergic sensitization. Future functional genetic research need to determine whether or not birth order is related to altered expression and methylation of the IL13 gene.

  6. Forward Technology Solar Cell Experiment (FTSCE) for MISSE-5 Verified and Readied for Flight on STS-114

    Science.gov (United States)

    Jenkins, Phillip P.; Krasowski, Michael J.; Greer, Lawrence C.; Flatico, Joseph M.

    2005-01-01

    The Forward Technology Solar Cell Experiment (FTSCE) is a space solar cell experiment built as part of the Fifth Materials on the International Space Station Experiment (MISSE-5): Data Acquisition and Control Hardware and Software. It represents a collaborative effort between the NASA Glenn Research Center, the Naval Research Laboratory, and the U.S. Naval Academy. The purpose of this experiment is to place current and future solar cell technologies on orbit where they will be characterized and validated. This is in response to recent on-orbit and ground test results that raised concerns about the in-space survivability of new solar cell technologies and about current ground test methodology. The various components of the FTSCE are assembled into a passive experiment container--a 2- by 2- by 4-in. folding metal container that will be attached by an astronaut to the outer structure of the International Space Station. Data collected by the FTSCE will be relayed to the ground through a transmitter assembled by the U.S. Naval Academy. Data-acquisition electronics and software were designed to be tolerant of the thermal and radiation effects expected on orbit. The experiment has been verified and readied for flight on STS-114.

  7. Nodal DG-FEM solution of high-order Boussinesq-type equations

    DEFF Research Database (Denmark)

    Engsig-Karup, Allan Peter; Hesthaven, Jan S.; Bingham, Harry B.

    2006-01-01

    We present a discontinuous Galerkin finite element method (DG-FEM) solution to a set of high-order Boussinesq-type equations for modelling highly nonlinear and dispersive water waves in one and two horizontal dimensions. The continuous equations are discretized using nodal polynomial basis...... functions of arbitrary order in space on each element of an unstructured computational domain. A fourth order explicit Runge-Kutta scheme is used to advance the solution in time. Methods for introducing artificial damping to control mild nonlinear instabilities are also discussed. The accuracy...... and convergence of the model with both h (grid size) and p (order) refinement are verified for the linearized equations, and calculations are provided for two nonlinear test cases in one horizontal dimension: harmonic generation over a submerged bar; and reflection of a steep solitary wave from a vertical wall...

  8. Development of measurement standards for verifying functional performance of surface texture measuring instruments

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, A [Life and Industrial Product Development Department Olympus Corporation, 2951 Ishikawa-machi, Hachiouji-shi, Tokyo (Japan); Suzuki, H [Industrial Marketing and Planning Department Olympus Corporation, Shinjyuku Monolith, 3-1 Nishi-Shinjyuku 2-chome, Tokyo (Japan); Yanagi, K, E-mail: a_fujii@ot.olympus.co.jp [Department of Mechanical Engineering, Nagaoka University of Technology, 1603-1 Kamitomioka-machi, Nagaoka-shi, Niigata (Japan)

    2011-08-19

    A new measurement standard is proposed for verifying overall functional performance of surface texture measuring instruments. Its surface is composed of sinusoidal surface waveforms of chirp signals along horizontal cross sections of the material measure. One of the notable features is that the amplitude of each cycle in the chirp signal form is geometrically modulated so that the maximum slope is kept constant. The maximum slope of the chirp-like signal is gradually decreased according to movement in the lateral direction. We fabricated the measurement standard by FIB processing, and it was calibrated by AFM. We tried to evaluate the functional performance of Laser Scanning Microscope by this standard in terms of amplitude response with varying slope angles. As a result, it was concluded that the proposed standard can easily evaluate the performance of surface texture measuring instruments.

  9. Verifying the agreed framework between the United States and North Korea

    International Nuclear Information System (INIS)

    May, M.M.

    2001-01-01

    Under the 1994 Agreed Framework (AF) between the United States and the Democratic People Republic of Korea (DPRK), the US and its allies will provide two nuclear-power reactors and other benefits to the DPRK in exchange for an agreement by the DPRK to declare how much nuclear-weapon material it has produced; to identify, freeze, and eventually dismantle specified facilities for producing this material; and to remain a party to the nuclear Non- Proliferation Treaty (NPT) and allow the implementation of its safeguards agreement. This study assesses the verifiability of these provisions. The study concludes verification can be accomplished, given cooperation and openness from the DPRK. Special effort will be needed from the IAEA, as well as support from the US and the Republic of Korea. (author)

  10. Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.

    Science.gov (United States)

    He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk

    2014-10-01

    The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.

  11. The anterior choroidal artery syndrome. Pt. 2. CT and/or MR in angiographically verified cases

    International Nuclear Information System (INIS)

    Takahashi, S.; Ishii, K.; Matsumoto, K.; Higano, S.; Ishibashi, T.; Suzuki, M.; Sakamoto, K.

    1994-01-01

    We reviewed 12 cases of infarcts in the territory of the anterior choroidal artery (AChA) on CT and/or MRI. In each case vascular occlusion in the region was verified angiographically. Although the extent of the lesion on CT/MR images was variable, all were located on the axial images within an arcuate zone between the striatium anterolaterally and the thalamus posteromedially. The distribution of the lesions on mutiplanar MRI conformed well to the territory of the AChA demonstrated microangiographically. The variability of the extent of the infarcts may be explained by variations in the degree of occlusive changes in the AChA or the development of collateral circulation through anastomoses between the AChA and the posterior communicating and posterior cerebral arteries. The extent of the lesion appeared to be closely related to the degree of neurological deficit. (orig.)

  12. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    Science.gov (United States)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  13. Getting What We Paid for: a Script to Verify Full Access to E-Resources

    Directory of Open Access Journals (Sweden)

    Kristina M. Spurgin

    2014-07-01

    Full Text Available Libraries regularly pay for packages of e-resources containing hundreds to thousands of individual titles. Ideally, library patrons could access the full content of all titles in such packages. In reality, library staff and patrons inevitably stumble across inaccessible titles, but no library has the resources to manually verify full access to all titles, and basic URL checkers cannot check for access. This article describes the E-Resource Access Checker—a script that automates the verification of full access. With the Access Checker, library staff can identify all inaccessible titles in a package and bring these problems to content providers’ attention to ensure we get what we pay for.

  14. Error prevention in radiotherapy treatments using a record and verify system

    International Nuclear Information System (INIS)

    Navarrete Campos, S.; Hernandez Vitoria, A.; Canellas Anoz, M.; Millan Cebrian, E.; Garcia Romero, A.

    2001-01-01

    Computerized record-and-verify systems (RVS) are being used increasingly to improve the precision of radiotherapy treatments. With the introduction of new treatment devices, such as multileaf or asymmetric collimators and virtual wedges, the responsibility to ensure correct treatment has increased. The purpose of this paper is to present the method that we are following to prevent some potential radiotherapy errors and to point out some errors that can be easily detected using a RVS, through a check of the daily recorded treatment information. We conclude that a RVS prevents the occurrence of many errors, when the settings of the treatment machine do not match the intended parameters within some maximal authorized deviation, and allows to detect easily other potential errors related with a incorrect selection of the treatment patient data. A quality assurance program, including a check of all beam data and a weekly control of the manual and electronic chart, has helped reduce errors. (author)

  15. Methods to verify absorbed dose of irradiated containers and evaluation of dosimeters

    International Nuclear Information System (INIS)

    Gao Meixu; Wang Chuanyao; Tang Zhangxong; Li Shurong

    2001-01-01

    The research on dose distribution in irradiated food containers and evaluation of several methods to verify absorbed dose were carried out. The minimum absorbed dose of treated five orange containers was in the top of the highest or in the bottom of lowest container. D max /D min in this study was 1.45 irradiated in a commercial 60 Co facility. The density of orange containers was about 0.391g/cm 3 . The evaluation of dosimeters showed that the PMMA-YL and clear PMMA dosimeters have linear relationship with dose response, and the word NOT in STERIN-125 and STERIN-300 indicators were covered completely at the dosage of 125 and 300 Gy respectively. (author)

  16. Measuring reporting verifying. A primer on MRV for nationally appropriate mitigation actions

    Energy Technology Data Exchange (ETDEWEB)

    Hinostroza, M. (ed.); Luetken, S.; Holm Olsen, K. (Technical Univ. of Denmark. UNEP Risoe Centre, Roskilde (Denmark)); Aalders, E.; Pretlove, B.; Peters, N. (Det Norske Veritas, Hellerup (Denmark))

    2012-03-15

    The requirements for measurement, reporting and verification (MRV) of nationally appropriate mitigation actions (NAMAs) are one of the crucial topics on the agenda of international negotiations to address climate change mitigation. According to agreements so far, the general guidelines for domestic MRV are to be developed by Subsidiary Body for Scientific and Technological Advice (SBSTA)1. Further, the Subsidiary Body for Implementation (SBI) will be conducting international consultations and analysis (ICA) of biennial update reports (BUR) to improve transparency of mitigation actions, which should be measured, reported and verified. 2. What is clear from undergoing discussions both at SBSTA and at SBI is that MRV for NAMAs should not be a burden for controlling greenhouse gas (GHG) emissions connected to economic activities. Instead, the MRV process should facilitate mitigation actions; encourage the redirection of investments and address concerns regarding carbon content of emission intensive operations of private and public companies and enterprises worldwide. While MRV requirements are being shaped within the Convention, there are a number of initiatives supporting developing countries moving forward with NAMA development and demonstration activities. How these actions shall be measured, reported and verified, however, remain unanswered. MRV is not new. It is present in most existing policies and frameworks related to climate change mitigation. With an aim to contribute to international debate and capacity building on this crucial issue, the UNEP Risoe Centre in cooperation with UNDP, are pleased to present this publication that through the direct collaboration with Det Norske Veritas (DNV) builds on existing MRV practices in current carbon markets; provides insights on how MRV for NAMAs can be performed and identifies elements and drivers to be considered when designing adequate MRV systems for NAMAs in developing countries. This primer is the second

  17. Evaluating MC and A effectiveness to verify the presence of nuclear materials

    International Nuclear Information System (INIS)

    Dawson, P.G.; Morzinski, J.A.; Ostenak, Carl A.; Longmire, V.L.; Jewell, D.; Williams, J.D.

    2001-01-01

    Traditional materials accounting is focused exclusively on the material balance area (MBA), and involves periodically closing a material balance based on accountability measurements conducted during a physical inventory. In contrast, the physical inventory for Los Alamos National Laboratory's near-real-time accounting system is established around processes and looks more like an item inventory. That is, the intent is not to measure material for accounting purposes, since materials have already been measured in the normal course of daily operations. A given unit process operates many times over the course of a material balance period. The product of a given unit process may move for processing within another unit process in the same MBA or may be transferred out of the MBA. Since few materials are unmeasured the physical inventory for a near-real-time process area looks more like an item inventory. Thus, the intent of the physical inventory is to locate the materials on the books and verify information about the materials contained in the books. Closing a materials balance for such an area is a matter of summing all the individual mass balances for the batches processed by all unit processes in the MBA. Additionally, performance parameters are established to measure the program's effectiveness. Program effectiveness for verifying the presence of nuclear material is required to be equal to or greater than a prescribed performance level, process measurements must be within established precision and accuracy values, physical inventory results meet or exceed performance requirements, and inventory differences are less than a target/goal quantity. This approach exceeds DOE established accounting and physical inventory program requirements. Hence, LANL is committed to this approach and to seeking opportunities for further improvement through integrated technologies. This paper will provide a detailed description of this evaluation process.

  18. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling

    Science.gov (United States)

    Antweiler, Ronald C.; Writer, Jeffrey H.; Murphy, Sheila F.

    2014-01-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations – such as missing the Lagrangian parcel by less than 1 h – can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed “verified Lagrangian” sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2–4 h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we

  19. Evaluation of wastewater contaminant transport in surface waters using verified Lagrangian sampling.

    Science.gov (United States)

    Antweiler, Ronald C; Writer, Jeffrey H; Murphy, Sheila F

    2014-02-01

    Contaminants released from wastewater treatment plants can persist in surface waters for substantial distances. Much research has gone into evaluating the fate and transport of these contaminants, but this work has often assumed constant flow from wastewater treatment plants. However, effluent discharge commonly varies widely over a 24-hour period, and this variation controls contaminant loading and can profoundly influence interpretations of environmental data. We show that methodologies relying on the normalization of downstream data to conservative elements can give spurious results, and should not be used unless it can be verified that the same parcel of water was sampled. Lagrangian sampling, which in theory samples the same water parcel as it moves downstream (the Lagrangian parcel), links hydrologic and chemical transformation processes so that the in-stream fate of wastewater contaminants can be quantitatively evaluated. However, precise Lagrangian sampling is difficult, and small deviations - such as missing the Lagrangian parcel by less than 1h - can cause large differences in measured concentrations of all dissolved compounds at downstream sites, leading to erroneous conclusions regarding in-stream processes controlling the fate and transport of wastewater contaminants. Therefore, we have developed a method termed "verified Lagrangian" sampling, which can be used to determine if the Lagrangian parcel was actually sampled, and if it was not, a means for correcting the data to reflect the concentrations which would have been obtained had the Lagrangian parcel been sampled. To apply the method, it is necessary to have concentration data for a number of conservative constituents from the upstream, effluent, and downstream sites, along with upstream and effluent concentrations that are constant over the short-term (typically 2-4h). These corrections can subsequently be applied to all data, including non-conservative constituents. Finally, we show how data

  20. Ordered delinquency: the "effects" of birth order on delinquency.

    Science.gov (United States)

    Cundiff, Patrick R

    2013-08-01

    Juvenile delinquency has long been associated with birth order in popular culture. While images of the middle child acting out for attention or the rebellious youngest child readily spring to mind, little research has attempted to explain why. Drawing from Adlerian birth order theory and Sulloway's born-to-rebel hypothesis, I examine the relationship between birth order and a variety of delinquent outcomes during adolescence. Following some recent research on birth order and intelligence, I use new methods that allow for the examination of between-individual and within-family differences to better address the potential spurious relationship. My findings suggest that contrary to popular belief, the relationship between birth order and delinquency is spurious. Specifically, I find that birth order effects on delinquency are spurious and largely products of the analytic methods used in previous tests of the relationship. The implications of this finding are discussed.

  1. Default settings of computerized physician order entry system order sets drive ordering habits.

    Science.gov (United States)

    Olson, Jordan; Hollenbeak, Christopher; Donaldson, Keri; Abendroth, Thomas; Castellani, William

    2015-01-01

    Computerized physician order entry (CPOE) systems are quickly becoming ubiquitous, and groups of orders ("order sets") to allow for easy order input are a common feature. This provides a streamlined mechanism to view, modify, and place groups of related orders. This often serves as an electronic equivalent of a specialty requisition. A characteristic, of these order sets is that specific orders can be predetermined to be "preselected" or "defaulted-on" whenever the order set is used while others are "optional" or "defaulted-off" (though there is typically the option is to "deselect" defaulted-on tests in a given situation). While it seems intuitive that the defaults in an order set are often accepted, additional study is required to understand the impact of these "default" settings in an order set on ordering habits. This study set out to quantify the effect of changing the default settings of an order set. For quality improvement purposes, order sets dealing with transfusions were recently reviewed and modified to improve monitoring of outcome. Initially, the order for posttransfusion hematocrits and platelet count had the default setting changed from "optional" to "preselected." The default settings for platelet count was later changed back to "optional," allowing for a natural experiment to study the effect of the default selections of an order set on clinician ordering habits. Posttransfusion hematocrit values were ordered for 8.3% of red cell transfusions when the default order set selection was "off" and for 57.4% of transfusions when the default selection was "preselected" (P default order set selection was "optional," increased to 59.4% when the default was changed to "preselected" (P default selection was returned to "optional." The posttransfusion platelet count rates during the two "optional" periods: 7.0% versus 7.5% - were not statistically different (P = 0.620). Default settings in CPOE order sets can significantly influence physician selection of

  2. Defining and Verifying Research Grade Airborne Laser Swath Mapping (ALSM) Observations

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Slatton, C. C.

    2004-12-01

    The first and primary goal of the National Science Foundation (NSF) supported Center for Airborne Laser Mapping (NCALM), operated jointly by the University of Florida and the University of California, Berkeley, is to make "research grade" ALSM data widely available at affordable cost to the national scientific community. Cost aside, researchers need to know what NCALM considers research grade data and how the quality of the data is verified, to be able to determine the likelihood that the data they receive will meet their project specific requirements. Given the current state of the technology it is reasonable to expect a well planned and executed survey to produce surface elevations with uncertainties less than 10 centimeters and horizontal uncertainties of a few decimeters. Various components of the total error are generally associated with the aircraft trajectory, aircraft orientation, or laser vectors. Aircraft trajectory error is dependent largely on the Global Positioning System (GPS) observations, aircraft orientation on Inertial Measurement Unit (IMU) observations, and laser vectors on the scanning and ranging instrumentation. In addition to the issue of the precision or accuracy of the coordinates of the surface points, consideration must also be given to the point-to-point spacing and voids in the coverage. The major sources of error produce distinct artifacts in the data set. For example, aircraft trajectory errors tend to change slowly as the satellite constellation geometry varies, producing slopes within swaths and offsets between swaths. Roll, pitch and yaw biases in the IMU observations tend to persist through whole flights, and created distinctive artifacts in the swath overlap areas. Errors in the zero-point and scale of the laser scanner cause the edges of swaths to turn up or down. Range walk errors cause offsets between bright and dark surfaces, causing paint stripes to float above the dark surfaces of roads. The three keys to producing

  3. Characteristic test of initial HTTR core

    International Nuclear Information System (INIS)

    Nojiri, Naoki; Shimakawa, Satoshi; Fujimoto, Nozomu; Goto, Minoru

    2004-01-01

    This paper describes the results of core physics test in start-up and power-up of the HTTR. The tests were conducted in order to ensure performance and safety of the high temperature gas cooled reactor, and was carried out to measure the critical approach, the excess reactivity, the shutdown margin, the control rod worth, the reactivity coefficient, the neutron flux distribution and the power distribution. The expected core performance and the required reactor safety characteristics were verified from the results of measurements and calculations

  4. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  5. Certified higher-order recursive path ordering

    NARCIS (Netherlands)

    Koprowski, A.; Pfenning, F.

    2006-01-01

    The paper reports on a formalization of a proof of wellfoundedness of the higher-order recursive path ordering (HORPO) in the proof checker Coq. The development is axiom-free and fully constructive. Three substantive parts that could be used also in other developments are the formalizations of the

  6. Automatic data-processing equipment of moon mark of nail for verifying some experiential theory of Traditional Chinese Medicine.

    Science.gov (United States)

    Niu, Renjie; Fu, Chenyu; Xu, Zhiyong; Huang, Jianyuan

    2016-04-29

    Doctors who practice Traditional Chinese Medicine (TCM) diagnose using four methods - inspection, auscultation and olfaction, interrogation, and pulse feeling/palpation. The shape and shape changes of the moon marks on the nails are an important indication when judging the patient's health. There are a series of classical and experimental theories about moon marks in TCM, which does not have support from statistical data. To verify some experiential theories on moon mark in TCM by automatic data-processing equipment. This paper proposes the equipment that utilizes image processing technology to collect moon mark data of different target groups conveniently and quickly, building a database that combines this information with that gathered from the health and mental status questionnaire in each test. This equipment has a simple design, a low cost, and an optimized algorithm. The practice has been proven to quickly complete automatic acquisition and preservation of key data about moon marks. In the future, some conclusions will likely be obtained from these data; some changes of moon marks related to a special pathological change will be established with statistical methods.

  7. Quality assurance for high dose rate brachytherapy treatment planning optimization: using a simple optimization to verify a complex optimization

    International Nuclear Information System (INIS)

    Deufel, Christopher L; Furutani, Keith M

    2014-01-01

    As dose optimization for high dose rate brachytherapy becomes more complex, it becomes increasingly important to have a means of verifying that optimization results are reasonable. A method is presented for using a simple optimization as quality assurance for the more complex optimization algorithms typically found in commercial brachytherapy treatment planning systems. Quality assurance tests may be performed during commissioning, at regular intervals, and/or on a patient specific basis. A simple optimization method is provided that optimizes conformal target coverage using an exact, variance-based, algebraic approach. Metrics such as dose volume histogram, conformality index, and total reference air kerma agree closely between simple and complex optimizations for breast, cervix, prostate, and planar applicators. The simple optimization is shown to be a sensitive measure for identifying failures in a commercial treatment planning system that are possibly due to operator error or weaknesses in planning system optimization algorithms. Results from the simple optimization are surprisingly similar to the results from a more complex, commercial optimization for several clinical applications. This suggests that there are only modest gains to be made from making brachytherapy optimization more complex. The improvements expected from sophisticated linear optimizations, such as PARETO methods, will largely be in making systems more user friendly and efficient, rather than in finding dramatically better source strength distributions. (paper)

  8. Testing Testing Testing.

    Science.gov (United States)

    Deville, Craig; O'Neill, Thomas; Wright, Benjamin D.; Woodcock, Richard W.; Munoz-Sandoval, Ana; Gershon, Richard C.; Bergstrom, Betty

    1998-01-01

    Articles in this special section consider (1) flow in test taking (Craig Deville); (2) testwiseness (Thomas O'Neill); (3) test length (Benjamin Wright); (4) cross-language test equating (Richard W. Woodcock and Ana Munoz-Sandoval); (5) computer-assisted testing and testwiseness (Richard Gershon and Betty Bergstrom); and (6) Web-enhanced testing…

  9. Isotope correlation techniques for verifying input accountability measurements at a reprocessing plant

    International Nuclear Information System (INIS)

    Umezawa, H.; Nakahara, Y.

    1983-01-01

    Isotope correlation techniques were studied to verify input accountability measurements at a reprocessing plant. On the basis of a historical data bank, correlation between plutonium-to-uranium ratio and isotopic variables was derived as a function of burnup. The burnup was determined from the isotopic ratios of uranium and plutonium, too. Data treatment was therefore made in an iterative manner. The isotopic variables were defined to cover a wide spectrum of isotopes of uranium and plutonium. The isotope correlation techniques evaluated important parameters such as the fuel burnup, the most probable ratio of plutonium to uranium, and the amounts of uranium and plutonium in reprocessing batches in connection with fresh fuel fabrication data. In addition, the most probable values of isotope abundance of plutonium and uranium could be estimated from the plutonium-to-uranium ratio determined, being compared with the reported data for verification. A pocket-computer-based system was developed to enable inspectors to collect and evaluate data in a timely fashion at the input accountability measurement point by the isotope correlation techniques. The device is supported by battery power and completely independent of the operator's system. The software of the system was written in BASIC. The data input can be stored in a cassette tape and transferred into a higher level computer. The correlations used for the analysis were given as a form of analytical function. Coefficients for the function were provided relevant to the type of reactor and the initial enrichment of fuel. (author)

  10. Method for verifying the pressure in a nuclear reactor fuel rod

    International Nuclear Information System (INIS)

    Jones, W.J.

    1979-01-01

    Disclosed is a method of accurately verifying the pressure contained in a sealed pressurized fuel rod by utilizing a pressure balance measurement technique wherein an end of the fuel rod extends through and is sealed in a wall of a small chamber. The chamber is pressurized to the nominal (desired) fuel rod pressure and the fuel rod is then pierced to interconnect the chamber and fuel rod. The deviation of chamber pressure is noted. The final combined pressure of the fuel rod and drill chamber is substantially equal to the nominal rod pressure; departure of the combined pressure from nominal is in direct proportion to departure of rod pressure from nominal. The maximum error in computing the rod pressure from the deviation of the combined pressure from nominal is estimated at plus or minus 3.0 psig for rod pressures within the specified production limits. If the rod pressure is corrected for rod void volume using a digital printer data record, the accuracy improves to about plus or minus 2.0 psig

  11. Experimental evaluation of the exposure level onboard Czech Airlines aircraft - measurements verified the routine method

    International Nuclear Information System (INIS)

    Ploc, O.; Spurny, F.; Turek, K.; Kovar, I.

    2008-01-01

    Air-crew members are exposed to ionizing radiation due to their work on board of air-crafts. The International Commission on Radiological Protection (ICRP) in 1990 recommends that exposure to cosmic radiation in the operation of jet aircraft should be recognised as occupational exposure. Czech air transport operators are therefore obliged to ensure: - Air-crew members to be well informed about the exposure level and health risks; - An analysis of complete exposure level of aircraft crew and its continuing monitoring in cases of exceeding the informative value 1 mSv; - A compliance of limit 1 mSv during pregnancy Since 1998, after receiving a proper accreditation, the Department of Radiation Dosimetry of Nuclear Physics Institute of Czech Academy of Sciences (DRD) is the competent dosimetric service realized requirements of Notice No.307 of the State Office for Nuclear Safety concerning air-crew exposure (paragraphs 87-90). The DRD has developed routine method of personal dosimetry of aircraft crew in 1998 which has been applied after receiving a proper accreditation in the same year. DRD therefore helps Czech airlines a.s. (CSA) with their legislative obligations mentioned above, and in return, once per four years, in terms of business contract, CSA allows scientific measurements performed by DRD onboard its air-crafts with the aim to verify the method of routine individual monitoring of aircraft crew exposure. (authors)

  12. Could hypomanic traits explain selective migration? Verifying the hypothesis by the surveys on sardinian migrants.

    Science.gov (United States)

    Giovanni, Carta Mauro; Francesca, Moro Maria; Viviane, Kovess; Brasesco, Maria Veronica; Bhat, Krishna M; Matthias, Angermeyer C; Akiskal, Hagop S

    2012-01-01

    A recent survey put forward the hypothesis that the emigration that occurred from Sardinia from the 1960's to the 1980's, selected people with a hypomanic temperament. The paper aims to verify if the people who migrated from Sardinia in that period have shown a high risk of mood disorders in the surveys carried out in their host countries, and if the results are consistent with this hypothesis. This is systematic review. In the 1970's when examining the attitudes towards migration in Sardinian couples waiting to emigrate, Rudas found that the decision to emigrate was principally taken by males. Female showed lower self-esteem than male emigrants. A study on Sardinian immigrants in Argentina carried out in 2001-02, at the peak of the economic crisis, found a high risk of depressive disorders in women only. These results were opposite to the findings recorded ten years earlier in a survey on Sardinian immigrants in Paris, where the risk of Depressive Episode was higher in young men only. Data point to a bipolar disorder risk for young (probably hypomanic) male migrants in competitive, challenging conditions; and a different kind of depressive episodes for women in trying economic conditions. The results of the survey on Sardinian migrants are partially in agreement with the hypothesis of a selective migration of people with a hypomanic temperament. Early motivations and self-esteem seem related to the ways mood disorders are expressed, and to the vulnerability to specific triggering situations in the host country.

  13. Experimentally verified inductance extraction and parameter study for superconductive integrated circuit wires crossing ground plane holes

    International Nuclear Information System (INIS)

    Fourie, Coenrad J; Wetzstein, Olaf; Kunert, Juergen; Meyer, Hans-Georg; Toepfer, Hannes

    2013-01-01

    As the complexity of rapid single flux quantum (RSFQ) circuits increases, both current and power consumption of the circuits become important design criteria. Various new concepts such as inductive biasing for energy efficient RSFQ circuits and inductively coupled RSFQ cells for current recycling have been proposed to overcome increasingly severe design problems. Both of these techniques use ground plane holes to increase the inductance or coupling factor of superconducting integrated circuit wires. New design tools are consequently required to handle the new topographies. One important issue in such circuit design is the accurate calculation of networks of inductances even in the presence of finite holes in the ground plane. We show how a fast network extraction method using InductEx, which is a pre- and post-processor for the magnetoquasistatic field solver FastHenry, is used to calculate the inductances of a set of SQUIDs (superconducting quantum interference devices) with ground plane holes of different sizes. The results are compared to measurements of physical structures fabricated with the IPHT Jena 1 kA cm −2 RSFQ niobium process to verify accuracy. We then do a parameter study and derive empirical equations for fast and useful estimation of the inductance of wires surrounded by ground plane holes. We also investigate practical circuits and show excellent accuracy. (paper)

  14. Identifying the 'right patient': nurse and consumer perspectives on verifying patient identity during medication administration.

    Science.gov (United States)

    Kelly, Teresa; Roper, Cath; Elsom, Stephen; Gaskin, Cadeyrn

    2011-10-01

    Accurate verification of patient identity during medication administration is an important component of medication administration practice. In medical and surgical inpatient settings, the use of identification aids, such as wristbands, is common. In many psychiatric inpatient units in Victoria, Australia, however, standardized identification aids are not used. The present paper outlines the findings of a qualitative research project that employed focus groups to examine mental health nurse and mental health consumer perspectives on the identification of patients during routine medication administration in psychiatric inpatient units. The study identified a range of different methods currently employed to verify patient identity, including technical methods, such as wristband and photographs, and interpersonal methods, such as patient recognition. There were marked similarities in the perspectives of mental health nurses and mental health consumers regarding their opinions and preferences. Technical aids were seen as important, but not as a replacement for the therapeutic nurse-patient encounter. © 2011 The Authors. International Journal of Mental Health Nursing © 2011 Australian College of Mental Health Nurses Inc.

  15. K/sub infinity/-meter concept verified via subcritical-critical TRIGA experiments

    International Nuclear Information System (INIS)

    Ocampo Mansilla, H.

    1983-01-01

    This work presents a technique for building a device to measure the k/sub infinity/ of a spent nuclear fuel assembly discharged from the core of a nuclear power plant. The device, called a k/sub infinity/-meter, consists of a cross-shaped subcritical assembly, two artificial neutron sources, and two separate neutron counting systems. The central position of the subcritical assembly is used to measure k/sub infinity/ of the spent fuel assembly. The initial subcritical assembly is calibrated to determine its k/sub eff/ and verify the assigned k/sub infinity/ of a selected fuel assembly placed in the central position. Count rates are taken with the fuel assembly of known k/sub infinity/'s placed in the central position and then repeated with a fuel assembly of unknown k/sub infinity/ placed in the central position. The count rate ratio of the unknown fuel assembly to the known fuel assembly is used to determine the k/sub infinity/ of the unknown fuel assembly. The k/sub infinity/ of the unknown fuel assembly is represented as a polynomial function of the count rate ratios. The coefficients of the polynomial equation are determined using the neutronic codes LEOPARD and EXTERMINATOR-II. The analytical approach has been validated by performing several subcritical/critical experiments, using the Penn State Breazeale TRIGA Reactor (PSBR), and comparing the experimental results with the calculations

  16. The Mitochondrial Protein Atlas: A Database of Experimentally Verified Information on the Human Mitochondrial Proteome.

    Science.gov (United States)

    Godin, Noa; Eichler, Jerry

    2017-09-01

    Given its central role in various biological systems, as well as its involvement in numerous pathologies, the mitochondrion is one of the best-studied organelles. However, although the mitochondrial genome has been extensively investigated, protein-level information remains partial, and in many cases, hypothetical. The Mitochondrial Protein Atlas (MPA; URL: lifeserv.bgu.ac.il/wb/jeichler/MPA ) is a database that provides a complete, manually curated inventory of only experimentally validated human mitochondrial proteins. The MPA presently contains 911 unique protein entries, each of which is associated with at least one experimentally validated and referenced mitochondrial localization. The MPA also contains experimentally validated and referenced information defining function, structure, involvement in pathologies, interactions with other MPA proteins, as well as the method(s) of analysis used in each instance. Connections to relevant external data sources are offered for each entry, including links to NCBI Gene, PubMed, and Protein Data Bank. The MPA offers a prototype for other information sources that allow for a distinction between what has been confirmed and what remains to be verified experimentally.

  17. A Pilot Study Verifying How the Curve Information Impacts on the Driver Performance with Cognition Model

    Directory of Open Access Journals (Sweden)

    Xiaohua Zhao

    2013-01-01

    Full Text Available Drivers' misjudgment is a significant issue for the curve safety. It is considered as a more influential factor than other traffic environmental conditions for inducing risk. The research suggested that the cognition theory could explain the process of drivers’ behavior at curves. In this simulator experiment, a principle cognition model was built to examine the rationality of this explanation. The core of this pilot study was using one of the driving decision strategies for braking at curves to verify the accuracy of the cognition model fundamentally. Therefore, the experiment designed three treatments of information providing modes. The result of the experiment presented that the warning information about curves in advance can move the position of first braking away from curves. This phenomenon is consistent with the model’s inference. Thus, the conclusion of this study indicates that the process of the drivers' behavior at curves can be explained by the cognition theory and represented by cognition model. In addition, the model’s characteristics and working parameters can be acquired by doing other research. Then based on the model it can afford the advice for giving the appropriate warning information that may avoid the driver’s mistake.

  18. Verifying Real-Time Systems using Explicit-time Description Methods

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking has been extensively researched in recent years. Many new formalisms with time extensions and tools based on them have been presented. On the other hand, Explicit-Time Description Methods aim to verify real-time systems with general untimed model checkers. Lamport presented an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables for time requirements. This paper proposes a new explicit-time description method with no reliance on global variables. Instead, it uses rendezvous synchronization steps between the Tick process and each system process to simulate time. This new method achieves better modularity and facilitates usage of more complex timing constraints. The two explicit-time description methods are implemented in DIVINE, a well-known distributed-memory model checker. Preliminary experiment results show that our new method, with better modularity, is comparable to Lamport's method with respect to time and memory efficiency.

  19. On verifying magnetic dipole moment of a magnetic torquer by experiments

    Science.gov (United States)

    Kuyyakanont, Aekjira; Kuntanapreeda, Suwat; Fuengwarodsakul, Nisai H.

    2018-01-01

    Magnetic torquers are used for the attitude control of small satellites, such as CubeSats with Low Earth Orbit (LEO). During the design of magnetic torquers, it is necessary to confirm if its magnetic dipole moment is enough to control the satellite attitude. The magnetic dipole moment can affect the detumbling time and the satellite rotation time. In addition, it is also necessary to understand how to design the magnetic torquer for operation in a CubeSat under the space environment at LEO. This paper reports an investigation of the magnetic dipole moment and the magnetic field generated by a circular air-coil magnetic torquer using experimental measurements. The experiment testbed was built on an air-bearing under a magnetic field generated by a Helmholtz coil. This paper also describes the procedure to determine and verify the magnetic dipole moment value of the designed circular air-core magnetic torquer. The experimental results are compared with the design calculations. According to the comparison results, the designed magnetic torquer reaches the required magnetic dipole moment. This designed magnetic torquer will be applied to the attitude control systems of a 1U CubeSat satellite in the project “KNACKSAT.”

  20. Verifying Identities of Plant-Based Multivitamins Using Phytochemical Fingerprinting in Combination with Multiple Bioassays.

    Science.gov (United States)

    Lim, Yeni; Ahn, Yoon Hee; Yoo, Jae Keun; Park, Kyoung Sik; Kwon, Oran

    2017-09-01

    Sales of multivitamins have been growing rapidly and the concept of natural multivitamin, plant-based multivitamin, or both has been introduced in the market, leading consumers to anticipate additional health benefits from phytochemicals that accompany the vitamins. However, the lack of labeling requirements might lead to fraudulent claims. Therefore, the objective of this study was to develop a strategy to verify identity of plant-based multivitamins. Phytochemical fingerprinting was used to discriminate identities. In addition, multiple bioassays were performed to determine total antioxidant capacity. A statistical computation model was then used to measure contributions of phytochemicals and vitamins to antioxidant activities. Fifteen multivitamins were purchased from the local markets in Seoul, Korea and classified into three groups according to the number of plant ingredients. Pearson correlation analysis among antioxidant capacities, amount phenols, and number of plant ingredients revealed that ferric reducing antioxidant power (FRAP) and 2,2-diphenyl-1-picryhydrazyl (DPPH) assay results had the highest correlation with total phenol content. This suggests that FRAP and DPPH assays are useful for characterizing plant-derived multivitamins. Furthermore, net effect linear regression analysis confirmed that the contribution of phytochemicals to total antioxidant capacities was always relatively higher than that of vitamins. Taken together, the results suggest that phytochemical fingerprinting in combination with multiple bioassays could be used as a strategy to determine whether plant-derived multivitamins could provide additional health benefits beyond their nutritional value.

  1. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    Science.gov (United States)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  2. Verified spider bites in Oregon (USA) with the intent to assess hobo spider venom toxicity.

    Science.gov (United States)

    McKeown, Nathanael; Vetter, Richard S; Hendrickson, Robert G

    2014-06-01

    This study compiled 33 verified spider bites from the state of Oregon (USA). The initial goal was to amass a series of bites by the hobo spider to assess whether it possesses toxic venom, a supposition which is currently in a contested state. None of the 33 bites from several spider species developed significant medical symptoms nor did dermonecrosis occur. The most common biters were the yellow sac spider, Cheiracanthium mildei (N = 10) and orb-weavers of the genus Araneus (N = 6). There were 10 bites from three genera of funnel web spiders of the family Agelenidae including one hobo spider bite and one from the congeneric giant house spider which is readily confused as a hobo spider. The hobo spider bite resulted in pain, redness, twitching in the calf muscle and resolved in 12 h. Also generated from this study were possibly the first records of bites from spiders of the genera Callobius (Amaurobiidae) and Antrodiaetus (Antrodiaetidae), both with minor manifestations. Published by Elsevier Ltd.

  3. Wetting transitions: First order or second order

    International Nuclear Information System (INIS)

    Teletzke, G.F.; Scriven, L.E.; Davis, H.T.

    1982-01-01

    A generalization of Sullivan's recently proposed theory of the equilibrium contact angle, the angle at which a fluid interface meets a solid surface, is investigated. The generalized theory admits either a first-order or second-order transition from a nonzero contact angle to perfect wetting as a critical point is approached, in contrast to Sullivan's original theory, which predicts only a second-order transition. The predictions of this computationally convenient theory are in qualitative agreement with a more rigorous theory to be presented in a future publication

  4. A study of beam position diagnostics with beam-excited dipole higher order modes using a downconverter test electronics in third harmonic 3.9 GHz superconducting accelerating cavities at FLASH

    International Nuclear Information System (INIS)

    Zhang, P.; Baboi, N.; Lorbeer, B.; Wamsat, T.; Eddy, N.; Fellenz, B.; Wendt, M.; Jones, R.M.

    2012-08-01

    Beam-excited higher order modes (HOM) in accelerating cavities contain transverse beam position information. Previous studies have narrowed down three modal options for beam position diagnostics in the third harmonic 3.9 GHz cavities at FLASH. Localized modes in the beam pipes at approximately 4.1 GHz and in the fifth cavity dipole band at approximately 9 GHz were found, that can provide a local measurement of the beam position. In contrast, propagating modes in the first and second dipole bands between 4.2 and 5.5 GHz can reach a better resolution. All the options were assessed with a specially designed test electronics built by Fermilab. The aim is to de ne a mode or spectral region suitable for the HOM electronics. Two data analysis techniques are used and compared in extracting beam position information from the dipole HOMs: direct linear regression and singular value decomposition. Current experiments suggest a resolution of 50 m accuracy in predicting local beam position using modes in the fifth dipole band, and a global resolution of 20 m over the complete module. Based on these results we decided to build a HOM electronics for the second dipole band and the fifth dipole band, so that we will have both high resolution measurements for the whole module, and localized measurements for individual cavity. The prototype electronics is being built by Fermilab and planned to be tested in FLASH by the end of 2012.

  5. A study of beam position diagnostics with beam-excited dipole higher order modes using a downconverter test electronics in third harmonic 3.9 GHz superconducting accelerating cavities at FLASH

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, P. [Manchester Univ. (United Kingdom); Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Baboi, N.; Lorbeer, B.; Wamsat, T. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Eddy, N.; Fellenz, B.; Wendt, M. [Fermi National Accelerator Lab., Batavia, IL (United States); Jones, R.M. [Manchester Univ. (United Kingdom); The Cockcroft Institute, Daresbury (United Kingdom)

    2012-08-15

    Beam-excited higher order modes (HOM) in accelerating cavities contain transverse beam position information. Previous studies have narrowed down three modal options for beam position diagnostics in the third harmonic 3.9 GHz cavities at FLASH. Localized modes in the beam pipes at approximately 4.1 GHz and in the fifth cavity dipole band at approximately 9 GHz were found, that can provide a local measurement of the beam position. In contrast, propagating modes in the first and second dipole bands between 4.2 and 5.5 GHz can reach a better resolution. All the options were assessed with a specially designed test electronics built by Fermilab. The aim is to de ne a mode or spectral region suitable for the HOM electronics. Two data analysis techniques are used and compared in extracting beam position information from the dipole HOMs: direct linear regression and singular value decomposition. Current experiments suggest a resolution of 50 m accuracy in predicting local beam position using modes in the fifth dipole band, and a global resolution of 20 m over the complete module. Based on these results we decided to build a HOM electronics for the second dipole band and the fifth dipole band, so that we will have both high resolution measurements for the whole module, and localized measurements for individual cavity. The prototype electronics is being built by Fermilab and planned to be tested in FLASH by the end of 2012.

  6. 78 FR 69871 - Agency Information Collection Activities: myE-Verify, Revision of a Currently Approved Collection

    Science.gov (United States)

    2013-11-21

    ... Collection (1) Type of Information Collection: Revision of a Currently Approved Collection. (2) Title of the... respond: E-Verify Self Check--Identity Authentication 2,900,000 responses at 0.0833 hours (5 minutes) per...

  7. Proposed procedure and analysis of results to verify the indicator of the product dose-area in radiology equipment

    International Nuclear Information System (INIS)

    Garcia Marcos, R.; Gallego Franco, P.; Sierra Diaz, F.; Gonzalez Ruiz, C.; Rodriguez Checa, M.; Brasa Estevez, M.; Gomez Calvar, R.

    2013-01-01

    The aim of this work is to establish a procedure to verify the value of the product dose-area showing certain teams of Radiology, with an alternative to the use of external transmission cameras. (Author)

  8. Combining of both RPAS and GPR methods for documentation and verifying of archaeological objects

    Science.gov (United States)

    Pavelka, Karel; Šedina, Jaroslav

    2015-04-01

    UAV (unmanned aircraft vehicle) or RPAS (remote piloted aircraft systems) are a modern technology for non - contact mapping and monitoring small areas. Nowadays, for control and piloting, RPAS are equipped with sophisticated micro-instruments such as IMU, gyroscopes, GNSS receivers, wireless image insights, wireless controls, automatic stabilization, flight planners, etc. RPAS can provide not only photographic data, but also other data types like multispectral (with NDVI capability), thermal data too (depending on sensors and type). Bigger RPAS can be equipped with more complex and expensive instruments like laser scanners or hyperspectral scanners. The RPAS method of acquisition combines the benefits of close range and aerial photogrammetry. As a result, a higher resolution and mapping precision can be obtained over compact and possibly less accessible areas (e.g. mountains, moors, swamps, dumps, small natural reserves, archaeological areas and dangerous or restricted areas). In our project, many small archaeological places are monitored. It is low cost, simple, and speedy. From these photos, a DSM (digital surface model) and orthophoto can be derived, which are useful for archaeologists (DSM is often used in shaded relief form). Based on the type of processing software, a textured virtual model can be obtained. Near infrared photos from height 100-200m give a new possibility in archaeology. We used both RPAS and GPR methods in three case projects in the Czech Republic in 2014. 1.Historical field fortification In the neighbourhood of town Litoměřice, there are still visible ramparts from the Prussian - Austrian war in the 19th Century. This was a field forward fortification, but has never been used in battle and later disappeared because of agricultural activities. Some parts are detectable by their terrain signatures, visible on shaded DSMs. By the documentation and research of these relics, we measured profiles with GPR for verifying of parts, which were

  9. Verifying the hypothesis of disconnection syndrome in patients with conduction aphasia using diffusion tensor imaging

    Institute of Scientific and Technical Information of China (English)

    Yanqin Guo; Jing Xu; Yindong Yang

    2007-01-01

    BACKGROUND: It is thought in disconnection theory that connection of anterior and posterior language function areas, i.e. the lesion of arcuate fasciculus causes conduction aphasia.OBJECTIVE: To verify the theory of disconnection elicited by repetition disorder in patients with conduction aphasia by comparing the characteristics of diffusion tensor imaging between healthy persons and patients with conduction aphasia.DESIGN: Case-control observation.SETTING: Department of Neurology, Hongqi Hospital Affiliated to Mudanjiang Medical College.PARTICIPANTS: Five male patients with cerebral infarction-involved arcuate fasciculus conduction aphasia, averaged (43±2) years, who hospitalized in the Department of Neurology, Hongqi Hospital Affiliated to Mudanjiang Medical College from February 2004 to February 2005 were involved in this experiment. The involved patients were all confirmed as cerebral infarction by skull CT and MRI, and met the diagnosis criteria revised in 1995 4th Cerebrovascular Conference. They were examined by the method of Aphasia Battery of Chinese (ABC) edited by Surong Gao. The results were poorer than auditory comprehension disproportionately, and consistented with the mode of conduction aphasia. Another 5 male healthy persons, averaged (43 ± 1 ) years, who were physicians receiving further training in the Department of Neurology, Beijing Tiantan Hospital were also involved in this experiment. Informed consents of detected items were obtained from all the subjects.METHODS: All the subjects were performed handedness assessment with assessment criteria of handedness formulated by Department of Neurology, First Hospital Affiliated to Beijing Medical University. Arcuate fasciculus of involved patients and health controls were analyzed with diffusion tensor imaging (DTI) and divided into 3 parts (anterior, middle and posterior segments) for determining FA value (mean value was obtained after three times of measurements), and a comparison of FA value was

  10. Unmaking the bomb: Verifying limits on the stockpiles of nuclear weapons

    Science.gov (United States)

    Glaser, Alexander

    2017-11-01

    Verifying limits on the stockpiles of nuclear weapons may require the ability for international in-spectors to account for individual warheads, even when non-deployed, and to confirm the authenticity of nuclear warheads prior to dismantlement. These are fundamentally new challenges for nuclear verification, and they have been known for some time; unfortunately, due to a lack of sense of urgency, research in this area has not made substantial progress over the past 20 years. This chapter explores the central outstanding issues and offers a number of possible paths forward. In the case of confirming numerical limits, these in-clude innovative tagging techniques and approaches solely based on declarations using modern crypto-graphic escrow schemes; with regard to warhead confirmation, there has recently been increasing interest in developing fundamentally new measurement approaches where, in one form or another, sensitive infor-mation is not acquired in the first place. Overall, new international R&D efforts could more usefully focus on non-intrusive technologies and approaches, which may show more promise for early demonstration and adoption. In the meantime, while warhead dismantlements remain unverified, nuclear weapon states ought to begin to document warhead assembly, refurbishment, and dismantlement activities and movements of warheads and warhead components through the weapons complex in ways that international inspectors will find credible at a later time. Again, such a process could be enabled by modern cryptographic techniques such as blockchaining. Finally, and perhaps most importantly, it is important to recognize that the main reason for the complexity of technologies and approaches needed for nuclear disarmament verification is the requirement to protect information that nuclear weapon states consider sensitive. Ultimately, if information security concerns cannot be resolved to the satisfaction of all stakeholders, an alternative would be to "reveal the

  11. Applying the Water Vapor Radiometer to Verify the Precipitable Water Vapor Measured by GPS

    Directory of Open Access Journals (Sweden)

    Ta-Kang Yeh

    2014-01-01

    Full Text Available Taiwan is located at the land-sea interface in a subtropical region. Because the climate is warm and moist year round, there is a large and highly variable amount of water vapor in the atmosphere. In this study, we calculated the Zenith Wet Delay (ZWD of the troposphere using the ground-based Global Positioning System (GPS. The ZWD measured by two Water Vapor Radiometers (WVRs was then used to verify the ZWD that had been calculated using GPS. We also analyzed the correlation between the ZWD and the precipitation data of these two types of station. Moreover, we used the observational data from 14 GPS and rainfall stations to evaluate three cases. The offset between the GPS-ZWD and the WVR-ZWD ranged from 1.31 to 2.57 cm. The correlation coefficient ranged from 0.89 to 0.93. The results calculated from GPS and those measured using the WVR were very similar. Moreover, when there was no rain, light rain, moderate rain, or heavy rain, the flatland station ZWD was 0.31, 0.36, 0.38, or 0.40 m, respectively. The mountain station ZWD exhibited the same trend. Therefore, these results have demonstrated that the potential and strength of precipitation in a region can be estimated according to its ZWD values. Now that the precision of GPS-ZWD has been confirmed, this method can eventually be expanded to the more than 400 GPS stations in Taiwan and its surrounding islands. The near real-time ZWD data with improved spatial and temporal resolution can be provided to the city and countryside weather-forecasting system that is currently under development. Such an exchange would fundamentally improve the resources used to generate weather forecasts.

  12. Comparison of coaxial higher order mode couplers for the CERN Superconducting Proton Linac study

    CERN Document Server

    AUTHOR|(CDS)2085329; Gerigk, Frank; Van Rienen, Ursula

    2017-01-01

    Higher order modes (HOMs) may affect beam stability and refrigeration requirements of superconducting proton linacs such as the Superconducting Proton Linac, which is studied at CERN. Under certain conditions beam-induced HOMs can accumulate sufficient energy to destabilize the beam or quench the superconducting cavities. In order to limit these effects, CERN considers the use of coaxial HOM couplers on the cutoff tubes of the 5-cell superconducting cavities. These couplers consist of resonant antennas shaped as loops or probes, which are designed to couple to potentially dangerous modes while sufficiently rejecting the fundamental mode. In this paper, the design process is presented and a comparison is made between various designs for the high-beta SPL cavities, which operate at 704.4 MHz. The rf and thermal behavior as well as mechanical aspects are discussed. In order to verify the designs, a rapid prototype for the favored coupler was fabricated and characterized on a low-power test-stand.

  13. FTREX Testing Report (Fault Tree Reliability Evaluation eXpert) Version 1.5

    International Nuclear Information System (INIS)

    Jung, Woo Sik

    2009-07-01

    In order to verify FTREX functions and to confirm the correctness of FTREX 1.5, various tests were performed 1.fault trees with negates 2.fault trees with house events 3.fault trees with multiple tops 4.fault trees with logical loops 5.fault trees with initiators, house events, negates, logical loops, and flag events By using the automated cutest propagation test, the FTREX 1.5 functions are verified. FTREX version 1.3 and later versions have capability to perform bottom-up cutset-propagation test in order check cutest status. FTREX 1.5 always generates the proper minimal cut sets. All the output cutsets of the tested problems are MCSs (Minimal Cut Sets) and have no non-minimal cutsets and improper cutsets. The improper cutsets are those that have no effect to top, have multiple initiators, or have disjoint events A * -A

  14. Field tests on partial embedment effects (embedment effect tests on soil-structure interaction)

    International Nuclear Information System (INIS)

    Kurimoto, O.; Tsunoda, T.; Inoue, T.; Izumi, M.; Kusakabe, K.; Akino, K.

    1993-01-01

    A series of Model Tests of Embedment Effect on Reactor Buildings has been carried out by the Nuclear Power Engineering Corporation (NUPEC), under the sponsorship of the Ministry of International Trade and lndustry (MITI) of Japan. The nuclear reactor buildings are partially embedded due to conditions for the construction or building arrangement in Japan. It is necessary to verify the partial embedment effects by experiments and analytical studies in order to incorporate the effects in the seismic design. Forced vibration tests, therefore, were performed using a model with several types of embedment. Correlated simulation analyses were also performed and the characteristics of partial embedment effects on soil-structure interaction were evaluated. (author)

  15. Birth-Order Complementarity and Marital Adjustment.

    Science.gov (United States)

    Vos, Cornelia J. Vanderkooy; Hayden, Delbert J.

    1985-01-01

    Tested the influence of birth-order complementarity on marital adjustment among 327 married women using the Spanier Dyadic Adjustment Scale (1976). Birth-order complementarity was found to be unassociated with marital adjustment. (Author/BL)

  16. Endurance test of DUPIC irradiation test rig-003

    Energy Technology Data Exchange (ETDEWEB)

    Moon, J.S; Yang, M.S.; Lee, C.Y.; Ryu, J.S.; Jeon, H.G

    2001-04-01

    This report presents the pressure drop, vibration and endurance test results for DUPIC Irradiation Test Rig-003 which was design and fabricated by KAERI. From the pressure drop and vibration test results, it is verified that DUPIC Irradiation Test Rig-003 satisfied the limit conditions of HANARO. And, remarkable wear is not observed in DUPIC Irradiation Test Rig-003 during 40 endurance test days.

  17. Test report - caustic addition system operability test procedure

    International Nuclear Information System (INIS)

    Parazin, R.E.

    1995-01-01

    This Operability Test Report documents the test results of test procedure WHC-SD-WM-OTP-167 ''Caustic Addition System Operability Test Procedure''. The Objective of the test was to verify the operability of the 241-AN-107 Caustic Addition System. The objective of the test was met

  18. Experimental demonstration of fractional-order oscillators of orders 2.6 and 2.7

    KAUST Repository

    Elwakil, A.S.; Agambayev, Agamyrat; Allagui, A.; Salama, Khaled N.

    2017-01-01

    The purpose of this work is to provide an experimental demonstration for the development of sinusoidal oscillations in a fractional-order Hartley-like oscillator. Solid-state fractional-order electric double-layer capacitors were first fabricated using graphene-percolated P(VDF-TrFE-CFE) composite structure, and then characterized by using electrochemical impedance spectroscopy. The devices exhibit the fractional orders of 0.6 and 0.74 respectively (using the model Zc=Rs+1/(jω)αCα), with the corresponding pseudocapacitances of approximately 93nFsec−0.4 and 1.5nFsec−0.26 over the frequency range 200kHz–6MHz (Rs < 15Ω). Then, we verified using these fractional-order devices integrated in a Hartley-like circuit that the fractional-order oscillatory behaviors are of orders 2.6 and 2.74.

  19. Experimental demonstration of fractional-order oscillators of orders 2.6 and 2.7

    KAUST Repository

    Elwakil, A.S.

    2017-02-07

    The purpose of this work is to provide an experimental demonstration for the development of sinusoidal oscillations in a fractional-order Hartley-like oscillator. Solid-state fractional-order electric double-layer capacitors were first fabricated using graphene-percolated P(VDF-TrFE-CFE) composite structure, and then characterized by using electrochemical impedance spectroscopy. The devices exhibit the fractional orders of 0.6 and 0.74 respectively (using the model Zc=Rs+1/(jω)αCα), with the corresponding pseudocapacitances of approximately 93nFsec−0.4 and 1.5nFsec−0.26 over the frequency range 200kHz–6MHz (Rs < 15Ω). Then, we verified using these fractional-order devices integrated in a Hartley-like circuit that the fractional-order oscillatory behaviors are of orders 2.6 and 2.74.

  20. Passive BWR integral LOCA testing at the Karlstein test facility INKA

    Energy Technology Data Exchange (ETDEWEB)

    Drescher, Robert [AREVA GmbH, Erlangen (Germany); Wagner, Thomas [AREVA GmbH, Karlstein am Main (Germany); Leyer, Stephan [TH University of Applied Sciences, Deggendorf (Germany)

    2014-05-15

    KERENA is an innovative AREVA GmbH boiling water reactor (BWR) with passive safety systems (Generation III+). In order to verify the functionality of the reactor design an experimental validation program was executed. Therefore the INKA (Integral Teststand Karlstein) test facility was designed and erected. It is a mockup of the BWR containment, with integrated pressure suppression system. While the scaling of the passive components and the levels match the original values, the volume scaling of the containment compartments is approximately 1:24. The storage capacity of the test facility pressure vessel corresponds to approximately 1/6 of the KERENA RPV and is supplied by a benson boiler with a thermal power of 22 MW. In March 2013 the first integral test - Main Steam Line Break (MSLB) - was executed. The test measured the combined response of the passive safety systems to the postulated initiating event. The main goal was to demonstrate the ability of the passive systems to ensure core coverage, decay heat removal and to maintain the containment within defined limits. The results of the test showed that the passive safety systems are capable to bring the plant to stable conditions meeting all required safety targets with sufficient margins. Therefore the test verified the function of those components and the interplay between them. The test proved that INKA is an unique test facility, capable to perform integral tests of passive safety concepts under plant-like conditions. (orig.)

  1. Space Suit Joint Torque Testing

    Science.gov (United States)

    Valish, Dana J.

    2011-01-01

    In 2009 and early 2010, a test was performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design meets the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future space suits. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis and a variance in torque values for some of the tested joints was apparent. Potential variables that could have affected the data were identified and re-testing was conducted in an attempt to eliminate these variables. The results of the retest will be used to determine if further testing and modification is necessary before the method can be validated.

  2. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r)

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Stambaugh, Cassandra [Department of Physics, University of South Florida, Tampa, Florida 33612 (United States); Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir, E-mail: vladimir.feygelman@moffitt.org [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)

    2015-08-15

    Purpose: The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. Methods: DICOM RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms—PINNACLE (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)—were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, D{sub max}, D{sub min}, and doses to % volume: D99, D95, D5, D1, D0.03 cm{sup 3}) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. Results: In Test 1, PINNACLE produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, PINNACLE and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding D{sub min} and D{sub max} as least clinically relevant would result in 32 (15%) vs 5 (2

  3. Methods, software and datasets to verify DVH calculations against analytical values: Twenty years late(r).

    Science.gov (United States)

    Nelms, Benjamin; Stambaugh, Cassandra; Hunt, Dylan; Tonner, Brian; Zhang, Geoffrey; Feygelman, Vladimir

    2015-08-01

    The authors designed data, methods, and metrics that can serve as a standard, independent of any software package, to evaluate dose-volume histogram (DVH) calculation accuracy and detect limitations. The authors use simple geometrical objects at different orientations combined with dose grids of varying spatial resolution with linear 1D dose gradients; when combined, ground truth DVH curves can be calculated analytically in closed form to serve as the absolute standards. dicom RT structure sets containing a small sphere, cylinder, and cone were created programmatically with axial plane spacing varying from 0.2 to 3 mm. Cylinders and cones were modeled in two different orientations with respect to the IEC 1217 Y axis. The contours were designed to stringently but methodically test voxelation methods required for DVH. Synthetic RT dose files were generated with 1D linear dose gradient and with grid resolution varying from 0.4 to 3 mm. Two commercial DVH algorithms-pinnacle (Philips Radiation Oncology Systems) and PlanIQ (Sun Nuclear Corp.)-were tested against analytical values using custom, noncommercial analysis software. In Test 1, axial contour spacing was constant at 0.2 mm while dose grid resolution varied. In Tests 2 and 3, the dose grid resolution was matched to varying subsampled axial contours with spacing of 1, 2, and 3 mm, and difference analysis and metrics were employed: (1) histograms of the accuracy of various DVH parameters (total volume, Dmax, Dmin, and doses to % volume: D99, D95, D5, D1, D0.03 cm(3)) and (2) volume errors extracted along the DVH curves were generated and summarized in tabular and graphical forms. In Test 1, pinnacle produced 52 deviations (15%) while PlanIQ produced 5 (1.5%). In Test 2, pinnacle and PlanIQ differed from analytical by >3% in 93 (36%) and 18 (7%) times, respectively. Excluding Dmin and Dmax as least clinically relevant would result in 32 (15%) vs 5 (2%) scored deviations for pinnacle vs PlanIQ in Test 1, while Test 2

  4. Evaluation of STAT medication ordering process in a community hospital.

    Science.gov (United States)

    Abdelaziz, Hani; Richardson, Sandra; Walsh, Kim; Nodzon, Jessica; Schwartz, Barbara

    2016-01-01

    In most health care facilities, problems related to delays in STAT medication order processing time are of common concern. The purpose of this study was to evaluate processing time for STAT orders at Kimball Medical Center. All STAT orders were reviewed to determine processing time; order processing time was also stratified by physician order entry (physician entered (PE) orders vs. non-physician entered (NPE) orders). Collected data included medication ordered, indication, time ordered, time verified by pharmacist, time sent from pharmacy, and time charted as given to the patient. A total of 502 STAT orders were reviewed and 389 orders were included for analysis. Overall, median time was 29 minutes, IQR 16-63; porder processing time may be improved by increasing the availability of medications in ADM, and pharmacy involvement in the verification process.

  5. Tests of the new STIC scintillator ring prototype, the photomultipliers and optic fibers cables of the 40 deg C counters

    International Nuclear Information System (INIS)

    Silva, Tatiana da

    1997-01-01

    This paper reports the tests performed on the semicircular prototype of the new scintillator ring with readings obtained by WLS optic fibers. The prototype intends to verify the light collecting and investigate a method for fiber gluing in a circular surface, without the appearing of air bubbles which may restrain the light transmission. Also the optic fiber cables and the photomultipliers used in the 40 deg C counters have been tested in order to verify the electromagnetic energy which may leak from failures in the barrel, aiming the hermeticity enhancement, and also the existence of any damaged cable

  6. 40 CFR 63.2994 - How do I verify the performance of monitoring equipment?

    Science.gov (United States)

    2010-07-01

    ... equipment? (a) Before conducting the performance test, you must take the steps listed in paragraphs (a)(1) and (2) of this section: (1) Install and calibrate all process equipment, control devices, and... evaluation results. (b) If you use a thermal oxidizer, the temperature monitoring device must meet the...

  7. Determination of Chlorine Dioxide and Chlorite in Water Supply Systems by Verified Methods

    Directory of Open Access Journals (Sweden)

    Tkáčová Jana

    2014-07-01

    Full Text Available This work is dedicated to the development and optimization of appropriate analytical methods for the determination of chlorine dioxide and chlorite in drinking water in order to obtain accurate and correct results in the quality control of drinking water. The work deals with the development and optimization of a method for the determination of chlorine dioxide using chlorophenol red. Furthermore, a new spectrophotometric method for the determination of chlorite via bromometry using methyl orange was developed, optimized and validated. An electrochemical method for the determination of chlorite by flow coulometry was also developed, optimized and validated.

  8. The prevalence of suspected and challenge-verified penicillin allergy in a university hospital population

    DEFF Research Database (Denmark)

    Borch, Jacob Eli; Andersen, Klaus Ejner; Bindslev-Jensen, Carsten

    2006-01-01

    patterns and public economy as a consequence. We performed a cross-sectional case-control study with two visits to all clinical departments of a large university hospital in order to find in-patients with medical files labelled "penicillin allergy" or who reported penicillin allergy upon admission. Patient....... In a cohort of 3642 patients, 96 fulfilled the inclusion criteria giving a point-prevalence of alleged penicillin allergy of 5% in a hospital in-patient population. Mean time elapsed since the alleged first reaction to penicillin was 20 years. The skin was the most frequently affected organ (82.2%), maculo...

  9. First-order inflation

    International Nuclear Information System (INIS)

    Kolb, E.W.

    1991-01-01

    In the original proposal, inflation occurred in the process of a strongly first-order phase transition. This model was soon demonstrated to be fatally flawed. Subsequent models for inflation involved phase transitions that were second-order, or perhaps weakly first-order; some even involved no phase transition at all. Recently the possibility of inflation during a strongly first-order phase transition has been reviewed. In this talk I will discuss some models for first-order inflation, and emphasize unique signatures that result if inflation is realized in a first-order transition. Before discussing first-order inflation, I will briefly review some of the history of inflation to demonstrate how first-order inflation differs from other models. (orig.)

  10. First-order inflation

    International Nuclear Information System (INIS)

    Kolb, E.W.; Chicago Univ., IL

    1990-09-01

    In the original proposal, inflation occurred in the process of a strongly first-order phase transition. This model was soon demonstrated to be fatally flawed. Subsequent models for inflation involved phase transitions that were second-order, or perhaps weakly first-order; some even involved no phase transition at all. Recently the possibility of inflation during a strongly first-order phase transition has been revived. In this talk I will discuss some models for first-order inflation, and emphasize unique signatures that result in inflation is realized in a first-order transition. Before discussing first-order inflation, I will briefly review some of the history of inflation to demonstrate how first-order inflation differs from other models. 58 refs., 3 figs

  11. Development of a system to verify the programs used for planning of photon beams teletherapy

    International Nuclear Information System (INIS)

    Ocariz Ayala, Victor Daniel

    2004-12-01

    The main objective of radiotherapy is to deliver to the tumor the radiation dose prescribed by the physician, in the most possible accurate form, to save, as much as possible, the healthy tissues located in the neighborhood of the tumor. In order to reach these objectives, it is necessary to carry out a treatment planning and the more the used technologies and therapeutical procedures are sophisticated, the more the planning will be sophisticated. The most sophisticated planning systems use computer programs and are able to determine dose distributions in three dimensions. However, since they work using mathematical models, they may fail and it is necessary to evaluate their performances in order to be considered reliable. Therefore, the availability of a system capable to evaluate the performance of planning systems employed in oncological teletherapy, using ionizing radiation, becomes important. In this work, a data file to be used in radiotherapy planning system quality control (Algorithm accuracy and dose distribution) was developed and it is able to be sent by mail to the radiotherapy services that work with photon beams. (author)

  12. Issues to be verified by IFMIF prototype accelerator for engineering validation

    International Nuclear Information System (INIS)

    Sugimoto, M.; Imai, T.; Okumura, Y.; Nakayama, K.; Suzuki, S.; Saigusa, M.

    2002-01-01

    The validation of the accelerator technology providing the 250 mA/40 MeV continuous-wave (CW) deuteron beam with the required quality is a key issue to realize the international fusion materials irradiation facility (IFMIF). As the difficulty of high current accelerator generally comes from the low energy section due to space-charge effects, a prototype test of such a part is planned in the next development phase. The optimal choice of the prototype consists of a full-scale injector, a full-modelled radiofrequency quadrupole, and a short drift tube linear accelerator associated with a beam diagnostics/dump. Through prototype tests, the stable control of the CW accelerator at the various operational conditions will be addressed, and the technical risks of IFMIF accelerator construction can be significantly reduced

  13. The Efficiency of a Hybrid Flapping Wing Structure—A Theoretical Model Experimentally Verified

    Directory of Open Access Journals (Sweden)

    Yuval Keren

    2016-07-01

    Full Text Available To propel a lightweight structure, a hybrid wing structure was designed; the wing’s geometry resembled a rotor blade, and its flexibility resembled an insect’s flapping wing. The wing was designed to be flexible in twist and spanwise rigid, thus maintaining the aeroelastic advantages of a flexible wing. The use of a relatively “thick” airfoil enabled the achievement of higher strength to weight ratio by increasing the wing’s moment of inertia. The optimal design was based on a simplified quasi-steady inviscid mathematical model that approximately resembles the aerodynamic and inertial behavior of the flapping wing. A flapping mechanism that imitates the insects’ flapping pattern was designed and manufactured, and a set of experiments for various parameters was performed. The simplified analytical model was updated according to the tests results, compensating for the viscid increase of drag and decrease of lift, that were neglected in the simplified calculations. The propelling efficiency of the hovering wing at various design parameters was calculated using the updated model. It was further validated by testing a smaller wing flapping at a higher frequency. Good and consistent test results were obtained in line with the updated model, yielding a simple, yet accurate tool, for flapping wings design.

  14. Linear Matrix Inequality Based Fuzzy Synchronization for Fractional Order Chaos

    Directory of Open Access Journals (Sweden)

    Bin Wang

    2015-01-01

    Full Text Available This paper investigates fuzzy synchronization for fractional order chaos via linear matrix inequality. Based on generalized Takagi-Sugeno fuzzy model, one efficient stability condition for fractional order chaos synchronization or antisynchronization is given. The fractional order stability condition is transformed into a set of linear matrix inequalities and the rigorous proof details are presented. Furthermore, through fractional order linear time-invariant (LTI interval theory, the approach is developed for fractional order chaos synchronization regardless of the system with uncertain parameters. Three typical examples, including synchronization between an integer order three-dimensional (3D chaos and a fractional order 3D chaos, anti-synchronization of two fractional order hyperchaos, and the synchronization between an integer order 3D chaos and a fractional order 4D chaos, are employed to verify the theoretical results.

  15. Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

    OpenAIRE

    Loke Mun Sei

    2015-01-01

    Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, ...

  16. An Approach to Verifying Completeness and Consistency in a Rule-Based Expert System.

    Science.gov (United States)

    1982-08-01

    peolea with the se e S knowlede base by observing en t om. W0hile thorough testing is an "samt4 Pert of V*flfyL the ooIlst4ftl and capleteness of a...physicians at Stanford’s Oncology Day Care Center on the management of patients who are on experimental treatment protocols. These protocols serve to...for oncology protocol management . Prooceedings of 7th IJCAI, pp. 876- 881, Vancouver, B.C., August 1981. I. van Melle, W. A Domain-Independent system

  17. A novel approach to verify the influence of atmospheric parameters in substations concerning lightning

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Andre Nunes de; Silva, Ivan Nunes da; Ulson, Jose Alfredo C.; Zago, Maria Goretti [UNESP, Bauru, SP (Brazil). Dept. de Engenharia Eletrica]. E-mail: andrejau@bauru.unesp.br

    2001-07-01

    This paper demonstrates that artificial neural networks can be used effectively for estimation of parameters related to study of atmospheric conditions to high voltage substations design. Specifically, the neural networks are used to compute the variation of electrical field intensity and critical disruptive voltage in substations taking into account several atmospheric factors, such as pressure, temperature, humidity. Examples of simulation of tests are presented to validate the proposed approach. The results that were obtained by experimental evidences and numerical simulations allowed the verification of the influence of the atmospheric conditions on design of substations concerning lightning. (author)

  18. Electromagnetic cloaking in higher order spherical cloaks

    Science.gov (United States)

    Sidhwa, H. H.; Aiyar, R. P. R. C.; Kulkarni, S. V.

    2017-06-01

    The inception of transformation optics has led to the realisation of the invisibility devices for various applications, one of which is spherical cloaking. In this paper, a formulation for a higher-order spherical cloak has been proposed to reduce its physical thickness significantly by introducing a nonlinear relation between the original and transformed coordinate systems and it has been verified using the ray tracing approach. Analysis has been carried out to observe the anomalies in the variation of refractive index for higher order cloaks indicating the presence of poles in the relevant equations. Furthermore, a higher-order spherical cloak with predefined values of the material characteristics on its inner and outer surfaces has been designed for practical application.

  19. Order Picking Optimization in Carousels Storage System

    Directory of Open Access Journals (Sweden)

    Xiong-zhi Wang

    2013-01-01

    Full Text Available This paper addresses the order picking problem in a material handling system consisting of multiple carousels and one picker. Carousels are rotatable closed-loop storage systems for small items, where items are stored in bins along the loop. An order at carousels consists of n different items stored there. The objective is to find an optimal picking sequence to minimizing the total order picking time. After proving the problem to be strongly NP-hard and deriving two characteristics, we develop a dynamic programming algorithm (DPA for a special case (two-carousel storage system and an improved nearest items heuristics (INIH for the general problem. Experimental results verify that the solutions are quickly and steadily achieved and show their better performance.

  20. Verifying elementary ITER maintenance actions with the MS2 benchmark product

    International Nuclear Information System (INIS)

    Heemskerk, C.J.M.; Elzendoorn, B.S.Q.; Magielsen, A.J.; Schropp, G.Y.R.

    2011-01-01

    A new facility has been taken in operation to investigate the influence of visual and haptic feedback on the performance of remotely executed ITER RH maintenance tasks. A reference set of representative ITER remote handling maintenance tasks was included the master slave manipulator system (MS2) benchmark product. The benchmark product was used in task performance tests in a representative two-handed dexterous manipulation test bed at NRG. In the setup, the quality of visual feedback was varied by exchanging direct view with indirect view setups in which visual feedback is provided via video cameras. Interaction forces were measured via an integrated force sensor. The impact of feedback quality on the performance of maintenance tasks at the level of handling individual parts was measured and analysed. Remote execution of the maintenance actions took roughly 3-5 times more time than hands-on. Visual feedback was identified as the dominant factor, including aspects like (lack of) operator control over camera placement, pan, tilt and zoom, lack of 3D perception, image quality, and latency. Haptic feedback was found to be important, but only in specific contact transition and constrained motion tasks.