WorldWideScience

Sample records for proved largely ineffective

  1. Ineffective Leadership.

    Science.gov (United States)

    Itri, Jason N; Lawson, Leslie M

    2016-07-01

    Radiology leaders can have a profound impact on the success and working environment of a radiology department, promoting core values and inspiring staff members to achieve the organization's mission. On the other hand, ineffective leaders can have a devastating effect on a radiology department by impairing communication among members, undermining staff commitment to the organization's success, and stifling the development of other staff members and leaders in the organization. One of the most important investments a radiology department can make is in identifying, cultivating, and promoting new leaders. The authors describe 13 habits and characteristics of new leaders that lead these individuals to address situations in both ineffective and counterproductive ways, impeding the performance of a radiology department and its capacity to play a meaningful role in shaping the future of radiology. New leaders must continually learn and improve their leadership skills if they are to avoid the destructive habits of ineffective leaders and successfully overcome the challenges facing radiology today. Senior leaders may also benefit from understanding the pitfalls that make leaders ineffective and should strive to continually improve their leadership skills given the critical role of leadership in the success of radiology departments. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  2. The (in)effectiveness of Global Land Policies on Large-Scale Land Acquisition

    NARCIS (Netherlands)

    Verhoog, S.M.

    2014-01-01

    Due to current crises, large-scale land acquisition (LSLA) is becoming a topic of growing concern. Public data from the ‘Land Matrix Global Observatory’ project (Land Matrix 2014a) demonstrates that since 2000, 1,664 large-scale land transactions in low- and middle-income countries were reported,

  3. Cloning the Professor, an Alternative to Ineffective Teaching in a Large Course

    Science.gov (United States)

    Nelson, Jennifer; Robison, Diane F.; Bell, John D.; Bradshaw, William S.

    2009-01-01

    Pedagogical strategies have been experimentally applied in large-enrollment biology courses in an attempt to amplify what teachers do best in effecting deep learning, thus more closely approximating a one-on-one interaction with students. Carefully orchestrated in-class formative assessments were conducted to provide frequent, high-quality…

  4. Seismic proving tests on the reliability for large components and equipment of nuclear power plants

    International Nuclear Information System (INIS)

    Ohno, Tokue; Tanaka, Nagatoshi

    1988-01-01

    Since Japan has destructive earthquakes frequently, the structural reliability for large components and equipment of nuclear power plants are rigorously required. They are designed using sophisticated seismic analyses and have not yet encountered a destructive earthquake. When nuclear power plants are planned, it is very important that the general public understand the structural reliability during and after an earthquake. Seismic Proving Tests have been planned by Ministry of International Trade and Industry (Miti) to comply with public requirement in Japan. A large-scale high-performance vibration table was constructed at Tasted Engineering Laboratory of Nuclear Power Engineering Test Center (NU PEC), in order to prove the structural reliability by vibrating the test model (of full scale or close to the actual size) in the condition of a destructive earthquake. As for the test models, the following four items were selected out of large components and equipment important to the safety: Reactor Containment Vessel; Primary Coolant Loop or Primary Loop Recirculation System; Reactor Pressure Vessel; and Reactor Core Internals. Here is described a brief of the vibration table, the test method and the results of the tests on PWR Reactor Containment Vessel and BWR Primary Loop Recirculation System (author)

  5. Application of proving-ring technology to measure thermally induced displacements in large boreholes in rock

    International Nuclear Information System (INIS)

    Patrick, W.C.; Reactor, N.L.; Butkovich, T.R.

    1984-03-01

    A strain-gauged proving-ring transducer was designed and deployed to measure small diametral displacements in 0.61-m diameter boreholes in rock. The rock surrounding the boreholes was previously heated by storage of spent nuclear fuel assemblies and measurements during post-retrieval cooling of the rock were made. To accomplish this, a transducer was designed to measure displacements in the range of 10 to 100 μm, to function in a time-varying temperature regime of 30 0 to 60 0 C at a relative humidity of 100%, to be of low stiffness, and to be easily and quickly installed. 7 references, 6 figures, 1 table

  6. PROVING THE CAPABILITY FOR LARGE SCALE REGIONAL LAND-COVER DATA PRODUCTION BY SELF-FUNDED COMMERCIAL OPERATORS

    Directory of Open Access Journals (Sweden)

    M. W. Thompson

    2017-11-01

    Full Text Available For service providers developing commercial value-added data content based on remote sensing technologies, the focus is to typically create commercially appropriate geospatial information which has downstream business value. The primary aim being to link locational intelligence with business intelligence in order to better make informed decisions. From a geospatial perspective this locational information must be relevant, informative, and most importantly current; with the ability to maintain the information timeously into the future for change detection purposes. Aligned with this, GeoTerraImage has successfully embarked on the production of land-cover/land-use content over southern Africa. The ability for a private company to successfully implement and complete such an exercise has been the capability to leverage the combined advantages of cutting edge data processing technologies and methodologies, with emphasis on processing repeatability and speed, and the use of a wide range of readily available imagery. These production workflows utilise a wide range of integrated procedures including machine learning algorithms, innovative use of non-specialists for sourcing of reference data, and conventional pixel and object-based image classification routines, and experienced/expert landscape interpretation. This multi-faceted approach to data produce development demonstrates the capability for SMME level commercial entities such as GeoTerraImage to generate industry applicable large data content, in this case, wide area coverage land-cover and land-use data across the sub-continent. Within this development, the emphasis has been placed on the key land-use information, such as mining, human settlements, and agriculture, given the importance of this geo-spatial land-use information in business and socio-economic applications and decision making.

  7. Ineffective cough and mechanical mucociliary clearance techniques.

    Science.gov (United States)

    Fernández-Carmona, A; Olivencia-Peña, L; Yuste-Ossorio, M E; Peñas-Maldonado, L

    Cough is a fundamental defense mechanism for keeping the airway free of foreign elements. Life-threatening situations may arise when cough proves ineffective as a result of muscle weakness or altered mucociliary function. When a patient is unable to cough effectively, techniques are required to either reinforce or replace cough capacity. The use of mechanical systems that facilitate or substitute cough function is increasingly common in Intensive Care Units, where it is relatively frequent to find situations of ineffective cough due to different clinical causes. This review examines the current clinical practice recommendations referred to the indication and use of mechanical cough assist and intrapulmonary percussive ventilation systems. Copyright © 2017 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  8. Multiple challenges of antibiotic use in a large hospital in Ethiopia - a ward-specific study showing high rates of hospital-acquired infections and ineffective prophylaxis.

    Science.gov (United States)

    Gutema, Girma; Håkonsen, Helle; Engidawork, Ephrem; Toverud, Else-Lydia

    2018-05-03

    This project aims to study the use of antibiotics in three clinical wards in the largest tertiary teaching hospital in Ethiopia for a period of 1 year. The specific aims were to assess the prevalence of patients on antibiotics, quantify the antibiotic consumption and identify the main indications of use. The material was all the medical charts (n = 2231) retrieved from three clinical wards (internal medicine, gynecology/obstetrics and surgery) in Tikur Anbessa Specialized Hospital (TASH) in Addis Ababa between September 2013 and September 2014. Data collection was performed manually by four pharmacists. Each medical chart represented one patient. About 60% of the patients were admitted to internal medicine, 20% to each of the other two wards. The number of bed days (BD) was on average 16.5. Antibiotics for systemic use were prescribed to 73.7% of the patients (on average: 2.1 antibiotics/patient) of whom 86.6% got a third or fourth generation cephalosporin (mainly ceftriaxone). The average consumption of antibiotics was 81.6 DDD/100BD, varying from 91.8 in internal medicine and 71.6 in surgery to 47.6 in gynecology/obstetrics. The five most frequently occurring infections were pneumonia (26.6%), surgical site infections (21.5%), neutropenic fever (6.9%), sepsis (6.4%) and urinary tract infections (4.7%). About one fourth of the prescriptions were for prophylactic purposes. Hospital acquired infections occurred in 23.5% of the patients (353 cases of surgical site infection). The prescribing was based on empirical treatment and sensitivity testing was reported in only 3.8% of the cases. In the present study from three wards in the largest tertiary teaching hospital in Ethiopia, three out of four patients were prescribed antibiotics, primarily empirically. The mean antibiotic consumption was 81.6 DDD/100BD. Surgical site infections constituted a large burden of the infections treated in the hospital, despite extensive prescribing of prophylaxis. The findings show

  9. Early Identification of Ineffective Cooperative Learning Teams

    Science.gov (United States)

    Hsiung, C .M.; Luo, L. F.; Chung, H. C.

    2014-01-01

    Cooperative learning has many pedagogical benefits. However, if the cooperative learning teams become ineffective, these benefits are lost. Accordingly, this study developed a computer-aided assessment method for identifying ineffective teams at their early stage of dysfunction by using the Mahalanobis distance metric to examine the difference…

  10. Role of W and Mn for reliable 1X nanometer-node ultra-large-scale integration Cu interconnects proved by atom probe tomography

    Energy Technology Data Exchange (ETDEWEB)

    Shima, K.; Shimizu, H.; Momose, T.; Shimogaki, Y. [Department of Materials Engineering, The University of Tokyo, 7-3-1, Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Tu, Y. [The Oarai Center, Institute for Materials Research, Tohoku University, Oarai, Ibaraki 311-1313 (Japan); Key Laboratory of Polar Materials and Devices, Ministry of Education, East China Normal University, Shanghai 200241 (China); Takamizawa, H.; Shimizu, Y.; Inoue, K.; Nagai, Y. [The Oarai Center, Institute for Materials Research, Tohoku University, Oarai, Ibaraki 311-1313 (Japan)

    2014-09-29

    We used atom probe tomography (APT) to study the use of a Cu(Mn) as a seed layer of Cu, and a Co(W) single-layer as reliable Cu diffusion barriers for future interconnects in ultra-large-scale integration. The use of Co(W) layer enhances adhesion of Cu to prevent electromigration and stress-induced voiding failures. The use of Cu(Mn) as seed layer may enhance the diffusion barrier performance of Co(W) by stuffing the Cu diffusion pass with Mn. APT was used to visualize the distribution of W and Mn in three dimensions with sub-nanometer resolution. W was found to segregate at the grain boundaries of Co, which prevents diffusion of Cu via the grain boundaries. Mn was found to diffuse from the Cu(Mn) layer to Co(W) layer and selectively segregate at the Co(W) grain boundaries with W, reinforcing the barrier properties of Co(W) layer. Hence, a Co(W) barrier coupled with a Cu(Mn) seed layer can form a sufficient diffusion barrier with film that is less than 2.0-nm-thick. The diffusion barrier behavior was preserved following a 1-h annealing at 400 °C. The underlayer of the Cu interconnects requires a large adhesion strength with the Cu, as well as low electrical resistivity. The use of Co(W) has previously been shown to satisfy these requirements, and addition of Mn is not expected to deteriorate these properties.

  11. Ineffective ADL skills in women with fibromyalgia

    DEFF Research Database (Denmark)

    Von Bülow, Cecilie; Amris, Kirstine; la Cour, Karen

    2016-01-01

    BACKGROUND: Subgroups of women with fibromyalgia likely show different activity of daily living (ADL) skill deficits. Identifying ineffective ADL skills of significance in the 'typical' woman with fibromyalgia will promote the planning of targeted occupational therapy interventions aiming...... at improving ADL ability. OBJECTIVE: To identify frequently reported ADL skill deficits of significance in subgroups of women with fibromyalgia who have decreased ADL motor ability in combination with decreased or competent ADL process ability. METHOD: Women with fibromyalgia were evaluated with the Assessment...... of Motor and Process Skills (AMPS). If they demonstrated decreased ADL motor ability, the calibrated AMPS raters identified and reported ineffective ADL skills of significance. Descriptive comparisons were made between subgroups displaying either decreased or competent ADL process ability. RESULTS: Moves...

  12. CAUSES FOR INEFFECTIVE COMMUNICATION BETWEEN MEDICAL SPECIALISTS

    Directory of Open Access Journals (Sweden)

    Stayko I. Spiridonov

    2017-07-01

    Full Text Available Purpose: In the resent years the healthcare system has moved to inter-professional, cross-disciplinary, multi-person approach where the communications are very important for ensuring patient safety. Communication in health organisations needs to be studied and analysed deeply and comprehensively because the future of an organisation often depends on good communication. The purpose of this study is to investigate and analyse the reasons for ineffective communication between medical specialists in the teams they work in. Materials and Methods: A questionnaire method is used. Through a survey over a period of 12 months (from 01. 12. 2014 to 01. 12. 2015 at the Escullap Hospital in Pazardzhik, DCC 18 - Sofia, St. Mina Hospital in Plovdiv, MHAT – Plovdiv, DCC 1 in Haskovo, UMHAT in Stara Zagora, DCC 3 in Varna and MHAT – Parvomay, was studied and analyzed the opinion of medical specialists on the effectiveness of communication within the team they work in. The survey includes 477 medical specialists. Results and conclusions: According to 41.1% of the respondents, the communication in the team they work in is insufficiently effective. Most of the respondents (39.8% find their colleagues responsible for the ineffective communication, followed by those who seek the cause for poor communication in the management of the health care facility (27.6%. The leading cause of poor communication in the team according to the study participants is the inequality between the characters of the colleagues (41.9%. According to the majority of respondents (28.3%, improvements in facilities and wage increases (27.3% would be essential to improve communication within the team they work in. Recommendations have been formulated to improve communication among medical specialists.

  13. An entomopathogenic fungus and nematode prove ineffective for biocontrol of an invasive leaf miner Profenusa thomsoni in Alaska

    Science.gov (United States)

    Robert Progar; J.J. Kruse; John Lundquist; K.P. Zogas; M.J. Rinella

    2015-01-01

    A non-native invasive sawfly, the amber-marked birch leaf miner Profenusa thomsoni (Konow), was first detected in south-central Alaska in 1996 and is now widely distributed throughout urban and wild birch trees in Alaska. Impacts have been considered primarily aesthetic because leaf miners cause leaves of birch trees (Betula...

  14. Elliptic curves and primality proving

    Science.gov (United States)

    Atkin, A. O. L.; Morain, F.

    1993-07-01

    The aim of this paper is to describe the theory and implementation of the Elliptic Curve Primality Proving algorithm. Problema, numeros primos a compositis dignoscendi, hosque in factores suos primos resolvendi, ad gravissima ac utilissima totius arithmeticae pertinere, et geometrarum tum veterum tum recentiorum industriam ac sagacitatem occupavisse, tam notum est, ut de hac re copiose loqui superfluum foret.

  15. Geometric inequalities methods of proving

    CERN Document Server

    Sedrakyan, Hayk

    2017-01-01

    This unique collection of new and classical problems provides full coverage of geometric inequalities. Many of the 1,000 exercises are presented with detailed author-prepared-solutions, developing creativity and an arsenal of new approaches for solving mathematical problems. This book can serve teachers, high-school students, and mathematical competitors. It may also be used as supplemental reading, providing readers with new and classical methods for proving geometric inequalities. .

  16. Ineffective higher derivative black hole hair

    Science.gov (United States)

    Goldstein, Kevin; Mashiyane, James Junior

    2018-01-01

    Inspired by the possibility that the Schwarzschild black hole may not be the unique spherically symmetric vacuum solution to generalizations of general relativity, we consider black holes in pure fourth order higher derivative gravity treated as an effective theory. Such solutions may be of interest in addressing the issue of higher derivative hair or during the later stages of black hole evaporation. Non-Schwarzschild solutions have been studied but we have put earlier results on a firmer footing by finding a systematic asymptotic expansion for the black holes and matching them with known numerical solutions obtained by integrating out from the near-horizon region. These asymptotic expansions can be cast in the form of trans-series expansions which we conjecture will be a generic feature of non-Schwarzschild higher derivative black holes. Excitingly we find a new branch of solutions with lower free energy than the Schwarzschild solution, but as found in earlier work, solutions only seem to exist for black holes with large curvatures, meaning that one should not generically neglect even higher derivative corrections. This suggests that one effectively recovers the nonhair theorems in this context.

  17. Immunomodulators in warts: Unexplored or ineffective?

    Directory of Open Access Journals (Sweden)

    Surabhi Sinha

    2015-01-01

    Full Text Available Cutaneous warts are known to be recurrent and often resistant to therapy. Resistant warts may reflect a localized or systemic cell mediated immune (CMI deficiency to HPV. Many modalities of treatment are in use; most of the provider-administered therapies are destructive and cause scarring, such as cryotherapy, chemical cauterisation, curettage, electrodessication and laser removal. Most patient-applied agents like podophyllotoxin have the risk of application-site reactions and recurrence. Thus immunotherapy is a promising modality which could lead to resolution of warts without any physical changes or scarring and in addition would augment the host response against the causative agent, thereby leading to complete resolution and decreased recurrences. Immunomodulators can be administered systemically, intralesionally or intradermally, and topically. A few agents have been tried and studied extensively such as cimetidine and interferons; others are new on the horizon, such as Echinacea, green tea catechins and quadrivalent HPV vaccine, and their efficacy is yet to be completely established. Though some like levamisole have shown no efficacy as monotherapy and are now used only in combination, other more recent agents require large and long term randomized placebo-controlled trials to clearly establish their efficacy or lack of it. In this review, we focus on the immunomodulators that have been used for the treatment of warts and the studies that have been conducted on them.

  18. Reported Work Emphasis of Effective and Ineffective Counselors.

    Science.gov (United States)

    Wiggins, James D.; Mickle-Askin, Kathleen

    1980-01-01

    A study of counselors in four states showed correlations between personality characteristics and job performance. Counselors rated effective emphasized individual counseling and career work and said they closely follow a theory. They also spent more time on follow-up and consultation than ineffective counselors. (JAC)

  19. Personality Characteristics of Counselors Rated as Effective or Ineffective.

    Science.gov (United States)

    Wiggins, J. D.; Weslander, D. L.

    1979-01-01

    Vocational Preference Inventory (VPI) was used to discriminate counselors rated as highly effective, as average, or as ineffective. Results indicated significant correlations between tested personality characteristics and rated job performances. Employment level, sex, age, certification, and degree status were of no significance in predicting…

  20. The Unreasonable Ineffectiveness of Security Engineering: An Overview

    NARCIS (Netherlands)

    Pavlovic, Dusko

    2010-01-01

    In his 1960 essay, Eugene Wigner raised the question of ‿the unreasonable effectiveness of mathematics in natural sciences‿. After several decades of security research, we are tempted to ask the opposite question: Are we not unreasonably ineffective? Why are we not more secure from all the security

  1. Techniques of preoxygenation in patients with ineffective face mask seal

    Directory of Open Access Journals (Sweden)

    Pankaj Kundra

    2013-01-01

    Full Text Available Background: Ineffective face mask seal is the most common cause for suboptimal pre-oxygenation. Room air entrainment can be more with vital capacity (VC breaths when the mask is not a tight fit. Aims: This study was designed to compare 5 min tidal volume (TV breathing and eight VC breaths in patients with ineffective face mask seal. Methods: Twenty eight ASA I adults with ineffective face mask seal were randomized to breathe 100% oxygen at normal TV for 5 min (Group TV and eight VC breaths (Group VC in a cross over manner through circle system at 10 L/min. End tidal oxygen concentration (EtO 2 and arterial blood gas analysis was performed to evaluate oxygenation with each technique. Statistical Analysis: Data were analysed using SPSS statistical software, version 16. Friedman′s two-way analysis of variance by ranks was used for non-parametric data. Results: Significant increase in EtO 2 (median 90 and PaO 2 (228.85 was seen in group TV when compared to group VC (EtO 2 median 85, PaO 2 147.65, P<0.05. Mean total ventilation volume in 1 min in group VC was 9.4±3.3 L/min and more than fresh gas flow (10 L/min in seven patients. In group TV, the fresh gas flow (50 L/5 min was sufficient at normal TV (mean total ventilation in 5 min 36.7±6.3 L/min. Conclusions: TV breathing for 5 min provides better pre-oxygenation in patients with ineffective mask seal with fresh gas flow of 10 L/min delivered through a circle system.

  2. [Use of ineffective practices in Primary Health Care: professional opinions].

    Science.gov (United States)

    Domínguez Bustillo, L; Barrasa Villar, J I; Castán Ruíz, S; Moliner Lahoz, F J; Aibar Remón, C

    2014-01-01

    To estimate the frequency of ineffective practices in Primary Health Care (PHC) based on the opinions of clinical professionals from the sector, and to assess the significance, implications and factors that may be contributing to their continuance. An on line survey of opinion from a convenience sample of 575 professionals who had published articles over the last years in Atención Primaria and Semergen medical journals. A total of 212 professionals replied (37%). For 70.6% (95% confidence interval [CI] 64.5 to 73.3) the problem of ineffective practices is frequent or very frequent in PHC, and rate their importance with an average score of 7.3 (standard deviation [SD]=1.8) out of 10. The main consequences would be endangering the sustainability of the system (48.1%; 95% CI, 41.2 to 54.9) and harming patients (32.1%; 95% CI, 25.7 to 38.5). These ineffective practices are the result of the behaviour of the patients themselves (28%; 95% CI, 22.6 to 35.0) workload (26.4%; 95% CI, 20.3 to 32.5), and the lack of the continuous education (19.3%; 95% CI, 13.9 to 24.7). Clinical procedures of greatest misuse are the prescribing of antibiotics for certain infections, the frequency of cervical cancer screening, rigorous pharmacological monitoring of type 2 diabetes in patients over 65 years, the use of psychotropic drugs in the elderly, or the use of analgesics in patients with hypertension or renal failure. The use of ineffective procedures in PHC is considered a very important issue that negatively affects many patients and their treatment, and possibly endangering the sustainability of the system and causing harm to patients. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  3. Symbolic logic and mechanical theorem proving

    CERN Document Server

    Chang, Chin-Liang

    1969-01-01

    This book contains an introduction to symbolic logic and a thorough discussion of mechanical theorem proving and its applications. The book consists of three major parts. Chapters 2 and 3 constitute an introduction to symbolic logic. Chapters 4-9 introduce several techniques in mechanical theorem proving, and Chapters 10 an 11 show how theorem proving can be applied to various areas such as question answering, problem solving, program analysis, and program synthesis.

  4. Proving productivity in infinite data structures

    NARCIS (Netherlands)

    Zantema, H.; Raffelsieper, M.; Lynch, C.

    2010-01-01

    For a general class of infinite data structures including streams, binary trees, and the combination of finite and infinite lists, we investigate the notion of productivity. This generalizes stream productivity. We develop a general technique to prove productivity based on proving context-sensitive

  5. Proving relations between modular graph functions

    International Nuclear Information System (INIS)

    Basu, Anirban

    2016-01-01

    We consider modular graph functions that arise in the low energy expansion of the four graviton amplitude in type II string theory. The vertices of these graphs are the positions of insertions of vertex operators on the toroidal worldsheet, while the links are the scalar Green functions connecting the vertices. Graphs with four and five links satisfy several non-trivial relations, which have been proved recently. We prove these relations by using elementary properties of Green functions and the details of the graphs. We also prove a relation between modular graph functions with six links. (paper)

  6. Critical defining characteristics for nursing diagnosis about ineffective breastfeeding

    Directory of Open Access Journals (Sweden)

    Sandra Cristina de Alvarenga

    Full Text Available ABSTRACT Objective: To investigate the Nursing diagnostic accuracy measures and to propose a model to use defining characteristics in order to judge the nursing diagnosis of ineffective breastfeeding. Method: Cross-sectional study with a sample of 73 binomials mom-child hospitalized in a maternity ward of an University Hospital, from July to August of 2014. Results: The diagnostic predominance rate was 58.9%. The characteristics that best meet the needs of logistic regression model were: discontinuance of breast sucking; infant's inability of seizing the areola-nipple region correctly; infant's crying one hour after breastfeeding and inappropriate milk supply perceived. Conclusion: Breastfeeding process is dynamic; diagnostic judgement may suffer some changes according to the time data are collected; the defining characteristics are the best predictors if associated with models and rules of use.

  7. Do parents of obese children use ineffective parenting strategies?

    Science.gov (United States)

    Morawska, Alina; West, Felicity

    2013-12-01

    Research has shown mixed findings about the relationship between parenting style and child lifestyle outcomes. This paper describes a cross-sectional study that aimed to clarify the relationship between ineffective parenting and childhood obesity by using multiple measures of child and family functioning. Sixty-two families with an obese child (aged four to 11 years) were matched with 62 families with a healthy weight child on key sociodemographic variables. Significant differences were found on several measures, including general parenting style, domain-specific parenting practices, and parenting self-efficacy (d = .53 to 1.96). Parents of obese children were more likely to use permissive and coercive discipline techniques, and to lack confidence in managing children's lifestyle behaviour. In contrast, parents of healthy weight children were more likely to implement specific strategies for promoting a healthy lifestyle.

  8. Generic Example Proving Criteria for All

    Science.gov (United States)

    Yopp, David; Ely, Rob; Johnson­-Leung, Jennifer

    2015-01-01

    We review literature that discusses generic example proving and highlight ambiguities that pervade our research community's discourse about generic example arguments. We distinguish between pedagogical advice for choosing good examples that can serve as generic examples when teaching and advice for developing generic example arguments. We provide…

  9. On proving syntactic properties of CPS programs

    DEFF Research Database (Denmark)

    Danvy, Olivier; Dzafic, Belmina; Pfenning, Frank

    1999-01-01

    Higher-order program transformations raise new challenges for proving properties of their output, since they resist traditional, first-order proof techniques. In this work, we consider (1) the “one-pass” continuation-passing style (CPS) transformation, which is second-order, and (2) the occurrences...

  10. SARS – Koch´Postulates proved.

    Indian Academy of Sciences (India)

    SARS – Koch´Postulates proved. Novel coronavirus identified from fluids of patients. Virus cultured in Vero cell line. Sera of patients have antibodies to virus. Cultured virus produces disease in Macaque monkeys. -produces specific immune response; -isolated virus is SARS CoV; -pathology similar to human.

  11. Seismic proving test of PWR reactor containment vessel

    International Nuclear Information System (INIS)

    Akiyama, H.; Yoshikawa, T.; Tokumaru, Y.

    1987-01-01

    The seismic reliability proving tests of nuclear power plant facilities are carried out by Nuclear Power Engineering Test Center (NUPEC), using the large-scale, high-performance vibration of Tadotsu Engineering Laboratory, and sponsored by the Ministry of International Trade and Industry (MITI). In 1982, the seismic reliability proving test of PWR containment vessel started using the test component of reduced scale 1/3.7 and the test component proved to have structural soundness against earthquakes. Subsequently, the detailed analysis and evaluation of these test results were carried out, and the analysis methods for evaluating strength against earthquakes were established. Whereupon, the seismic analysis and evaluation on the actual containment vessel were performed by these analysis methods, and the safety and reliability of the PWR reactor containment vessel were confirmed

  12. Proving Non-Deterministic Computations in Agda

    Directory of Open Access Journals (Sweden)

    Sergio Antoy

    2017-01-01

    Full Text Available We investigate proving properties of Curry programs using Agda. First, we address the functional correctness of Curry functions that, apart from some syntactic and semantic differences, are in the intersection of the two languages. Second, we use Agda to model non-deterministic functions with two distinct and competitive approaches incorporating the non-determinism. The first approach eliminates non-determinism by considering the set of all non-deterministic values produced by an application. The second approach encodes every non-deterministic choice that the application could perform. We consider our initial experiment a success. Although proving properties of programs is a notoriously difficult task, the functional logic paradigm does not seem to add any significant layer of difficulty or complexity to the task.

  13. On proving syntactic properties of CPS programs

    DEFF Research Database (Denmark)

    Danvy, Olivier; Dzafic, Belmina; Pfenning, Frank

    1999-01-01

    Higher-order program transformations raise new challenges for proving properties of their output, since they resist traditional, first-order proof techniques. In this work, we consider (1) the “one-pass” continuation-passing style (CPS) transformation, which is second-order, and (2) the occurrences...... of parameters of continuations in its output. To this end, we specify the one-pass CPS transformation relationally and we use the proof technique of logical relations....

  14. Theorem Proving In Higher Order Logics

    Science.gov (United States)

    Carreno, Victor A. (Editor); Munoz, Cesar A.; Tahar, Sofiene

    2002-01-01

    The TPHOLs International Conference serves as a venue for the presentation of work in theorem proving in higher-order logics and related areas in deduction, formal specification, software and hardware verification, and other applications. Fourteen papers were submitted to Track B (Work in Progress), which are included in this volume. Authors of Track B papers gave short introductory talks that were followed by an open poster session. The FCM 2002 Workshop aimed to bring together researchers working on the formalisation of continuous mathematics in theorem proving systems with those needing such libraries for their applications. Many of the major higher order theorem proving systems now have a formalisation of the real numbers and various levels of real analysis support. This work is of interest in a number of application areas, such as formal methods development for hardware and software application and computer supported mathematics. The FCM 2002 consisted of three papers, presented by their authors at the workshop venue, and one invited talk.

  15. Automated theorem proving theory and practice

    CERN Document Server

    Newborn, Monty

    2001-01-01

    As the 21st century begins, the power of our magical new tool and partner, the computer, is increasing at an astonishing rate. Computers that perform billions of operations per second are now commonplace. Multiprocessors with thousands of little computers - relatively little! -can now carry out parallel computations and solve problems in seconds that only a few years ago took days or months. Chess-playing programs are on an even footing with the world's best players. IBM's Deep Blue defeated world champion Garry Kasparov in a match several years ago. Increasingly computers are expected to be more intelligent, to reason, to be able to draw conclusions from given facts, or abstractly, to prove theorems-the subject of this book. Specifically, this book is about two theorem-proving programs, THEO and HERBY. The first four chapters contain introductory material about automated theorem proving and the two programs. This includes material on the language used to express theorems, predicate calculus, and the rules of...

  16. Theorem Proving in Intel Hardware Design

    Science.gov (United States)

    O'Leary, John

    2009-01-01

    For the past decade, a framework combining model checking (symbolic trajectory evaluation) and higher-order logic theorem proving has been in production use at Intel. Our tools and methodology have been used to formally verify execution cluster functionality (including floating-point operations) for a number of Intel products, including the Pentium(Registered TradeMark)4 and Core(TradeMark)i7 processors. Hardware verification in 2009 is much more challenging than it was in 1999 - today s CPU chip designs contain many processor cores and significant firmware content. This talk will attempt to distill the lessons learned over the past ten years, discuss how they apply to today s problems, outline some future directions.

  17. Reasoning by analogy as an aid to heuristic theorem proving.

    Science.gov (United States)

    Kling, R. E.

    1972-01-01

    When heuristic problem-solving programs are faced with large data bases that contain numbers of facts far in excess of those needed to solve any particular problem, their performance rapidly deteriorates. In this paper, the correspondence between a new unsolved problem and a previously solved analogous problem is computed and invoked to tailor large data bases to manageable sizes. This paper outlines the design of an algorithm for generating and exploiting analogies between theorems posed to a resolution-logic system. These algorithms are believed to be the first computationally feasible development of reasoning by analogy to be applied to heuristic theorem proving.

  18. Valutazione economica dello studio PROVE-IT

    Directory of Open Access Journals (Sweden)

    Lorenzo G. Mantovani

    2007-10-01

    Full Text Available Introduction: the PROVE-IT (“Intensive versus moderate lipid lowering with statins after acute coronary syndromes” was a comparison of pravastatin 40 mg/die versus atorvastatin 80 mg/die in patients with an acute coronary syndrome (ACS. Aim: our aim was to investigate the economic consequence of high dose of atorvastatin vs usual-dose of pravastatin in Italian patients with a history of acute coronary syndrome. Methods: the analysis is conducted on the basis of clinical outcomes of the PROVE-IT study. We conducted a cost-effectiveness analysis, comparing high dose of atorvastatin (80 mg/die versus usual-dose of pravastatin (40 mg/die in the perspective of the Italian National Health Service. We identified and quantified medical costs: drug costs according to the Italian National Therapeutic Formulary and hospitalizations were quantified based on the Italian National Health Service tariffs (2006. Effects were measured in terms of mortality and morbidity reduction (number of deaths, life years gained and frequency of hospitalizations. We considered an observation period of 24 months. The costs borne after the first 12 months were discounted using an annual rate of 3%. We conducted one and multi-way sensitivity analyses on unit cost and effectiveness. We also conducted a threshold analysis. Results: the cost of pravastatin or atorvastatin therapy over the 2 years period amounted to approximately 1.3 millions euro and 870,000 euro per 1,000 patients respectively. Atorvastatin was more efficacious compared to pravastatin and the overall cost of care per 1,000 patients over 24 months of follow-up was estimated at 3.2 millions euro in the pravastatin and 2.5 millions euro in the atorvastatin group, resulting into a cost saving of about 700,000 euro that is 27% of total costs occurred in the pravastatin group. Discussion: this study demonstrates that high does atorvastatin treatment leads to a reduction of direct costs for the National Health System

  19. a Test to Prove Cloud Whitening THEORY!

    Science.gov (United States)

    Buttram, J. W.

    2011-12-01

    Climate science researchers believe our planet can possibly tolerate twice the present carbon dioxide levels with no upwards temperature change, IF we could increase the amount of energy reflected back out into space by about 2.0%. (c)Cloudtec basically alters a blend of seawater and applies heat derived from magma to it at a temperature exceeding 2,000 degrees F. The interaction of seawater and magma displaces the oxygen, causing the volume of water to vaporize and expand over 4,000 times - transforming billions of tons of seawater into thousands of cubic miles of white, maritime, stratocumulus clouds to reflect the incident Sun's rays back out into space. A 6 month test to prove Cloud Whitening Theory will cost 6 million dollars. (No profit added.) This study will enable everyone on the planet with a computer the transparency to use satellite imagery and check out for themselves - if and when Cloud Whitening is occurring. If Cloud Whitening Theory is validated, (c)Cloudtec's innovation can strategically create the clouds we need to reflect the Sun's rays back out into space and help neutralize the projected 3.6 degrees F rise in temperature. Based on reasonable calculations of anthropogenic global warming: this one move alone would be comparable to slashing global carbon dioxide emissions by over 60% over the next 40 years.

  20. A pharmacological analysis elucidating why, in contrast to (-)-deprenyl (selegiline), alpha-tocopherol was ineffective in the DATATOP study.

    Science.gov (United States)

    Miklya, I; Knoll, B; Knoll, J

    2003-04-25

    The Parkinson Study Group who conducted the Deprenyl and Tocopherol Antioxidative Therapy of Parkinsonism (DATATOP) trial designed their study in the belief that the MAO inhibitor (-)-deprenyl (selegiline), the antioxidant alpha-tocopherol, and the combination of the two compounds will slow the clinical progression of the disease to the extent that MAO activity and the formation of oxygen radicals contribute to the pathogenesis of nigral degeneration. In fact, (-)-deprenyl only delayed the onset of disability associated with early, otherwise untreated Parkinson's disease, however, in contrast to the expectation of the authors, alpha-tocopherol proved to be ineffective in the DATATOP study. Enhancer substances, (-)-deprenyl, (-)-1-phenyl-2-propylaminopentane [(-)-PPAP] the (-)-deprenyl analogue free of MAO inhibitory potency, and R-(-)1-(benzofuran-2-yl)-2-propylaminopentane [(-)-BPAP] the presently known most potent enhancer substance, are peculiar stimulants. They enhance the impulse propagation mediated release of the catecholamines in the brain. Due to their enhancer effect, the amount of catecholamines released from selected discrete brain areas (striatum, substantia nigra, tuberculum olfactorium, locus coeruleus) is significantly higher in rats treated with an enhancer substance than in saline treated rats. We compared the effect of (-)-deprenyl 0.025 and 0.25 mg/kg, (-)-PPAP 0.1 mg/kg, (-)-BPAP 0.0001 mg/kg, and alpha-tocopherol 25 and 50 mg/kg, in this test. The doses of (-)-deprenyl and alpha-tocopherol were selected to be in compliance with the dose given in the DATATOP study. Compared to saline treated rats, the enhancer substances significantly increased the amount of dopamine released from the striatum, substantia nigra and tuberculum olfactorium and the amount of norepinephrine released from the locus coeruleus; alpha-tocopherol was ineffective. The results indicate that alpha-tocopherol was ineffective, because, unlike (-)-deprenyl it dose not enhance

  1. Seismic proving test of BWR primary loop recirculation system

    International Nuclear Information System (INIS)

    Sato, H.; Shigeta, M.; Karasawa, Y.

    1987-01-01

    The seismic proving test of BWR Primary Loop Recirculation system is the second test to use the large-scale, high-performance vibration table of Tadotsu Engineering Laboratory. The purpose of this test is to prove the seismic reliability of the primary loop recirculation system (PLR), one of the most important safety components in the BWR nuclear plants, and also to confirm the adequacy of seismic analysis method used in the current seismic design. To achieve the purpose, the test was conducted under conditions and scale as near as possible to actual systems. The strength proving test was carried out with the test model mounted on the vibration table in consideration of basic design earthquake ground motions and other conditions to confirm the soundness of structure and the strength against earthquakes. Detailed analysis and analytic evaluation of the data obtained from the test was conducted to confirm the adequacy of the seismic analysis method and earthquake response analysis method used in the current seismic design. Then, on the basis of the results obtained, the seismic safety and reliability of BWR primary loop recirculation of the actual plants was fully evaluated

  2. A Hybrid Approach to Proving Memory Reference Monotonicity

    KAUST Repository

    Oancea, Cosmin E.

    2013-01-01

    Array references indexed by non-linear expressions or subscript arrays represent a major obstacle to compiler analysis and to automatic parallelization. Most previous proposed solutions either enhance the static analysis repertoire to recognize more patterns, to infer array-value properties, and to refine the mathematical support, or apply expensive run time analysis of memory reference traces to disambiguate these accesses. This paper presents an automated solution based on static construction of access summaries, in which the reference non-linearity problem can be solved for a large number of reference patterns by extracting arbitrarily-shaped predicates that can (in)validate the reference monotonicity property and thus (dis)prove loop independence. Experiments on six benchmarks show that our general technique for dynamic validation of the monotonicity property can cover a large class of codes, incurs minimal run-time overhead and obtains good speedups. © 2013 Springer-Verlag.

  3. Ineffective esophageal motility and the vagus: current challenges and future prospects

    Directory of Open Access Journals (Sweden)

    Chen JH

    2016-09-01

    Full Text Available Ji-Hong Chen1,2 1Department of Gastroenterology, Renmin Hospital, Wuhan University, Wuhan, People’s Republic of China; 2Division of Gastroenterology, Department of Medicine, Farncombe Family Digestive Health Research Institute, McMaster University, Hamilton, ON, Canada Abstract: Ineffective esophageal motility (IEM is characterized by low to very low amplitude propulsive contractions in the distal esophagus, hence primarily affecting the smooth muscle part of the esophagus. IEM is often found in patients with dysphagia or heartburn and is commonly associated with gastroesophageal reflux disease. IEM is assumed to be associated with ineffective bolus transport; however, this can be verified using impedance measurements or evaluation of a barium coated marshmallow swallow. Furthermore, water swallows may not assess accurately the motor capabilities of the esophagus, since contraction amplitude is strongly determined by the size and consistency of the bolus.The “peristaltic reserve” of the esophagus can be evaluated by multiple rapid swallows that, after a period of diglutative inhibition, normally give a powerful peristaltic contraction suggestive of the integrity of neural orchestration and smooth muscle action. The amplitude of contraction is determined by a balance between intrinsic excitatory cholinergic, inhibitory nitrergic, as well as postinhibition rebound excitatory output to the musculature. This is strongly influenced by vagal efferent motor neurons and this in turn is influenced by vagal afferent neurons that send bolus information to the solitary nucleus where programmed activation of the vagal motor neurons to the smooth muscle esophagus is initiated. Solitary nucleus activity is influenced by sensory activity from a large number of organs and various areas of the brain, including the hypothalamus and the cerebral cortex. This allows interaction between swallowing activities and respiratory and cardiac activities and allows the

  4. Ineffective Degradation of Immunogenic Gluten Epitopes by Currently Available Digestive Enzyme Supplements

    Science.gov (United States)

    Janssen, George; Christis, Chantal; Kooy-Winkelaar, Yvonne; Edens, Luppo; Smith, Drew

    2015-01-01

    Background Due to the high proline content of gluten molecules, gastrointestinal proteases are unable to fully degrade them leaving large proline-rich gluten fragments intact, including an immunogenic 33-mer from α-gliadin and a 26-mer from γ-gliadin. These latter peptides can trigger pro-inflammatory T cell responses resulting in tissue remodeling, malnutrition and a variety of other complications. A strict lifelong gluten-free diet is currently the only available treatment to cope with gluten intolerance. Post-proline cutting enzymes have been shown to effectively degrade the immunogenic gluten peptides and have been proposed as oral supplements. Several existing digestive enzyme supplements also claim to aid in gluten degradation. Here we investigate the effectiveness of such existing enzyme supplements in comparison with a well characterized post-proline cutting enzyme, Prolyl EndoPeptidase from Aspergillus niger (AN-PEP). Methods Five commercially available digestive enzyme supplements along with purified digestive enzymes were subjected to 1) enzyme assays and 2) mass spectrometric identification. Gluten epitope degradation was monitored by 1) R5 ELISA, 2) mass spectrometric analysis of the degradation products and 3) T cell proliferation assays. Findings The digestive enzyme supplements showed comparable proteolytic activities with near neutral pH optima and modest gluten detoxification properties as determined by ELISA. Mass spectrometric analysis revealed the presence of many different enzymes including amylases and a variety of different proteases with aminopeptidase and carboxypeptidase activity. The enzyme supplements leave the nine immunogenic epitopes of the 26-mer and 33-mer gliadin fragments largely intact. In contrast, the pure enzyme AN-PEP effectively degraded all nine epitopes in the pH range of the stomach at much lower dose. T cell proliferation assays confirmed the mass spectrometric data. Conclusion Currently available digestive enzyme

  5. Ineffective degradation of immunogenic gluten epitopes by currently available digestive enzyme supplements.

    Directory of Open Access Journals (Sweden)

    George Janssen

    Full Text Available Due to the high proline content of gluten molecules, gastrointestinal proteases are unable to fully degrade them leaving large proline-rich gluten fragments intact, including an immunogenic 33-mer from α-gliadin and a 26-mer from γ-gliadin. These latter peptides can trigger pro-inflammatory T cell responses resulting in tissue remodeling, malnutrition and a variety of other complications. A strict lifelong gluten-free diet is currently the only available treatment to cope with gluten intolerance. Post-proline cutting enzymes have been shown to effectively degrade the immunogenic gluten peptides and have been proposed as oral supplements. Several existing digestive enzyme supplements also claim to aid in gluten degradation. Here we investigate the effectiveness of such existing enzyme supplements in comparison with a well characterized post-proline cutting enzyme, Prolyl EndoPeptidase from Aspergillus niger (AN-PEP.Five commercially available digestive enzyme supplements along with purified digestive enzymes were subjected to 1 enzyme assays and 2 mass spectrometric identification. Gluten epitope degradation was monitored by 1 R5 ELISA, 2 mass spectrometric analysis of the degradation products and 3 T cell proliferation assays.The digestive enzyme supplements showed comparable proteolytic activities with near neutral pH optima and modest gluten detoxification properties as determined by ELISA. Mass spectrometric analysis revealed the presence of many different enzymes including amylases and a variety of different proteases with aminopeptidase and carboxypeptidase activity. The enzyme supplements leave the nine immunogenic epitopes of the 26-mer and 33-mer gliadin fragments largely intact. In contrast, the pure enzyme AN-PEP effectively degraded all nine epitopes in the pH range of the stomach at much lower dose. T cell proliferation assays confirmed the mass spectrometric data.Currently available digestive enzyme supplements are ineffective in

  6. Perceptions of effective and ineffective nurse-physician communication in hospitals.

    Science.gov (United States)

    Robinson, F Patrick; Gorman, Geraldine; Slimmer, Lynda W; Yudkowsky, Rachel

    2010-01-01

    Nurse-physician communication affects patient safety. Such communication has been well studied using a variety of survey and observational methods; however, missing from the literature is an investigation of what constitutes effective and ineffective interprofessional communication from the perspective of the professionals involved. The purpose of this study was to explore nurse and physician perceptions of effective and ineffective communication between the two professions. Using focus group methodology, we asked nurses and physicians with at least 5 years' acute care hospital experience to reflect on effective and ineffective interprofessional communication and to provide examples. Three focus groups were held with 6 participants each (total sample 18). Sessions were audio recorded and transcribed verbatim. Transcripts were coded into categories of effective and ineffective communication. The following themes were found. For effective communication: clarity and precision of message that relies on verification, collaborative problem solving, calm and supportive demeanor under stress, maintenance of mutual respect, and authentic understanding of the unique role. For ineffective communication: making someone less than, dependence on electronic systems, and linguistic and cultural barriers. These themes may be useful in designing learning activities to promote effective interprofessional communication.

  7. Geophysics: Building E5476 decommissiong, Aberdeen Proving Ground

    International Nuclear Information System (INIS)

    Miller, S.F.; Thompson, M.D.; McGinnis, M.G.; McGinnis, L.D.

    1992-11-01

    Building E5476 was one of ten potentially contaminated sites in the Canal Creek and Westwood areas of the Edgewood section of Aberdeen Proving Ground examined by a geophysical team from Argonne National Laboratory in April and May of 1992. Noninvasive geophysical surveys, including magnetics, electrical resistivity, and ground-penetrating radar, were conducted around the perimeter of the building to guide a sampling program prior to decommissioning and dismantling. The large number of magnetic sources surrounding the building are believed to be contained in construction fill. The smaller anomalies, for the most part, were not imaged with ground radar or by electrical profiling. Large magnetic anomalies near the southwest comer of the building are due to aboveground standpipes and steel-reinforced concrete. Two high-resistivity areas, one projecting northeast from the building and another south of the original structure, may indicate the presence of organic pore fluids in the subsurface. A conductive lineament protruding from the south wall that is enclosed by the southem, high-resistivity feature is not associated with an equivalent magnetic anomaly. Magnetic and electrical anomalies south of the old landfill boundary are probably not associated with the building. The boundary is marked by a band of magnetic anomalies and a conductive zone trending northwest to southeast. The cause of high resistivities in a semicircular area in the southwest comer, within the landfill area, is unexplained

  8. National income inequality and ineffective health insurance in 35 low- and middle-income countries.

    Science.gov (United States)

    Alvarez, Francisco N; El-Sayed, Abdulrahman M

    2017-05-01

    Global health policy efforts to improve health and reduce financial burden of disease in low- and middle-income countries (LMIC) has fuelled interest in expanding access to health insurance coverage to all, a movement known as Universal Health Coverage (UHC). Ineffective insurance is a measure of failure to achieve the intended outcomes of health insurance among those who nominally have insurance. This study aimed to evaluate the relation between national-level income inequality and the prevalence of ineffective insurance. We used Standardized World Income Inequality Database (SWIID) Gini coefficients for 35 LMICs and World Health Survey (WHS) data about insurance from 2002 to 2004 to fit multivariable regression models of the prevalence of ineffective insurance on national Gini coefficients, adjusting for GDP per capita. Greater inequality predicted higher prevalence of ineffective insurance. When stratifying by individual-level covariates, higher inequality was associated with greater ineffective insurance among sub-groups traditionally considered more privileged: youth, men, higher education, urban residence and the wealthiest quintile. Stratifying by World Bank country income classification, higher inequality was associated with ineffective insurance among upper-middle income countries but not low- or lower-middle income countries. We hypothesize that these associations may be due to the imprint of underlying social inequalities as countries approach decreasing marginal returns on improved health insurance by income. Our findings suggest that beyond national income, income inequality may predict differences in the quality of insurance, with implications for efforts to achieve UHC. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. How the dual process model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: A preliminary model of unlearning and substitution.

    Science.gov (United States)

    Helfrich, Christian D; Rose, Adam J; Hartmann, Christine W; van Bodegom-Vos, Leti; Graham, Ian D; Wood, Suzanne J; Majerczyk, Barbara R; Good, Chester B; Pogach, Leonard M; Ball, Sherry L; Au, David H; Aron, David C

    2018-02-01

    One way to understand medical overuse at the clinician level is in terms of clinical decision-making processes that are normally adaptive but become maladaptive. In psychology, dual process models of cognition propose 2 decision-making processes. Reflective cognition is a conscious process of evaluating options based on some combination of utility, risk, capabilities, and/or social influences. Automatic cognition is a largely unconscious process occurring in response to environmental or emotive cues based on previously learned, ingrained heuristics. De-implementation strategies directed at clinicians may be conceptualized as corresponding to cognition: (1) a process of unlearning based on reflective cognition and (2) a process of substitution based on automatic cognition. We define unlearning as a process in which clinicians consciously change their knowledge, beliefs, and intentions about an ineffective practice and alter their behaviour accordingly. Unlearning has been described as "the questioning of established knowledge, habits, beliefs and assumptions as a prerequisite to identifying inappropriate or obsolete knowledge underpinning and/or embedded in existing practices and routines." We hypothesize that as an unintended consequence of unlearning strategies clinicians may experience "reactance," ie, feel their professional prerogative is being violated and, consequently, increase their commitment to the ineffective practice. We define substitution as replacing the ineffective practice with one or more alternatives. A substitute is a specific alternative action or decision that either precludes the ineffective practice or makes it less likely to occur. Both approaches may work independently, eg, a substitute could displace an ineffective practice without changing clinicians' knowledge, and unlearning could occur even if no alternative exists. For some clinical practice, unlearning and substitution strategies may be most effectively used together. By taking into

  10. Death Anxiety in Young Adulthood: Ineffective Ways of Coping with the Terror and the Dread.

    Science.gov (United States)

    Ballard, Mary B.; Halbrook, Bernadette M.

    1992-01-01

    Familiarizes counselors with role of death fear as primary source of anxiety for all individuals. Attempts to define death anxiety and demonstrate how defense mechanisms used to deny it can affect development in young adulthood. Provides three examples of maladaptive modes of behavior resulting from ineffective defense mechanisms (addiction,…

  11. Maladaptive Perfectionism and Ineffective Coping as Mediators between Attachment and Future Depression: A Prospective Analysis

    Science.gov (United States)

    Wei, Meifen; Heppner, P. Paul; Russell, Daniel W.; Young, Shannon K.

    2006-01-01

    This study used a longitudinal design to examine whether maladaptive perfectionism and ineffective coping served as 2 mediators of the relation between adult attachment and future depression. Data were collected from 372 undergraduates at 2 time points. Results indicated that (a) the impact of attachment on future depression was mediated through…

  12. How perfectionism and ineffectiveness influence growth of eating disorder risk in young adolescent girls.

    Science.gov (United States)

    Wade, Tracey D; Wilksch, Simon M; Paxton, Susan J; Byrne, Susan M; Austin, S Bryn

    2015-03-01

    While perfectionism is widely considered to influence risk for eating disorders, results of longitudinal studies are mixed. The goal of the current study was to investigate a more complex model of how baseline perfectionism (both high personal standards and self-critical evaluative concerns) might influence change in risk status for eating disorders in young adolescent girls, through its influence on ineffectiveness. The study was conducted with 926 girls (mean age of 13 years), and involved three waves of data (baseline, 6- and 12-month follow-up). Latent growth curve modelling, incorporating the average rate at which risk changed over time, the intercept (initial status) of ineffectiveness, and baseline perfectionism, was used to explore longitudinal mediation. Personal standards was not supported as contributing to risk but results indicated that the higher mean scores on ineffectiveness over the three waves mediated the relationship between higher baseline self-critical evaluative concerns and both measures of eating disorder risk. The relationship between concern over mistakes and change in risk was small and negative. These results suggest the usefulness of interventions related to self-criticism and ineffectiveness for decreasing risk for developing an eating disorder in young adolescent girls. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Therapists' experiences and perceptions of teamwork in neurological rehabilitation: critical happenings in effective and ineffective teamwork.

    Science.gov (United States)

    Suddick, Kitty M; De Souza, Lorraine H

    2007-12-01

    This paper reports the second part of an exploratory study into occupational therapists' and physiotherapists' perceptions and experiences of teamwork in neurological rehabilitation: the factors that were thought to influence effective and ineffective teamwork, and the meaning behind effective and ineffective teamwork in neurological rehabilitation. The study was undertaken through semi-structured interviews of 10 therapists from three different neurological rehabilitation teams based in the United Kingdom, and used the critical incident technique. Through analysis of the data, several main themes emerged regarding the perceived critical happenings in effective and ineffective teamwork. These were: team events and characteristics, team members' characteristics, shared and collaborative working practices, communication, specific organizational structures, environmental, external, and patient and family-related factors. Effective and ineffective team-work was perceived to impact on a number of levels: having implications for the team, the patient, individual team members, and the neurological rehabilitation service. The study supported the perceived value of team work within neurological rehabilitation. It also indicated the extensive and variable factors that may influence the team-working process as well as the complex and diverse nature of the process.

  14. Age-related inflammatory bone marrow microenvironment induces ineffective erythropoiesis mimicking del(5q) MDS.

    Science.gov (United States)

    Mei, Y; Zhao, B; Basiorka, A A; Yang, J; Cao, L; Zhang, J; List, A; Ji, P

    2018-04-01

    Anemia is characteristic of myelodysplastic syndromes (MDS). The mechanisms of anemia in MDS are unclear. Using a mouse genetic approach, here we show that dual deficiency of mDia1 and miR-146a, encoded on chromosome 5q and commonly deleted in MDS (del(5q) MDS), causes an age-related anemia and ineffective erythropoiesis mimicking human MDS. We demonstrate that the ageing bone marrow microenvironment is important for the development of ineffective erythropoiesis in these mice. Damage-associated molecular pattern molecules (DAMPs), whose levels increase in ageing bone marrow, induced TNFα and IL-6 upregulation in myeloid-derived suppressor cells (MDSCs) in mDia1/miR-146a double knockout mice. Mechanistically, we reveal that pathologic levels of TNFα and IL-6 inhibit erythroid colony formation and differentially affect terminal erythropoiesis through reactive oxygen species-induced caspase-3 activation and apoptosis. Treatment of the mDia1/miR-146a double knockout mice with all-trans retinoic acid, which promoted the differentiation of MDSCs and ameliorated the inflammatory bone marrow microenvironment, significantly rescued anemia and ineffective erythropoiesis. Our study underscores the dual roles of the ageing microenvironment and genetic abnormalities in the pathogenesis of ineffective erythropoiesis in del(5q) MDS.

  15. Geophysics: Building E5481 decommissioning, Aberdeen Proving Ground

    International Nuclear Information System (INIS)

    Thompson, M.D.; McGinnis, M.G.; McGinnis, L.D.; Miller, S.F.

    1992-11-01

    Building E5481 is one of ten potentially contaminated sites in the Canal Creek and Westwood areas of the Edgewood section of Aberdeen Proving Ground examined by a geophysical team from Argonne National Laboratory in April and May of 1992. Noninvasive geophysical surveys, including magnetics, electrical resistivity, and ground-penetrating radar, were conducted around the perimeter of the building to guide a sampling program prior to decommissioning and dismantling. The building is located on the northern margin of a landfill that was sited in a wetland. The large number of magnetic sources surrounding the building are believed to be contained in construction fill that had been used to raise the grade. The smaller anomalies, for the most part, are not imaged with ground radar or by electrical profiling. A conductive zone trending northwest to southeast across the site is spatially related to an old roadbed. Higher resistivity areas in the northeast and east are probably representive of background values. Three high-amplitude, positive, rectangular magnetic anomalies have unknown sources. The features do not have equivalent electrical signatures, nor are they seen with radar imaging

  16. Preservice Mathematics Teachers' Metaphorical Perceptions towards Proof and Proving

    Science.gov (United States)

    Ersen, Zeynep Bahar

    2016-01-01

    Since mathematical proof and proving are in the center of mathematics; preservice mathematics teachers' perceptions against these concepts have a great importance. Therefore, the study aimed to determine preservice mathematics teachers' perceptions towards proof and proving through metaphors. The participants consisted of 192 preservice…

  17. Proving termination of logic programs with delay declarations

    NARCIS (Netherlands)

    E. Marchiori; F. Teusink (Frank)

    1996-01-01

    textabstractIn this paper we propose a method for proving termination of logic programs with delay declarations. The method is based on the notion of recurrent logic program, which is used to prove programs terminating wrt an arbitrary selection rule. Most importantly, we use the notion of bound

  18. 20 CFR 219.23 - Evidence to prove death.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Evidence to prove death. 219.23 Section 219... EVIDENCE REQUIRED FOR PAYMENT Evidence of Age and Death § 219.23 Evidence to prove death. (a) Preferred evidence of death. The best evidence of a person's death is— (1) A certified copy of or extract from the...

  19. A New Approach for Proving or Generating Combinatorial Identities

    Science.gov (United States)

    Gonzalez, Luis

    2010-01-01

    A new method for proving, in an immediate way, many combinatorial identities is presented. The method is based on a simple recursive combinatorial formula involving n + 1 arbitrary real parameters. Moreover, this formula enables one not only to prove, but also generate many different combinatorial identities (not being required to know them "a…

  20. Reasoning and Proving Opportunities in Textbooks: A Comparative Analysis

    Science.gov (United States)

    Hong, Dae S.; Choi, Kyong Mi

    2018-01-01

    In this study, we analyzed and compared reasoning and proving opportunities in geometry lessons from American standard-based textbooks and Korean textbooks to understand how these textbooks provide student opportunities to engage in reasoning and proving activities. Overall, around 40% of exercise problems in Core Plus Mathematics Project (CPMP)…

  1. The Earth is Flat, and I Can Prove It!

    Science.gov (United States)

    Klinger, Art

    1998-01-01

    Describes an educational program that asks students to attempt to prove that the earth is spherical and that it rotates. Presents tips to pique student interest and charts related to sensing the spin, nonrotation notions, flat earth fallacies, evidence that the earth is spherical and rotates, and the role of watersheds in proving that the earth…

  2. On the problem of proving the existence of ''charmed'' particles

    International Nuclear Information System (INIS)

    Tyapkin, A.A.

    1975-01-01

    In order to search for ''charmed'' particles a possibility of performing an experiment is discussed in which one could observe a new particle and prove a necessity of introducting for this particle a new quantum number conserved in strong interactions

  3. Automated Theorem Proving in High-Quality Software Design

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  4. Models and Techniques for Proving Data Structure Lower Bounds

    DEFF Research Database (Denmark)

    Larsen, Kasper Green

    In this dissertation, we present a number of new techniques and tools for proving lower bounds on the operational time of data structures. These techniques provide new lines of attack for proving lower bounds in both the cell probe model, the group model, the pointer machine model and the I...... bound of tutq = (lgd􀀀1 n). For ball range searching, we get a lower bound of tutq = (n1􀀀1=d). The highest previous lower bound proved in the group model does not exceed ((lg n= lg lg n)2) on the maximum of tu and tq. Finally, we present a new technique for proving lower bounds....../O-model. In all cases, we push the frontiers further by proving lower bounds higher than what could possibly be proved using previously known techniques. For the cell probe model, our results have the following consequences: The rst (lg n) query time lower bound for linear space static data structures...

  5. La doble ineficacia de la tortura | The Double Ineffectiveness of Torture

    Directory of Open Access Journals (Sweden)

    Jesus Garcia Civico

    2016-12-01

    Full Text Available RESUMEN. El propósito de este trabajo es reflexionar sobre la ineficacia del derecho a no sufrir torturas y también sobre la ineficacia del recurso (ilegal a la tortura. En el primer caso, se estudian las distintas formas de ineficacia: casos de tortura, impunidad, ausencia de investigaciones eficaces. Al mismo tiempo, se repasa la actualidad de los mecanismos de eficacia de este derecho. En el segundo supuesto, se analizan los últimos informes sobre el fracaso del recurso a la tortura en la llamada «lucha contra el terror».   ABSTRACT. The purpose of this study is to reflect about the ineffectiveness of the right not to suffer torture and also the ineffectiveness of (illegal recourse to torture. In the first case, the different forms of ineffectiveness are analyzed: cases of torture, impunity, lack of effective investigations. At the same time, it reviews the current mechanisms of effectiveness of this right. In the second case, the latest reports on the failure to use torture in the so-called "fight against terror" are analyzed.

  6. Transcriptional Alterations of Virulence-Associated Genes in Extended Spectrum Beta-Lactamase (ESBL-Producing Uropathogenic Escherichia coli during Morphologic Transitions Induced by Ineffective Antibiotics

    Directory of Open Access Journals (Sweden)

    Isak Demirel

    2017-06-01

    Full Text Available It is known that an ineffective antibiotic treatment can induce morphological shifts in uropathogenic Escherichia coli (UPEC but the virulence properties during these shifts remain to be studied. The present study examines changes in global gene expression patterns and in virulence factor-associated genes in an extended spectrum beta-lactamase (ESBL-producing UPEC (ESBL019 during the morphologic transitions induced by an ineffective antibiotic and in the presence of human primary bladder epithelial cells. Microarray results showed that the different morphological states of ESBL019 had significant transcriptional alterations of a large number of genes (Transition; 7%, Filamentation; 32%, and Reverted 19% of the entities on the array. All three morphological states of ESBL019 were associated with a decreased energy metabolism, altered iron acquisition systems and altered adhesion expression. In addition, genes associated with LPS synthesis and bacterial motility was also altered in all the morphological states. Furthermore, the transition state induced a significantly higher release of TNF-α from bladder epithelial cells compared to all other morphologies, while the reverted state was unable to induce TNF-α release. Our findings show that the morphological shifts induced by ineffective antibiotics are associated with significant transcriptional virulence alterations in ESBL-producing UPEC, which may affect survival and persistence in the urinary tract.

  7. Three Smoking Guns Prove Falsity of Green house Warming

    Science.gov (United States)

    Fong, P.

    2001-12-01

    Three observed facts: 1, the cloud coverage increased 4.1% in 50 years; 2. the precipitation increased 7.8% in 100 years; 3. the two rates are the same. {Interpretation}. 1, By the increased albedo of the clouds heat dissipation is increased 3.98 W/m2 by 2XCO2 time, canceling out greenhouse warming of 4 W/m{2}. Thus no global warming. 2, The precipitation increase show the increased release of latent heat of vaporization, which turns out to be equal to that absorbed by ocean due to increased evaporation by the greenhouse forcing. This all greenhouse heat is used up in evaporation and the warming of the earth is zero. 3, The identity of the two rates double-checked the two independent proofs. Therefore experimentally no greenhouse warming is triply proved. A new branch of science Pleistocene Climatology is developed to study the theoretical origin of no greenhouse warming. Climatology, like mechanics of a large number of particles, is of course complex and unwieldy. If totally order-less then there is no hope. However, if some regularity appears, then a systematic treatment can be done to simplify the complexity. The rigid bodies are subjected to a special simplifying condition (the distances between all particles are constant) and only 6 degrees of freedom are significant, all others are sidetracked. To study the spinning top there is no need to study the dynamics of every particle of the top by Newton's laws through super-computer. It only needs to solve the Euler equations without computer. In climate study the use of super-computer to study all degrees of freedom of the climate is as untenable as the study of the spinning top by super-computer. Yet in spite of the complexity there is strict regularity as seen in the ice ages, which works as the simplifying conditions to establish a new science Pleistocene climatology. See my book Greenhouse Warming and Nuclear Hazards just published (www.PeterFongBook.com). This time the special condition is the presence of a

  8. Proving test on the reliability for nuclear valves

    International Nuclear Information System (INIS)

    Kajiyama, Yasuo; Tashiro, Hisao; Uga, Takeo; Maeda, Shunichi.

    1986-01-01

    Since valves are the most common components, they could be the most frequent causes of troubles in nuclear power plants. This proving test, therefore, has an important meaning to examine and verify the reliability of various valves under simulating conditions of abnormal and transient operations of the nuclear power plant. The test was performed mainly for the various types and pressure ratings of valves which were used in the primary and secondary systems in BWR and PWR nuclear power plants and which had major operating or safety related functions in those nuclear power plants. The results of the proving test, confirmed for more than four years, showed relatively favourable performance of the tested valves. It is concluded that performances of valves including operability, seat sealing and structural integrity were proved under the thermal cycling, vibration and pipe reaction load conditions. Operating functions during and after accident such as loss of coolant accident were satisfactory. From these results, it was considered that the purpose of this proving test was satisfactorily fulfilled. Several data accumulated by the test would be useful to get better reliability if it was evaluated with the actually experienced data of valves in the nuclear power plants. (Nogami, K.)

  9. Responsibility for proving and defining in abstract algebra class

    Science.gov (United States)

    Fukawa-Connelly, Timothy

    2016-07-01

    There is considerable variety in inquiry-oriented instruction, but what is common is that students assume roles in mathematical activity that in a traditional, lecture-based class are either assumed by the teacher (or text) or are not visible at all in traditional math classrooms. This paper is a case study of the teaching of an inquiry-based undergraduate abstract algebra course. In particular, gives a theoretical account of the defining and proving processes. The study examines the intellectual responsibility for the processes of defining and proving that the professor devolved to the students. While the professor wanted the students to engage in all aspects of defining and proving, he was only successful at devolving responsibility for certain aspects and much more successful at devolving responsibility for proving than conjecturing or defining. This study suggests that even a well-intentioned instructor may not be able to devolve responsibility to students for some aspects of mathematical practice without using a research-based curriculum or further professional development.

  10. Overcoming the Obstacle of Poor Knowledge in Proving Geometry Tasks

    Directory of Open Access Journals (Sweden)

    Zlatan Magajna

    2013-12-01

    Full Text Available Proving in school geometry is not just about validating the truth of a claim. In the school setting, the main function of the proof is to convince someone that a claim is true by providing an explanation. Students consider proving to be difficult; in fact, they find the very concept of proof demanding. Proving a claim in planar geometry involves several processes, the most salient being visual observation and deductive argumentation. These two processes are interwoven, but often poor observation hinders deductive argumentation. In the present article, we consider the possibility of overcoming the obstacle of a student’s poor observation by making use of computer-aided observation with appropriate software. We present the results of two small-scale research projects, both of which indicate that students are able to work out considerably more deductions if computer-aided observation is used. Not all students use computer-aided observation effectively in proving tasks: some find an exhaustive computer-provided list of properties confusing and are not able to choose the properties that are relevant to the task.

  11. Pengembangan Perangkat Pembelajaran Geometri Ruang dengan Model Proving Theorem

    Directory of Open Access Journals (Sweden)

    Bambang Eko Susilo

    2016-03-01

    Full Text Available Kemampuan berpikir kritis dan kreatif mahasiswa masih lemah. Hal ini ditemukan pada mahasiswa yang mengambil mata kuliah Geometri Ruang yaitu dalam membuktikan soal-soal pembuktian (problem to proof. Mahasiswa masih menyelesaikan secara algoritmik atau prosedural sehingga diperlukan pengembangan perangkat pembelajaran Geometri Ruang berbasis kompetensi dan konservasi dengan model Proving Theorem. Dalam penelitian ini perangkat perkuliahan yang dikembangkan yaitu Silabus, Satuan Acara Perkuliahan (SAP, Kontrak Perkuliahan, Media Pembelajaran, Bahan Ajar, Tes UTS dan UAS serta Angket Karakter Konservasi telah dilaksanakan dengan baik dengan kriteria (1 validasi perangkat pembelajaran mata kuliah Geometri ruang berbasis kompetensi dan konservasi dengan model proving theorem berkategori baik dan layak digunakan dan (2 keterlaksanaan RPP pada pembelajaran yang dikembangkan secara keseluruhan berkategori baik.Critical and creative thinking abilities of students still weak. It is found in students who take Space Geometry subjects that is in solving problems to to prove. Students still finish in algorithmic or procedural so that the required the development of Space Geometry learning tools based on competency and conservation with Proving Theorem models. This is a research development which refers to the 4-D models that have been modified for the Space Geometry learning tools, second semester academic year 2014/2015. Instruments used include validation sheet, learning tools and character assessment questionnaire. In this research, the learning tools are developed, namely Syllabus, Lesson Plan, Lecture Contract, Learning Media, Teaching Material, Tests, and Character Conservation Questionnaire had been properly implemented with the criteria (1 validation of Space Geometry learning tools based on competency and conservation with Proving Theorem models categorized good and feasible to use, and (2 the implementation of Lesson Plan on learning categorized

  12. Clinical indicators to monitor patients with risk for ineffective cerebral tissue perfusion

    Directory of Open Access Journals (Sweden)

    Miriam de Abreu Almeida

    2015-04-01

    Full Text Available Objective. Select and validate the clinical indicators to monitor patients on risk for ineffective cerebral tissue perfusion, according to the Nursing Outcomes Classification (NOC. Methodology. Validation study carried out between November 2012 and August 2013, in a Brazilian hospital. Seventeen judges nurses evaluated the clinical indicators of Nursing Outcomes, according to NOC for patients on risk for ineffective cerebral tissue perfusion. In the first stage, were selected the nursing results for the assessment of the studied diagnosis and, in the second nurses assessment the importance of the indicators of the validated results in the previous step through a five points Likert scale (1 = not important to 5 = extremely important. Were used the content validity index (CVI that corresponds to the calculation of weighted averages of them marks awarded for each indicator, as it considered the following weights: 1=0.00, 2=0.25, 3=0.50; 4=0.75; 5=1.00. For categorization, the CVI considered as critical = ≥0.80; supplementary =≥0.50 to 0.79 and were disposed results <0.50. Results. Of the 9 nursing results, only the cerebral tissue perfusion obtained a 100% consensus. The CVI of the 18 indicators of this result showed that five were validated as critical (impaired neurological reflexes, systolic blood pressure, diastolic blood pressure, reduced level of consciousness and mean arterial pressure, 12 were validated as supplementary (Agitation, Impaired cognition, Intracranial pressure, Syncope, Vomiting, Findings of cerebral angiography, Headache, Restlessness, Fever, Unexplained anxiety, listlessness and Hiccughs and one was disposed (carotid bruit. Conclusions. The validation of information about the conditions of risk may allow early intervention to minimize the consequences of ineffective cerebral tissue perfusion.

  13. Improving healthcare practice behaviors: an exploratory study identifying effective and ineffective behaviors in healthcare.

    Science.gov (United States)

    Van Fleet, David D; Peterson, Tim O

    2016-01-01

    The purpose of this paper is to present the results of exploratory research designed to develop an awareness of healthcare behaviors, with a view toward improving the customer satisfaction with healthcare services. It examines the relationship between healthcare providers and their consumers/patients/clients. The study uses a critical incident methodology, with both effective and ineffective behavioral specimens examined across different provider groups. The effects of these different behaviors on what Berry (1999) identified as the common core values of service organizations are examined, as those values are required to build a lasting service relationship. Also examined are categories of healthcare practice based on the National Quality Strategy priorities. The most obvious is the retrospective nature of the method used. How accurate are patient or consumer memories? Are they capable of making valid judgments of healthcare experiences (Berry and Bendapudi, 2003)? While an obvious limitation, such recollections are clearly important as they may be paramount in following the healthcare practitioners' instructions, loyalty for repeat business, making recommendations to others and the like. Further, studies have shown retrospective reports to be accurate and useful (Miller et al., 1997). With this information, healthcare educators should be in a better position to improve the training offered in their programs and practitioners to better serve their customers. The findings would indicate that the human values of excellence, innovation, joy, respect and integrity play a significant role in building a strong service relationship between consumer and healthcare provider. Berry (1999) has argued that the overriding importance in building a lasting service business is human values. This exploratory study has shown how critical incident analysis can be used to determine both effective and ineffective practices of different medical providers. It also provides guidelines as

  14. Formalizing and proving a typing result for security protocols in Isabelle/HOL

    DEFF Research Database (Denmark)

    Hess, Andreas Viktor; Modersheim, Sebastian

    2017-01-01

    or the positive output of a verification tool. However several of these works have used a typed model, where the intruder is restricted to "well-typed" attacks. There also have been several works that show that this is actually not a restriction for a large class of protocols, but all these results so far...... are again pen-and-paper proofs. In this work we present a formalization of such a typing result in Isabelle/HOL. We formalize a constraint-based approach that is used in the proof argument of such typing results, and prove its soundness, completeness and termination. We then formalize and prove the typing...... result itself in Isabelle. Finally, to illustrate the real-world feasibility, we prove that the standard Transport Layer Security (TLS) handshake satisfies the main condition of the typing result....

  15. Unexploded ordnance issues at Aberdeen Proving Ground: Background information

    Energy Technology Data Exchange (ETDEWEB)

    Rosenblatt, D.H.

    1996-11-01

    This document summarizes currently available information about the presence and significance of unexploded ordnance (UXO) in the two main areas of Aberdeen Proving Ground: Aberdeen Area and Edgewood Area. Known UXO in the land ranges of the Aberdeen Area consists entirely of conventional munitions. The Edgewood Area contains, in addition to conventional munitions, a significant quantity of chemical-munition UXO, which is reflected in the presence of chemical agent decomposition products in Edgewood Area ground-water samples. It may be concluded from current information that the UXO at Aberdeen Proving Ground has not adversely affected the environment through release of toxic substances to the public domain, especially not by water pathways, and is not likely to do so in the near future. Nevertheless, modest but periodic monitoring of groundwater and nearby surface waters would be a prudent policy.

  16. Unicorns do exist: a tutorial on "proving" the null hypothesis.

    Science.gov (United States)

    Streiner, David L

    2003-12-01

    Introductory statistics classes teach us that we can never prove the null hypothesis; all we can do is reject or fail to reject it. However, there are times when it is necessary to try to prove the nonexistence of a difference between groups. This most often happens within the context of comparing a new treatment against an established one and showing that the new intervention is not inferior to the standard. This article first outlines the logic of "noninferiority" testing by differentiating between the null hypothesis (that which we are trying to nullify) and the "nill" hypothesis (there is no difference), reversing the role of the null and alternate hypotheses, and defining an interval within which groups are said to be equivalent. We then work through an example and show how to calculate sample sizes for noninferiority studies.

  17. Logic for computer science foundations of automatic theorem proving

    CERN Document Server

    Gallier, Jean H

    2015-01-01

    This advanced text for undergraduate and graduate students introduces mathematical logic with an emphasis on proof theory and procedures for algorithmic construction of formal proofs. The self-contained treatment is also useful for computer scientists and mathematically inclined readers interested in the formalization of proofs and basics of automatic theorem proving. Topics include propositional logic and its resolution, first-order logic, Gentzen's cut elimination theorem and applications, and Gentzen's sharpened Hauptsatz and Herbrand's theorem. Additional subjects include resolution in fir

  18. IHSI [Induction Heating Stress Improvement] proves its worth

    International Nuclear Information System (INIS)

    Froehlich, C.H.; Cofie, N.G.; Sheffield, J.R.

    1988-01-01

    Based upon the wealth of experimental test data, extensive and successful in-plant application, and the decreasing cost of applying the process, IHSI is proving itself an important part of overall IGSCC mitigation programmes. Work is ongoing on the development of new temperature sensing systems, more efficient equipment immobilization/demobilization hardware configurations, and craft support management practices to further enhance the cost-effectiveness of IHSI. (author)

  19. Renewable Energy Opportunties at Dugway Proving Ground, Utah

    Energy Technology Data Exchange (ETDEWEB)

    Orrell, Alice C.; Kora, Angela R.; Russo, Bryan J.; Horner, Jacob A.; Williamson, Jennifer L.; Weimar, Mark R.; Gorrissen, Willy J.; Nesse, Ronald J.; Dixon, Douglas R.

    2010-05-31

    This document provides an overview of renewable resource potential at Dugway Proving Ground, based primarily upon analysis of secondary data sources supplemented with limited on-site evaluations. This effort focuses on grid-connected generation of electricity from renewable energy sources and ground source heat pumps (GSHPs). The effort was funded by the U.S. Army Installation Management Command (IMCOM) as follow-on to the 2005 Department of Defense (DoD) Renewables Assessment.

  20. Effective or ineffective: attribute framing and the human papillomavirus (HPV) vaccine.

    Science.gov (United States)

    Bigman, Cabral A; Cappella, Joseph N; Hornik, Robert C

    2010-12-01

    To experimentally test whether presenting logically equivalent, but differently valenced effectiveness information (i.e. attribute framing) affects perceived effectiveness of the human papillomavirus (HPV) vaccine, vaccine-related intentions and policy opinions. A survey-based experiment (N=334) was fielded in August and September 2007 as part of a larger ongoing web-enabled monthly survey, the Annenberg National Health Communication Survey. Participants were randomly assigned to read a short passage about the HPV vaccine that framed vaccine effectiveness information in one of five ways. Afterward, they rated the vaccine and related opinion questions. Main statistical methods included ANOVA and t-tests. On average, respondents exposed to positive framing (70% effective) rated the HPV vaccine as more effective and were more supportive of vaccine mandate policy than those exposed to the negative frame (30% ineffective) or the control frame. Mixed valence frames showed some evidence for order effects; phrasing that ended by emphasizing vaccine ineffectiveness showed similar vaccine ratings to the negative frame. The experiment finds that logically equivalent information about vaccine effectiveness not only influences perceived effectiveness, but can in some cases influence support for policies mandating vaccine use. These framing effects should be considered when designing messages. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Effective or ineffective: Attribute framing and the human papillomavirus (HPV) vaccine

    Science.gov (United States)

    Bigman, Cabral A.; Cappella, Joseph N.; Hornik, Robert C.

    2010-01-01

    Objectives To experimentally test whether presenting logically equivalent, but differently valenced effectiveness information (i.e. attribute framing) affects perceived effectiveness of the human papillomavirus (HPV) vaccine, vaccine related intentions and policy opinions. Method A survey-based experiment (N= 334) was fielded in August and September 2007 as part of a larger ongoing web-enabled monthly survey, the Annenberg National Health Communication Survey. Participants were randomly assigned to read a short passage about the HPV vaccine that framed vaccine effectiveness information in one of five ways. Afterward, they rated the vaccine and related opinion questions. Main statistical methods included ANOVA and t-tests. Results On average, respondents exposed to positive framing (70% effective) rated the HPV vaccine as more effective and were more supportive of vaccine mandate policy than those exposed to the negative frame (30% ineffective) or the control frame. Mixed valence frames showed some evidence for order effects; phrasing that ended by emphasizing vaccine ineffectiveness showed similar vaccine ratings to the negative frame. Conclusions The experiment finds that logically equivalent information about vaccine effectiveness not only influences perceived effectiveness, but can in some cases influence support for policies mandating vaccine use. Practice implications These framing effects should be considered when designing messages. PMID:20851560

  2. [Ineffective sexuality pattern in an adolescent: nursing approach in primary health care].

    Science.gov (United States)

    Martín-García, Angel; Oter-Quintana, Cristina; Brito-Brito, Pedro Ruymán; Martín-Iglesias, Susana; Alcolea-Cosín, M Teresa

    2013-01-01

    Adolescent is a phase of continual physiological, psychological and social adaptation. It is during this time that young people tend to have their first sexual experiences. Sexual dysfunctions are characterized by important clinical changes in sexual desire and/or by psycho-physiological changes in the sexual response cycle. Premature ejaculation is one of the most frequent sexual dysfunction amongst men, with a higher prevalence in the younger population compared to other populations. The clinical case is presented of a 17 year-old male who experienced difficulties during his sexual relations. It is discussed whether his condition was a sexual dysfunction or ineffective sexual pattern. The care plan which was developed in nursing consultation was described for ineffective sexual pattern; the pending nursing treatment incorporated activities recommended by scientific evidence. Finally, the role of primary health care nursing professionals is pointed out in the detection and approach of sexual problems in adolescents. Copyright © 2013 Elsevier España, S.L. All rights reserved.

  3. A Closer look on Ineffectiveness in Riau Mainland Expenditure: Local Government Budget Case

    Science.gov (United States)

    Yandra, Alexsander; Roserdevi Nasution, Sri; Harsini; Wardi, Jeni

    2018-05-01

    this study discussed about the issues on ineffectiveness of expenditure by one Indonesia local Government in Riau. This Provence were amounted Rp. 10.7 trillion through Local Government Budget (APBD) in 2015. According to data from Financial Management Board and Regions Assets (BPKAD) APBD Riau in 2015 stood at approximately 37.58% until October 2015,another data taken from the Ministry of Home Affairs, Riau regional budget, from January to December 2015, it shows the lowest in Indonesia which amounted to 59.6%. The percentage described the lacking implementation of the budget, this can be interpreted that Riau government is less optimal and irrelevant in spending the budget in 2015. Through a theoretical approach to government spending, the implementation of public policies showed the ineffectiveness of the budget that have implicated towards regional development. As regional budget is only the draft in achieving the targets. Budget management in 2015 by the provincial administration through the Local Government Unit (SKPD) shows unsynchronized between the Medium Term Development Plan with the work program from SKPD.

  4. Ineffective participation: reactions to absentee and incompetent nurse leadership in an intensive care unit.

    Science.gov (United States)

    Rouse, Ruby A

    2009-05-01

    The aim of the present study was to analyse reactions to ineffective leader participation in an intensive care unit (ICU). Critical examination of leadership failures helps identify nurse manager behaviours to avoid. An online survey collected data from 51 interacting healthcare providers who work in an intensive care unit. Participants reported dissatisfaction with nurse leaders who were perceived as absent or ill prepared. Participants categorized intensive care unit productivity and morale as moderate to low. Multiple regression suggested the best predictor of perceived unit productivity was supervisor communication; the best predictor of employee morale was perceived leader mentoring. Intensive care unit nurses reported wanting active participation from their leaders and expressed dissatisfaction when supervisors were perceived as absent or incompetent. Ineffective leader participation significantly correlated with lower employee perceptions of productivity and morale. Senior managers should recruit and develop supervisors with effective participation skills. Organizations primarily concerned about productivity should focus on developing the communication skills of nurse leaders. Units mainly concerned with employee morale should emphasize mentorship and role modelling. Formal assessment of nurse leaders by all intensive care unit team members should also be used to proactively identify opportunities for improvement.

  5. Technical report on the Piping Reliability Proving Tests at the Japan Atomic Energy Research Institute

    International Nuclear Information System (INIS)

    1993-05-01

    Japan Atomic Energy Research Institute (JAERI) conducts Piping Reliability Proving Tests from 1975 to 1992 based upon the contracts between JAERI and Science and Technology Agency of Japan (STA) under the auspices of the special account law for electric power development promotion. The purpose of these tests are to prove the structural reliability of the primary cooling piping constituting a part of the pressure boundary in the light water reactor power plants. The tests with large experimental facilities had ended already in 1990. Presently piping reliability analysis by the probabilistic fracture mechanics method is being done. Until now annual reports concerning the proving tests were produced and submitted to STA, whereas this report summarizes the test results done during these 16 years. Objectives of the piping reliability proving tests are to prove that the primary piping of the light water reactor (1) be reliable throughout the service period, (2) have no possibility of rupture, (3) bring no detrimental influence on the surrounding instrumentations or equipments near the break location even if it ruptured suddenly. To attain these objectives (i) pipe fatigue tests, (ii) unstable pipe fracture tests, (iii) pipe rupture tests and also the analyses by computer codes were done. After carrying out these tests, it is verified that the piping is reliable throughout the service period. The authors of this report are T. Isozaki, K. Shibata, S. Ueda, R. Kurihara, K. Onizawa and A. Kohsaka. The parts they wrote are shown in contents. (author)

  6. Preparing for Mars: The Evolvable Mars Campaign 'Proving Ground' Approach

    Science.gov (United States)

    Bobskill, Marianne R.; Lupisella, Mark L.; Mueller, Rob P.; Sibille, Laurent; Vangen, Scott; Williams-Byrd, Julie

    2015-01-01

    As the National Aeronautics and Space Administration (NASA) prepares to extend human presence beyond Low Earth Orbit, we are in the early stages of planning missions within the framework of an Evolvable Mars Campaign. Initial missions would be conducted in near-Earth cis-lunar space and would eventually culminate in extended duration crewed missions on the surface of Mars. To enable such exploration missions, critical technologies and capabilities must be identified, developed, and tested. NASA has followed a principled approach to identify critical capabilities and a "Proving Ground" approach is emerging to address testing needs. The Proving Ground is a period subsequent to current International Space Station activities wherein exploration-enabling capabilities and technologies are developed and the foundation is laid for sustained human presence in space. The Proving Ground domain essentially includes missions beyond Low Earth Orbit that will provide increasing mission capability while reducing technical risks. Proving Ground missions also provide valuable experience with deep space operations and support the transition from "Earth-dependence" to "Earth-independence" required for sustainable space exploration. A Technology Development Assessment Team identified a suite of critical technologies needed to support the cadence of exploration missions. Discussions among mission planners, vehicle developers, subject-matter-experts, and technologists were used to identify a minimum but sufficient set of required technologies and capabilities. Within System Maturation Teams, known challenges were identified and expressed as specific performance gaps in critical capabilities, which were then refined and activities required to close these critical gaps were identified. Analysis was performed to identify test and demonstration opportunities for critical technical capabilities across the Proving Ground spectrum of missions. This suite of critical capabilities is expected to

  7. On proving confluence modulo equivalence for Constraint Handling Rules

    DEFF Research Database (Denmark)

    Christiansen, Henning; Kirkeby, Maja Hanne

    2017-01-01

    -logical built-in predicates such as var/1 and incomplete ones such as is/2, that are ignored in previous work on confluence. To this end, a new operational semantics for CHR is developed which includes such predicates. In addition, this semantics differs from earlier approaches by its simplicity without loss......Previous results on proving confluence for Constraint Handling Rules are extended in two ways in order to allow a larger and more realistic class of CHR programs to be considered confluent. Firstly, we introduce the relaxed notion of confluence modulo equivalence into the context of CHR: while...

  8. Proving Test on the Reliability for Reactor Containment Vessel

    International Nuclear Information System (INIS)

    Takumi, K.; Nonaka, A.

    1988-01-01

    NUPEC (Nuclear Power Engineering Test Center) has started an eight-year project of Proving Test on the Reliability for Reactor Containment Vessel since June 1987. The objective of this project is to confirm the integrity of containment vessels under severe accident conditions. This paper shows the outline of this project. The test Items are (1) Hydrogen mixing and distribution test, (2) Hydrogen burning test, (3) Iodine trapping characteristics test, and (4) Structural behavior test. Based on the test results, computer codes are verified and as the results of analysis and evaluation by the computer codes, containment integrity is to be confirmed

  9. Case report 486: Spondyloepiphyseal dysplasia tarda (SDT) (presumptively proved)

    International Nuclear Information System (INIS)

    Brown, D.D.; Childress, M.H.

    1988-01-01

    A 51 year old man with severe degenerative joint disease, short stature, barrel chest deformity, platyspondyly, a narrow pelvis, small iliac bones, dysplastic femoral heads and necks, notching of the patellae and flattening of the femoral intercondylar notches has been described as an example of Spondyloepiphyseal dysplasia tarda SDT. The entity was discussed in detail. The notching of the patellae has not been reported in association with SDT to the authors' knowledge. Characteristic features of SDT allow it to be differentiated from other arthropathies and dysplasias and these distinctions have been emphasized in the discussion. The diagnosis in this case can only be considered presumptively proved. (orig./MG)

  10. A case of residual inferior sinus venosus defect after ineffective surgical closure.

    Science.gov (United States)

    Uga, Sayuri; Hidaka, Takayuki; Takasaki, Taiichi; Kihara, Yasuki

    2014-10-03

    A 38-year-old woman presented with cyanosis and heart failure 34 years after patch closure of an atrial septal defect and partial anomalous pulmonary venous connection. CT and cardiac catheterisation showed a residual defect that caused right-to-left shunting. The patch almost blocked the inferior vena cava from the right atrium, resulting in uncommon drainage of the inferior vena cava into the left atrium. Other anomalies included the coronary-to-pulmonary artery fistula and duplicate inferior vena cava with dilated azygos venous system. A second surgery was performed, and we confirmed an inferior sinus venosus defect, which is rare and can be misdiagnosed. The ineffective patch closure had caused a haemodynamic status that rarely occurs. We describe the diagnostic process and emphasise the importance of correctly understanding the entity. 2014 BMJ Publishing Group Ltd.

  11. Non formal mechanisms for public water allocation and the ineffectiveness of law in arid western Argentina

    Directory of Open Access Journals (Sweden)

    Liber Martin

    2015-04-01

    Full Text Available This work analyzed the informal mechanisms of public water allocation and reallocation in western Argentina from a holistic conception of law. The paper refers to informal uses, its logical but ineffective repression and the continuous regularization processes from a non experimental observational method based on the use of qualitative strategies. The research work focused on the operation of water allocation mechanisms and management practices developed in the absence of law and against the law, at both the delivery and regulatory levels. The findings highlight the tensions and contradictions of these mechanisms under the formal legal system, demonstrating the crisis of both effectiveness and legitimacy of the law and the State in managing public waters.

  12. An Archeological Overview and Management Plan for the Dugway Proving Ground.

    Science.gov (United States)

    1984-03-29

    niches, particulary the lacustrine environment of the Great Basin (Baumhoff and Heizer 1965, Butler 1978, Heizer and Krieger 1956, Heizer and Harper...power that would capture the animals souls, rendering them docile and stupid (Steward 1970:34). Other large game was present, but were not numerous...uni versity. Baum, Bernard. 1947. Dugway Proving Ground. Aberdeen: U.S. Army Chemical Corps.* Bailmhoff, W.A. and R.F. Heizer . 1965. Postglacial

  13. Formal Analysis of Soft Errors using Theorem Proving

    Directory of Open Access Journals (Sweden)

    Sofiène Tahar

    2013-07-01

    Full Text Available Modeling and analysis of soft errors in electronic circuits has traditionally been done using computer simulations. Computer simulations cannot guarantee correctness of analysis because they utilize approximate real number representations and pseudo random numbers in the analysis and thus are not well suited for analyzing safety-critical applications. In this paper, we present a higher-order logic theorem proving based method for modeling and analysis of soft errors in electronic circuits. Our developed infrastructure includes formalized continuous random variable pairs, their Cumulative Distribution Function (CDF properties and independent standard uniform and Gaussian random variables. We illustrate the usefulness of our approach by modeling and analyzing soft errors in commonly used dynamic random access memory sense amplifier circuits.

  14. Evaluation of depleted uranium in the environment at Aberdeen Proving Grounds, Maryland and Yuma Proving Grounds, Arizona. Final report

    International Nuclear Information System (INIS)

    Kennedy, P.L.; Clements, W.H.; Myers, O.B.; Bestgen, H.T.; Jenkins, D.G.

    1995-01-01

    This report represents an evaluation of depleted uranium (DU) introduced into the environment at the Aberdeen Proving Grounds (APG), Maryland and Yuma Proving Grounds (YPG) Arizona. This was a cooperative project between the Environmental Sciences and Statistical Analyses Groups at LANL and with the Department of Fishery and Wildlife Biology at Colorado State University. The project represents a unique approach to assessing the environmental impact of DU in two dissimilar ecosystems. Ecological exposure models were created for each ecosystem and sensitivity/uncertainty analyses were conducted to identify exposure pathways which were most influential in the fate and transport of DU in the environment. Research included field sampling, field exposure experiment, and laboratory experiments. The first section addresses DU at the APG site. Chapter topics include bioenergetics-based food web model; field exposure experiments; bioconcentration by phytoplankton and the toxicity of U to zooplankton; physical processes governing the desorption of uranium from sediment to water; transfer of uranium from sediment to benthic invertebrates; spead of adsorpion by benthic invertebrates; uptake of uranium by fish. The final section of the report addresses DU at the YPG site. Chapters include the following information: Du transport processes and pathway model; field studies of performance of exposure model; uptake and elimination rates for kangaroo rates; chemical toxicity in kangaroo rat kidneys

  15. Evaluation of depleted uranium in the environment at Aberdeen Proving Grounds, Maryland and Yuma Proving Grounds, Arizona. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, P.L.; Clements, W.H.; Myers, O.B.; Bestgen, H.T.; Jenkins, D.G. [Colorado State Univ., Fort Collins, CO (United States). Dept. of Fishery and Wildlife Biology

    1995-01-01

    This report represents an evaluation of depleted uranium (DU) introduced into the environment at the Aberdeen Proving Grounds (APG), Maryland and Yuma Proving Grounds (YPG) Arizona. This was a cooperative project between the Environmental Sciences and Statistical Analyses Groups at LANL and with the Department of Fishery and Wildlife Biology at Colorado State University. The project represents a unique approach to assessing the environmental impact of DU in two dissimilar ecosystems. Ecological exposure models were created for each ecosystem and sensitivity/uncertainty analyses were conducted to identify exposure pathways which were most influential in the fate and transport of DU in the environment. Research included field sampling, field exposure experiment, and laboratory experiments. The first section addresses DU at the APG site. Chapter topics include bioenergetics-based food web model; field exposure experiments; bioconcentration by phytoplankton and the toxicity of U to zooplankton; physical processes governing the desorption of uranium from sediment to water; transfer of uranium from sediment to benthic invertebrates; spead of adsorpion by benthic invertebrates; uptake of uranium by fish. The final section of the report addresses DU at the YPG site. Chapters include the following information: Du transport processes and pathway model; field studies of performance of exposure model; uptake and elimination rates for kangaroo rates; chemical toxicity in kangaroo rat kidneys.

  16. Transplacental transfer of macromolecules: proving the efficiency of ...

    African Journals Online (AJOL)

    Background: Smaller substances 1000 Da may not. This may not be consistent because maternal measles antibodies (MMA) are large immunoglobulin G molecules with molecular weight of 150,000 Da, could cross the placenta in mother‑infant pairs.

  17. The GOES-R Proving Ground: 2012 Update

    Science.gov (United States)

    Gurka, J.; Goodman, S. J.; Schmit, T.; Demaria, M.; Mostek, A.; Siewert, C.; Reed, B.

    2011-12-01

    The Geostationary Operational Environmental Satellite (GOES)-R will provide a great leap forward in observing capabilities, but will also offer a significant challenge to ensure that users are ready to exploit the vast improvements in spatial, spectral, and temporal resolutions. To ensure user readiness, forecasters and other users must have access to prototype advanced products well before launch, and have the opportunity to provide feedback to product developers and computing and communications managers. The operational assessment is critical to ensure that the end products and NOAA's computing and communications systems truly meet their needs in a rapidly evolving environment. The GOES-R Proving Ground (PG) engages the National Weather Service (NWS) forecast, watch and warning community and other agency users in pre-operational demonstrations of select products with GOES-R attributes (enhanced spectral, spatial, and temporal resolution). In the PG, developers and forecasters test and apply algorithms for new GOES-R satellite data and products using proxy and simulated data sets, including observations from current and future satellite instruments (MODIS, AIRS, IASI, SEVIRI, NAST-I, NPP/VIIRS/CrIS, LIS), lightning networks, and computer simulated products. The complete list of products to be evaluated in 2012 will be determined after evaluating results from experiments in 2011 at the NWS' Storm Prediction Center, National Hurricane Center, Aviation Weather Center, Ocean Prediction Center, Hydrometeorological Prediction Center, and from the six NWS regions. In 2012 and beyond, the PG will test and validate data processing and distribution systems and the applications of these products in operational settings. Additionally developers and forecasters will test and apply display techniques and decision aid tools in operational environments. The PG is both a recipient and a source of training. Training materials are developed using various distance training tools in

  18. NASA SPoRT GOES-R Proving Ground Activities

    Science.gov (United States)

    Stano, Geoffrey T.; Fuell, Kevin K.; Jedloec, Gary J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) program is a partner with the GOES-R Proving Ground (PG) helping prepare forecasters understand the unique products to come from the GOES-R instrument suite. SPoRT is working collaboratively with other members of the GOES-R PG team and Algorithm Working Group (AWG) scientists to develop and disseminate a suite of proxy products that address specific forecast problems for the WFOs, Regional and National Support Centers, and other NOAA users. These products draw on SPoRT s expertise with the transition and evaluation of products into operations from the MODIS instrument and the North Alabama Lightning Mapping Array (NALMA). The MODIS instrument serves as an excellent proxy for the Advanced Baseline Imager (ABI) that will be aboard GOES-R. SPoRT has transitioned and evaluated several multi-channel MODIS products. The true and false color products are being used in natural hazard detection by several SPoRT partners to provide better observation of land features, such as fires, smoke plumes, and snow cover. Additionally, many of SPoRT s partners are coastal offices and already benefit from the MODIS sea surface temperature composite. This, along with other surface feature observations will be developed into ABI proxy products for diagnostic use in the forecast process as well as assimilation into forecast models. In addition to the MODIS instrument, the NALMA has proven very valuable to WFOs with access to these total lightning data. These data provide situational awareness and enhanced warning decision making to improve lead times for severe thunderstorm and tornado warnings. One effort by SPoRT scientists includes a lightning threat product to create short-term model forecasts of lightning activity. Additionally, SPoRT is working with the AWG to create GLM proxy data from several of the ground based total lightning networks, such as the NALMA. The evaluation will focus on the vastly improved spatial

  19. Geophysics: Building E5375 decommissioning, Aberdeen Proving Ground

    International Nuclear Information System (INIS)

    McGinnis, M.G.; McGinnis, L.D.; Miller, S.F.; Thompson, M.D.

    1992-08-01

    Building E5375 was one of ten potentially contaminated sites in the Canal Creek area of the Edgewood section of Aberdeen Proving Ground examined by a geophysical team from Argonne National Laboratory in April and May 1992. Noninvasive geophysical surveys, including magnetics, electrical resistivity, and ground-penetrating radar (GPR), were conducted around the perimeter of the building to guide a sampling program prior to decommissioning and dismantling. Several anomalies wear, noted: (1) An underground storage tank located 25 ft east of Building E5375 was identified with magnetic, resistivity, and GPR profiling. (2) A three-point resistivity anomaly, 12 ft east of the northeast comer of Building E5374 (which borders Building E5375) and 5 ft south of the area surveyed with the magnetometer, may be caused by another underground storage tank. (3) A 2,500-gamma magnetic anomaly near the northeast corner of the site has no equivalent resistivity anomaly, although disruption in GPR reflectors was observed. (4) A one-point magnetic anomaly was located at the northeast comer, but its source cannot be resolved. A chaotic reflective zone to the east represents the radar signature of Building E5375 construction fill

  20. Safety objectives for next generation reactors: proving their achievement

    International Nuclear Information System (INIS)

    Tanguy, P.Y.

    1996-01-01

    Assuming that there is a consensus between regulatory bodies and nuclear operating organizations on safety objectives for future plants, how are we going to demonstrate that they have been achieved, with a reasonable certainty? Right from the beginning, I would like to underline the importance of convincing the public that high level safety objectives will be effectively achieved in future nuclear power plants. The mere fulfillment of administrative requirements might not be sufficient to obtain public acceptance. One has to take into account the changes that have occurred in the public preception of nuclear risks in the wake of the Chernobyl accident. Today public opinion rules out the possibility not only that such a catastrophic accident could recur, but also that any accident with detrimental health consequences off-site could occur. The nuclear industry has to reflect this concern in its safety demonstration, independently of proving the achievement of technical safety goals. The public opinion issue will be readdressed at the end of this paper. (orig.)

  1. Potential Cislunar and Interplanetary Proving Ground Excursion Trajectory Concepts

    Science.gov (United States)

    McGuire, Melissa L.; Strange, Nathan J.; Burke, Laura M.; MacDonald, Mark A.; McElrath, Timothy P.; Landau, Damon F.; Lantoine, Gregory; Hack, Kurt J.; Lopez, Pedro

    2016-01-01

    NASA has been investigating potential translunar excursion concepts to take place in the 2020s that would be used to test and demonstrate long duration life support and other systems needed for eventual Mars missions in the 2030s. These potential trajectory concepts could be conducted in the proving ground, a region of cislunar and near-Earth interplanetary space where international space agencies could cooperate to develop the technologies needed for interplanetary spaceflight. Enabled by high power Solar Electric Propulsion (SEP) technologies, the excursion trajectory concepts studied are grouped into three classes of increasing distance from the Earth and increasing technical difficulty: the first class of excursion trajectory concepts would represent a 90-120 day round trip trajectory with abort to Earth options throughout the entire length, the second class would be a 180-210 day round trip trajectory with periods in which aborts would not be available, and the third would be a 300-400 day round trip trajectory without aborts for most of the length of the trip. This paper provides a top-level summary of the trajectory and mission design of representative example missions of these three classes of excursion trajectory concepts.

  2. Why prove it again? alternative proofs in mathematical practice

    CERN Document Server

    Dawson, Jr , John W

    2015-01-01

    This monograph considers several well-known mathematical theorems and asks the question, “Why prove it again?” while examining alternative proofs.   It  explores the different rationales mathematicians may have for pursuing and presenting new proofs of previously established results, as well as how they judge whether two proofs of a given result are different.  While a number of books have examined alternative proofs of individual theorems, this is the first that presents comparative case studies of other methods for a variety of different theorems. The author begins by laying out the criteria for distinguishing among proofs and enumerates reasons why new proofs have, for so long, played a prominent role in mathematical practice.  He then outlines various purposes that alternative proofs may serve.  Each chapter that follows provides a detailed case study of alternative proofs for particular theorems, including the Pythagorean Theorem, the Fundamental Theorem of Arithmetic, Desargues’ Theorem, the...

  3. Depleted uranium risk assessment at Aberdeen Proving Ground

    International Nuclear Information System (INIS)

    Ebinger, M.H.; Myers, O.B.; Kennedy, P.L.; Clements, W.H.

    1993-01-01

    The Environmental Science Group at Los Alamos and the Test and Evaluation Command (TECOM) are assessing the risk of depleted uranium (DU) testing at Aberdeen Proving Ground (APG). Conceptual and mathematical models of DU transfer through the APG ecosystem have been developed in order to show the mechanisms by which DU migrates or remains unavailable to different flora and fauna and to humans. The models incorporate actual rates of DU transfer between different ecosystem components as much as possible. Availability of data on DU transport through different pathways is scarce and constrains some of the transfer rates that can be used. Estimates of transfer rates were derived from literature sources and used in the mass-transfer models when actual transfer rates were unavailable. Objectives for this risk assessment are (1) to assess if DU transports away from impact areas; (2) to estimate how much, if any, DU migrates into Chesapeake Bay; (3) to determine if there are appreciable risks to the ecosystems due to DU testing; (4) to estimate the risk to human health as a result of DU testing

  4. Ineffectiveness of rat liver tissues in the screening of connective tissue disease

    International Nuclear Information System (INIS)

    Aziz, Khalil A.

    2004-01-01

    To assess the effectiveness of using rat liver tissue (RLT) for the screening of connective tissue disease (CTD). Results of patient samples submitted to the Clinical Immunology Laboratory, Brimingham Heartlands Hospital, Bordsley Green East, Brimingham, United Kingdom for the investigation of CTD between 2001 and 2002 were analyzed. Positive results for anti-double stranded DNA (dsDNA) antibodies and anti-extractable nuclear antigen (ENA) antibodies were correlated with the results of the corresponding antinuclear antibodies (ANA), obtained by indirect immunofluorescence (IIF) using RLT. In the second part of study samples that were previously tested positive for anti-ENA or anti-dsDNA antibodies were investigated prospectively for ANA using both RLTand human epithelial (Hep-2) cell line. The IIF method employing RLT for screening of CTD, failed to detect ANA patterns from 45% and 25%of patients sample know to contain antibodies to dsDNA and ENA.The anti -dsDNA antibodies that failed to be detected by the RLTwere of low avidity and their clinical significance is unknown. In contrast the antibodies to ENAwere mostly directed against the Ro antigen.In cotrast and like RLT, Hep-2 cell line failed to detect the low avidity anti-dsDNA antibdies.The present study has clearly shown that RLT are ineffective for screening of CTD. It is recommended that laboratories which ars still using these tissues should consider replacing them with the Hep-2 cell line. (author)

  5. The ineffectiveness and unintended consequences of the public health war on obesity.

    Science.gov (United States)

    Ramos Salas, Ximena

    2015-02-03

    The public health war on obesity has had little impact on obesity prevalence and has resulted in unintended consequences. Its ineffectiveness has been attributed to: 1) heavy focus on individual-based approaches and lack of scaled-up socio-environmental policies and programs, 2) modest effects of interventions in reducing and preventing obesity at the population level, and 3) inappropriate focus on weight rather than health. An unintended consequence of these policies and programs is excessive weight preoccupation among the population, which can lead to stigma, body dissatisfaction, dieting, disordered eating, and even death from effects of extreme dieting, anorexia, and obesity surgery complications, or from suicide that results from weight-based bullying. Future public health approaches should: a) avoid simplistic obesity messages that focus solely on individuals' responsibility for weight and health, b) focus on health outcomes rather than weight control, and c) address the complexity of obesity and target both individual-level and system-level determinants of health.

  6. Correlation of radiographic and manometric findings in patients with ineffective esophageal motility.

    Science.gov (United States)

    Shakespear, J S; Blom, D; Huprich, J E; Peters, J H

    2004-03-01

    Ineffective esophageal motility disorder (IEM) is a new, manometrically defined, esophageal motility disorder, associated with severe gastroesophageal reflux disease (GERD), GERD-associated respiratory symptoms, delayed acid clearance, and mucosal injury. Videoesophagram is an important, inexpensive, and widely available tool in the diagnostic evaluation of patients with esophageal pathologies. The efficacy of videoesophagography has not been rigorously examined in patients with IEM. The aim of this study was to determine the diagnostic value of videoesophagography in patients with IEM. The radiographic and manometric findings of 202 consecutive patients presenting with foregut symptoms were evaluated. IEM was defined by strict manometric criteria. All other named motility disorders such as achalasia were excluded. Videoesophagography was performed according to a standard protocol. Of patients in this cohort, 16% (33/202) had IEM by manometric criteria. Of IEM patients, 55% (18/33) had an abnormal videoesophagram, while in 45% (15/33) this test was read as normal. Only 11% (15/137) of patients with a normal videoesophagram were found to have IEM. Sensitivity of videoesophagram was 54.6%, specificity 72.2%, positive predictive value only 27.7%, and negative predictive value 89.1% in the diagnosis of IEM. These data show that videoesophagram is relatively insensitive in detecting patients with IEM and should not be considered a valid diagnostic test for this disorder. We conclude that esophageal manometry is an indispensable diagnostic modality in the workup of a patient with suspected of IEM.

  7. Environmental geophysics at J-Field, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Daudt, C.R.; McGinnis, L.D.; Miller, S.F.; Thompson, M.D.

    1994-11-01

    Geophysical data collected at J-Field, Aberdeen Proving Ground, Maryland, were used in the characterization of the natural hydrogeologic framework of the J-Field area and in the identification of buried disturbances (trenches and other evidences of contamination). Seismic refraction and reflection data and electrical resistivity data have aided in the characterization of the leaky confining unit at the base of the surficial aquifer (designated Unit B of the Tertiary Talbot Formation). Excellent reflectors have been observed for both upper and lower surfaces of Unit B that correspond to stratigraphic units observed in boreholes and on gamma logs. Elevation maps of both surfaces and an isopach map of Unit B, created from reflection data at the toxic burning pits site, show a thickening of Unit B to the east. Abnormally low seismic compressional-wave velocities suggest that Unit B consists of gassy sediments whose gases are not being flushed by upward or downward moving groundwater. The presence of gases suggests that Unit B serves as an efficient aquitard that should not be penetrated by drilling or other activities. Electromagnetic, total-intensity magnetic, and ground-penetrating radar surveys have aided in delineating the limits of two buried trenches, the VX burning pit and the liquid smoke disposal pit, both located at the toxic burning pits site. The techniques have also aided in determining the extent of several other disturbed areas where soils and materials were pushed out of disposal pits during trenching activities. Surveys conducted from the Prototype Building west to the Gunpowder River did not reveal any buried trenches.

  8. Depleted uranium human health risk assessment, Jefferson Proving Ground, Indiana

    International Nuclear Information System (INIS)

    Ebinger, M.H.; Hansen, W.R.

    1994-01-01

    The risk to human health from fragments of depleted uranium (DU) at Jefferson Proving Ground (JPG) was estimated using two types of ecosystem pathway models. A steady-state, model of the JPG area was developed to examine the effects of DU in soils, water, and vegetation on deer that were hunted and consumed by humans. The RESRAD code was also used to estimate the effects of farming the impact area and consuming the products derived from the farm. The steady-state model showed that minimal doses to humans are expected from consumption of deer that inhabit the impact area. Median values for doses to humans range from about 1 mrem (±2.4) to 0.04 mrem (±0.13) and translate to less than 1 x 10 -6 detriments (excess cancers) in the population. Monte Carlo simulation of the steady-state model was used to derive the probability distributions from which the median values were drawn. Sensitivity analyses of the steady-state model showed that the amount of DU in airborne dust and, therefore, the amount of DU on the vegetation surface, controlled the amount of DU ingested by deer and by humans. Human doses from the RESRAD estimates ranged from less than 1 mrem/y to about 6.5 mrem/y in a hunting scenario and subsistence fanning scenario, respectively. The human doses exceeded the 100 mrem/y dose limit when drinking water for the farming scenario was obtained from the on-site aquifer that was presumably contaminated with DU. The two farming scenarios were unrealistic land uses because the additional risk to humans due to unexploded ordnance in the impact area was not figured into the risk estimate. The doses estimated with RESRAD translated to less than 1 x 10 -6 detriments to about 1 x 10 -3 detriments. The higher risks were associated only with the farming scenario in which drinking water was obtained on-site

  9. Bronchoscopic diagnostic procedures and microbiological examinations in proving endobronchial tuberculosis

    Directory of Open Access Journals (Sweden)

    Abdullah Şimşek

    Full Text Available ABSTRACT Objective: To determine the proportional distribution of endobronchial tuberculosis (EBTB subtypes and to evaluate the types of bronchoscopic diagnostic procedures that can prove granulomatous inflammation. Methods: This was a retrospective study of 18 HIV-negative patients with biopsy-proven EBTB treated between 2010 and 2014. Results: The most common EBTB subtypes, as classified by the bronchoscopic features, were tumorous and granular (in 22.2% for both. Sputum smear microscopy was performed in 11 patients and was positive for AFB in 4 (36.3%. Sputum culture was also performed in 11 patients and was positive for Mycobacterium tuberculosis in 10 (90.9%. Smear microscopy of BAL fluid (BALF was performed in 16 patients and was positive for AFB in 10 (62.5%. Culture of BALF was also performed in 16 patients and was positive for M. tuberculosis in 15 (93.7%. Culture of BALF was positive for M. tuberculosis in 93.7% of the 16 patients tested. Among the 18 patients with EBTB, granulomatous inflammation was proven by the following bronchoscopic diagnostic procedures: bronchial mucosal biopsy, in 8 (44.4%; bronchial brushing, in 7 (38.8%; fine-needle aspiration biopsy, in 2 (11.1%; and BAL, in 2 (11.1%. Bronchial anthracofibrosis was observed in 5 (27.7% of the 18 cases evaluated. Conclusions: In our sample of EBTB patients, the most common subtypes were the tumorous and granular subtypes. We recommend that sputum samples and BALF samples be evaluated by smear microscopy for AFB and by culture for M. tuberculosis, which could increase the rates of early diagnosis of EBTB. We also recommend that bronchial brushing be employed together with other bronchoscopic diagnostic procedures in patients suspected of having EBTB.

  10. Ineffectiveness and adverse events of nitrofurantoin in women with urinary tract infection and renal impairment in primary care

    NARCIS (Netherlands)

    Geerts, A.F.; Eppenga, W.L.; Heerdink, R.; Derijks, H.J.; Wensing, M.J.P.; Egberts, T.C.; Smet, P.A.G.M. de

    2013-01-01

    PURPOSE: To determine whether treatment with nitrofurantoin in women with urinary tract infection (UTI) and renal impairment in primary care is associated with a higher risk of ineffectiveness and/or serious adverse events than in women without renal impairment. METHODS: A cohort of 21,317 women

  11. Ineffective acute treatment of episodic migraine is associated with new-onset chronic migraine.

    Science.gov (United States)

    Lipton, Richard B; Fanning, Kristina M; Serrano, Daniel; Reed, Michael L; Cady, Roger; Buse, Dawn C

    2015-02-17

    To test the hypothesis that ineffective acute treatment of episodic migraine (EM) is associated with an increased risk for the subsequent onset of chronic migraine (CM). In the American Migraine Prevalence and Prevention Study, respondents with EM in 2006 who completed the Migraine Treatment Optimization Questionnaire (mTOQ-4) and provided outcome data in 2007 were eligible for analyses. The mTOQ-4 is a validated questionnaire that assesses treatment efficacy based on 4 aspects of response to acute treatment. Total mTOQ-4 scores were used to define categories of acute treatment response: very poor, poor, moderate, and maximum treatment efficacy. Logistic regression models were used to examine the dichotomous outcome of transition from EM in 2006 to CM in 2007 as a function of mTOQ-4 category, adjusting for covariates. Among 5,681 eligible study respondents with EM in 2006, 3.1% progressed to CM in 2007. Only 1.9% of the group with maximum treatment efficacy developed CM. Rates of new-onset CM increased in the moderate treatment efficacy (2.7%), poor treatment efficacy (4.4%), and very poor treatment efficacy (6.8%) groups. In the fully adjusted model, the very poor treatment efficacy group had a more than 2-fold increased risk of new-onset CM (odds ratio = 2.55, 95% confidence interval 1.42-4.61) compared to the maximum treatment efficacy group. Inadequate acute treatment efficacy was associated with an increased risk of new-onset CM over the course of 1 year. Improving acute treatment outcomes might prevent new-onset CM, although reverse causality cannot be excluded. © 2015 American Academy of Neurology.

  12. Seismic proving test of ultimate piping strength (current status of preliminary tests)

    International Nuclear Information System (INIS)

    Suzuki, K.; Namita, Y.; Abe, H.; Ichihashi, I.; Suzuki, K.; Ishiwata, M.; Fujiwaka, T.; Yokota, H.

    2001-01-01

    In 1998 Fiscal Year, the 6 year program of piping tests was initiated with the following objectives: i) to clarify the elasto-plastic response and ultimate strength of nuclear piping, ii) to ascertain the seismic safety margin of the current seismic design code for piping, and iii) to assess new allowable stress rules. In order to resolve extensive technical issues before proceeding on to the seismic proving test of a large-scale piping system, a series of preliminary tests of materials, piping components and simplified piping systems is intended. In this paper, the current status of the material tests and the piping component tests is reported. (author)

  13. Users manual on database of the Piping Reliability Proving Tests at the Japan Atomic Energy Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    Japan Atomic Energy Research Institute(JAERI) conducted Piping Reliability Proving Tests from 1975 to 1992 based upon the contracts between JAERI and Science and Technology Agency of Japan under the auspices of the special account law for electric power development promotion. The purposes of those tests are to prove the structural reliability of the primary cooling piping constituting a part of the pressure boundary in the water reactor power plants. The tests with large experimental facilities had ended already in 1990. After that piping reliability analysis by the probabilistic method followed until 1992. This report describes the users manual on databases about the test results using the large experimental facilities. Objectives of the piping reliability proving tests are to prove that the primary piping of the water reactor (1) be reliable throughout the service period, (2) have no possibility of rupture, (3) bring no detrimental influence on the surrounding instrumentations or equipments near the break location. The research activities using large scale piping test facilities are described. The present report does the database about the test results pairing the former report. With these two reports, all the feature of Piping Reliability Proving Tests is made clear. Briefings of the tests are described also written in Japanese or English. (author)

  14. Bronchoscopic diagnostic procedures and microbiological examinations in proving endobronchial tuberculosis.

    Science.gov (United States)

    Şimşek, Abdullah; Yapıcı, İlhami; Babalık, Mesiha; Şimşek, Zekiye; Kolsuz, Mustafa

    2016-01-01

    To determine the proportional distribution of endobronchial tuberculosis (EBTB) subtypes and to evaluate the types of bronchoscopic diagnostic procedures that can prove granulomatous inflammation. This was a retrospective study of 18 HIV-negative patients with biopsy-proven EBTB treated between 2010 and 2014. The most common EBTB subtypes, as classified by the bronchoscopic features, were tumorous and granular (in 22.2% for both). Sputum smear microscopy was performed in 11 patients and was positive for AFB in 4 (36.3%). Sputum culture was also performed in 11 patients and was positive for Mycobacterium tuberculosis in 10 (90.9%). Smear microscopy of BAL fluid (BALF) was performed in 16 patients and was positive for AFB in 10 (62.5%). Culture of BALF was also performed in 16 patients and was positive for M. tuberculosis in 15 (93.7%). Culture of BALF was positive for M. tuberculosis in 93.7% of the 16 patients tested. Among the 18 patients with EBTB, granulomatous inflammation was proven by the following bronchoscopic diagnostic procedures: bronchial mucosal biopsy, in 8 (44.4%); bronchial brushing, in 7 (38.8%); fine-needle aspiration biopsy, in 2 (11.1%); and BAL, in 2 (11.1%). Bronchial anthracofibrosis was observed in 5 (27.7%) of the 18 cases evaluated. In our sample of EBTB patients, the most common subtypes were the tumorous and granular subtypes. We recommend that sputum samples and BALF samples be evaluated by smear microscopy for AFB and by culture for M. tuberculosis, which could increase the rates of early diagnosis of EBTB. We also recommend that bronchial brushing be employed together with other bronchoscopic diagnostic procedures in patients suspected of having EBTB. Determinar a distribuição proporcional dos subtipos de tuberculose endobrônquica (TBEB) e avaliar os tipos de procedimentos diagnósticos broncoscópicos que podem revelar inflamação granulomatosa. Este foi um estudo retrospectivo com 18 pacientes HIV negativos com TBEB comprovada

  15. Challenges in Australian policy processes for disinvestment from existing, ineffective health care practices.

    Science.gov (United States)

    Elshaug, Adam G; Hiller, Janet E; Tunis, Sean R; Moss, John R

    2007-10-31

    Internationally, many health care interventions were diffused prior to the standard use of assessments of safety, effectiveness and cost-effectiveness. Disinvestment from ineffective or inappropriately applied practices is a growing priority for health care systems for reasons of improved quality of care and sustainability of resource allocation. In this paper we examine key challenges for disinvestment from these interventions and explore potential policy-related avenues to advance a disinvestment agenda. We examine five key challenges in the area of policy driven disinvestment: 1) lack of resources to support disinvestment policy mechanisms; 2) lack of reliable administrative mechanisms to identify and prioritise technologies and/or practices with uncertain clinical and cost-effectiveness; 3) political, clinical and social challenges to removing an established technology or practice; 4) lack of published studies with evidence demonstrating that existing technologies/practices provide little or no benefit (highlighting complexity of design) and; 5) inadequate resources to support a research agenda to advance disinvestment methods. Partnerships are required to involve government, professional colleges and relevant stakeholder groups to put disinvestment on the agenda. Such partnerships could foster awareness raising, collaboration and improved health outcome data generation and reporting. Dedicated funds and distinct processes could be established within the Medical Services Advisory Committee and Pharmaceutical Benefits Advisory Committee to, a) identify technologies and practices for which there is relative uncertainty that could be the basis for disinvestment analysis, and b) conduct disinvestment assessments of selected item(s) to address existing practices in an analogous manner to the current focus on new and emerging technology. Finally, dedicated funding and cross-disciplinary collaboration is necessary to build health services and policy research capacity

  16. Breaking Dense Structures: Proving Stability of Densely Structured Hybrid Systems

    Directory of Open Access Journals (Sweden)

    Eike Möhlmann

    2015-06-01

    Full Text Available Abstraction and refinement is widely used in software development. Such techniques are valuable since they allow to handle even more complex systems. One key point is the ability to decompose a large system into subsystems, analyze those subsystems and deduce properties of the larger system. As cyber-physical systems tend to become more and more complex, such techniques become more appealing. In 2009, Oehlerking and Theel presented a (de-composition technique for hybrid systems. This technique is graph-based and constructs a Lyapunov function for hybrid systems having a complex discrete state space. The technique consists of (1 decomposing the underlying graph of the hybrid system into subgraphs, (2 computing multiple local Lyapunov functions for the subgraphs, and finally (3 composing the local Lyapunov functions into a piecewise Lyapunov function. A Lyapunov function can serve multiple purposes, e.g., it certifies stability or termination of a system or allows to construct invariant sets, which in turn may be used to certify safety and security. In this paper, we propose an improvement to the decomposing technique, which relaxes the graph structure before applying the decomposition technique. Our relaxation significantly reduces the connectivity of the graph by exploiting super-dense switching. The relaxation makes the decomposition technique more efficient on one hand and on the other allows to decompose a wider range of graph structures.

  17. Proving test on the performance of a Multiple-Excitation Simulator

    International Nuclear Information System (INIS)

    Fujita, Katsuhisa; Ito, Tomohiro; Kojima, Nobuyuki; Sasaki, Yoichi; Abe, Hiroshi; Kuroda, Katsuhiko

    1995-01-01

    Seismic excitation test on large scale piping systems is scheduled to be carried out by the Nuclear power Engineering Corporation (NUPEC) using the large-scale, high-performance vibration table at the Tadotsu Engineering Laboratory, under the sponsorship of the Ministry of International Trade and Industry (MITI). In the test, the piping systems simulate the main steam piping system and the main feed water piping system in the nuclear power plants. In this study, a fundamental test was carried out to prove the performance of the Multiple Excitation Simulator which consists of the hydraulic actuator and the control system. An L-shaped piping system and a hydraulic actuator were installed on the shaking table. Acceleration and displacement generated by the actuator were measured. The performance of the actuator and the control system was discussed comparing the measured values and the target values on the time histories and the response spectrum of the acceleration. As a result, it was proved that the actuator and the control system have good performance and will be applicable to the verification test

  18. Seismic proving test of process computer systems with a seismic floor isolation system

    International Nuclear Information System (INIS)

    Fujimoto, S.; Niwa, H.; Kondo, H.

    1995-01-01

    The authors have carried out seismic proving tests for process computer systems as a Nuclear Power Engineering Corporation (NUPEC) project sponsored by the Ministry of International Trade and Industry (MITI). This paper presents the seismic test results for evaluating functional capabilities of process computer systems with a seismic floor isolation system. The seismic floor isolation system to isolate the horizontal motion was composed of a floor frame (13 m x 13 m), ball bearing units, and spring-damper units. A series of seismic excitation tests was carried out using a large-scale shaking table of NUPEC. From the test results, the functional capabilities during large earthquakes of computer systems with a seismic floor isolation system were verified

  19. OPERA and MINOS Experimental Result Prove Big Bang Theory Invalid

    Science.gov (United States)

    Pressler, David E.

    2012-03-01

    The greatest error in the history of science is the misinterpretation of the Michelson-Morley Experiment. The speed of light was measured to travel at the same speed in all three directions (x, y, z axis) in ones own inertial reference system; however, c will always be measured as having an absolute different speed in all other inertial frames at different energy levels. Time slows down due to motion or a gravity field. Time is the rate of physical process. Speed = Distance/Time. If the time changes the distance must change. Therefore, BOTH mirrors must move towards the center of the interferometer and space must contract in all-three-directions; C-Space. Gravity is a C-Space condition, and is the cause of redshift in our universe-not motion. The universe is not expanding. OPERA results are directly indicated; at the surface of earth, the strength of the gravity field is at maximum-below the earth's surface, time and space is less distorted, C-Space; therefore, c is faster. Newtonian mechanics dictate that a spherical shell of matter at greater radii, with uniform density, produces no net force on an observer located centrally. An observer located on the sphere's surface, like our Earth's or a large sphere, like one located in a remote galaxy, will construct a picture centered on himself to be identical to the one centered inside the spherical shell of mass. Both observers will view the incoming radiation, emitted by the other observer, as redshifted, because they lay on each others radial line. The Universe is static and very old.

  20. ATTEMPT OF OVERCOMING SECONDARY INEFFECTIVENESS OF INFLIXIMAB IN A PATIENT WITH ANKYLOSING SPONDYLITIS USING PLASMAPHERESIS (A CASE REPORT

    Directory of Open Access Journals (Sweden)

    Oksana Alekseyevna Rumyantseva

    2013-01-01

    Full Text Available The article focuses on the problem of secondary ineffectiveness of tumor necrosis factor α infliximab (INF and describes an attempt of using plasmapheresis (PF to eliminate this problem in a patient with ankylosing spondylitis who had received INF treatment at a dose of 5 mg/kg for a long time (over 4 years. After PF, the IFF therapy ensured a long-term clinical and laboratory improvement of patient's condition. One can assume that PF made it possible to overcome secondary ineffectiveness of INF and can be used in some patients in cases when INF cannot be replaced with another TNFα inhibitor.

  1. Why is multiple micronutrient powder ineffective at reducing anaemia among 12–24 month olds in Colombia? Evidence from a randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Alison Andrew

    2016-12-01

    Full Text Available In Colombia’s bottom socio-economic strata, 46.6% of children under two are anaemic. A prevalence of above 20% falls within the WHO guidelines for daily supplementation with multiple micronutrient powder (MNP. To evaluate the effect of daily MNP supplementation on anaemia amongst Colombian children aged 12–24 months we ran a cluster RCT (n=1440. In previous work, we found the intervention had no impact on haemoglobin or anaemia in this population. In this current paper, we investigate this null result and find it cannot be explained by an underpowered study design, inaccurate measurements, low adoption of and compliance with the intervention, or crowding out through dietary substitution. We conclude that our intervention was ineffective at reducing rates of childhood anaemia because MNP itself was inefficacious in our population, rather than poor implementation of or adherence to the planned intervention. Further analysis of our data and secondary data suggests that the evolution with age of childhood anaemia in Colombia, and its causes, appear different from those in settings where MNP has been effective. Firstly, rates of anaemia peak at much earlier ages and then fall rapidly. Secondly, anaemia that remains after the first year of life is relatively, and increasingly as children get older, unrelated to iron deficiency. We suggest that factors during gestation, birth, breastfeeding and early weaning may be important in explaining very high rates of anaemia in early infancy. However, the adverse effects of these factors appear to be largely mitigated by the introduction of solid foods that often include meat. This renders population wide MNP supplementation, provided after a diet of solid foods has become established, an ineffective instrument with which to target Colombia’s childhood anaemia problem. Keywords: Anaemia, Iron-deficiency, Haemoglobin, Colombia, Micronutrients, Multiple micronutrient powder, Child, Nutrition

  2. Structure of Corrective Feedback for Selection of Ineffective Vegetable Parenting Practices for Use in a Simulation Videogame.

    Science.gov (United States)

    Baranowski, Tom; Beltran, Alicia; Chen, Tzu-An; O'Connor, Teresia; Hughes, Sheryl; Buday, Richard; Baranowski, Janice

    2013-02-01

    A serious videogame is being developed to train parents of preschool children in selecting and using parenting practices that are likely to encourage their child to eat more vegetables. The structure of feedback to the parents on their selection may influence what they learn from the game. Feedback Intervention Theory provides some guidance on the design of such messages. The structure of preferred performance feedback statements has not been investigated within serious videogames. Two feedback formats were tested for a player's preferences within the context of this videogame. Based on Feedback Intervention Theory, which proposes that threat to self-concept impairs feedback response, three-statement (a nonaffirming comment sandwiched between two affirming comments, called "Oreo" feedback, which should minimize threat to self-concept) and two-statement (a nonaffirming comment followed by an affirming comment) performance feedbacks were tailored to respondents. Tailoring was based on participants' report of frequency of use of effective and ineffective vegetable parenting practices and the reasons for use of the ineffective practices. Participants selected their preference between the two forms of feedback for each of eight ineffective vegetable parenting practices. In general, mothers ( n =81) (no male respondents) slightly preferred the "Oreo" feedback, but the pattern of preferences varied by demographic characteristics. Stronger relationships by income suggest the feedback structure should be tailored to family income. Future research with larger and more diverse samples needs to test whether perceived threat to self-concept mediates the response to feedback and otherwise verify these findings.

  3. Beyond Sex: Likelihood and Predictors of Effective and Ineffective Intervention in Intimate Partner Violence in Bystanders Perceiving an Emergency.

    Science.gov (United States)

    Chabot, Heather Frasier; Gray, Melissa L; Makande, Tariro B; Hoyt, Robert L

    2016-01-06

    Within the framework of the bystander model of intervention, we examined specific correlates and the likelihood of effective and ineffective intervention strategies of bystanders to an instance of intimate partner violence (IPV) identified as an emergency. We measured psychological variables associated with general prosocial behavior (including sex, instrumentality, expressiveness, empathy, personal distress, dispositional anger, and perceived barriers) as influential predictors in four IPV intervention behaviors (i.e., calling 911, talking to the victim, talking to the perpetrator, and physically interacting with the perpetrator). One hundred seventeen college community members completed preintervention measures, watched a film clip of IPV which they identified as an emergency, reported their likelihood of becoming involved and utilizing intervention behaviors, and identified perceived barriers to intervention. Participants were more likely to indicate using effective over ineffective intervention tactics. Lower perceived barriers to intervention predicted greater intervention likelihood. Hierarchical regression indicated that men and individuals higher in anger and instrumental traits were more likely to report that they would engage in riskier ineffective forms of intervention. Implications regarding bystander training and associations to intervention in related forms of violence including sexual assault are discussed. © The Author(s) 2016.

  4. Lactancia materna ineficaz: prevalencia y factores asociados Ineffective Breastfeeding: prevalence and associated factors

    Directory of Open Access Journals (Sweden)

    Elvinia Pinilla Gómez

    2011-12-01

    and associated factors with nursing diagnosis "ineffective breastfeeding" in infants under than 6 months hospitalized in a tertiary institution. Methodology: Cross sectional study. We selected 108 pairs (mother and child hospitalized in an institution of tertiary care in 2009, we applied a validated format to identify the diagnosis. Rasch analysis was performed for variables that represent the defining characteristics of the diagnosis, which created a scale of 0 to 100 and built a linear regression model the variables associated with the extent of diagnosis. Results: Prevalence of diagnosis was 93,5 %, the defining characteristic was easier to find the infant’s inability to hold on to the breast and the hardest was shaking and crying of the infant within the first hour after breastfeeding. Linear regression was associated factors such as: female gender, weight infant and the mother need to urinate. Conclusion: There is an unfavorable trend in both the prevalence and duration of breastfeeding for infants hospitalized, promotion of breastfeeding should be an interdisciplinary, modify hospital policies favoring the contact of the binomial and the early start to the chest maternal. Salud UIS 2011; 43 (3: 271-279

  5. Ecological survey of M-Field, Edgewood Area Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Downs, J.L.; Eberhardt, L.E.; Fitzner, R.E.; Rogers, L.E.

    1991-12-01

    An ecological survey was conducted on M-Field, at the Edgewood Area, Aberdeen Proving Ground, Maryland. M-Field is used routinely to test army smokes and obscurants, including brass flakes, carbon fibers, and fog oils. The field has been used for testing purposes for the past 40 years, but little documented history is available. Under current environmental regulations, the test field must be assessed periodically to document the presence or potential use of the area by threatened and endangered species. The M-Field area is approximately 370 acres and is part of the US Army's Edgewood Area at Aberdeen Proving Ground in Harford County, Maryland. The grass-covered field is primarily lowlands with elevations from about 1.0 to 8 m above sea level, and several buildings and structures are present on the field. The ecological assessment of M-Field was conducted in three stages, beginning with a preliminary site visit in May to assess sampling requirements. Two field site visits were made June 3--7, and August 12--15, 1991, to identify the biota existing on the site. Data were gathered on vegetation, small mammals, invertebrates, birds, large mammals, amphibians, and reptiles.

  6. Ecological survey of M-Field, Edgewood Area Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Downs, J.L.; Eberhardt, L.E.; Fitzner, R.E.; Rogers, L.E.

    1991-12-01

    An ecological survey was conducted on M-Field, at the Edgewood Area, Aberdeen Proving Ground, Maryland. M-Field is used routinely to test army smokes and obscurants, including brass flakes, carbon fibers, and fog oils. The field has been used for testing purposes for the past 40 years, but little documented history is available. Under current environmental regulations, the test field must be assessed periodically to document the presence or potential use of the area by threatened and endangered species. The M-Field area is approximately 370 acres and is part of the US Army`s Edgewood Area at Aberdeen Proving Ground in Harford County, Maryland. The grass-covered field is primarily lowlands with elevations from about 1.0 to 8 m above sea level, and several buildings and structures are present on the field. The ecological assessment of M-Field was conducted in three stages, beginning with a preliminary site visit in May to assess sampling requirements. Two field site visits were made June 3--7, and August 12--15, 1991, to identify the biota existing on the site. Data were gathered on vegetation, small mammals, invertebrates, birds, large mammals, amphibians, and reptiles.

  7. Research Objectives for Human Missions in the Proving Ground of Cis-Lunar Space

    Science.gov (United States)

    Spann, James; Niles, Paul; Eppler, Dean; Kennedy, Kriss; Lewis, Ruthan; Sullivan, Thomas

    2016-07-01

    Introduction: This talk will introduce the preliminary findings in support of NASA's Future Capabilities Team. In support of the ongoing studies conducted by NASA's Future Capabilities Team, we are tasked with collecting re-search objectives for the Proving Ground activities. The objectives could include but are certainly not limited to: demonstrating crew well being and performance over long duration missions, characterizing lunar volatiles, Earth monitoring, near Earth object search and identification, support of a far-side radio telescope, and measuring impact of deep space environment on biological systems. Beginning in as early as 2023, crewed missions beyond low Earth orbit will be enabled by the new capabilities of the SLS and Orion vehicles. This will initiate the "Proving Ground" phase of human exploration with Mars as an ultimate destination. The primary goal of the Proving Ground is to demonstrate the capability of suitably long dura-tion spaceflight without need of continuous support from Earth, i.e. become Earth Independent. A major component of the Proving Ground phase is to conduct research activities aimed at accomplishing major objectives selected from a wide variety of disciplines including but not limited to: Astronomy, Heliophysics, Fun-damental Physics, Planetary Science, Earth Science, Human Systems, Fundamental Space Biology, Microgravity, and In Situ Resource Utilization. Mapping and prioritizing the most important objectives from these disciplines will provide a strong foundation for establishing the architecture to be utilized in the Proving Ground. Possible Architectures: Activities and objectives will be accomplished during the Proving Ground phase using a deep space habitat. This habitat will potentially be accompanied by a power/propulsion bus capable of moving the habitat to accomplish different objectives within cis-lunar space. This architecture can also potentially support stag-ing of robotic and tele-robotic assets as well as

  8. Structure of Corrective Feedback for Selection of Ineffective Vegetable Parenting Practices for Use in a Simulation Videogame

    Science.gov (United States)

    Beltran, Alicia; Chen, Tzu-An; O'Connor, Teresia; Hughes, Sheryl; Buday, Richard; Baranowski, Janice

    2013-01-01

    Abstract A serious videogame is being developed to train parents of preschool children in selecting and using parenting practices that are likely to encourage their child to eat more vegetables. The structure of feedback to the parents on their selection may influence what they learn from the game. Feedback Intervention Theory provides some guidance on the design of such messages. The structure of preferred performance feedback statements has not been investigated within serious videogames. Two feedback formats were tested for a player's preferences within the context of this videogame. Based on Feedback Intervention Theory, which proposes that threat to self-concept impairs feedback response, three-statement (a nonaffirming comment sandwiched between two affirming comments, called “Oreo” feedback, which should minimize threat to self-concept) and two-statement (a nonaffirming comment followed by an affirming comment) performance feedbacks were tailored to respondents. Tailoring was based on participants' report of frequency of use of effective and ineffective vegetable parenting practices and the reasons for use of the ineffective practices. Participants selected their preference between the two forms of feedback for each of eight ineffective vegetable parenting practices. In general, mothers (n=81) (no male respondents) slightly preferred the “Oreo” feedback, but the pattern of preferences varied by demographic characteristics. Stronger relationships by income suggest the feedback structure should be tailored to family income. Future research with larger and more diverse samples needs to test whether perceived threat to self-concept mediates the response to feedback and otherwise verify these findings. PMID:24761320

  9. Attitude toward contraception and abortion among Curaçao women. Ineffective contraception due to limited sexual education?

    OpenAIRE

    van den Brink, Maaike J; Boersma, Adriana A; Meyboom-de Jong, Betty; de Bruijn, Jeanne GM

    2011-01-01

    Abstract Background In Curaçao is a high incidence of unintended pregnancies and induced abortions. Most of the induced abortions in Curaçao are on request of the woman and performed by general practitioners. In Curaçao, induced abortion is strictly prohibited, but since 1999 there has been a policy of connivance. We present data on the relevance of economic and socio-cultural factors for the high abortion-rates and the ineffective use of contraception. Methods Structured interviews to invest...

  10. The production tax credit for wind turbine powerplants is an ineffective incentive

    International Nuclear Information System (INIS)

    Kahn, E.; California Univ., Berkeley, CA

    1996-01-01

    The US Energy Policy Act (EPAct) of 1992 created a production tax credit of 1.5c/kWh available for 10 years to promote certain renewable energy technologies, including wind turbines. This paper argues that the impact of the wind turbine production tax credit will be minimal. The argument depends entirely on the nature of the project finance structure used by the private power industry for wind turbine development. We show that tax credits can only be absorbed by equity investors if there is a large fraction of equity in the project capital structure. This raises the financing cost of wind turbine projects compared to conventional power technology, which relies on a large fraction of low cost debt. If the tax credit were paid as a cash subsidy, the capital structure could be shifted to low cost debt and financing costs could be significantly reduced. (Author)

  11. The (ineffectiveness of green seals on consumer behavior: a experimental study [doi: 10.21529/RECADM.2017003

    Directory of Open Access Journals (Sweden)

    Taís Pasquotto Andreoli

    2017-09-01

    Full Text Available The objective of the study was to analyze consumers' perceptions about green seals, according to how well known these seals are and to their own profile of green consumption. Therefore, a theoretical reference was made on sustainable consumption, as well as on the application of green seals in the perspective of green marketing. The method adopted was a hypothetical-deductive approach, carried out through two experiments (one online and the other face-to-face, both with factorial design 2 (true and well-known green seal versus false seal and nothing known x 2 very sustainable versus low sustainable consumer profile. In view of the results obtained, four main discussions could be made, such as the possibility of greenwashing due to the clear mention of environmental aspects, either in relation to the seal itself or to the statements underlying it, the ineffectiveness of adopting green seals and influence of this ineffectiveness even in consumers who reported having a more sustainable consumption profile.   Keywords Sustainable consumption; Green marketing; Green seals; Experimental research; Greenwashing.

  12. Attitude toward contraception and abortion among Curaçao women. Ineffective contraception due to limited sexual education?

    Directory of Open Access Journals (Sweden)

    Meyboom-de Jong Betty

    2011-06-01

    Full Text Available Abstract Background In Curaçao is a high incidence of unintended pregnancies and induced abortions. Most of the induced abortions in Curaçao are on request of the woman and performed by general practitioners. In Curaçao, induced abortion is strictly prohibited, but since 1999 there has been a policy of connivance. We present data on the relevance of economic and socio-cultural factors for the high abortion-rates and the ineffective use of contraception. Methods Structured interviews to investigate knowledge and attitudes toward sexuality, contraception and abortion and reasons for ineffective use of contraceptives among women, visiting general practitioners. Results Of 158 women, 146 (92% participated and 82% reported that their education on sexuality and about contraception was of good quality. However 'knowledge of reliable contraceptive methods' appeared to be - in almost 50% of the cases - false information, misjudgements or erroneous views on the chance of getting pregnant using coitus interruptus and about the reliability and health effects of oral contraceptive pills. Almost half of the interviewed women had incorrect or no knowledge about reliability of condom use and IUD. 42% of the respondents risked by their behavior an unplanned pregnancy. Most respondents considered abortion as an emergency procedure, not as contraception. Almost two third experienced emotional, physical or social problems after the abortion. Conclusions Respondents had a negative attitude toward reliable contraceptives due to socio-cultural determined ideas about health consequences and limited sexual education. Main economic factors were costs of contraceptive methods, because most health insurances in Curaçao do not cover contraceptives. To improve the effective use of reliable contraceptives, more adequate information should be given, targeting the wrong beliefs and false information. The government should encourage health insurance companies to reimburse

  13. Environmental geophysics at the Southern Bush River Peninsula, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Davies, B.E.; Miller, S.F.; McGinnis, L.D. [and others

    1995-05-01

    Geophysical studies have been conducted at five sites in the southern Bush River Peninsula in the Edgewood Area of Aberdeen Proving Ground, Maryland. The goals of the studies were to identify areas containing buried metallic objects and to provide diagnostic signatures of the hydrogeologic framework of the site. These studies indicate that, during the Pleistocene Epoch, alternating stands of high and low sea level resulted in a complex pattern of channel-fill deposits. Paleochannels of various sizes and orientations have been mapped throughout the study area by means of ground-penetrating radar and EM-31 techniques. The EM-31 paleochannel signatures are represented onshore either by conductivity highs or lows, depending on the depths and facies of the fill sequences. A companion study shows the features as conductivity highs where they extend offshore. This erosional and depositional system is environmentally significant because of the role it plays in the shallow groundwater flow regime beneath the site. Magnetic and electromagnetic anomalies outline surficial and buried debris throughout the areas surveyed. On the basis of geophysical measurements, large-scale (i.e., tens of feet) landfilling has not been found in the southern Bush River Peninsula, though smaller-scale dumping of metallic debris and/or munitions cannot be ruled out.

  14. The existence of propagated sensation along the meridian proved by neuroelectrophysiology

    Science.gov (United States)

    Xu, Jinsen; Zheng, Shuxia; Pan, Xiaohua; Zhu, Xiaoxiang; Hu, Xianglong

    2013-01-01

    Propagated sensation along the meridian can occur when acupoints are stimulated by acupuncture or electrical impulses. In this study, participants with notable propagated sensation along the dian were given electro-acupuncture at the Jianyu (LI15) acupoint of the large intestine meridian. When participants stated that the sensation reached the back of their hand, regular nervous system action discharge was examined using a physiological recording electrode placed on the superficial branch of the radial nerve. The topographical maps of brain-evoked potential in the primary cortical somatosensory area were also detected. When Guangming (GB37) acupoint in the lower limb and Hegu (LI4) acupoint in the upper limb were stimulated, subjects without propagated sensation along the meridian exhibited a high potential reaction in the corresponding area of the brain cortical so-matosensory area. For subjects with a notable propagated sensation along the meridian, the tion area was larger and extended into the face representative area. These electrophysiological measures directly prove the existence of propagated sensation along the meridian, and the pheral stimulated site is consistent with the corresponding primary cortical somatosensory area, which presents a high potential reaction. PMID:25206574

  15. The mechanism for the formation of boron ineffective zone and its effect on the properties of ultra low carbon bainitic steels

    International Nuclear Information System (INIS)

    Hsieh, Rongiuan; Wang, Shyichin; Liou, Horngyih.

    1993-01-01

    In the manufacturing of Ultra Low Carbon Bainitic(ULCB) steels, boron is a prerequisite alloying element to promote the desired bainitic transformation. In order to obtain this hardenability effect, boron must be in solution and segregate to austenite grain boundaries and thus decrease the contribution of boundary interfacial energy to ferrite nucleation. During the development of ULCB steels in CSC, a small boron ineffective zone was sometimes found at the center of steel plates. From EPMA and boron autoradiograph analysis, it was found that the formation of this boron ineffective zone was due to center line segregation of inclusions which strongly combined with boron and formed a boron free zone in its vicinity. The microstructure of the boron ineffective zone was conventional ferrite with strength much lower than that of its surrounding bainite. This resulted in the occurrence of separations (splits) in tensile and impact specimens. Also, it was found that the hydrogen induced cracking (HIC) has a propensity to propagate along the boron ineffective zone. in welding y-groove tests, a higher cold cracking sensitivity at this boron ineffective zone was also found

  16. Relevance of mild ineffective oesophageal motility (IOM) and potential pharmacological reversibility of severe IOM in patients with gastro-oesophageal reflux disease.

    Science.gov (United States)

    Fornari, F; Blondeau, K; Durand, L; Rey, E; Diaz-Rubio, M; De Meyer, A; Tack, J; Sifrim, D

    2007-11-15

    Several studies showed high prevalence of ineffective oesophageal motility (IOM) in gastro-oesophageal reflux disease (GERD) and suggested an important role for ineffective oesophageal motility in increased acid exposure. However, impedance-manometric studies proposed that only severe ineffective oesophageal motility might affect oesophageal clearance. (i) To re-assess the relevance of mild IOM in GERD and (ii) to test the reversibility of IOM. Oesophageal motility, clearance and acid exposure were assessed in 191 GERD patients: 99 without IOM; 58 with mild IOM (30-80% ineffective contractions) and 34 with severe IOM (>80% ineffective contractions). In 30 patients with oesophagitis, the potential reversibility of IOM was evaluated with edrophonium intravenously. Patients with mild IOM had identical oesophageal clearance and acid exposure in comparison with those without IOM. Patients with severe IOM had a higher probability of prolonged supine clearance and acid exposure [odds ratio: 2.88 (1.16-7.17); 2.48 (0.99-6.17)]. This effect was independent of the presence of hiatal hernia and male sex. Severe IOM could be transiently reverted in 55% of patients. Mild IOM does not affect oesophageal clearance. Only severe IOM is associated with prolonged clearance and acid exposure, particularly in supine periods. The edrophonium test might be useful to predict severe IOM response to prokinetic medications.

  17. Proving the AGT relation for N f = 0, 1, 2 antifundamentals

    Science.gov (United States)

    Hadasz, Leszek; Jaskólski, Zbigniew; Suchanek, Paulina

    2010-06-01

    Using recursive relations satisfied by Nekrasov partition functions and by irregular conformal blocks we prove the AGT correspondence in the case of mathcal{N} = 2 superconformal SU(2) quiver gauge theories with N f = 0, 1, 2 antifundamental hypermultiplets.

  18. Classificació de proves no paramètriques. Com aplicar-les en SPSS

    Directory of Open Access Journals (Sweden)

    Vanesa Berlanga-Silvente

    2012-07-01

    Full Text Available Les proves no paramètriques engloben una sèrie de proves estadístiques, que tenen com a denominador comú l'absència de assumpcions sobre la llei de probabilitat que segueix la població de la qual ha estat extreta la mostra. Per aquesta raó és comú referir-s'hi com a proves de distribució lliure. A l'article es descriuen i treballen les proves no paramètriques ressaltant el seu fonament i les indicacions per al seu ús quan es tracta d'una sola mostra (Chi-quadrat, de dues mostres amb dades independents (U de Mann-Whitney, de dues mostres amb dades relacionades (T de Wilcoxon, de diverses mostres amb dades independents (H de Kruskal-Wallis i de diverses mostres amb dades relacionades (Friedman.

  19. “Deliberate distortion of facts” and the problem of proving bias:

    African Journals Online (AJOL)

    user

    informed observer would reasonably perceive bias on the part of the officer .... represent an excellent illustration of what an Australian Chief Justice once .... the appellants prove that the Justice of Appeal who had no financial or other.

  20. PROVE Land Cover and Leaf Area of Jornada Experimental Range, New Mexico, 1997

    Data.gov (United States)

    National Aeronautics and Space Administration — Field measurement of shrubland ecological properties is important for both site monitoring and validation of remote-sensing information. During the PROVE exercise on...

  1. PROVE Land Cover and Leaf Area of Jornada Experimental Range, New Mexico, 1997

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: Field measurement of shrubland ecological properties is important for both site monitoring and validation of remote-sensing information. During the PROVE...

  2. Environmental radiation monitoring plan for depleted uranium and beryllium areas, Yuma Proving Ground

    International Nuclear Information System (INIS)

    Ebinger, M.H.; Hansen, W.R.

    1994-01-01

    This Environmental Radiation Monitoring Plan (ERM) discusses sampling soils, vegetation, and biota for depleted uranium (DU) and beryllium (Be) at Yuma Proving Ground (YPG). The existing ERM plan was used and modified to more adequately assess the potential of DU and Be migration through the YPG ecosystem. The potential pathways for DU and Be migration are discussed and include soil to vegetation, soil to animals, vegetation to animals, animals to animals, and animals to man. Sample collection will show DU deposition and will be used to estimate DU migration. The number of samples from each area varies and depends on if the firing range of interest is currently used for DU testing (GP 17A) or if the range is not used currently for DU testing (GP 20). Twenty to thirty-five individual mammals or lizards will be sampled from each transect. Air samples and samples of dust in the air fall will be collected in three locations in the active ranges. Thirty to forty-five sediment samples will be collected from different locations in the arroys near the impact areas. DU and Be sampling in the Hard Impact and Soft Impact areas changed only slightly from the existing ERM. The modifications are changes in sample locations, addition of two sediment transport locations, addition of vegetation samples, mammal samples, and air sampling from three to five positions on the impact areas. Analysis of samples for DU or total U by inductively-coupled mass spectroscopy (ICP/MS), cc spectroscopy, neutron activation analysis (NAA), and kinetic phosphorimetric analysis (KPA) are discussed, and analysis for Be by ICP/MS are recommended. Acquiring total U (no isotope data) from a large number of samples and analysis of those samples with relatively high total U concentrations results in fewer isotopic identifications but more information on U distribution. From previous studies, total U concentrations greater than about 3 times natural background are usually DU by isotopic confirmation

  3. Thoracic epidural catheter for postoperative pain control following an ineffective transversus abdominis plane block using liposome bupivacaine

    Directory of Open Access Journals (Sweden)

    Terrien BD

    2017-01-01

    Full Text Available Brian D Terrien,1 David Espinoza,2 Charles C Stehman,3 Gabriel A Rodriguez,1 Nicholas C Connolly1 1Department of Anesthesiology, Naval Medical Center San Diego, 2Surface Warfare Medical Institute, San Diego, 3Department of Anesthesiology, Robert E. Bush Naval Hospital, Twenty Nine Palms, CA, USA Abstract: A 24-year-old female with a history of ulcerative colitis underwent colectomy. The patient received an ineffective transversus abdominis plane (TAP block with liposome bupivacaine (Exparel intraoperatively and was started on a hydromorphone patient-controlled analgesia 5 hours after the TAP block, which did not relieve her pain. A continuous thoracic epidural (CTE was then placed after blood levels of bupivacaine were drawn, and the patient immediately experienced significant pain relief. The combined use of liposome bupivacaine and bupivacaine CTE infusion in the postoperative management of this patient demonstrated no safety concerns, provided excellent analgesia and plasma concentrations of bupivacaine remained far below toxic levels. Keywords: liposome bupivacaine (bupivacaine liposome injectable suspension, plasma bupivacaine levels, transversus abdominis plane (TAP nerve block, thoracic epidural

  4. Convective Leakage Makes Heparin Locking of Central Venous Catheters Ineffective Within Seconds: Experimental Measurements in a Model Superior Vena Cava.

    Science.gov (United States)

    Barbour, Michael C; McGah, Patrick M; Ng, Chin H; Clark, Alicia M; Gow, Kenneth W; Aliseda, Alberto

    2015-01-01

    Central venous catheters (CVCs), placed in the superior vena cava (SVC) for hemodialysis or chemotherapy, are routinely filled while not in use with heparin, an anticoagulant, to maintain patency and prevent thrombus formation at the catheter tip. The heparin-locking procedure, however, places the patient at risk for systemic bleeding, as heparin is known to leak from the catheter into the blood stream. We provide evidence from detailed in vitro experiments that shows the driving mechanism behind heparin leakage to be convective-diffusive transport due to the pulsatile flow surrounding the catheter. This novel mechanism is supported by experimental planar laser-induced fluorescence (PLIF) and particle image velocimetry (PIV) measurements of flow velocity and heparin transport from a CVC placed inside a model SVC inside a pulsatile flow loop. The results predict an initial, fast (<10 s), convection-dominated phase that rapidly depletes the concentration of heparin in the near-tip region, the region of the catheter with side holes. This is followed by a slow, diffusion-limited phase inside the catheter lumen, where the concentration is still high, that is insufficient at replenishing the lost heparin concentration in the near-tip region. The results presented here, which are consistent with previous in vivo estimates of 24 hour leakage rates, predict that the concentration of heparin in the near-tip region is essentially zero for the majority of the interdialytic phase, rendering the heparin locking procedure ineffective.

  5. Improved recognition of ineffective chest compressions after a brief Crew Resource Management (CRM) training: a prospective, randomised simulation study.

    Science.gov (United States)

    Haffner, Leopold; Mahling, Moritz; Muench, Alexander; Castan, Christoph; Schubert, Paul; Naumann, Aline; Reddersen, Silke; Herrmann-Werner, Anne; Reutershan, Jörg; Riessen, Reimer; Celebi, Nora

    2017-03-03

    Chest compressions are a core element of cardio-pulmonary resuscitation. Despite periodic training, real-life chest compressions have been reported to be overly shallow and/or fast, very likely affecting patient outcomes. We investigated the effect of a brief Crew Resource Management (CRM) training program on the correction rate of improperly executed chest compressions in a simulated cardiac arrest scenario. Final-year medical students (n = 57) were randomised to receive a 10-min computer-based CRM or a control training on ethics. Acting as team leaders, subjects performed resuscitation in a simulated cardiac arrest scenario before and after the training. Team members performed standardised overly shallow and fast chest compressions. We analysed how often the team leader recognised and corrected improper chest compressions, as well as communication and resuscitation quality. After the CRM training, team leaders corrected improper chest compressions (35.5%) significantly more often compared with those undergoing control training (7.7%, p = 0.03*). Consequently, four students have to be trained (number needed to treat = 3.6) for one improved chest compression scenario. Communication quality assessed by the Leader Behavior Description Questionnaire significantly increased in the intervention group by a mean of 4.5 compared with 2.0 (p = 0.01*) in the control group. A computer-based, 10-min CRM training improved the recognition of ineffective of chest compressions. Furthermore, communication quality increased. As guideline-adherent chest compressions have been linked to improved patient outcomes, our CRM training might represent a brief and affordable approach to increase chest compression quality and potentially improve patient outcomes.

  6. Early home treatment of childhood fevers with ineffective antimalarials is deleterious in the outcome of severe malaria

    Directory of Open Access Journals (Sweden)

    Olumese Peter E

    2008-07-01

    Full Text Available Abstract Background Early diagnosis and prompt treatment including appropriate home-based treatment of malaria is a major strategy for malaria control. A major determinant of clinical outcome in case management is compliance and adherence to effective antimalarial regimen. Home-based malaria treatment with inappropriate medicines is ineffective and there is insufficient evidence on how this contributes to the outcome of severe malaria. This study evaluated the effects of pre-hospital antimalarial drugs use on the presentation and outcome of severe malaria in children in Ibadan, Nigeria. Methods Two hundred and sixty-eight children with a median age of 30 months comprising 114 children with cerebral malaria and 154 with severe malarial anaemia (as defined by WHO were prospectively enrolled. Data on socio-demographic data, treatments given at home, clinical course and outcome of admission were collected and analysed. Results A total of 168 children had treatment with an antimalarial treatment at home before presenting at the hospital when there was no improvement. There were no significant differences in the haematocrit levels, parasite counts and nutritional status of the pre-hospital treated and untreated groups. The most commonly used antimalarial medicine was chloroquine. Treatment policy was revised to Artemesinin-based Combination Therapy (ACT in 2005 as a response to unacceptable levels of therapeutic failures with chloroquine, however chloroquine use remains high. The risk of presenting as cerebral malaria was 1.63 times higher with pre-hospital use of chloroquine for treatment of malaria, with a four-fold increase in the risk of mortality. Controlling for other confounding factors including age and clinical severity, pre-hospital treatment with chloroquine was an independent predictor of mortality. Conclusion This study showed that, home treatment with chloroquine significantly impacts on the outcome of severe malaria. This finding

  7. Proving the correctness of unfold/fold program transformations using bisimulation

    DEFF Research Database (Denmark)

    Hamilton, Geoff W.; Jones, Neil

    2011-01-01

    by a labelled transition system whose bisimilarity relation is a congruence that coincides with contextual equivalence. Labelled transition systems are well-suited to represent global program behaviour. On the other hand, unfold/fold program transformations use generalization and folding, and neither is easy......This paper shows that a bisimulation approach can be used to prove the correctness of unfold/fold program transformation algorithms. As an illustration, we show how our approach can be use to prove the correctness of positive supercompilation (due to Sørensen et al). Traditional program equivalence...... to describe contextually, due to use of non-local information. We show that weak bisimulation on labelled transition systems gives an elegant framework to prove contextual equivalence of original and transformed programs. One reason is that folds can be seen in the context of corresponding unfolds....

  8. A practical approach to proving waste metals suitable for consignment as radiologically exempt materials - 59266

    International Nuclear Information System (INIS)

    Carvel, Iain; Gunn, Richard D.; Orr, Christopher H.; Strange, Robin

    2012-01-01

    Building 220 at Harwell was built by the Ministry of Works as a Radiochemical Research and Development facility in the latter part of the 1940's. The facility has been operational since 1949 and has been extended several times, most notably the Plutonium Glove Box Wing in the 1950's and the Remote Handling Wing in the 1980's. Only the Remote Handling wing remains operational, processing Historic Waste which is being recovered from storage holes elsewhere on site. The remainder of the facility is undergoing progressive strip out and decommissioning. In the Plutonium Wing and associated areas the waste 'fingerprint' (nuclide vector) consists predominately of alpha emitting radionuclides. Decommissioning and Decontamination (D and D) operations often result in the production of large volumes of scrap metal waste with little or no radioactive contamination. Proving that the waste is clean can be costly and time consuming, as the shape and size of the metallic waste items often means that it is difficult or impossible to monitor all surfaces using conventional hand-held survey meters. This is a particular problem for alpha contamination measurement. Traditional radiological surveying techniques are very labour intensive and involve surveyors checking every surface using hand held instruments and smear sampling the hard to access areas. Even then 100% monitoring cannot be guaranteed. An alternative to traditional methods is the Long Range Alpha Detection (LRAD) technique which remotely detects and measures secondary ionization created in air by alpha particle interactions, allowing extremely low levels of alpha contamination to be measured. A survey system, IonSens R , using the LRAD technique, was developed by BNFL Instruments Ltd (now Babcock Nuclear) which allows rapid surveying of scrap metal for alpha contamination at very low levels. Two versions of this system exist but both essentially comprise a measurement chamber into which scrap metal is placed and sealed

  9. The in-pile proving test for fuel assembly of Qinshan nuclear power plant

    International Nuclear Information System (INIS)

    Chen Dianshan; Zhang Shucheng; Kang Rixin; Wang Huarong; Chen Guanghan

    1989-10-01

    The in-pile proving test for fuel assembly of Qinshan nuclear power plant had been conducted in the experimental loop of HWRR at IAE (Institute of Atomic Energy) in Beijing, China, from January 1985 to December 1986. Average burnup of 27000 MWd/tU and peak burnup of 34000 MWd/tU of fuel rod had already been reached. The basic status of the experiment are described, emphasis is placed on the discussion of proving test parameters and analysis of experiment results

  10. The effect of herbal formula PROVE 1 and Stevia levels in diets on diet utilization of growing pigs

    Directory of Open Access Journals (Sweden)

    Kooprasert, S.

    2007-05-01

    Full Text Available The objective of this experiment was to study the effect of 0.2% antibiotic (ascomix-s®, one kilogram of which contains lincomycin hydrochloride 44 g and sulfamethazine 110 g or 0.25% herbal formulaPROVE 1, combined with five levels of Stevia supplementation in the diets on digestibility of pigs. Two factors; 1 type of drug (0.2% antibiotic and 0.25% herbal formula PROVE 1 and 2 five Stevia levels (0,0.2, 0.4, 0.6 and 0.8% were investigated and 10 dietary treatments were used in this study. Ten related growing crossbred (Large White x Landrace barrow pigs (30±1.5 kg body weight were raised in individualmetabolism cages for three collecting periods (30, 40 and 50 kg body weight, each pig was fed one experimental diet throughout the collecting period.The results showed that pigs fed diet with either 0.2% antibiotic or 0.25% herbal formula PROVE 1 had similar digestibility of diet, crude protein (CP, fiber, ash and nitrogen free extract (NFE (89.01 vs 87.83,94.96 vs 94.23, 60.73 vs 59.03, 61.22 vs 60.44 and 93.28 vs 92.03%, respectively. Negligible differences were observed between 0 and 0.4% Stevia supplementation in diet, but levels showed better digestibility than the other levels of Stevia supplementation, and the diet with 0.4% Stevia supplementation had the highestdigestibility of diet, CP, fiber, ash and NFE (91.04, 96.43, 69.48, 70.47 and 94.07%, respectively. The diet with antibiotic combined with 0.4% Stevia had digestibility of diet, CP, fat and fiber better than the otherlevels of Stevia supplementation, especially digestibility of ash, which was significantly higher than that of diet with 0.2% Stevia, but not significantly different from the other levels of Stevia supplementation. A partof herbal formula PROVE1 combined with 0% Stevia had the highest digestibility of ash (72.90%, significantly higher than the other levels of Stevia supplementation, except the diet with herbal formula PROVE 1combined with 0.4% Stevia supplementation

  11. Origin of choriocarcinoma in previous molar pregnancy proved by DNA analysis

    International Nuclear Information System (INIS)

    Vojtassak, J.; Repiska, V.; Konecna, B.; Zajac, V.; Korbel, M.; Danihel, L.

    1996-01-01

    A 17-year old woman had in a short time period (seven months) a very exciting reproduction history. Molar pregnancy in December 1993, choriocarcinoma in January 1994 and induced abortion in June 1994. DNA analysis proved the origin of the choriocarcinoma in the previous molar pregnancy. (author)

  12. Automatically Proving Termination and Memory Safety for Programs with Pointer Arithmetic

    DEFF Research Database (Denmark)

    Ströder, Thomas; Giesl, Jürgen; Brockschmidt, Marc

    2017-01-01

    While automated verification of imperative programs has been studied intensively, proving termination of programs with explicit pointer arithmetic fully automatically was still an open problem. To close this gap, we introduce a novel abstract domain that can track allocated memory in detail. We use...

  13. Wind tunnel experiments to prove a hydraulic passive torque control concept for variable speed wind turbines

    NARCIS (Netherlands)

    Diepeveen, N.F.B.; Jarquin-Laguna, A.

    2014-01-01

    In this paper the results are presented of experiments to prove an innovative concept for passive torque control of variable speed wind turbines using fluid power technology. It is demonstrated that by correctly configuring the hydraulic drive train, the wind turbine rotor operates at or near

  14. 20 CFR 416.1603 - How to prove you are a resident of the United States.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false How to prove you are a resident of the United States. 416.1603 Section 416.1603 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL... as— (1) Property, income, or other tax forms or receipts; (2) Utility bills, leases or rent payment...

  15. Using eternity variables to specify and prove a serializable database interface

    NARCIS (Netherlands)

    Hesselink, Wim H.

    Eternity variables are introduced to specify and verify serializability of transactions of a distributed database. Eternity variables are a new kind of auxiliary variables. They do not occur in the implementation but are used in specification and verification. Elsewhere it has been proved that

  16. Proving termination of graph transformation systems using weighted type graphs over semirings

    NARCIS (Netherlands)

    Bruggink, H.J.S.; König, B.; Nolte, D.; Zantema, H.; Parisi-Presicce, F.; Westfechtel, B.

    2015-01-01

    We introduce techniques for proving uniform termination of graph transformation systems, based on matrix interpretations for string rewriting. We generalize this technique by adapting it to graph rewriting instead of string rewriting and by generalizing to ordered semirings. In this way we obtain a

  17. Proof and Proving: Logic, Impasses, and the Relationship to Problem Solving

    Science.gov (United States)

    Savic, Milos

    2012-01-01

    Becoming a skillful prover is critical for success in advanced undergraduate and graduate mathematics courses. In this dissertation, I report my investigations of proof and the proving process in three separate studies. In the first study, I examined the amount of logic used in student-constructed proofs to help in the design of…

  18. The Secret Prover : Proving Possession of Arbitrary Files While not Giving Them Away

    NARCIS (Netherlands)

    Teepe, Wouter

    2005-01-01

    The Secret Prover is a Java application which allows a user (A) to prove to another user (B), that A possesses a file. If B also possesses this file B will get convinced, and if B does not possess this file B will gain no information on (the contents of) this file. This is the first implementation

  19. Searching for fixed point combinators by using automated theorem proving: A preliminary report

    International Nuclear Information System (INIS)

    Wos, L.; McCune, W.

    1988-09-01

    In this report, we establish that the use of an automated theorem- proving program to study deep questions from mathematics and logic is indeed an excellent move. Among such problems, we focus mainly on that concerning the construction of fixed point combinators---a problem considered by logicians to be significant and difficult to solve, and often computationally intensive and arduous. To be a fixed point combinator, Θ must satisfy the equation Θx = x(Θx) for all combinators x. The specific questions on which we focus most heavily ask, for each chosen set of combinators, whether a fixed point combinator can be constructed from the members of that set. For answering questions of this type, we present a new, sound, and efficient method, called the kernel method, which can be applied quite easily by hand and very easily by an automated theorem-proving program. For the application of the kernel method by a theorem-proving program, we illustrate the vital role that is played by both paramodulation and demodulation---two of the powerful features frequently offered by an automated theorem-proving program for treating equality as if it is ''understood.'' We also state a conjecture that, if proved, establishes the completeness of the kernel method. From what we can ascertain, this method---which relies on the introduced concepts of kernel and superkernel---offers the first systematic approach for searching for fixed point combinators. We successfully apply the new kernel method to various sets of combinators and, for the set consisting of the combinators B and W, construct an infinite set of fixed point combinators such that no two of the combinators are equal even in the presence of extensionality---a law that asserts that two combinators are equal if they behave the same. 18 refs

  20. Searching for fixed point combinators by using automated theorem proving: A preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    Wos, L.; McCune, W.

    1988-09-01

    In this report, we establish that the use of an automated theorem- proving program to study deep questions from mathematics and logic is indeed an excellent move. Among such problems, we focus mainly on that concerning the construction of fixed point combinators---a problem considered by logicians to be significant and difficult to solve, and often computationally intensive and arduous. To be a fixed point combinator, THETA must satisfy the equation THETAx = x(THETAx) for all combinators x. The specific questions on which we focus most heavily ask, for each chosen set of combinators, whether a fixed point combinator can be constructed from the members of that set. For answering questions of this type, we present a new, sound, and efficient method, called the kernel method, which can be applied quite easily by hand and very easily by an automated theorem-proving program. For the application of the kernel method by a theorem-proving program, we illustrate the vital role that is played by both paramodulation and demodulation---two of the powerful features frequently offered by an automated theorem-proving program for treating equality as if it is ''understood.'' We also state a conjecture that, if proved, establishes the completeness of the kernel method. From what we can ascertain, this method---which relies on the introduced concepts of kernel and superkernel---offers the first systematic approach for searching for fixed point combinators. We successfully apply the new kernel method to various sets of combinators and, for the set consisting of the combinators B and W, construct an infinite set of fixed point combinators such that no two of the combinators are equal even in the presence of extensionality---a law that asserts that two combinators are equal if they behave the same. 18 refs.

  1. Patient profiling can identify patients with adult spinal deformity (ASD) at risk for conversion from nonoperative to surgical treatment: initial steps to reduce ineffective ASD management.

    Science.gov (United States)

    Passias, Peter G; Jalai, Cyrus M; Line, Breton G; Poorman, Gregory W; Scheer, Justin K; Smith, Justin S; Shaffrey, Christopher I; Burton, Douglas C; Fu, Kai-Ming G; Klineberg, Eric O; Hart, Robert A; Schwab, Frank; Lafage, Virginie; Bess, Shay

    2018-02-01

    Non-operative management is a common initial treatment for patients with adult spinal deformity (ASD) despite reported superiority of surgery with regard to outcomes. Ineffective medical care is a large source of resource drain on the health system. Characterization of patients with ASD likely to elect for operative treatment from non-operative management may allow for more efficient patient counseling and cost savings. This study aimed to identify deformity and disability characteristics of patients with ASD who ultimately convert to operative treatment compared with those who remain non-operative and those who initially choose surgery. A retrospective review was carried out. A total of 510 patients with ASD (189 non-operative, 321 operative) with minimum 2-year follow-up comprised the patient sample. Oswestry Disability Index (ODI), Short-Form 36 Health Assessment (SF-36), Scoliosis Research Society questionnaire (SRS-22r), and spinopelvic radiographic alignment were the outcome measures. Demographic, radiographic, and patient-reported outcome measures (PROMs) from a cohort of patients with ASD prospectively enrolled into a multicenter database were evaluated. Patients were divided into three treatment cohorts: Non-operative (NON=initial non-operative treatment and remained non-operative), Operative (OP=initial operative treatment), and Crossover (CROSS=initial non-operative treatment with subsequent conversion to operative treatment). NON and OP groups were propensity score-matched (PSM) to CROSS for baseline demographics (age, body mass index, Charlson Comorbidity Index). Time to crossover was divided into early (1 year). Outcome measures were compared across and within treatment groups at four time points (baseline, 6 weeks, 1 year, and 2 years). Following PSM, 118 patients were included (NON=39, OP=38, CROSS=41). Crossover rate was 21.7% (41/189). Mean time to crossover was 394 days. All groups had similar baseline sagittal alignment, but CROSS had larger

  2. Correlation of Fault Size, Moment Magnitude, and Tsunami Height to Proved Paleo-tsunami Data in Sulawesi Indonesia

    Science.gov (United States)

    Julius, A. M.; Pribadi, S.

    2016-02-01

    Sulawesi (Indonesia) island is located in the meeting of three large plates i.e. Indo-Australia, Pacific, and Eurasia. This configuration surely make high risk on tsunami by earthquake and by sea floor landslide. NOAA and Russia Tsunami Laboratory show more than 20 tsunami data recorded in Sulawesi since 1820. Based on this data, determine of correlation between all tsunami parameter need to be done to proved all event in the past. Complete data of magnitudes, fault sizes and tsunami heights in this study sourced from NOAA and Russia Tsunami database and completed with Pacific Tsunami Warning Center (PTWC) catalog. This study aims to find correlation between fault area, moment magnitude, and tsunami height by simple regression in Sulawesi. The step of this research are data collect, processing, and regression analysis. Result shows very good correlation, each moment magnitude, tsunami heights, and fault parameter i.e. long, wide, and slip are correlate linier. In increasing of fault area, the tsunami height and moment magnitude value also increase. In increasing of moment magnitude, tsunami height also increase. This analysis is enough to proved all Sulawesi tsunami parameter catalog in NOAA, Russia Tsunami Laboratory and PTWC are correct. Keyword: tsunami, magnitude, height, fault

  3. Mathematical Understanding and Proving Abilities: Experiment With Undergraduate Student By Using Modified Moore Learning Approach

    Directory of Open Access Journals (Sweden)

    Rippi Maya

    2011-07-01

    Full Text Available This paper reports findings of  a  post test experimental control group design conducted to investigate the role of modified Moore learning approach  on improving students’ mathematical understanding and proving abilities. Subject of study were 56 undergradute students of one state university in Bandung, who took advanced abstract algebra course. Instrument of study were a set test of mathematical understanding ability, a set test of mathematical proving ability, and a set of students’ opinion scale on modified Moore learning approach. Data were analyzed by using two path ANOVA. The study found that proof construction process was more difficult than mathematical understanding  task  for all students, and students still posed some difficulties on constructing mathematical proof task.  The study also found there were not differences  between students’  abilities on mathematical understanding and on proving abilities of  the both classes, and both abilities were classified as mediocre. However, in modified Moore learning approach class there were more students who got above average grades on mathematical understanding than those of conventional class. Moreover, students performed positive  opinion toward  modified Moore learning approach. They  were  active in questioning and solving problems, and in explaining their works in front of class as well, while students of conventional teaching prefered to listen to lecturer’s explanation. The study also found that there was no interaction between learning approach and students’ prior mathematics ability on mathematical understanding and proving abilities,  but  there were  quite strong  association between students’ mathematical understanding and proving abilities.Keywords:  modified Moore learning approach, mathematical understanding ability, mathematical proving ability. DOI: http://dx.doi.org/10.22342/jme.2.2.751.231-250

  4. Impact of cooking, proving, and baking on the (poly)phenol content of wild blueberry.

    Science.gov (United States)

    Rodriguez-Mateos, Ana; Cifuentes-Gomez, Tania; George, Trevor W; Spencer, Jeremy P E

    2014-05-07

    Accumulating evidence suggests that diets rich in (poly)phenols may have positive effects on human health. Currently there is limited information regarding the effects of processing on the (poly)phenolic content of berries, in particular in processes related to the baking industry. This study investigated the impact of cooking, proving, and baking on the anthocyanin, procyanidin, flavonol, and phenolic acid contents of wild blueberry using HPLC with UV and fluorescence detection. Anthocyanin levels decreased during cooking, proving, and baking, whereas no significant changes were observed for total procyanidins. However, lower molecular weight procyanidins increased and high molecular weight oligomers decreased during the process. Quercetin and ferulic and caffeic acid levels remained constant, whereas increases were found for chlorogenic acid. Due to their possible health benefits, a better understanding of the impact of processing is important to maximize the retention of these phytochemicals in berry-containing products.

  5. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    Science.gov (United States)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  6. Remedial investigation report for J-Field, Aberdeen Proving Ground, Maryland. Volume 3: Ecological risk assessment

    International Nuclear Information System (INIS)

    Hlohowskyj, I.; Hayse, J.; Kuperman, R.; Van Lonkhuyzen, R.

    2000-01-01

    The Environmental Management Division of the U.S. Army Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation (RI) and feasibility study (FS) of the J-Field area at APG, pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. As part of that activity, Argonne National Laboratory (ANL) conducted an ecological risk assessment (ERA) of the J-Field site. This report presents the results of that assessment

  7. Remedial investigation report for J-Field, Aberdeen Proving Ground, Maryland. Volume 3: Ecological risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Hlohowskyj, I.; Hayse, J.; Kuperman, R.; Van Lonkhuyzen, R.

    2000-02-25

    The Environmental Management Division of the U.S. Army Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation (RI) and feasibility study (FS) of the J-Field area at APG, pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. As part of that activity, Argonne National Laboratory (ANL) conducted an ecological risk assessment (ERA) of the J-Field site. This report presents the results of that assessment.

  8. JPSS Preparations at the Satellite Proving Ground for Marine, Precipitation, and Satellite Analysis

    Science.gov (United States)

    Folmer, Michael J.; Berndt, E.; Clark, J.; Orrison, A.; Kibler, J.; Sienkiewicz, J.; Nelson, J.; Goldberg, M.; Sjoberg, W.

    2016-01-01

    The ocean prediction center at the national hurricane center's tropical analysis and forecast Branch, the Weather Prediction center and the Satellite analysis branch of NESDIS make up the Satellite Proving Ground for Marine, Precipitation and Satellite Analysis. These centers had early exposure to JPSS products using the S-NPP Satellite that was launched in 2011. Forecasters continue to evaluate new products in anticipation for the launch of JPSS-1 sometime in 2017.

  9. Conceptualizing reasoning-and-proving opportunities in textbook expositions : Cases from secondary calculus

    OpenAIRE

    Bergwall, Andreas

    2017-01-01

    Several recent textbook studies focus on opportunities to learn reasoning-and-proving. They typically investigate the extent to which justifications are general proofs and what opportunities exist for learning important elements of mathematical reasoning. In this paper, I discuss how a particular analytical framework for this might be refined. Based on an in-depth analysis of certain textbook passages in upper secondary calculus textbooks, I make an account for analytical issues encountered d...

  10. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    Science.gov (United States)

    Wyss, Max

    2013-04-01

    being incorrect for scientific reasons and here I argue that it is also ineffective for psychological reasons. Instead of calming the people or by underestimating the hazard in strongly active areas by the GSHAP approach, they should be told quantitatively the consequences of the reasonably worst case and be motivated to prepare for it, whether or not it may hit the present or the next generation. In a worst case scenario for L'Aquila, the number of expected fatalities and injured should have been calculated for an event in the range of M6.5 to M7, as I did for a civil defense exercise in Umbria, Italy. With the prospect that approximately 500 people may die in an earthquake in the immediate or distant future, some residents might have built themselves an earthquake closet (similar to a simple tornado shelter) in a corner of their apartment, into which they might have dashed to safety at the onset of the P-wave before the destructive S-wave arrived. I conclude that in earthquake prone areas quantitative loss estimates due to a reasonable worst case earthquake should replace probabilistic hazard and risk estimates. This is a service, which experts owe the community. Insurance companies and academics may still find use for probabilistic estimates of losses, especially in areas of low seismic hazard, where the worst case scenario approach is less appropriate.

  11. SPoRT's Participation in the GOES-R Proving Ground Activity

    Science.gov (United States)

    Jedlovec, Gary; Fuell, Kevin; Smith, Matthew; Stano, Geoffrey; Molthan, Andrew

    2011-01-01

    The next generation geostationary satellite, GOES-R, will carry two new instruments with unique atmospheric and surface observing capabilities, the Advanced Baseline Imager (ABI) and the Geostationary Lightning Mapper (GLM), to study short-term weather processes. The ABI will bring enhanced multispectral observing capabilities with frequent refresh rates for regional and full disk coverage to geostationary orbit to address many existing and new forecast challenges. The GLM will, for the first time, provide the continuous monitoring of total lightning flashes over a hemispherical region from space. NOAA established the GOES-R Proving Ground activity several years ago to demonstrate the new capabilities of these instruments and to prepare forecasters for their day one use. Proving Ground partners work closely with algorithm developers and the end user community to develop and transition proxy data sets representing GOES-R observing capabilities. This close collaboration helps to maximize refine algorithms leading to the delivery of a product that effectively address a forecast challenge. The NASA Short-term Prediction Research and Transition (SPoRT) program has been a participant in the NOAA GOES-R Proving Ground activity by developing and disseminating selected GOES-R proxy products to collaborating WFOs and National Centers. Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the SPoRT program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral data from EOS satellites to improve short-term weather forecasts on a regional and local scale. Participation in the Proving Ground activities extends SPoRT s activities and taps its experience and expertise in diagnostic weather analysis, short-term weather forecasting, and the transition of research and experimental

  12. 76 FR 50771 - Submission for Review: RI 25-37, Evidence To Prove Dependency of a Child, 3206-0206

    Science.gov (United States)

    2011-08-16

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: RI 25-37, Evidence To Prove Dependency of a...) 3206-0206, Evidence to Prove Dependency of a Child. As required by the Paperwork Reduction Act of 1995... or faxed to (202) 395-6974. SUPPLEMENTARY INFORMATION: Evidence to Prove Dependency of a Child is...

  13. How cytogenetical methods help victims prove radiation exposure and claim right for social support: NCERM experience

    International Nuclear Information System (INIS)

    Aleksanin, S.; Slozina, N.; Neronova, E.; Smoliakov, E.

    2011-01-01

    Russian citizens who were irradiated because of radiation disasters, nuclear weapons testing and some other sources have a right to some social support and financial compensation. In order to get this compensation people have to prove that they were irradiated. As it is, not all victims for a variety of reasons have formal documents. Thus they apply for cytogenetic investigation to prove irradiation months, years and even decades after irradiation. Since 1992 the cytogenetic investigations related to radiation exposure were performed in NRCERM for more than 700 people. At the beginning of this investigation FISH method was not certified as a biodosimenty test in Russia. Only dicentric analysis was approved as a proof of irradiation. It is known that the rate of dicentrics decrease in time, but the residual level of cytogenetical markers could be revealed a long time after a radiation accident. Thus the dicentric analysis was performed for the people who applied for biological indication of radiation exposure at that time. Rates of dicentrics exceeding control levels were revealed in half the people who applied for radiation conformation. Now FISH method is certified in Russia and both cytogenetic tests of biodosimetry (dicentrics and FISH) are available for all comers. Increased levels of translocations were found in 8 cases (the dose rate from 0.16 to 0.64 Gy). On the basis of the results of cytogenetic tests official documents were supplied to these people and they were entitled to apply for radiation exposure compensation. Thus cytogenetic tests are very effective and in some cases the only possible way for the victims to prove irradiation exposure and to apply for radiation exposure compensation a long time after an accident.

  14. How cytogenetical methods help victims prove radiation exposure and claim right for social support: NCERM experience

    Energy Technology Data Exchange (ETDEWEB)

    Aleksanin, S., E-mail: Aleksanin@arcerm.spb.ru [Nikiforov Russian Center of Emergency and Radiation Medicine EMERCOM of Russia, (NRCERM) ul. Akademika Lebedeva 4/2, 194044 St. Petersburg (Russian Federation); Slozina, N., E-mail: NataliaSlozina@peterlink.ru [Nikiforov Russian Center of Emergency and Radiation Medicine EMERCOM of Russia, (NRCERM) ul. Akademika Lebedeva 4/2, 194044 St. Petersburg (Russian Federation); Neronova, E.; Smoliakov, E. [Nikiforov Russian Center of Emergency and Radiation Medicine EMERCOM of Russia, (NRCERM) ul. Akademika Lebedeva 4/2, 194044 St. Petersburg (Russian Federation)

    2011-09-15

    Russian citizens who were irradiated because of radiation disasters, nuclear weapons testing and some other sources have a right to some social support and financial compensation. In order to get this compensation people have to prove that they were irradiated. As it is, not all victims for a variety of reasons have formal documents. Thus they apply for cytogenetic investigation to prove irradiation months, years and even decades after irradiation. Since 1992 the cytogenetic investigations related to radiation exposure were performed in NRCERM for more than 700 people. At the beginning of this investigation FISH method was not certified as a biodosimenty test in Russia. Only dicentric analysis was approved as a proof of irradiation. It is known that the rate of dicentrics decrease in time, but the residual level of cytogenetical markers could be revealed a long time after a radiation accident. Thus the dicentric analysis was performed for the people who applied for biological indication of radiation exposure at that time. Rates of dicentrics exceeding control levels were revealed in half the people who applied for radiation conformation. Now FISH method is certified in Russia and both cytogenetic tests of biodosimetry (dicentrics and FISH) are available for all comers. Increased levels of translocations were found in 8 cases (the dose rate from 0.16 to 0.64 Gy). On the basis of the results of cytogenetic tests official documents were supplied to these people and they were entitled to apply for radiation exposure compensation. Thus cytogenetic tests are very effective and in some cases the only possible way for the victims to prove irradiation exposure and to apply for radiation exposure compensation a long time after an accident.

  15. Initial building investigations at Aberdeen Proving Ground, Maryland: Objectives and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Dougherty, J.M.; McGinnis, L.D.

    1994-12-01

    As part of an environmental-contamination source-definition program at Aberdeen Proving Ground, detailed internal and external inspections of 23 potentially contaminated buildings are being conducted to describe and characterize the state of each building as it currently exists and to identify areas potentially contaminated with toxic or other hazardous substances. In addition, a detailed geophysical investigation is being conducted in the vicinity of each target building to locate and identify subsurface structures, associated with former building operations, that are potential sources of contamination. This report describes the objectives of the initial building inspections, including the geophysical investigations, and discusses the methodology that has been developed to achieve these objectives.

  16. Using aetnanova to formally prove that the Davis-Putnam satisfiability test is correct

    Directory of Open Access Journals (Sweden)

    Eugenio G. Omodeo

    2008-05-01

    Full Text Available This paper reports on using the ÆtnaNova/Referee proof-verification system to formalize issues regarding the satisfiability of CNF-formulae of propositional logic. We specify an “archetype” version of the Davis-Putnam-Logemann-Loveland algorithm through the THEORY of recursive functions based on a well-founded relation, and prove it to be correct.Within the same framework, and by resorting to the Zorn lemma, we develop a straightforward proof of the compactness theorem.

  17. Divide and conquer method for proving gaps of frustration free Hamiltonians

    DEFF Research Database (Denmark)

    Kastoryano, Michael J.; Lucia, Angelo

    2018-01-01

    Providing system-size independent lower bounds on the spectral gap of local Hamiltonian is in general a hard problem. For the case of finite-range, frustration free Hamiltonians on a spin lattice of arbitrary dimension, we show that a property of the ground state space is sufficient to obtain...... such a bound. We furthermore show that such a condition is necessary and equivalent to a constant spectral gap. Thanks to this equivalence, we can prove that for gapless models in any dimension, the spectral gap on regions of diameter $n$ is at most $o\\left(\\frac{\\log(n)^{2+\\epsilon}}{n}\\right)$ for any...... positive $\\epsilon$....

  18. Remediation application strategies for depleted uranium contaminated soils at the US Army Yuma Proving Ground

    International Nuclear Information System (INIS)

    Vandel, D.S.; Medina, S.M.; Weidner, J.R.

    1994-03-01

    The US Army Yuma Proving Ground (YPG), located in the southwest portion of Arizona conducts firing of projectiles into the Gunpoint (GP-20) firing range. The penetrators are composed of titanium and DU. The purpose of this project was to determine feasible cleanup technologies and disposal alternatives for the cleanup of the depleted uranium (DU) contaminated soils at YPG. The project was split up into several tasks that include (a) collecting and analyzing samples representative of the GP-20 soils, (b) evaluating the data results, (c) conducting a literature search of existing proven technologies for soil remediation, and (0) making final recommendations for implementation of this technology to the site. As a result of this study, several alternatives for the separation, treatment, and disposal procedures are identified that would result in meeting the cleanup levels defined by the Nuclear Regulatory Commission for unrestricted use of soils and would result in a significant cost savings over the life of the firing range

  19. Proving Continuity of Coinductive Global Bisimulation Distances: A Never Ending Story

    Directory of Open Access Journals (Sweden)

    David Romero-Hernández

    2015-12-01

    Full Text Available We have developed a notion of global bisimulation distance between processes which goes somehow beyond the notions of bisimulation distance already existing in the literature, mainly based on bisimulation games. Our proposal is based on the cost of transformations: how much we need to modify one of the compared processes to obtain the other. Our original definition only covered finite processes, but a coinductive approach allows us to extend it to cover infinite but finitary trees. After having shown many interesting properties of our distance, it was our intention to prove continuity with respect to projections, but unfortunately the issue remains open. Nonetheless, we have obtained several partial results that are presented in this paper.

  20. Fractal geometry as a new approach for proving nanosimilarity: a reflection note.

    Science.gov (United States)

    Demetzos, Costas; Pippa, Natassa

    2015-04-10

    Nanosimilars are considered as new medicinal outcomes combining the generic drugs and the nanocarrier as an innovative excipient, in order to evaluate them as final products. They belong to the grey area - concerning the evaluation process - between generic drugs and biosimilar medicinal products. Generic drugs are well documented and a huge number of them are in market, replacing effectively the off-patent drugs. The scientific approach for releasing them to the market is based on bioequivalence studies, which are well documented and accepted by the regulatory agencies. On the other hand, the structural complexity of biological/biotechnology-derived products demands a new approach for the approval process taking into consideration that bioequivalence studies are not considered as sufficient as in generic drugs, and new clinical trials are needed to support their approval process of the product to the market. In proportion, due to technological complexity of nanomedicines, the approaches for proving the statistical identity or the similarity for generic and biosimilar products, respectively, with those of prototypes, are not considered as effective for nanosimilar products. The aim of this note is to propose a complementary approach which can provide realistic evidences concerning the nanosimilarity, based on fractal analysis. This approach is well fit with the structural complexity of nanomedicines and smooths the difficulties for proving the similarity between off-patent and nanosimilar products. Fractal analysis could be considered as the approach that completely characterizes the physicochemical/morphological characteristics of nanosimilar products and could be proposed as a start point for a deep discussion on nanosimilarity. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Fake News: A Technological Approach to Proving the Origins of Content, Using Blockchains.

    Science.gov (United States)

    Huckle, Steve; White, Martin

    2017-12-01

    In this article, we introduce a prototype of an innovative technology for proving the origins of captured digital media. In an era of fake news, when someone shows us a video or picture of some event, how can we trust its authenticity? It seems that the public no longer believe that traditional media is a reliable reference of fact, perhaps due, in part, to the onset of many diverse sources of conflicting information, via social media. Indeed, the issue of "fake" reached a crescendo during the 2016 U.S. Presidential Election, when the winner, Donald Trump, claimed that The New York Times was trying to discredit him by pushing disinformation. Current research into overcoming the problem of fake news does not focus on establishing the ownership of media resources used in such stories-the blockchain-based application introduced in this article is technology that is capable of indicating the authenticity of digital media. Put simply, using the trust mechanisms of blockchain technology, the tool can show, beyond doubt, the provenance of any source of digital media, including images used out of context in attempts to mislead. Although the application is an early prototype and its capability to find fake resources is somewhat limited, we outline future improvements that would overcome such limitations. Furthermore, we believe that our application (and its use of blockchain technology and standardized metadata) introduces a novel approach to overcoming falsities in news reporting and the provenance of media resources used therein. However, while our application has the potential to be able to verify the originality of media resources, we believe that technology is only capable of providing a partial solution to fake news. That is because it is incapable of proving the authenticity of a news story as a whole. We believe that takes human skills.

  2. What did all the money do? On the general ineffectiveness of recent West German labour market programmes

    OpenAIRE

    Wunsch, Conny; Lechner, Michael

    2007-01-01

    We provide new evidence on the effectiveness of West German labour market programmes by evaluating training and employment programmes that have been conducted 2000-2002 after the first large reform of German labour market policy in 1998. We employ exceptionally rich administrative data that allow us to use microeconometric matching methods and to estimate interesting effects for different types of programmes and participants at a rather disaggregated level. We find that, on average, all progr...

  3. When Safe Proved Risky: Commercial Paper during the Financial Crisis of 2007-2009

    OpenAIRE

    Marcin Kacperczyk; Philipp Schnabl

    2010-01-01

    Commercial paper is a short-term debt instrument issued by large corporations. The commercial paper market has long been viewed as a bastion of high liquidity and low risk. But twice during the financial crisis of 2007-2009, the commercial paper market nearly dried up and ceased being perceived as a safe haven. Major interventions by the Federal Reserve, including large outright purchases of commercial paper, were eventually used to support both issuers of and investors in commercial paper. W...

  4. Large field radiotherapy

    International Nuclear Information System (INIS)

    Vanasek, J.; Chvojka, Z.; Zouhar, M.

    1984-01-01

    Calculations may prove that irradiation procedures, commonly used in radiotherapy and represented by large-capacity irradiation techniques, do not exceed certain limits of integral doses with favourable radiobiological action on the organism. On the other hand integral doses in supralethal whole-body irradiation, used in the therapy of acute leukemia, represent radiobiological values which without extreme and exceptional further interventions and teamwork are not compatible with life, and the radiotherapeutist cannot use such high doses without the backing of a large team. (author)

  5. 76 FR 22938 - Submission for Review: RI 25-37, Evidence To Prove Dependency of a Child

    Science.gov (United States)

    2011-04-25

    ... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: RI 25-37, Evidence To Prove Dependency of a..., Evidence to Prove Dependency of a Child. As required by the Paperwork Reduction Act of 1995 (Pub. L. 104-13... Dependency of a Child, is designed to collect sufficient information for the Office of Personnel Management...

  6. Contamination source review for Building E2370, Edgewood Area, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    O`Reilly, D.P.; Glennon, M.A.; Draugelis, A.K.; Rueda, J.; Zimmerman, R.E.

    1995-09-01

    The US Army Aberdeen Proving Ground (APG) commissioned Argonne National Laboratory (ANL) to conduct a contamination source review to identify and define areas of toxic or hazardous contaminants and to assess the physical condition and accessibility of APG buildings. The information obtained from this review may be used to assist the US Army in planning for the future use or disposition of the buildings. The contamination source review consisted of the following tasks: historical records search, physical inspection, photographic documentation, and geophysical investigation. This report provides the results of the contamination source review for Building E2370. Many of the APG facilities constructed between 1917 and the 1960s are no longer used because of obsolescence and their poor state of repair. Because many of these buildings were used for research, development, testing, and/or pilot-scale production of chemical warfare agents and other military substances, the potential exists for portions of the buildings to be contaminated with these substances, their degradation products, and other laboratory or industrial chemicals. These buildings and associated structures or appurtenances may contribute to environmental concerns at APG.

  7. Interim progress report -- geophysics: Decommissioning of Buildings E5974 and E5978, Aberdeen Proving Ground

    International Nuclear Information System (INIS)

    McGinnis, M.G.; McGinnis, L.D.; Miller, S.F.; Thompson, M.D.

    1992-11-01

    Buildings E5974 and E5978, located near the mouth of Canal Creek, were among 10 potentially contaminated sites in the Westwood and Canal Creek areas of the Edgewood section of Aberdeen Proving Ground examined by a geophysical team from Argonne National Laboratory in April and May of 1992. Noninvasive geophysical surveys, including the complementary technologies of magnetics, electrical resistivity, and ground-penetrating radar, were conducted around the perimeters of the buildings to guide a sampling program prior to decommissioning and dismantling. The magnetic anomalies and the electrically conductive areas around these buildings have a spatial relationship similar to that observed in low-lying sites in the Canal Creek area; they are probably associated with construction fill. Electrically conductive terrain is dominant on the eastern side of the site, and resistive terrain predominates on the west. The smaller magnetic anomalies are not imaged with ground radar or by electrical profiling. The high resistivities in the northwest quadrant are believed to be caused by a natural sand lens. The causes of three magnetic anomalies in the high-resistivity area are unidentified, but they are probably anthropogenic

  8. A Mechanically Proved and an Incremental Development of the Session Initiation Protocol INVITE Transaction

    Directory of Open Access Journals (Sweden)

    Rajaa Filali

    2014-01-01

    Full Text Available The Session Initiation Protocol (SIP is an application layer signaling protocol used to create, manage, and terminate sessions in an IP based network. SIP is considered as a transactional protocol. There are two main SIP transactions, the INVITE transaction and the non-INVITE transaction. The SIP INVITE transaction specification is described in an informal way in Request for Comments (RFC 3261 and modified in RFC 6026. In this paper we focus on the INVITE transaction of SIP, over reliable and unreliable transport mediums, which is used to initiate a session. In order to ensure the correctness of SIP, the INVITE transaction is modeled and verified using event-B method and its Rodin platform. The Event-B refinement concept allows an incremental development by defining the studied system at different levels of abstraction, and Rodin discharges almost all proof obligations at each level. This interaction between modeling and proving reduces the complexity and helps in assuring that the INVITE transaction SIP specification is correct, unambiguous, and easy to understand.

  9. The written mathematical communication profile of prospective math teacher in mathematical proving

    Science.gov (United States)

    Pantaleon, K. V.; Juniati, D.; Lukito, A.; Mandur, K.

    2018-01-01

    Written mathematical communication is the process of expressing mathematical ideas and understanding in writing. It is one of the important aspects that must be mastered by the prospective math teacher as tool of knowledge transfer. This research was a qualitative research that aimed to describe the mathematical communication profile of the prospective mathematics teacher in mathematical proving. This research involved 48 students of Mathematics Education Study Program; one of them with moderate math skills was chosen as the main subject. Data were collected through tests, assignments, and task-based interviews. The results of this study point out that in the proof of geometry, the subject explains what is understood, presents the idea in the form of drawing and symbols, and explains the content/meaning of a representation accurately and clearly, but the subject can not convey the argument systematically and logically. Whereas in the proof of algebra, the subject describes what is understood, explains the method used, and describes the content/meaning of a symbolic representation accurately, systematically, logically, but the argument presented is not clear because it is insufficient detailed and complete.

  10. Hydrogeologic and chemical data for the O-Field area, Aberdeen Proving Ground, Maryland

    International Nuclear Information System (INIS)

    Nemoff, P.R.; Vroblesky, D.A.

    1989-01-01

    O-Field, located at the Edgewood area of Aberdeen Proving Ground, Maryland, was periodically used for disposal of munitions, waste chemicals, and chemical-warfare agents from World War II through the 1950's. This report includes various physical, geologic, chemical, and hydrologic data obtained from well-core, groundwater, surface water, and bottom-sediment sampling sites at and near the O-Field disposal area. The data are presented in tables and hydrographs. Three site-location maps are also included. Well-core data include lithologic logs for 11 well-cluster sites, grain-size distributions, various chemical characteristics, and confining unit characteristics. Groundwater data include groundwater chemistry, method blanks for volatile organic carbon, available data on volatile and base/neutral organics, and compilation of corresponding method blanks, chemical-warfare agents, explosive-related products, radionuclides, herbicides, and groundwater levels. Surface-water data include field-measured characteristics; concentrations of various inorganic constituents including arsenic; selected organic constituents with method blanks; detection limits of organics; and a compilation of information on corresponding acids, volatiles, and semivolatiles; and method blanks corresponding to acids, volatiles, and semivolatiles. A set of 15 water-level hydrographs for the period March 1986 through September 1987 also is included in the report. 3 refs., 18 figs., 24 tabs

  11. Modeling exposure to depleted uranium in support of decommissioning at Jefferson Proving Ground, Indiana

    Energy Technology Data Exchange (ETDEWEB)

    Ebinger, M.H. [Los Alamos National Lab., NM (United States); Oxenburg, T.P. [Army Test and Evaluation Command, Aberdeen Proving Ground, MD (United States)

    1997-02-01

    Jefferson Proving Ground was used by the US Army Test and Evaluation Command for testing of depleted uranium munitions and closed in 1995 under the Base Realignment and Closure Act. As part of the closure of JPG, assessments of potential adverse health effects to humans and the ecosystem were conducted. This paper integrates recent information obtained from site characterization surveys at JPG with environmental monitoring data collected from 1983 through 1994 during DU testing. Three exposure scenarios were evaluated for potential adverse effects to human health: an occasional use scenario and two farming scenarios. Human exposure was minimal from occasional use, but significant risk were predicted from the farming scenarios when contaminated groundwater was used by site occupants. The human health risk assessments do not consider the significant risk posed by accidents with unexploded ordnance. Exposures of white-tailed deer to DU were also estimated in this study, and exposure rates result in no significant increase in either toxicological or radiological risks. The results of this study indicate that remediation of the DU impact area would not substantially reduce already low risks to humans and the ecosystem, and that managed access to JPG is a reasonable model for future land use options.

  12. Contamination source review for Building E3236, Edgewood Area, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Zellmer, S.D.; Smits, M.P.; Draugelis, A.K.; Glennon, M.A.; Rueda, J.; Zimmerman, R.E.

    1995-09-01

    The US Army Aberdeen Proving Ground (APG) commissioned Argonne National Laboratory (ANL) to conduct a contamination source review to identify and define areas of toxic or hazardous contaminants and to assess the physical condition and accessibility of APG buildings. The information obtained from the review may be used to assist the US Army in planning for the future use or disposition of the buildings. The contamination source review consisted of the following tasks: historical records search, physical inspection, photographic documentation, geophysical investigation, and review of available records regarding underground storage tanks associated with each building. This report provides the results of the contamination source review for Building E3236. Many of the APG facilities constructed between 1917 and the 1960s are no longer used because of obsolescence and their poor state of repair. Because many of these buildings were used for research, development, testing, and/or pilot- scale production of chemical warfare agents and other military substances, the potential exists for portions of the buildings to be contaminated with these substances, their degradation products, and other laboratory or industrial chemicals. These buildings and associated structures or appurtenances may contribute to environmental concerns at APG.

  13. Remedial investigation report for J-Field, Aberdeen Proving Ground, Maryland. Volume 1: Remedial investigation results

    International Nuclear Information System (INIS)

    Yuen, C. R.; Martino, L. E.; Biang, R. P.; Chang, Y. S.; Dolak, D.; Van Lonkhuyzen, R. A.; Patton, T. L.; Prasad, S.; Quinn, J.; Rosenblatt, D. H.; Vercellone, J.; Wang, Y. Y.

    2000-01-01

    This report presents the results of the remedial investigation (RI) conducted at J-Field in the Edgewood Area of Aberdeen Proving Ground (APG), a U.S. Army installation located in Harford County, Maryland. Since 1917, activities in the Edgewood Area have included the development, manufacture, and testing of chemical agents and munitions and the subsequent destruction of these materials at J-Field by open burning and open detonation. These activities have raised concerns about environmental contamination at J-Field. This RI was conducted by the Environmental Conservation and Restoration Division, Directorate of Safety, Health and Environmental Division of APG, pursuant to requirements outlined under the Comprehensive Environmental Response, Compensation, and Liability Act, as amended (CERCLA). The RI was accomplished according to the procedures developed by the U.S. Environmental Protection Agency (EPA 1988). The RI provides a comprehensive evaluation of the site conditions, nature of contaminants present, extent of contamination, potential release mechanisms and migration pathways, affected populations, and risks to human health and the environment. This information will be used as the basis for the design and implementation of remedial actions to be performed during the remedial action phase, which will follow the feasibility study (FS) for J-Field

  14. How to prove the Earth's daily and annual direction of its spinning

    Directory of Open Access Journals (Sweden)

    Drago Špoljarić

    2014-12-01

    Full Text Available Every day, we can observe the Sun's apparent motion around the sky. It rises in the east, gets to its highest point above the horizon at noon, and sets in the west. The stars appear to be fixed on the sky and move around apparently together with the Sun. We have daytime1 and night. The apparent annual motion of the Sun results in seasons when we can see different stars. These directly visible daily and annual changes result from real Earth’s motions – the Earth’s daily and annual spinning (rotation and revolution and they are not easily explainable without understanding the Earth’s motions. In order to understand the apparent daily and annual motions and motion direction of the Sun and stars (night sky, it is very important to know where we are on the Earth, what is our geographic position, i.e. to know the cardinal points. At the same time, one should take into consideration also the direction of the Earth’s rotation and revolution. What is the Earth’s daily or annual direction of spinning as related to the direction of clock hands, and how do we prove it?

  15. Remedial investigation report for J-Field, Aberdeen Proving Ground, Maryland. Volume 1: Remedial investigation results

    Energy Technology Data Exchange (ETDEWEB)

    Yuen, C. R.; Martino, L. E.; Biang, R. P.; Chang, Y. S.; Dolak, D.; Van Lonkhuyzen, R. A.; Patton, T. L.; Prasad, S.; Quinn, J.; Rosenblatt, D. H.; Vercellone, J.; Wang, Y. Y.

    2000-03-14

    This report presents the results of the remedial investigation (RI) conducted at J-Field in the Edgewood Area of Aberdeen Proving Ground (APG), a U.S. Army installation located in Harford County, Maryland. Since 1917, activities in the Edgewood Area have included the development, manufacture, and testing of chemical agents and munitions and the subsequent destruction of these materials at J-Field by open burning and open detonation. These activities have raised concerns about environmental contamination at J-Field. This RI was conducted by the Environmental Conservation and Restoration Division, Directorate of Safety, Health and Environmental Division of APG, pursuant to requirements outlined under the Comprehensive Environmental Response, Compensation, and Liability Act, as amended (CERCLA). The RI was accomplished according to the procedures developed by the U.S. Environmental Protection Agency (EPA 1988). The RI provides a comprehensive evaluation of the site conditions, nature of contaminants present, extent of contamination, potential release mechanisms and migration pathways, affected populations, and risks to human health and the environment. This information will be used as the basis for the design and implementation of remedial actions to be performed during the remedial action phase, which will follow the feasibility study (FS) for J-Field.

  16. Transgenic Drosophila simulans strains prove the identity of the speciation gene Lethal hybrid rescue.

    Science.gov (United States)

    Prigent, Stéphane R; Matsubayashi, Hiroshi; Yamamoto, Masa-Toshi

    2009-10-01

    Speciation genes are responsible for genetic incompatibilities in hybrids of incipient species and therefore participate in reproductive isolation leading to complete speciation. Hybrid males between Drosophila melanogaster females and D. simulans males die at late larval or prepupal stages due to a failure in chromosome condensation during mitosis. However a mutant male of D. simulans, named Lethal hybrid rescue (Lhr), produces viable hybrid males when crossed to females of D. melanogaster. Recently the Lhr gene has been proposed as corresponding to the CG18468 gene in D. melanogaster. However this identification relied on sequence characteristics more than on a precise mapping and the use of the GAL4/UAS system to drive the transgene in D. melanogaster might have increased the complexity of interaction. Thus here we propose an independent identification of the Lhr gene based on a more precise mapping and transgenic experiments in D. simulans. We have mapped the Lhr gene by using Single Nucleotide Polymorphisms (SNPs) and identified within the candidate region the gene homologous to CG18468 as the Lhr gene as it was previously reported. Transgenic experiments in D. simulans with the native promoter of CG18468 prove that it is the Lhr gene of D. simulans by inducing the lethality of the hybrid males.

  17. Review of analytical results from the proposed agent disposal facility site, Aberdeen Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Reed, L.L.; Myers, S.W.; Shepard, L.T.; Sydelko, T.G.

    1997-09-01

    Argonne National Laboratory reviewed the analytical results from 57 composite soil samples collected in the Bush River area of Aberdeen Proving Ground, Maryland. A suite of 16 analytical tests involving 11 different SW-846 methods was used to detect a wide range of organic and inorganic contaminants. One method (BTEX) was considered redundant, and two {open_quotes}single-number{close_quotes} methods (TPH and TOX) were found to lack the required specificity to yield unambiguous results, especially in a preliminary investigation. Volatile analytes detected at the site include 1, 1,2,2-tetrachloroethane, trichloroethylene, and tetrachloroethylene, all of which probably represent residual site contamination from past activities. Other volatile analytes detected include toluene, tridecane, methylene chloride, and trichlorofluoromethane. These compounds are probably not associated with site contamination but likely represent cross-contamination or, in the case of tridecane, a naturally occurring material. Semivolatile analytes detected include three different phthalates and low part-per-billion amounts of the pesticide DDT and its degradation product DDE. The pesticide could represent residual site contamination from past activities, and the phthalates are likely due, in part, to cross-contamination during sample handling. A number of high-molecular-weight hydrocarbons and hydrocarbon derivatives were detected and were probably naturally occurring compounds. 4 refs., 1 fig., 8 tabs.

  18. Ecological risk assessment of depleted uranium in the environment at Aberdeen Proving Ground

    International Nuclear Information System (INIS)

    Clements, W.H.; Kennedy, P.L.; Myers, O.B.

    1993-01-01

    A preliminary ecological risk assessment was conducted to evaluate the effects of depleted uranium (DU) in the Aberdeen Proving Ground (APG) ecosystem and its potential for human health effects. An ecological risk assessment of DU should include the processes of hazard identification, dose-response assessment, exposure assessment, and risk characterization. Ecological risk assessments also should explicitly examine risks incurred by nonhuman as well as human populations, because risk assessments based only on human health do not always protect other species. To begin to assess the potential ecological risk of DU release to the environment we modeled DU transport through the principal components of the aquatic ecosystem at APG. We focused on the APG aquatic system because of the close proximity of the Chesapeake Bay and concerns about potential impacts on this ecosystem. Our objective in using a model to estimate environmental fate of DU is to ultimately reduce the uncertainty about predicted ecological risks due to DU from APG. The model functions to summarize information on the structure and functional properties of the APG aquatic system, to provide an exposure assessment by estimating the fate of DU in the environment, and to evaluate the sources of uncertainty about DU transport

  19. Experimental study of soil-structure interaction for proving the three dimensional thin layered element method

    International Nuclear Information System (INIS)

    Kuwabara, Y.; Ogiwara, Y.; Suzuki, T.; Tsuchiya, H.; Nakayama, M.

    1981-01-01

    It is generally recognized that the earthquake response of a structure can be significantly affected by the dynamic interaction between the structure and the surrounding soil. Dynamic soil-structure interaction effects are usually analyzed by using a lumped mass model or a finite element model. In the lumped mass model, the soil is represented by springs and dashpots based on the half-space elastic theory. Each model has its advantages and limitations. The Three Dimensional Thin Layered Element Theory has been developed by Dr. Hiroshi Tajimi based on the combined results of the abovementioned lumped mass model and finite element model. The main characteristic of this theory is that, in consideration and can be applied in the analysis of many problems in soil-structure interaction, such as those involving radiation damping, embedded structures, and multi-layered soil deposits. This paper describes test results on a small scale model used to prove the validity of the computer program based on the Thin Layered Element Theory. As a numerical example, the response analysis of a PWR nuclear power plant is carried out using this program. The vibration test model is simplified and the scale is 1/750 for line. The soil layer of the model is made of congealed gelatine. The test soil layer is 80 cm long, 35 cm wide and 10 cm thick. The super structure is a one mass model made of metal sheet spring and solid mass metal. As fixed inputs, sinusoidal waves (10, 20 gal level) are used. The displacements of the top and base of the super structure, and the accelerations and the displacements of the shaking table are measured. The main parameter of the test is the shear wave velocity of the soil layer. (orig./RW)

  20. Potential health impacts from range fires at Aberdeen Proving Ground, Maryland

    International Nuclear Information System (INIS)

    Willians, G.P.; Hermes, A.M.; Policastro, A.J.; Hartmann, H.M.; Tomasko, D.

    1998-03-01

    This study uses atmospheric dispersion computer models to evaluate the potential for human health impacts from exposure to contaminants that could be dispersed by fires on the testing ranges at Aberdeen Proving Ground, Maryland. It was designed as a screening study and does not estimate actual human health risks. Considered are five contaminants possibly present in the soil and vegetation from past human activities at APG--lead, arsenic, trichloroethylene (TCE), depleted uranium (DU), and dichlorodiphenyltrichloroethane (DDT); and two chemical warfare agents that could be released from unexploded ordnance rounds heated in a range fire--mustard and phosgene. For comparison, dispersion of two naturally occurring compounds that could be released by burning of uncontaminated vegetation--vinyl acetate and 2-furaldehyde--is also examined. Data from previous studies on soil contamination at APG are used in conjunction with conservative estimates about plant uptake of contaminants, atmospheric conditions, and size and frequency of range fires at APG to estimate dispersion and possible human exposure. The results are compared with US Environmental Protection Agency action levels. The comparisons indicate that for all of the anthropogenic contaminants except arsenic and mustard, exposure levels would be at least an order of magnitude lower than the corresponding action levels. Because of the compoundingly conservative nature of the assumptions made, they conclude that the potential for significant human health risks from range fires is low. The authors recommend that future efforts be directed at fire management and control, rather than at conducting additional studies to more accurately estimate actual human health risk from range fires

  1. JPSS Preparations at the Satellite Proving Ground for Marine, Precipitation, and Satellite Analysis

    Science.gov (United States)

    Folmer, M. J.; Berndt, E.; Clark, J.; Orrison, A.; Kibler, J.; Sienkiewicz, J. M.; Nelson, J. A., Jr.; Goldberg, M.

    2016-12-01

    The National Oceanic and Atmospheric Administration (NOAA) Satellite Proving Ground (PG) for Marine, Precipitation, and Satellite Analysis (MPS) has been demonstrating and evaluating Suomi National Polar-orbiting Partnership (S-NPP) products along with other polar-orbiting satellite platforms in preparation for the Joint Polar Satellite System - 1 (JPSS-1) launch in March 2017. The first S-NPP imagery was made available to the MPS PG during the evolution of Hurricane Sandy in October 2012 and has since been popular in operations. Since this event the MPS PG Satellite Liaison has been working with forecasters on ways to integrate single-channel and multispectral imagery from the Visible Infrared Imaging Radiometer Suite (VIIRS), the Moderate Resolution Imaging Spectroradiometer (MODIS), and the Advanced Very High Resolution Radiometer (AVHRR)into operations to complement numerical weather prediction and geostationary satellite savvy National Weather Service (NWS) National Centers. Additional unique products have been introduced to operations to address specific forecast challenges, including the Cooperative Institute for Research in the Atmosphere (CIRA) Layered Precipitable Water, the National Environmental Satellite, Data, and Information Service (NESDIS) Snowfall Rate product, NOAA Unique Combined Atmospheric Processing System (NUCAPS) Soundings, ozone products from the Atmospheric Infrared Sounder (AIRS), Cross-track Infrared Sounder/Advanced Technology Microwave Sounder (CrIS/ATMS), and Infrared Atmospheric Sounding Interferometer (IASI). In addition, new satellite domains have been created to provide forecasters at the NWS Ocean Prediction Center and Weather Prediction Center with better quality imagery at high latitudes. This has led to research projects that are addressing forecast challenges such as tropical to extratropical transition and explosive cyclogenesis. This presentation will provide examples of how the MPS PG has been introducing and integrating

  2. Work plan for conducting an ecological risk assessment at J-Field, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Hlohowskyj, I.; Hayse, J.; Kuperman, R. [Argonne National Lab., IL (United States). Environmental Assessment Div.] [and others

    1995-03-01

    The Environmental Management Division of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. J-Field is within the Edgewood Area of APG in Harford County, Maryland, and activities at the Edgewood Area since World War II have included the development, manufacture, testing, and destruction of chemical agents and munitions. The J-Field site was used to destroy chemical agents and munitions by open burning and open detonation. This work plan presents the approach proposed to conduct an ecological risk assessment (ERA) as part of the RI/FS program at J-Field. This work plan identifies the locations and types of field studies proposed for each area of concern (AOC), the laboratory studies proposed to evaluate toxicity of media, and the methodology to be used in estimating doses to ecological receptors and discusses the approach that will be used to estimate and evaluate ecological risks at J-Field. Eight AOCs have been identified at J-Field, and the proposed ERA is designed to evaluate the potential for adverse impacts to ecological receptors from contaminated media at each AOC, as well as over the entire J-Field site. The proposed ERA approach consists of three major phases, incorporating field and laboratory studies as well as modeling. Phase 1 includes biotic surveys of the aquatic and terrestrial habitats, biological tissue sampling and analysis, and media toxicity testing at each AOC and appropriate reference locations. Phase 2 includes definitive toxicity testing of media from areas of known or suspected contamination or of media for which the Phase 1 results indicate toxicity or adverse ecological effects. In Phase 3, the uptake models initially developed in Phase 2 will be finalized, and contaminant dose to each receptor from all complete pathways will be estimated.

  3. Potential health impacts from range fires at Aberdeen Proving Ground, Maryland.

    Energy Technology Data Exchange (ETDEWEB)

    Willians, G.P.; Hermes, A.M.; Policastro, A.J.; Hartmann, H.M.; Tomasko, D.

    1998-03-01

    This study uses atmospheric dispersion computer models to evaluate the potential for human health impacts from exposure to contaminants that could be dispersed by fires on the testing ranges at Aberdeen Proving Ground, Maryland. It was designed as a screening study and does not estimate actual human health risks. Considered are five contaminants possibly present in the soil and vegetation from past human activities at APG--lead, arsenic, trichloroethylene (TCE), depleted uranium (DU), and dichlorodiphenyltrichloroethane (DDT); and two chemical warfare agents that could be released from unexploded ordnance rounds heated in a range fire--mustard and phosgene. For comparison, dispersion of two naturally occurring compounds that could be released by burning of uncontaminated vegetation--vinyl acetate and 2-furaldehyde--is also examined. Data from previous studies on soil contamination at APG are used in conjunction with conservative estimates about plant uptake of contaminants, atmospheric conditions, and size and frequency of range fires at APG to estimate dispersion and possible human exposure. The results are compared with US Environmental Protection Agency action levels. The comparisons indicate that for all of the anthropogenic contaminants except arsenic and mustard, exposure levels would be at least an order of magnitude lower than the corresponding action levels. Because of the compoundingly conservative nature of the assumptions made, they conclude that the potential for significant human health risks from range fires is low. The authors recommend that future efforts be directed at fire management and control, rather than at conducting additional studies to more accurately estimate actual human health risk from range fires.

  4. Generator, mechanical, smoke: For dual-purpose unit, XM56, Yuma Proving Ground, Yuma, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Driver, C.J.; Ligotke, M.W.; Moore, E.B. Jr. (Pacific Northwest Lab., Richland, WA (United States)); Bowers, J.F. (Dugway Proving Ground, UT (United States))

    1991-10-01

    The US Army Chemical Research, Development and Engineering Center (CRDEC) is planning to perform a field test of the XM56 smoke generator at the US Army Yuma Proving Ground (YPG), Arizona. The XM56, enabling the use of fog oil in combination with other materials, such as graphite flakes, is part of an effort to improve the efficiency of smoke generation and to extend the effectiveness of the resulting obscurant cloud to include the infrared spectrum. The plan field operation includes a road test and concurrent smoke- generation trials. Three M1037 vehicles with operation XM56 generators will be road-tested for 100 h. Smoke will be generated for 30 min from a single stationary XM56 four times during the road test, resulting in a total of 120 min of smoke generation. The total aerial release of obscurant materials during this test is expected to be 556 kg (1,220 lb) of fog oil and 547 kg (1,200 lb) of graphite flakes. This environmental assessment has evaluated the consequences of the proposed action. Air concentrations and surface deposition levels were estimated using an atmospheric dispersion model. Degradation of fog oil and incorporation of graphite in the soil column will limit the residual impacts of the planned action. No significant impacts to air, water, and soil quality are anticipated. risks to the environment posed by the proposed action were determined to be minimal or below levels previously found to pose measurable impacts. Cultural resources are present on YPG and have been identified in adjacent areas; therefore, off-road activities should be preceded by a cultural resource survey. A Finding of No Significant Impact is recommended. 61 refs., 1 fig.

  5. The EU Seal Products Ban – Why Ineffective Animal Welfare Protection Cannot Justify Trade Restrictions under European and International Trade Law

    Directory of Open Access Journals (Sweden)

    Martin Hennig

    2015-03-01

    Full Text Available In this article, the author questions the legitimacy of the general ban on trade in seal products adopted by the European Union. It is submitted that the EU Seal Regime, which permits the marketing of Greenlandic seal products derived from Inuit hunts, but excludes Canadian and Norwegian seal products from the European market, does not ensure a satisfactory degree of animal welfare protection in order to justify the comprehensive trade restriction in place. It is argued that the current ineffective EU ban on seal products, which according to the WTO Appellate Body cannot be reconciled with the objective of protecting animal welfare, has no legal basis in EU Treaties and should be annulled.

  6. Prospective Retinal and Optic Nerve Vitrectomy Evaluation (PROVE study: findings at 3 months

    Directory of Open Access Journals (Sweden)

    Reddy RK

    2013-09-01

    Full Text Available Rahul K Reddy,1 Maziar Lalezary,1 Stephen J Kim,1 Jeffrey A Kammer,1 Rachel W Kuchtey,1 Edward F Cherney,1 Franco M Recchia,2 Karen M Joos,1 Anita Agarwal,1 Janice C Law11Department of Ophthalmology, Vanderbilt University School of Medicine, Nashville, TN, USA; 2Tennessee Retina, PC, Nashville, TN, USABackground: The purpose of this paper is to report the 3-month findings of the Prospective Retinal and Optic Nerve Vitrectomy Evaluation (PROVE study.Methods: Eighty eyes of 40 participants undergoing vitrectomy were enrolled. Participants underwent baseline evaluation of the study (surgical and fellow (control eye that included: intraocular pressure, central corneal thickness, gonioscopy, cup-to-disc ratio measurement, color fundus and optic disc photography, automated perimetry, and optical coherence tomography of the macula and optic nerve. Evaluation was repeated at 3 months. Main outcome measures were changes in macula and retinal nerve fiber layer (RNFL thickness and intraocular pressure.Results: All participants completed follow-up. Mean cup-to-disc ratio of study and fellow eyes at baseline was 0.43 ± 0.2 and 0.46 ± 0.2, respectively, and 13% of participants had undiagnosed narrow angles. There was no significant change in intraocular pressure, cup-to-disc ratio, or pattern standard deviation in study eyes compared with baseline or fellow eyes at 3 months. Vision improved in all study eyes at 3 months compared with baseline (P = 0.013, but remained significantly worse than fellow eyes (P < 0.001. Central subfield and temporal peripapillary RNFL thickness were significantly greater in eyes with epiretinal membrane (P < 0.05, and resolution after surgery correlated with visual improvement (P < 0.05.Conclusion: The 3-month results do not indicate any increased risk for open-angle glaucoma but suggest that a relatively high percentage of eyes may be at risk of angle closure glaucoma. Temporal RNFL thickness and central subfield were increased

  7. GOES-R Proving Ground Activities at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Molthan, Andrew

    2011-01-01

    SPoRT is actively involved in GOES-R Proving Ground activities in a number of ways: (1) Applying the paradigm of product development, user training, and interaction to foster interaction with end users at NOAA forecast offices national centers. (2) Providing unique capabilities in collaboration with other GOES-R Proving Ground partners (a) Hybrid GOES-MODIS imagery (b) Pseudo-GLM via regional lightning mapping arrays (c) Developing new RGB imagery from EUMETSAT guidelines

  8. An Introduction to Programming and Proving with Dependent Types in Coq

    Directory of Open Access Journals (Sweden)

    Adam Chlipala

    2010-01-01

    Full Text Available Computer proof assistants vary along many dimensions. Among the mature implementations, the Coq system is distinguished by two key features. First, we have support for programming with dependent types in the tradition of type theory, based on dependent function types and inductive type families. Second, we have a domain-specific language for coding correct-by-construction proof automation. Though the Coq user community has grown quite large, neither of the aspects I highlight is widely used. In this tutorial, I aim to provide a pragmatic introduction to both, showing how they can bring significant improvements in productivity.

  9. Killing the straw man: Does BICEP prove inflation at the GUT scale?

    Energy Technology Data Exchange (ETDEWEB)

    Dent, James B. [Department of Physics, University of Louisiana at Lafayette, Lafayette, LA 70504 (United States); Krauss, Lawrence M. [Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287 (United States); Mount Stromlo Observatory, Research School of Astronomy and Astrophysics, Australian National University, Weston, ACT, 2611 (Australia); Mathur, Harsh [Department of Physics, Case Western Reserve University, Cleveland, OH 44106-7079 (United States)

    2014-09-07

    The surprisingly large value of r, the ratio of power in tensor to scalar density perturbations in the CMB reported by the BICEP2 Collaboration, if confirmed, provides strong evidence for Inflation at the GUT scale. While the Inflationary signal remains the best motivated source, a large value of r alone would still allow for the possibility that a comparable gravitational wave background might result from a self ordering scalar field (SOSF) transition that takes place later at somewhat lower energy. We find that even without detailed considerations of the predicted BICEP signature of such a transition, simple existing limits on the isocurvature contribution to CMB anisotropies would definitively rule out a contribution of more than 5% to r≈0.2. We also present a general relation for the allowed fractional SOSF contribution to r as a function of the ultimate measured value of r. These results point strongly not only to an inflationary origin of the BICEP2 signal, if confirmed, but also to the fact that if the GUT scale is of order 10{sup 16} GeV then either the GUT transition happens before Inflation or the Inflationary transition and the GUT transition must be one and the same.

  10. Killing the straw man: Does BICEP prove inflation at the GUT scale?

    International Nuclear Information System (INIS)

    Dent, James B.; Krauss, Lawrence M.; Mathur, Harsh

    2014-01-01

    The surprisingly large value of r, the ratio of power in tensor to scalar density perturbations in the CMB reported by the BICEP2 Collaboration, if confirmed, provides strong evidence for Inflation at the GUT scale. While the Inflationary signal remains the best motivated source, a large value of r alone would still allow for the possibility that a comparable gravitational wave background might result from a self ordering scalar field (SOSF) transition that takes place later at somewhat lower energy. We find that even without detailed considerations of the predicted BICEP signature of such a transition, simple existing limits on the isocurvature contribution to CMB anisotropies would definitively rule out a contribution of more than 5% to r≈0.2. We also present a general relation for the allowed fractional SOSF contribution to r as a function of the ultimate measured value of r. These results point strongly not only to an inflationary origin of the BICEP2 signal, if confirmed, but also to the fact that if the GUT scale is of order 10 16 GeV then either the GUT transition happens before Inflation or the Inflationary transition and the GUT transition must be one and the same.

  11. Running head: Relative judgment. When the relative judgment theory proved to be false

    Directory of Open Access Journals (Sweden)

    Levi A.M.

    2016-01-01

    Full Text Available A commonly accepted theory is that when witnesses can identify culprits in lineups, they will concentrate on him. On the other hand, when they cannot they compare between lineup members and choose the person most similar to the culprit. Therefore they will divide their gaze more equally between foils. An eye tracker was used with a 48-person lineup (four screens with twelve photos in each in an attempt to demonstrate the superiority of gaze behavior over the verbal response. Surprisingly witnesses usually concentrated on some foil as much as they did on the target. Alternate theories are required to explain the reduction of false identifications in sequential lineups. The advantage of large lineups was demonstrated. Police may use them in conjunction with eye trackers to find culprits that witnesses focus on despite saying that they are absent, the only known method to increase correct identifications.

  12. A hierarchical approach to ecological assessment of contaminated soils at Aberdeen Proving Ground, USA

    Energy Technology Data Exchange (ETDEWEB)

    Kuperman, R.G.

    1995-12-31

    Despite the expansion of environmental toxicology studies over the past decade, soil ecosystems have largely been ignored in ecotoxicological studies in the United States. The objective of this project was to develop and test the efficacy of a comprehensive methodology for assessing ecological impacts of soil contamination. A hierarchical approach that integrates biotic parameters and ecosystem processes was used to give insight into the mechanisms that lead to alterations in the structure and function of soil ecosystems in contaminated areas. This approach involved (1) a thorough survey of the soil biota to determine community structure, (2) laboratory and field tests on critical ecosystem processes, (3) toxicity trials, and (4) the use of spatial analyses to provide input to the decision-making, process. This methodology appears to, offer an efficient and potentially cost-saving tool for remedial investigations of contaminated sites.

  13. Fingerprinting captured CO2 using natural tracers: Determining CO2 fate and proving ownership

    Science.gov (United States)

    Flude, Stephanie; Gilfillan, Stuart; Johnston, Gareth; Stuart, Finlay; Haszeldine, Stuart

    2016-04-01

    In the long term, captured CO2 will most likely be stored in large saline formations and it is highly likely that CO2 from multiple operators will be injected into a single saline formation. Understanding CO2 behavior within the reservoir is vital for making operational decisions and often uses geochemical techniques. Furthermore, in the event of a CO2 leak, being able to identify the owner of the CO2 is of vital importance in terms of liability and remediation. Addition of geochemical tracers to the CO2 stream is an effective way of tagging the CO2 from different power stations, but may become prohibitively expensive at large scale storage sites. Here we present results from a project assessing whether the natural isotopic composition (C, O and noble gas isotopes) of captured CO2 is sufficient to distinguish CO2 captured using different technologies and from different fuel sources, from likely baseline conditions. Results include analytical measurements of CO2 captured from a number of different CO2 capture plants and a comprehensive literature review of the known and hypothetical isotopic compositions of captured CO2 and baseline conditions. Key findings from the literature review suggest that the carbon isotope composition will be most strongly controlled by that of the feedstock, but significant fractionation is possible during the capture process; oxygen isotopes are likely to be controlled by the isotopic composition of any water used in either the industrial process or the capture technology; and noble gases concentrations will likely be controlled by the capture technique employed. Preliminary analytical results are in agreement with these predictions. Comparison with summaries of likely storage reservoir baseline and shallow or surface leakage reservoir baseline data suggests that C-isotopes are likely to be valuable tracers of CO2 in the storage reservoir, while noble gases may be particularly valuable as tracers of potential leakage.

  14. Way-finding in displaced clock-shifted bees proves bees use a cognitive map.

    Science.gov (United States)

    Cheeseman, James F; Millar, Craig D; Greggers, Uwe; Lehmann, Konstantin; Pawley, Matthew D M; Gallistel, Charles R; Warman, Guy R; Menzel, Randolf

    2014-06-17

    Mammals navigate by means of a metric cognitive map. Insects, most notably bees and ants, are also impressive navigators. The question whether they, too, have a metric cognitive map is important to cognitive science and neuroscience. Experimentally captured and displaced bees often depart from the release site in the compass direction they were bent on before their capture, even though this no longer heads them toward their goal. When they discover their error, however, the bees set off more or less directly toward their goal. This ability to orient toward a goal from an arbitrary point in the familiar environment is evidence that they have an integrated metric map of the experienced environment. We report a test of an alternative hypothesis, which is that all the bees have in memory is a collection of snapshots that enable them to recognize different landmarks and, associated with each such snapshot, a sun-compass-referenced home vector derived from dead reckoning done before and after previous visits to the landmark. We show that a large shift in the sun-compass rapidly induced by general anesthesia does not alter the accuracy or speed of the homeward-oriented flight made after the bees discover the error in their initial postrelease flight. This result rules out the sun-referenced home-vector hypothesis, further strengthening the now extensive evidence for a metric cognitive map in bees.

  15. Drug-eluting or bare-metal stents for large coronary vessel stenting? The BASKET-PROVE (PROspective Validation Examination) trial: Study protocol and design

    DEFF Research Database (Denmark)

    Pfisterer, M.; Bertel, O.; Bonetti, P.O.

    2008-01-01

    or refute this hypothesis, we set up an 11-center 4-country prospective trial of 2260 consecutive patients treated with >= 3.0-mm stents only, randomized to receive Cypher (Johnson & Johnson, Miami Lakes, FL), Vision (Abbott Vascular, Abbott Laboratories, IL), or Xience stents (Abbott Vascular). Only...

  16. Used tires prove a major solution to oil and gas spills in Alberta

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2009-01-15

    Approximately 5 to 15 per cent of the material recycled from tires cannot be used for any type of application. This waste material is either sent to a landfill or to an incinerator to be burnt for fuel. For every 200 worn out tires, between 10 and 30 of them end up in landfills or will be used in furnaces, leaving a future generation to deal with the residual negative effect of the landfill or how to clean the polluted air. An Alberta-based company, ESSI International, has found a unique use for this unwanted material. The company has used 100 per cent of this unwanted material to clean or re-mediate oil and gas spills, both on land and offshore. ESSI International has 6 patents pending on its processes. It has developed both large scale commercial filtration systems for field use, indoor filtration systems, and packaged products that are proactive, preventive, easy to use, and economically competitive. Once utilized, the material is recycled to recover the collected hydrocarbons and then the material itself is used as a replacement for sand and gravel in the production of ultra light weight concrete. In addition, ESSI International has developed a line of manufactured products that include wellhead and containment bag systems that proactively prevent hydrocarbons from contaminating the earth around the wellhead and adjacent areas. The article also discussed ESSI International's development of the containment system called bag in a barrel. It was concluded that where other products end up in a landfill or used to fuel incinerators, ESSI International uses its used and then recycled filter material in concrete. The filter material replaces sand and gravel and results in a lightweight product that can be used for driveways, side walks, sound barriers, road dividers. 1 ref., 3 figs.

  17. Improving immunization delivery using an electronic health record: the ImmProve project.

    Science.gov (United States)

    Bundy, David G; Persing, Nichole M; Solomon, Barry S; King, Tracy M; Murakami, Peter N; Thompson, Richard E; Engineer, Lilly D; Lehmann, Christoph U; Miller, Marlene R

    2013-01-01

    Though an essential pediatric preventive service, immunizations are challenging to deliver reliably. Our objective was to measure the impact on pediatric immunization rates of providing clinicians with electronic health record-derived immunization prompting. Operating in a large, urban, hospital-based pediatric primary care clinic, we evaluated 2 interventions to improve immunization delivery to children ages 2, 6, and 13 years: point-of-care, patient-specific electronic clinical decision support (CDS) when children overdue for immunizations presented for care, and provider-specific bulletins listing children overdue for immunizations. Overall, the proportion of children up to date for a composite of recommended immunizations at ages 2, 6, and 13 years was not different in the intervention (CDS active) and historical control (CDS not active) periods; historical immunization rates were high. The proportion of children receiving 2 doses of hepatitis A immunization before their second birthday was significantly improved during the intervention period. Human papillomavirus (HPV) immunization delivery was low during both control and intervention periods and was unchanged for 13-year-olds. For 14-year-olds, however, 4 of the 5 highest quarterly rates of complete HPV immunization occurred in the final year of the intervention. Provider-specific bulletins listing children overdue for immunizations increased the likelihood of identified children receiving catch-up hepatitis A immunizations (hazard ratio 1.32; 95% confidence interval 1.12-1.56); results for HPV and the composite of recommended immunizations were of a similar magnitude but not statistically significant. In our patient population, with high baseline uptake of recommended immunizations, electronic health record-derived immunization prompting had a limited effect on immunization delivery. Benefit was more clearly demonstrated for newer immunizations with lower baseline uptake. Copyright © 2013 Academic

  18. E-squared nine do-it-yourself energy experiments that prove your thoughts create your reality

    CERN Document Server

    Grout, Pam

    2013-01-01

    E-Squared is a lab manual with simple experiments to prove once and for all that there really is a good, loving, totally hip force in the universe. Rather than take it on faith, you are invited to conduct ten 48-hour experiments to prove each of the principles in this book. Yes, you read that right. It says prove. The experiments, each of which can be conducted with absolutely no money and very little time expenditure, demonstrate that spiritual principles are as dependable as gravity, as consistent as Newton's 2nd law of motion. For years, you've been hoping and praying that spiritual principles are true. Now, you can know.

  19. The effects of GeoGebra software on pre-service mathematics teachers' attitudes and views toward proof and proving

    Science.gov (United States)

    Zengin, Yılmaz

    2017-11-01

    The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used the 'Attitude Scale Towards Proof and Proving' and an open-ended questionnaire that were administered before and after the intervention as data collection tools. Paired samples t-test analysis was used for the analysis of quantitative data and content and descriptive analyses were utilized for the analysis of qualitative data. As a result of the data analysis, it was determined that GeoGebra software was an effective tool in increasing pre-service teachers' attitudes towards proof and proving.

  20. Early nodule senescence is activated in symbiotic mutants of pea (Pisum sativum L.) forming ineffective nodules blocked at different nodule developmental stages.

    Science.gov (United States)

    Serova, Tatiana A; Tsyganova, Anna V; Tsyganov, Viktor E

    2018-04-03

    Plant symbiotic mutants are useful tool to uncover the molecular-genetic mechanisms of nodule senescence. The pea (Pisum sativum L.) mutants SGEFix - -1 (sym40), SGEFix - -3 (sym26), and SGEFix - -7 (sym27) display an early nodule senescence phenotype, whereas the mutant SGEFix - -2 (sym33) does not show premature degradation of symbiotic structures, but its nodules show an enhanced immune response. The nodules of these mutants were compared with each other and with those of the wild-type SGE line using seven marker genes that are known to be activated during nodule senescence. In wild-type SGE nodules, transcript levels of all of the senescence-associated genes were highest at 6 weeks after inoculation (WAI). The senescence-associated genes showed higher transcript abundance in mutant nodules than in wild-type nodules at 2 WAI and attained maximum levels in the mutant nodules at 4 WAI. Immunolocalization analyses showed that the ethylene precursor 1-aminocyclopropane-1-carboxylate accumulated earlier in the mutant nodules than in wild-type nodules. Together, these results showed that nodule senescence was activated in ineffective nodules blocked at different developmental stages in pea lines that harbor mutations in four symbiotic genes.

  1. Large deviations

    CERN Document Server

    Varadhan, S R S

    2016-01-01

    The theory of large deviations deals with rates at which probabilities of certain events decay as a natural parameter in the problem varies. This book, which is based on a graduate course on large deviations at the Courant Institute, focuses on three concrete sets of examples: (i) diffusions with small noise and the exit problem, (ii) large time behavior of Markov processes and their connection to the Feynman-Kac formula and the related large deviation behavior of the number of distinct sites visited by a random walk, and (iii) interacting particle systems, their scaling limits, and large deviations from their expected limits. For the most part the examples are worked out in detail, and in the process the subject of large deviations is developed. The book will give the reader a flavor of how large deviation theory can help in problems that are not posed directly in terms of large deviations. The reader is assumed to have some familiarity with probability, Markov processes, and interacting particle systems.

  2. A randomized double-blind placebo-controlled crossover-style trial of buspirone in functional dysphagia and ineffective esophageal motility.

    Science.gov (United States)

    Aggarwal, Nitin; Thota, Prashanthi Nagavenkata; Lopez, Rocio; Gabbard, Scott

    2018-02-01

    Studies suggest that Ineffective Esophageal Motility (IEM) is the manometric correlate of Functional Dysphagia (FD). Currently, there is no accepted therapy for either condition. Buspirone is a serotonin modulating medication and has been shown to augment esophageal peristaltic amplitude in healthy volunteers. We aimed to determine if buspirone improves manometric parameters and symptoms in patients with overlapping IEM/FD. We performed a prospective, double-blind, placebo-controlled, crossover-style trial of 10 patients with IEM/FD. The study consisted of two 2-week treatment arms with a 2-week washout period. Outcomes measured at baseline, end of week 2, and week 6 include high resolution esophageal manometry (HREM), the Mayo Dysphagia Questionnaire-14 (MDQ-14), and the GERD-HRQL. The mean age of our 10 patients was 53 ± 9 years and 70% were female. After treatment with buspirone, 30% of patients had normalization of IEM on manometry; however, there was 30% normalization in the placebo group as well. Comparing buspirone to placebo, there was no statistically significant difference in the HREM parameters measured. There was also no statistically significant difference in symptom outcomes for buspirone compared to placebo. Of note, patients had a statistically significant decrease in the total GERD-HRQL total score when treated with placebo compared to baseline levels. Despite previous data demonstrating improved esophageal motility in healthy volunteers, our study shows no difference in terms of HREM parameters or symptom scores in IEM/FD patients treated with buspirone compared to placebo. Further research is necessary to identify novel agents for this condition. © 2017 John Wiley & Sons Ltd.

  3. Meeting on Solute/Solvent Interactions Held in Aberdeen Proving Ground, Maryland on May 29-30, 1991

    Science.gov (United States)

    1992-01-01

    the magnitudes of the nuclear charges and therefore cannot be assumed to indicate relative reactivities toward nucleophiles. There is accordingly no...APPENDIX 3 ORGANIZATIONS OF AUTHORS IN THESE PROCEEDINGS Central Michigan University 49 Instituto de Quimica Fisica 33 La Sierra University, Riverside 71...Aberdeen Proving Gd, MD Joxe-Luis Abbud Christopher Cramer Instituto de Quimica Fisica SMCCR-RSP-C "Rocasolano" U.S. Army Chemical RD&E Center Conajo

  4. Non-linear quenching of current fluctuations in a self-exciting homopolar dynamo, proved by feedback system theory

    Science.gov (United States)

    de Paor, A. M.

    Hide (Nonlinear Processes in Geophysics, 1998) has produced a new mathematical model of a self-exciting homopolar dynamo driving a series- wound motor, as a continuing contribution to the theory of the geomagnetic field. By a process of exact perturbation analysis, followed by combination and partial solution of differential equations, the complete nonlinear quenching of current fluctuations reported by Hide in the case that a parameter ɛ has the value 1 is proved via the Popov theorem from feedback system stability theory.

  5. Non-linear quenching of current fluctuations in a self-exciting homopolar dynamo, proved by feedback system theory

    OpenAIRE

    A. M. de Paor

    1998-01-01

    International audience; Hide (Nonlinear Processes in Geophysics, 1998) has produced a new mathematical model of a self-exciting homopolar dynamo driving a series- wound motor, as a continuing contribution to the theory of the geomagnetic field. By a process of exact perturbation analysis, followed by combination and partial solution of differential equations, the complete nonlinear quenching of current fluctuations reported by Hide in the case that a parameter ? has the value 1 is proved via ...

  6. Seismic test facilities at the ENEA Casaccia Research Center; Prove sismiche con le tavole vibranti al centro ricerche Enea Casaccia

    Energy Technology Data Exchange (ETDEWEB)

    De Canio, G. [ENEA, Divisione Servizi Tecnologici, Centro Ricerche Casaccia, Rome (Italy)

    2000-07-01

    The main experimental facilities for seismic tests at the ENEA C.R. Casaccia laboratories consist of two high performance shake table for three axial seismic tests of structures up to 10 ton mass and 3g acceleration applied at the Center of Gravity at 1m from the base table. The activities are principally devoted to the dynamic characterization and vibration tests for mechanical and aero spatial structures, and the experimental analysis of innovative systems for the seismic isolation and retrofitting of civil, industrial, and historical buildings; together with the seismic tests of sub-structures and scaled mock-ups, in order to evaluate the isolation/dissipation performance of the anti-seismic devices, and the failure modes of the structural parts of the building. [Italian] Le principali attrezzature per le prove sismiche presso i laboratori del C.R. Casaccia consistono di due tavole vibranti triassali per prove su strutture fino a 10t di peso con una accelerazione di 3g applicata al centro di gravita' posto ad 1 m di altezza dal piano della tavola. Le principali attivita' riguardano: (a) test di caratterizzazione dinamica e prove di vibrazioni per strutture meccaniche ed aerospaziali; (b) l'analisi sperimentale di sistemi innovativi per l'isolamento sismico ed il consolidamento di strutture civili, industriali e storico monumentali, e le prove sismiche di elementi strutturali e di modelli in scala per la valutazione della capacita' di dissipazione dei dispositivi antisismici e le modalita' di formazione delle fratture.

  7. Coronal mass ejections and large geomagnetic storms

    International Nuclear Information System (INIS)

    Gosling, J.T.; Bame, S.J.; McComas, D.J.; Phillips, J.L.

    1990-01-01

    Previous work indicates that coronal mass ejection (CME) events in the solar wind at 1 AU can be identified by the presence of a flux of counterstreaming solar wind halo electrons (above about 80 eV). Using this technique to identify CMEs in 1 AU plasma data, the authors find that most large geomagnetic storms during the interval surrounding the last solar maximum (Aug. 1978-Oct. 1982) were associated with Earth-passage of interplanetary disturbances in which the Earth encountered both a shock and the CME driving the shock. However, only about one CME in six encountered by Earth was effective in causing a large geomagnetic storm. Slow CMEs which did not interact strongly with the ambient solar wind ahead were particularly ineffective in a geomagnetic sense

  8. Depleted uranium risk assessment for Jefferson Proving Ground using data from environmental monitoring and site characterization. Final report

    International Nuclear Information System (INIS)

    Ebinger, M.H.; Hansen, W.R.

    1996-10-01

    This report documents the third risk assessment completed for the depleted uranium (DU) munitions testing range at Jefferson Proving Ground (JPG), Indiana, for the U.S. Army Test and Evaluation command. Jefferson Proving Ground was closed in 1995 under the Base Realignment and Closure Act and the testing mission was moved to Yuma Proving Ground. As part of the closure of JPG, assessments of potential adverse health effects to humans and the ecosystem were conducted. This report integrates recent information obtained from site characterization surveys at JPG with environmental monitoring data collected from 1983 through 1994 during DU testing. Three exposure scenarios were evaluated for potential adverse effects to human health: an occasional use scenario and two farming scenarios. Human exposure was minimal from occasional use, but significant risk were predicted from the farming scenarios when contaminated groundwater was used by site occupants. The human health risk assessments do not consider the significant risk posed by accidents with unexploded ordnance. Exposures of white-tailed deer to DU were also estimated in this study, and exposure rates result in no significant increase in either toxicological or radiological risks. The results of this study indicate that remediation of the DU impact area would not substantially reduce already low risks to humans and the ecosystem, and that managed access to JPG is a reasonable model for future land use options

  9. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine.

    Science.gov (United States)

    Sawyer, Taylor; White, Marjorie; Zaveri, Pavan; Chang, Todd; Ades, Anne; French, Heather; Anderson, JoDee; Auerbach, Marc; Johnston, Lindsay; Kessler, David

    2015-08-01

    Acquisition of competency in procedural skills is a fundamental goal of medical training. In this Perspective, the authors propose an evidence-based pedagogical framework for procedural skill training. The framework was developed based on a review of the literature using a critical synthesis approach and builds on earlier models of procedural skill training in medicine. The authors begin by describing the fundamentals of procedural skill development. Then, a six-step pedagogical framework for procedural skills training is presented: Learn, See, Practice, Prove, Do, and Maintain. In this framework, procedural skill training begins with the learner acquiring requisite cognitive knowledge through didactic education (Learn) and observation of the procedure (See). The learner then progresses to the stage of psychomotor skill acquisition and is allowed to deliberately practice the procedure on a simulator (Practice). Simulation-based mastery learning is employed to allow the trainee to prove competency prior to performing the procedure on a patient (Prove). Once competency is demonstrated on a simulator, the trainee is allowed to perform the procedure on patients with direct supervision, until he or she can be entrusted to perform the procedure independently (Do). Maintenance of the skill is ensured through continued clinical practice, supplemented by simulation-based training as needed (Maintain). Evidence in support of each component of the framework is presented. Implementation of the proposed framework presents a paradigm shift in procedural skill training. However, the authors believe that adoption of the framework will improve procedural skill training and patient safety.

  10. Large deviations

    CERN Document Server

    Deuschel, Jean-Dominique; Deuschel, Jean-Dominique

    2001-01-01

    This is the second printing of the book first published in 1988. The first four chapters of the volume are based on lectures given by Stroock at MIT in 1987. They form an introduction to the basic ideas of the theory of large deviations and make a suitable package on which to base a semester-length course for advanced graduate students with a strong background in analysis and some probability theory. A large selection of exercises presents important material and many applications. The last two chapters present various non-uniform results (Chapter 5) and outline the analytic approach that allow

  11. Tamsulosin or Silodosin Adjuvant Treatment Is Ineffective in Improving Shockwave Lithotripsy Outcome: A Short-Term Follow-Up Randomized, Placebo-Controlled Study.

    Science.gov (United States)

    De Nunzio, Cosimo; Brassetti, Aldo; Bellangino, Mariangela; Trucchi, Alberto; Petta, Stefano; Presicce, Fabrizio; Tubaro, Andrea

    2016-07-01

    The role of α-blockers after shockwave lithotripsy (SWL) is controversial. The aim of our study was to evaluate the effect of tamsulosin and silodosin after SWL for kidney stones. From 2012 onward, a consecutive series of patients undergoing SWL were prospectively enrolled and randomized by closed envelopes in three groups receiving tamsulosin 0.4 mg (A), silodosin 8 mg (B), and placebo (C) daily for 21 days after SWL. Anthropometrics, stone size, and location were recorded before SWL. Visual analogue scale (VAS) score was collected at 6, 12, and 24 hours after treatment to evaluate patients' discomfort. Stone-free rate was assessed 1 and 3 weeks postoperatively. Complications and medical treatment-related adverse events (AEs) were recorded. Differences in VAS score, stone-free rate, and complications were compared among the groups. Overall, 60 patients were enrolled. Mean stone sizes were 10.28 ± 2.46 mm, 10.45 ± 1.73 mm, and 9.23 ± 2.04 mm in groups A, B, and C, respectively (p = 0.474). There was no significant difference between the three groups with regard to stone location. Comparable energy was used to treat patients from the three groups. The overall 3-week stone-free rate was 53%: 58% in the tamsulosin group, 47% in the silodosin group, and 55% in the placebo group (p = 0.399). No significant differences were observed in the VAS scores reported by the groups at 6 hours (p = 1.254), 12 hours (p = 0.075), and 24 hours (p = 0.490). Overall, 12 complications were reported: 11 patients (7 in group C and 4 in group B) needed analgesics for colic, and 1 patient (group B) was surgically treated for Steinstrasse. Tamsulosin was superior to placebo (p = 0.008) and silodosin (p = 0.021) in preventing complications; no difference between silodosin and placebo (p = 0.629) was noted. Tamsulosin and silodosin are ineffective in increasing stone-free rate as well as early patients' discomfort after extracorporeal

  12. Non-linear quenching of current fluctuations in a self-exciting homopolar dynamo, proved by feedback system theory

    Directory of Open Access Journals (Sweden)

    A. M. de Paor

    1998-01-01

    Full Text Available Hide (Nonlinear Processes in Geophysics, 1998 has produced a new mathematical model of a self-exciting homopolar dynamo driving a series- wound motor, as a continuing contribution to the theory of the geomagnetic field. By a process of exact perturbation analysis, followed by combination and partial solution of differential equations, the complete nonlinear quenching of current fluctuations reported by Hide in the case that a parameter ε has the value 1 is proved via the Popov theorem from feedback system stability theory.

  13. The Efficacy and Safety of Chinese Herbal Medicine Jinlida as Add-On Medication in Type 2 Diabetes Patients Ineffectively Managed by Metformin Monotherapy: A Double-Blind, Randomized, Placebo-Controlled, Multicenter Trial

    OpenAIRE

    Lian, Fengmei; Tian, Jiaxing; Chen, Xinyan; Li, Zhibin; Piao, Chunli; Guo, Junjie; Ma, Licheng; Zhao, Lijuan; Xia, Chengdong; Wang, Chong-Zhi; Yuan, Chun-Su; Tong, Xiaolin

    2015-01-01

    Background Metformin plays an important role in diabetes treatment. Studies have shown that the combined use of oral hypoglycemic medications is more effective than metformin monotherapy. In this double-blind, randomized, placebo-controlled, multicenter trial, we evaluated whether Jinlida, a Chinese herbal medicine, enhances the glycemic control of metformin in type 2 diabetes patients whose HbA1c was ineffectively controlled with metformin alone. Methods A total of 186 diabetes patients were...

  14. Filtered air plastic chamber as an experimental facility to prove visible damage of crops due to air pollution

    Energy Technology Data Exchange (ETDEWEB)

    Matsuoka, Y; Yoda, H; Omichi, S; Shiratori, K

    1975-01-01

    An experimental filtered air chamber was constructed to prove the visible damage of crops due to air pollution. The chamber was provided with another room into which non-filtered ambient air was introduced. The purified air was prepared by filtering ambient air with activated carbon. The average content of air pollutants in the purified air chamber was less than 10 to 20% of the ozone and 20% of the sulfur oxides in the ambient air. However, cultivated vegetables such as tobacco and spinach, which are susceptible to oxidant, showed no visible damage in the filtered air chamber, and showed the same damage in the nonfiltered air chamber as was seen in fields at the same time.

  15. Historical wildlife dynamics on Dugway Proving Ground: population and disease trends in jack rabbits over two decades. [Lepus californicus

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L.E.; Van Voris, P.

    1986-08-01

    In an effort to determine whether US Army activities on the Dugway Proving Ground (DPG) have had an impact on resident wildlife, intensive studies have been conducted on the biology and ecology of the black-tailed jack rabbit (Lepus californicus) since 1965. in addition, the incidence of endemic diseases in several species of resident wildlife on the DPG have been studied from the late 1950s through the mid-1970s. The objectives of this report are to: (1) compile and summarize the jack rabbit data and some of the disease information that is presently contained only in annual reports; (2) compare the DPG jack rabbit data to data available on other jack rabbit populations; and (3) analyze the data for unusual or unexplained fluctuations in population densities or in incidence of disease.

  16. Methodology of proving long-term safety of a salt dome repository with existing insecurities forming the background

    International Nuclear Information System (INIS)

    Storck, R.

    1992-01-01

    Existing methods to prove safety can consider the insecurities of input data within the framework of probabilistic analyses. The results of application calculations show that inspite of considerable band widths of input data the scattering widths of radiation exposures are comparably limited, and calculated radiation exposures are clearly below acceptable limits. Moreover it can be demonstrated that in the event of an assumed brine influx into the repository radionuclides are released only if parameter combinations are unfavourable. Therefore such incident in general does not have any radiological consequences. Insecurities in model approaches can be taken into consideration only partly so far by using alternative models, or indirectly through data insecurities. (orig./DG) [de

  17. Long-term fate of depleted uranium at Aberdeen and Yuma Proving Grounds: Human health and ecological risk assessments

    International Nuclear Information System (INIS)

    Ebinger, M.H.; Beckman, R.J.; Myers, O.B.; Kennedy, P.L.; Clements, W.; Bestgen, H.T.

    1996-09-01

    The purpose of this study was to evaluate the immediate and long-term consequences of depleted uranium (DU) in the environment at Aberdeen Proving Ground (APG) and Yuma Proving Ground (YPG) for the Test and Evaluation Command (TECOM) of the US Army. Specifically, we examined the potential for adverse radiological and toxicological effects to humans and ecosystems caused by exposure to DU at both installations. We developed contaminant transport models of aquatic and terrestrial ecosystems at APG and terrestrial ecosystems at YPG to assess potential adverse effects from DU exposure. Sensitivity and uncertainty analyses of the initial models showed the portions of the models that most influenced predicted DU concentrations, and the results of the sensitivity analyses were fundamental tools in designing field sampling campaigns at both installations. Results of uranium (U) isotope analyses of field samples provided data to evaluate the source of U in the environment and the toxicological and radiological doses to different ecosystem components and to humans. Probabilistic doses were estimated from the field data, and DU was identified in several components of the food chain at APG and YPG. Dose estimates from APG data indicated that U or DU uptake was insufficient to cause adverse toxicological or radiological effects. Dose estimates from YPG data indicated that U or DU uptake is insufficient to cause radiological effects in ecosystem components or in humans, but toxicological effects in small mammals (e.g., kangaroo rats and pocket mice) may occur from U or DU ingestion. The results of this study were used to modify environmental radiation monitoring plans at APG and YPG to ensure collection of adequate data for ongoing ecological and human health risk assessments

  18. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R.A.

    2002-05-09

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km{sup 2} between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  19. JPSS Proving Ground Activities with NASA's Short-term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Schultz, L. A.; Smith, M. R.; Fuell, K.; Stano, G. T.; LeRoy, A.; Berndt, E.

    2015-12-01

    Instruments aboard the Joint Polar Satellite System (JPSS) series of satellites will provide imagery and other data sets relevant to operational weather forecasts. To prepare current and future weather forecasters in application of these data sets, Proving Ground activities have been established that demonstrate future JPSS capabilities through use of similar sensors aboard NASA's Terra and Aqua satellites, and the S-NPP mission. As part of these efforts, NASA's Short-term Prediction Research and Transition (SPoRT) Center in Huntsville, Alabama partners with near real-time providers of S-NPP products (e.g., NASA, UW/CIMSS, UAF/GINA, etc.) to demonstrate future capabilities of JPSS. This includes training materials and product distribution of multi-spectral false color composites of the visible, near-infrared, and infrared bands of MODIS and VIIRS. These are designed to highlight phenomena of interest to help forecasters digest the multispectral data provided by the VIIRS sensor. In addition, forecasters have been trained on the use of the VIIRS day-night band, which provides imagery of moonlit clouds, surface, and lights emitted by human activities. Hyperspectral information from the S-NPP/CrIS instrument provides thermodynamic profiles that aid in the detection of extremely cold air aloft, helping to map specific aviation hazards at high latitudes. Hyperspectral data also support the estimation of ozone concentration, which can highlight the presence of much drier stratospheric air, and map its interaction with mid-latitude or tropical cyclones to improve predictions of their strengthening or decay. Proving Ground activities are reviewed, including training materials and methods that have been provided to forecasters, and forecaster feedback on these products that has been acquired through formal, detailed assessment of their applicability to a given forecast threat or task. Future opportunities for collaborations around the delivery of training are proposed

  20. Long-term fate of depleted uranium at Aberdeen and Yuma Proving Grounds: Human health and ecological risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Ebinger, M.H.; Beckman, R.J.; Myers, O.B. [Los Alamos National Lab., NM (United States); Kennedy, P.L.; Clements, W.; Bestgen, H.T. [Colorado State Univ., Ft. Collins, CO (United States). Dept. of Fishery and Wildlife Biology

    1996-09-01

    The purpose of this study was to evaluate the immediate and long-term consequences of depleted uranium (DU) in the environment at Aberdeen Proving Ground (APG) and Yuma Proving Ground (YPG) for the Test and Evaluation Command (TECOM) of the US Army. Specifically, we examined the potential for adverse radiological and toxicological effects to humans and ecosystems caused by exposure to DU at both installations. We developed contaminant transport models of aquatic and terrestrial ecosystems at APG and terrestrial ecosystems at YPG to assess potential adverse effects from DU exposure. Sensitivity and uncertainty analyses of the initial models showed the portions of the models that most influenced predicted DU concentrations, and the results of the sensitivity analyses were fundamental tools in designing field sampling campaigns at both installations. Results of uranium (U) isotope analyses of field samples provided data to evaluate the source of U in the environment and the toxicological and radiological doses to different ecosystem components and to humans. Probabilistic doses were estimated from the field data, and DU was identified in several components of the food chain at APG and YPG. Dose estimates from APG data indicated that U or DU uptake was insufficient to cause adverse toxicological or radiological effects. Dose estimates from YPG data indicated that U or DU uptake is insufficient to cause radiological effects in ecosystem components or in humans, but toxicological effects in small mammals (e.g., kangaroo rats and pocket mice) may occur from U or DU ingestion. The results of this study were used to modify environmental radiation monitoring plans at APG and YPG to ensure collection of adequate data for ongoing ecological and human health risk assessments.

  1. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    International Nuclear Information System (INIS)

    Efroymson, R.A.

    2002-01-01

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km 2 between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  2. Ultra-Lightweight Large Aperture Support Structures, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Ultra-lightweight membranes may prove to be very attractive for large aperture systems, but their value will be fully realized only if they are mated with equally...

  3. An explicit local uniform large deviation bound for Brownian bridges

    NARCIS (Netherlands)

    Wittich, O.

    2005-01-01

    By comparing curve length in a manifold and a standard sphere, we prove a local uniform bound for the exponent in the Large Deviation formula that describes the concentration of Brownian bridges to geodesics.

  4. Large ethics.

    Science.gov (United States)

    Chambers, David W

    2008-01-01

    This essay presents an alternative to the traditional view that ethics means judging individual behavior against standards of right and wrong. Instead, ethics is understood as creating ethical communities through the promises we make to each other. The "aim" of ethics is to demonstrate in our own behavior a credible willingness to work to create a mutually better world. The "game" of ethics then becomes searching for strategies that overlap with others' strategies so that we are all better for intending to act on a basis of reciprocal trust. This is a difficult process because we have partial, simultaneous, shifting, and inconsistent views of the world. But despite the reality that we each "frame" ethics in personal terms, it is still possible to create sufficient common understanding to prosper together. Large ethics does not make it a prerequisite for moral behavior that everyone adheres to a universally agreed set of ethical principles; all that is necessary is sufficient overlap in commitment to searching for better alternatives.

  5. Use of large sources and accelerators

    International Nuclear Information System (INIS)

    1969-01-01

    A comprehensive review of applications of large radiation sources and accelerators in industrial processing was made at a symposium held in Munich during August. Reports presented dealt with industrial work already proved to be practical, projects in an advanced stage of development and with others in which there appears to be significant potential. (author)

  6. An Electrosurgical Endoknife with a Water-Jet Function (Flushknife Proves Its Merits in Colorectal Endoscopic Submucosal Dissection Especially for the Cases Which Should Be Removed En Bloc

    Directory of Open Access Journals (Sweden)

    Yoji Takeuchi

    2013-01-01

    Full Text Available Background. Previously, we reported that the Flushknife (electrosurgical endoknife with a water-jet function could reduce the operation time of colorectal endoscopic submucosal dissection (ESD however, suitable situation for the Flushknife was obscure. This subgroup analysis of a prospective randomized controlled trial was aimed to investigate the suitable situation for the Flushknife. Methods. A total of 48 superficial colorectal neoplasms that underwent ESD using either the Flexknife or the Flushknife in a referral center were enrolled. The differences of operation time between the Flexknife and the Flushknife groups in each subgroup (tumor size, location, and macroscopic type were analyzed. Results. Median (95% CI operation time calculated using survival curves was significantly shorter in the Flushknife group than in the Flexknife group (55.5 min [41, 78] versus 74.0 [57, 90] min; , Hazard Ratio HR: 0.53; 95% CI (0.29–0.97. In particular, the HR in patients with laterally spreading tumors-nongranular type (LST-NG in the Flushknife group was significantly smaller than in the Flexknife group (HR: 0.1650.17; 95% CI (0.04–0.66. There was a trend of decreasing HRs according to larger lesion size. Conclusions. The Flushknife proved its merits in colorectal ESD especially for the lesions which should be removed en bloc (LST-NG and large lesion.

  7. Heart rate at discharge and long-term prognosis following percutaneous coronary intervention in stable and acute coronary syndromes--results from the BASKET PROVE trial.

    Science.gov (United States)

    Jensen, Magnus Thorsten; Kaiser, Christoph; Sandsten, Karl Erik; Alber, Hannes; Wanitschek, Maria; Iversen, Allan; Jensen, Jan Skov; Pedersen, Sune; Soerensen, Rikke; Rickli, Hans; Zurek, Marzena; Fahrni, Gregor; Bertel, Osmund; De Servi, Stefano; Erne, Paul; Pfisterer, Matthias; Galatius, Søren

    2013-10-09

    Elevated heart rate (HR) is associated with mortality in a number of heart diseases. We examined the long-term prognostic significance of HR at discharge in a contemporary population of patients with stable angina (SAP), non-ST-segment elevation acute coronary syndromes (NSTE-ACS), and ST-segment elevation myocardial infarction (STEMI) revascularized with percutaneous coronary intervention (PCI). Patients from the BASKET-PROVE trial, an 11-center randomized all-comers trial comparing bare-metal and drug-eluting stenting in large coronary vessels, were included. Discharge HR was determined from a resting ECG. Long-term outcomes (7 days to 2 years) were evaluated for all-cause mortality and cardiovascular death and non-fatal myocardial infarction. A total of 2029 patients with sinus rhythm were included, 722 (35.6%) SAP, 647 (31.9%) NSTE-ACS, and 660 (32.5%) STEMI. Elevated discharge HR was associated significantly with all-cause mortality: when compared to a reference of 90 bpm. For cardiovascular death/myocardial infarction, a discharge HR >90 bpm was associated with a hazard ratio of 6.2 (2.5-15.5, pacute coronary syndromes an elevated discharge HR was independently associated with poor prognosis. Conversely, a HR <60 bpm at discharge was associated with a good long-term prognosis irrespective of indication for PCI. © 2013.

  8. Seeing a Colleague Encourage a Student to Make an Assumption while Proving: What Teachers Put in Play when Casting an Episode of Instruction

    Science.gov (United States)

    Nachlieli, Talli; Herbst, Patricio

    2009-01-01

    This article reports on an investigation of how teachers of geometry perceived an episode of instruction presented to them as a case of engaging students in proving. Confirming what was hypothesized, participants found it remarkable that a teacher would allow a student to make an assumption while proving. But they perceived this episode in various…

  9. 20 CFR 1002.33 - Does the employee have to prove that the employer discriminated against him or her in order to be...

    Science.gov (United States)

    2010-04-01

    ... employer discriminated against him or her in order to be eligible for reemployment? 1002.33 Section 1002.33... have to prove that the employer discriminated against him or her in order to be eligible for reemployment? No. The employee is not required to prove that the employer discriminated against him or her...

  10. An optimized groundwater extraction system for the toxic burning pits area of J-Field, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, J.J.; Johnson, R.L.; Patton, T.L.; Martino, L.E.

    1996-06-01

    Testing and disposal of chemical warfare agents, munitions, and industrial chemicals at the J-Field area of the Aberdeen Proving Ground (APG) have resulted in contamination of soil and groundwater. The discharge of contaminated groundwater to on-site marshes and adjacent estuaries poses a potential risk to ecological receptors. The Toxic Burning Pits (TBP) area is of special concern because of its disposal history. This report describes a groundwater modeling study conducted at J-Field that focused on the TBP area. The goal of this modeling effort was optimization of the groundwater extraction system at the TBP area by applying linear programming techniques. Initially, the flow field in the J-Field vicinity was characterized with a three-dimensional model that uses existing data and several numerical techniques. A user-specified border was set near the marsh and used as a constraint boundary in two modeled remediation scenarios: containment of the groundwater and containment of groundwater with an impermeable cap installed over the TBP area. In both cases, the objective was to extract the minimum amount of water necessary while satisfying the constraints. The smallest number of wells necessary was then determined for each case. This optimization approach provided two benefits: cost savings, in that the water to be treated and the well installation costs were minimized, and minimization of remediation impacts on the ecology of the marsh.

  11. The “incredible” difficulty of proving “incredibility” – Example of fire-induced multiple spurious operations

    International Nuclear Information System (INIS)

    Gallucci, Raymond H.V.

    2016-01-01

    “Risk-informed” regulation is often an alternative to “deterministically-based” regulation that offers relaxation in criteria for acceptability while possibly requiring greater analytical effort. “Risk-informed determinism” is an attempt to meld the best of both worlds by using risk information to set deterministic acceptance criteria a priori. A recent joint effort by the US Nuclear Regulatory Commission’s Office of Nuclear Regulatory Research (RES) and Electric Power Research Institute (EPRI) originally endeavored to do this for several examples involving fire-induced multiple spurious operations (MSOs) in electrical circuits at nuclear power plants. While a noble effort, this did not consider the actual distributions involved in the events, originally limiting the analysis to mean values and, in some cases, qualitative considerations. A much more comprehensive and defensible approach is performed here where the probabilistic distributions for all the factors are considered via simulation to meet quantitative acceptance criteria related to the concept of “incredibility” that is often the figure of merit that must be met in a deterministic world. The effort demonstrates that it can be “incredibly” difficult to prove “incredibility” in this context.

  12. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Missile Firing

    International Nuclear Information System (INIS)

    Jones, Daniel Steven; Efroymson, Rebecca Ann; Hargrove, William Walter; Suter, Glenn; Pater, Larry

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the missile launch and detonation. The primary stressor associated with this activity was sound. Other minor stressors included the detonation impact, shrapnel, and fire. Exposure to desert mule deer (Odocoileus hemionus crooki) was quantified using the Army sound contour program BNOISE2, as well as distances from the explosion to deer. Few effects data were available from related studies. Exposure-response models for the characterization of effects consisted of human 'disturbance' and hearing damage thresholds in units of C-weighted decibels (sound exposure level) and a distance-based No Observed Adverse Effects Level for moose and cannonfire. The risk characterization used a weight-of-evidence approach and concluded that risk to mule deer behavior from the missile firing was likely for a negligible number of deer, but that no risk to mule deer abundance and reproduction is expected

  13. The “incredible” difficulty of proving “incredibility” – Example of fire-induced multiple spurious operations

    Energy Technology Data Exchange (ETDEWEB)

    Gallucci, Raymond H.V., E-mail: Ray.Gallucci@nrc.gov

    2016-11-15

    “Risk-informed” regulation is often an alternative to “deterministically-based” regulation that offers relaxation in criteria for acceptability while possibly requiring greater analytical effort. “Risk-informed determinism” is an attempt to meld the best of both worlds by using risk information to set deterministic acceptance criteria a priori. A recent joint effort by the US Nuclear Regulatory Commission’s Office of Nuclear Regulatory Research (RES) and Electric Power Research Institute (EPRI) originally endeavored to do this for several examples involving fire-induced multiple spurious operations (MSOs) in electrical circuits at nuclear power plants. While a noble effort, this did not consider the actual distributions involved in the events, originally limiting the analysis to mean values and, in some cases, qualitative considerations. A much more comprehensive and defensible approach is performed here where the probabilistic distributions for all the factors are considered via simulation to meet quantitative acceptance criteria related to the concept of “incredibility” that is often the figure of merit that must be met in a deterministic world. The effort demonstrates that it can be “incredibly” difficult to prove “incredibility” in this context.

  14. The large deviations theorem and ergodicity

    International Nuclear Information System (INIS)

    Gu Rongbao

    2007-01-01

    In this paper, some relationships between stochastic and topological properties of dynamical systems are studied. For a continuous map f from a compact metric space X into itself, we show that if f satisfies the large deviations theorem then it is topologically ergodic. Moreover, we introduce the topologically strong ergodicity, and prove that if f is a topologically strongly ergodic map satisfying the large deviations theorem then it is sensitively dependent on initial conditions

  15. Ex Vivo and In Vivo Mice Models to Study Blastocystis spp. Adhesion, Colonization and Pathology: Closer to Proving Koch's Postulates.

    Directory of Open Access Journals (Sweden)

    Sitara S R Ajjampur

    Full Text Available Blastocystis spp. are widely prevalent extra cellular, non-motile anerobic protists that inhabit the gastrointestinal tract. Although Blastocystis spp. have been associated with gastrointestinal symptoms, irritable bowel syndrome and urticaria, their clinical significance has remained controversial. We established an ex vivo mouse explant model to characterize adhesion in the context of tissue architecture and presence of the mucin layer. Using confocal microscopy with tissue whole mounts and two axenic isolates of Blastocystis spp., subtype 7 with notable differences in adhesion to intestinal epithelial cells (IEC, isolate B (ST7-B and isolate H (more adhesive, ST7-H, we showed that adhesion is both isolate dependent and tissue trophic. The more adhesive isolate, ST7-H was found to bind preferentially to the colon tissue than caecum and terminal ileum. Both isolates were also found to have mucinolytic effects. We then adapted a DSS colitis mouse model as a susceptible model to study colonization and acute infection by intra-caecal inoculation of trophic Blastocystis spp.cells. We found that the more adhesive isolate ST7-H was also a better colonizer with more mice shedding parasites and for a longer duration than ST7-B. Adhesion and colonization was also associated with increased virulence as ST7-H infected mice showed greater tissue damage than ST7-B. Both the ex vivo and in vivo models used in this study showed that Blastocystis spp. remain luminal and predominantly associated with mucin. This was further confirmed using colonic loop experiments. We were also successfully able to re-infect a second batch of mice with ST7-H isolates obtained from fecal cultures and demonstrated similar histopathological findings and tissue damage thereby coming closer to proving Koch's postulates for this parasite.

  16. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland. Volume 1: Field Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Benioff, P.; Biang, R.; Dolak, D.; Dunn, C.; Martino, L.; Patton, T.; Wang, Y.; Yuen, C.

    1995-03-01

    The Environmental Management Division (EMD) of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. J-Field is within the Edgewood Area of APG in Harford County, Maryland (Figure 1. 1). Since World War II activities in the Edgewood Area have included the development, manufacture, testing, and destruction of chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). Considerable archival information about J-Field exists as a result of efforts by APG staff to characterize the hazards associated with the site. Contamination of J-Field was first detected during an environmental survey of the Edgewood Area conducted in 1977 and 1978 by the US Army Toxic and Hazardous Materials Agency (USATHAMA) (predecessor to the US Army Environmental Center [AEC]). As part of a subsequent USATHAMA -environmental survey, 11 wells were installed and sampled at J-Field. Contamination at J-Field was also detected during a munitions disposal survey conducted by Princeton Aqua Science in 1983. The Princeton Aqua Science investigation involved the installation and sampling of nine wells and the collection and analysis of surficial and deep composite soil samples. In 1986, a Resource Conservation and Recovery Act (RCRA) permit (MD3-21-002-1355) requiring a basewide RCRA Facility Assessment (RFA) and a hydrogeologic assessment of J-Field was issued by the US Environmental Protection Agency (EPA). In 1987, the US Geological Survey (USGS) began a two-phased hydrogeologic assessment in data were collected to model, groundwater flow at J-Field. Soil gas investigations were conducted, several well clusters were installed, a groundwater flow model was developed, and groundwater and surface water monitoring programs were established that continue today.

  17. Work plan for focused feasibility study of the toxic burning pits area at J-Field, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Biang, C.; Benioff, P.; Martino, L.; Patton, T.

    1995-03-01

    The Environmental Management Division (EMD) of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act, as amended (CERCIA). J-Field is within the Edgewood Area of APG in Harford County, Maryland. Since World War II, activities in the Edgewood Area have included the development, manufacture, testing, and destruction of chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). Considerable archival information about J-Field exists as a result of efforts by APG staff to characterize the hazards associated with the site. Contamination of J-Field was first detected during an environmental survey of the Edgewood Area conducted in 1977 and 1978 by the US Army Toxic and Hazardous Materials Agency (USATHAMA)(predecessor to the US Army Environmental Center). As part of a subsequent USATHAMA environmental survey, 11 wells were installed and sampled at J-Field. Contamination at J-Field was also detected during a munitions disposal survey conducted by Princeton Aqua Science in 1983. The Princeton Aqua Science investigation involved the installation and sampling of nine wells and the collection and analysis of surficial and deep composite soil samples. In 1986, a Resource Conservation and Recovery Act (RCRA) permit (MD3-21-0021355) requiring a basewide RCRA Facility Assessment (RFA) and a hydrogeologic assessment of J-Field was issued by the US Environmental Protection Agency (EPA). In 1987, the US Geological Survey (USGS) began a two-phased hydrogeologic assessment in which data were collected to model groundwater flow at J-Field. Soil gas investigations were conducted, several well clusters were installed, a groundwater flow model was developed, and groundwater and surface water monitoring programs were established that continue today-

  18. Nuclear Bombs and Coral: Guam Coral Core Reveals Operation-Specific Radiocarbon Signals from the Pacific Proving Grounds

    Science.gov (United States)

    Andrews, A. H.

    2016-12-01

    Radiocarbon (14C) analyses on a coral core extracted from the western Central Pacific (Guam) has revealed a series of early peaks in the marine bomb 14C record. The typical marine bomb 14C signal, one that is phase lagged and attenuated relative to atmospheric bomb 14C, is present in the coral core and is consistent with other North Pacific records. However, 14C levels that are well above what can be explained by air-sea diffusion alone punctuate this pattern. This anomaly has been demonstrated to a limited extent in other coral cores of the Indo-Pacific region, but is unmatched relative to the magnitude and temporal resolution recorded in the Guam coral core. Other records have shown an early Δ14C rise on the order of 40-50‰ above pre-bomb levels, with a subsequent decline before continuing the gradual Δ14C rise that is indicative of air-sea diffusion of 14CO2. The Guam coral Δ14C record provided three strong pulses in 1954-55, 1956-57, and 1958-59 that are superimposed on the pre-bomb to initial Δ14C rise from atmospheric bomb 14C. Each of these peaks can be directly linked to testing of thermonuclear devices in the Pacific Proving Grounds at Eniwetok and Bikini Atoll of the Marshall Islands. The measurable lag in reaching Guam can be tied to ocean surface currents and can be traced to other regional Δ14C records from corals, providing a transport timeline to places as distant as the Indonesian throughflow, Okinawa and Palmyra.

  19. Caratterizzazione microstrutturale e prove di resilienza su giunti Friction Stir Welding e Linear Friction Welding di compositi a matrice metallica

    Directory of Open Access Journals (Sweden)

    M. Merlin

    2010-04-01

    Full Text Available In questo studio sono stati caratterizzati giunti Friction Stir Welding e Linear Friction Welding su compositi a matrice in lega di alluminio e rinforzo particellare ceramico. Il processo FSW è stato applicato a due compositi ottenuti con processo fusorio, quindi estrusi e trattati termicamente T6: AA6061/20%vol.Al2O3p e AA7005/10%vol.Al2O3p. I giunti LFW sono stati invece realizzati su un composito con matrice in lega di alluminio e rinforzo particellare in carburo di silicio, ottenuto mediante metallurgia delle polveri, quindi forgiato e trattato termicamente T4: AA2124/25%vol.SiCp. Sono stati esaminati gli effetti della saldatura sullecaratteristiche microstrutturali dei giunti, avvalendosi di tecniche di microscopia ottica con analisi di immagine e di microscopia elettronica in scansione (SEM con microsonda a dispersione di energia (EDS. Sono state quindi condotte prove di resilienza con pendolo strumentato Charpy. Lo studio dei meccanismi di danneggiamento è stato effettuato mediante analisi al SEM delle superfici di frattura. Entrambi i processi di saldatura hanno portato a giunti sostanzialmente esenti da difetti. La microstruttura dei cordoni è risultata dipendente sia dalle caratteristiche microstrutturali iniziali dei compositi considerati, sia dalla tipologia di processo di saldatura. Nel caso dei compositi AA6061/20%Al2O3p e AA7005/10%Al2O3p saldati FSW si è osservato un sostanziale incremento di resilienza, rispetto al materiale base, in conseguenza dell’affinamento dei grani della matrice, della riduzione della dimensione media delle particelle di rinforzo e della loro spigolosità, indotte dal processo di saldatura. Il composito AA2124/25%SiCp saldato LFW ha presentato valori di resilienza confrontabili con quelli del materiale base, in conseguenza, soprattutto, dei limitati effetti della saldatura su dimensione e distribuzione delle particelle di rinforzo.

  20. MX Siting Investigation. Volume IIB. Geotechnical Report, Yuma Proving Grounds/Luke-Williams Bombing and Gunnery Range (YPG/LWBGR).

    Science.gov (United States)

    1975-06-30

    within YPG/LWBGR is largely transient and consists primarily of military, civil service, and contractual personnel totaling approximately 2000... Ingeneria Antisimica, Santiago, Chile, 1, p. 14-27. Algermissen, S. T., and Perkins, D. M., 1972, Technique for seismic zoning: general consideration...T P -- m.. SIA~gUN,. ,.. A- 45 SOURCES OF PERSONAL COMMUNICATION Ashley, John F., Civil Engineer, 58th Civil Engineer Squadron, Luke Air Force Base

  1. Simultaneous perineal ultrasound and vaginal pressure measurement prove the action of electrical pudendal nerve stimulation in treating female stress incontinence.

    Science.gov (United States)

    Wang, Siyou; Zhang, Shujing

    2012-11-01

    Study Type - Diagnostic (case series) Level of Evidence 4. What's known on the subject? and What does the study add? Pelvic floor muscle training (PFMT) and transvaginal electrical stimulation (TES) are two commonly used forms of conservative treatment for stress urinary incontinence (SUI). PFMT may build up the structural support of the pelvis, but many SUI patients are unable to perform PFMT effectively and its primary disadvantage is lack of long-term patient compliance. TES is a passive treatment that produces PFM contraction and patient compliance with it is good; however, its effect is not as good as that of PFMT when performed correctly. Electrical pudendal nerve stimulation (EPNS) combines the advantages of PFMT and TES and incorporates the technique of deep insertion of long needles. In this study, simultaneous perineal ultrasound and vaginal pressure measurement prove that EPNS can contract the PFM and simulate PFMT. It is shown that EPNS is an alternative therapy for female SUI patients who fail PFMT and TES and the therapy can also be used for severe SUI. • To prove that electrical pudendal nerve stimulation (EPNS) can contract the pelvic floor muscles (PFM) and simulate pelvic floor muscle training (PFMT). • To show that EPNS is an alternative therapy for female stress urinary incontinence (SUI) that does not respond effectively to PFMT and transvaginal electrical stimulation (TES). • Thirty-five female patients with SUI who did not respond effectively to PFMT and TES (group I) were enrolled and 60 other female patients with SUI were allocated to group II (30 patients) and group III (30 patients). • Long needles were deeply inserted into four sacral points and electrified to stimulate the pudendal nerves. Group I and group II were treated by a doctor skilled in performing EPNS and group III, by a doctor unskilled in performing EPNS. • When EPNS was performed in group I, perineal ultrasonographic PFM movements, vaginal pressure (VP) and PFM

  2. The U.S. Army Occupational and Environmental Medicine Residency at Aberdeen Proving Ground, Maryland: 1960-1996.

    Science.gov (United States)

    Gaydos, Joel C; Mallon, Timothy M; Rice, William A

    2016-11-01

    Reorganization of the Army and critical assessment of Army Graduate Medical Education programs prompted the Occupational and Environmental Medicine (OEM) Consultant to the Army Surgeon General to initiate a review of current Army OEM residency training. Available information indicated the Army OEM residency at Aberdeen Proving Ground, MD, was the first and longest operating Army OEM residency. Describing this residency was identified as the first step in the review, with the objectives of determining why the residency was started and sustained and its relevance to the needs of the Army. Records possibly related to the residency were reviewed, starting with 1954 since certification of physicians as Occupation Medicine specialists began in 1955. Interviews were conducted with selected physicians who had strong affiliations with the Army residency and the practice of Army OEM. The Army OEM residency began in 1960 and closed in 1996 with the transfer of Army OEM residency training to the Uniformed Services University of the Health Sciences, Bethesda, MD. Over 36 years, 47 uniformed residency graduates were identified; 44 were from the Army. Forty graduated between 1982 and 1996. The OEM residency was part of a dynamic cycle. Uniformed OEM leaders identified the knowledge and skills required of military OEM physicians and where these people should be stationed in the global Army. Rotations at military sites to acquire the needed knowledge and skills were integrated into the residency. Residency graduates were assigned to positions where they were needed. Having uniformed residents and preceptors facilitated the development of trust with military leaders and access to areas where OEM physician skills and knowledge could have a positive impact. Early reports indicated the residency was important in recruiting and retaining OEM physicians, with emphasis placed on supporting the Army industrial base. The late 1970s into the 1990s was a more dynamic period. There was

  3. Intelligent Physical Exercise Training proves effective in enhancing muscle strength and reducing musculoskeletal pain in a workplace setting

    DEFF Research Database (Denmark)

    Dalager, Tina; Justesen, Just Bendix; Sjøgaard, Gisela

    intensity IPET every week within working hours for one year. The training program was based on baseline health check measures of muscle strength, musculoskeletal pain (self-reported on a 0-9 numeric box scale), cardiorespiratory fitness and health risk indicators, as well as functional capacity including...... compared with REF (~ 20 %). Discussion: High intensity IPET during working hours significantly reduced musculoskeletal pain in neck and shoulders as well as increased muscle strength among office workers. Of note is the large proportion of employees in TG who had pain reductions of ≥1, which is considered...... workers based on health checks and to assess the effect on musculoskeletal health (Sjøgaard G et al. BMC Public Health 2014, 14:652). Methods: Office workers were at each of 6 companies randomized 1:1 to a training group, TG (N=194) or a reference group, REF (N=195). TG received one-hour supervised high...

  4. Com aplicar les proves paramètriques bivariades t de Student i ANOVA en SPSS. Cas pràctic

    Directory of Open Access Journals (Sweden)

    María-José Rubio-Hurtado

    2012-07-01

    Full Text Available Les proves paramètriques són un tipus de proves de significació estadística que quantifiquen l'associació o independència entre una variable quantitativa i una categòrica. Les proves paramètriques són exigents amb certs requisits previs per a la seva aplicació: la distribució Normal de la variable quantitativa en els grups que es comparen, l'homogeneïtat de variàncies en les poblacions de les quals procedeixen els grups i una n mostral no inferior a 30. El seu no compliment comporta la necessitat de recórrer a proves estadístiques no paramètriques. Les proves paramètriques es classifiquen en dos: prova t (per a una mostra o per a dues mostres relacionades o independents i prova ANOVA (per a més de dues mostres independents.

  5. Induction of host defences by Rhizobium during ineffective nodulation of pea (Pisum sativum L.) carrying symbiotically defective mutations sym40 (PsEFD), sym33 (PsIPD3/PsCYCLOPS) and sym42.

    Science.gov (United States)

    Ivanova, Kira A; Tsyganova, Anna V; Brewin, Nicholas J; Tikhonovich, Igor A; Tsyganov, Viktor E

    2015-11-01

    Rhizobia are able to establish a beneficial interaction with legumes by forming a new organ, called the symbiotic root nodule, which is a unique ecological niche for rhizobial nitrogen fixation. Rhizobial infection has many similarities with pathogenic infection and induction of defence responses accompanies both interactions, but defence responses are induced to a lesser extent during rhizobial infection. However, strong defence responses may result from incompatible interactions between legumes and rhizobia due to a mutation in either macro- or microsymbiont. The aim of this research was to analyse different plant defence reactions in response to Rhizobium infection for several pea (Pisum sativum) mutants that result in ineffective symbiosis. Pea mutants were examined by histochemical and immunocytochemical analyses, light, fluorescence and transmission electron microscopy and quantitative real-time PCR gene expression analysis. It was observed that mutations in pea symbiotic genes sym33 (PsIPD3/PsCYCLOPS encoding a transcriptional factor) and sym40 (PsEFD encoding a putative negative regulator of the cytokinin response) led to suberin depositions in ineffective nodules, and in the sym42 there were callose depositions in infection thread (IT) and host cell walls. The increase in deposition of unesterified pectin in IT walls was observed for mutants in the sym33 and sym42; for mutant in the sym42, unesterified pectin was also found around degrading bacteroids. In mutants in the genes sym33 and sym40, an increase in the expression level of a gene encoding peroxidase was observed. In the genes sym40 and sym42, an increase in the expression levels of genes encoding a marker of hypersensitive reaction and PR10 protein was demonstrated. Thus, a range of plant defence responses like suberisation, callose and unesterified pectin deposition as well as activation of defence genes can be triggered by different pea single mutations that cause perception of an otherwise

  6. Environmental effects and large space systems

    Science.gov (United States)

    Garrett, H. B.

    1981-01-01

    When planning large scale operations in space, environmental impact must be considered in addition to radiation, spacecraft charging, contamination, high power and size. Pollution of the atmosphere and space is caused by rocket effluents and by photoelectrons generated by sunlight falling on satellite surfaces even light pollution may result (the SPS may reflect so much light as to be a nuisance to astronomers). Large (100 Km 2) structures also will absorb the high energy particles that impinge on them. Altogether, these effects may drastically alter the Earth's magnetosphere. It is not clear if these alterations will in any way affect the Earth's surface climate. Large structures will also generate large plasma wakes and waves which may cause interference with communications to the vehicle. A high energy, microwave beam from the SPS will cause ionospheric turbulence, affecting UHF and VHF communications. Although none of these effects may ultimately prove critical, they must be considered in the design of large structures.

  7. Collaborative networks in the internet of services: 13th IFIP WG 5.5 working conference on virtual enterprises, PRO-VE 2012, Bournemouth, UK, October 2012: proceedings

    NARCIS (Netherlands)

    Camarinha-Matos, L.M.; Xu, L.; Afsarmanesh, H.

    2012-01-01

    This book constitutes the refereed proceedings of the 13th IFIP WG 5.5 Working Conference on Virtual Enterprises, PRO-VE 2012, held in Bournemouth, UK, in October 2012. The 61 revised papers presented were carefully selected from numerous submissions. They provide a comprehensive overview of

  8. The Effects of GeoGebra Software on Pre-Service Mathematics Teachers' Attitudes and Views toward Proof and Proving

    Science.gov (United States)

    Zengin, Yilmaz

    2017-01-01

    The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used…

  9. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  10. Glucocorticoids are ineffective in alcoholic hepatitis

    DEFF Research Database (Denmark)

    Christensen, E; Gluud, C

    1995-01-01

    The aim of this study was to perform a meta-analysis of controlled clinical trials of glucocorticoid treatment in clinical alcoholic hepatitis, adjusting for prognostic variables and their possible interaction with therapy, because these trials have given appreciably different results. Weighted...... logistic regression analysis was applied using the summarised descriptive data (for example, % with encephalopathy, mean bilirubin value) of the treatment and control groups of 12 controlled trials that gave this information. Despite evidence of publication bias favouring glucocorticoid treatment, its...... overall effect on mortality was not statistically significant (p = 0.20)--the relative risk (steroid/control) was 0.78 (95% confidence intervals 0.51, 1.18). There was indication of interaction between glucocorticoid therapy and gender, but not encephalopathy. Thus, the effect of glucocorticoid treatment...

  11. Counterpoint: Special Education--Ineffective? Immoral?

    Science.gov (United States)

    Fuchs, Douglas; Fuchs, Lynn S.

    1995-01-01

    This counterpoint to a critique of the authors' paper, which argued against full inclusion of students with disabilities, offers evidence of the effectiveness of special education and notes court litigation that has recognized that separate is not always unequal. (JDD)

  12. The ineffectiveness of contracts for public services

    Directory of Open Access Journals (Sweden)

    Jorg Pudelka

    2017-03-01

    Full Text Available This article examines the legal nature and application of contracts for public services. On the one hand the data to be treated as constitutionally guaranteed freedom, and on the other as the subordination of public authorities, acting unilaterally using the imperative management. Complimentary benefits unilateral public contracts for comparative analysis and their types.

  13. The Relative Ineffectiveness of Criminal Network Disruption

    Science.gov (United States)

    Duijn, Paul A. C.; Kashirin, Victor; Sloot, Peter M. A.

    2014-01-01

    Researchers, policymakers and law enforcement agencies across the globe struggle to find effective strategies to control criminal networks. The effectiveness of disruption strategies is known to depend on both network topology and network resilience. However, as these criminal networks operate in secrecy, data-driven knowledge concerning the effectiveness of different criminal network disruption strategies is very limited. By combining computational modeling and social network analysis with unique criminal network intelligence data from the Dutch Police, we discovered, in contrast to common belief, that criminal networks might even become ‘stronger’, after targeted attacks. On the other hand increased efficiency within criminal networks decreases its internal security, thus offering opportunities for law enforcement agencies to target these networks more deliberately. Our results emphasize the importance of criminal network interventions at an early stage, before the network gets a chance to (re-)organize to maximum resilience. In the end disruption strategies force criminal networks to become more exposed, which causes successful network disruption to become a long-term effort. PMID:24577374

  14. Civil Society in Nigeria: Reasons for Ineffectiveness

    Science.gov (United States)

    2015-03-01

    commodities such as oil and gas left Nigeria in financial ruin after the 1970s’ oil collapse. With the withdrawal of Soviet funds, the World Banks...standards across Nigeria .65 The new and more efficient “adjusted” economy that the World Bank and the International Monetary Fund ( IMF ) projected...comparative advantage for Nigeria and had the potential to industrialize and expand the agricultural industry. Instead, the sector was neglected

  15. Ineffectiveness of anticoagulation in experimental radiation

    International Nuclear Information System (INIS)

    Roettinger, E.M.; Sedlacek, R.; Suit, H.D.

    1975-01-01

    A spontaneous mammary carcinoma, a methylcholanthrene induced fibrosarcoma, and a methylcholanthrene induced squamous carcinoma have been examined in syngeneic C3Hsub(f)/Sed mice to assess the modification of tumour response to local radiation therapy by the administration of warfarin. For the fibrosarcoma (demonstrably antigenic in this animal system) recipient mice were noraml or previously sensitized to the tumor. The transplant take rate and the growth rate of the mammary carcinoma and the fibrosarcoma were determined following subcutaneous injections of single cell suspensions. The radiation dose required for local control of 50% of the tumors (TCD 50 ) was determined for single and ten equal radiation doses. The dosage of warfarin in the drinking water resulted in a 2 to 3-fold prolongation of the prothrombin time. Warfarin administration before and following inoculation of tumor cells did not alter the transplantability. Warfarin administration either during the course of fractionated irradiation, or at and following single doses, did not affect tumor response to irradiation. (author)

  16. Risco para amamentação ineficaz: um diagnóstico de enfermagem Risco para el amamantamiento ineficaz: un diagnostico de enfermería Risk of ineffective breast-feeding: a nursing diagnosis

    Directory of Open Access Journals (Sweden)

    Cláudia Silveira Viera

    2004-12-01

    Full Text Available Este estudo objetiva apresentar o diagnóstico de enfermagem risco para amamentação ineficaz em mães com filho prematuro hospitalizados em uma UTI neonatal. O estudo de caso delineou a metodologia do estudo, constituindo-se a amostra de 35 mães. Encontrou-se este diagnóstico em 100% da amostra, obtendo como fatores de risco:prematuridade; oportunidade insuficiente para a amamentação ao seio, devido ao recém-nascido (RN estar hospitalizado; déficit de conhecimento quanto à manutenção da lactação; medo materno; inconstância da sucção do seio devido à separação; alimentação artificial do RN. Considera-se que a identificação de riscos para a não amamentação durante o período de hospitalização do RN possibilita o direcionamento de um cuidado de enfermagem voltado para a prevenção de um diagnóstico de amamentação ineficaz.Este estudio objetiva presentar el diagnóstico de enfermería riesgo para amamantamiento ineficaz en madres con hijo prematuro hospitalizados en una UTI neonatal. El estudio de caso delineó la metodología del estudio, constituyéndose la muestra de 35 madres. Se encontró este diagnóstico en 100% de la muestra, obteniendo como factores de riesgo: premadurez; oportunidad insuficiente para el amamantamiento al seno, debido al recién nacido (RN estar hospitalizado; déficit de conocimiento en lo que se refiere al mantenimiento de la lactancia; miedo materno; inconstancia de la succión del seno debido a la separación; alimentación artificial del RN. Se considera que la identificación de riesgos para que no se amamante durante el período de hospitalización del RN posibilita la dirección de un cuidado de enfermería volcado para la prevención de un diagnóstico de amamantamiento ineficaz.This study is aimed at presenting the 'risk of ineffective breast-feeding' nursing diagnosis regarding mothers of premature infants who are hospitalized in a neonatal Intensive Care Unit. The case study

  17. Interactive Theorem Proving and Verification

    Indian Academy of Sciences (India)

    human beings and computers. ... proofs themselves come from humans, the formalisations are meant to be ... real feel for the formalization process, it also addresses some of the central questions ... outside mathematics and computer science.

  18. Combining norms to prove termination

    DEFF Research Database (Denmark)

    Genaim, S.; Codish, M.; Gallagher, John Patrick

    2002-01-01

    Automatic termination analysers typically measure the size of terms applying norms which are mappings from terms to the natural numbers. This paper illustrates howt o enable the use of size functions defined as tuples of these simpler norm functions. This approach enables us to simplify the probl...... of the recursive data-types in the program, is often a suitable choice. We first demonstrate the power of combining norm functions and then the adequacy of combining norms based on regular types....

  19. Test to prove the resistance to incidents of components of electric and control systems in the safety containment of nuclear power plants

    International Nuclear Information System (INIS)

    1982-01-01

    The marginal program for proving the suitability of safety-relevant components of electric and control systems in the safety containment during a loss-of-coolant incident is described. Variant test conditions are established in the component-specific test program. Special attention has been paid to the representation of the course of pressure and temperature for the performance test of the valve room of the Nuclear Power Plant Philippsburg 2. (DG) [de

  20. General Large Deviations and Functional Iterated Logarithm Law for Multivalued Stochastic Differential Equations

    OpenAIRE

    Ren, Jiagang; Wu, Jing; Zhang, Hua

    2015-01-01

    In this paper, we prove a large deviation principle of Freidlin-Wentzell's type for the multivalued stochastic differential equations. As an application, we derive a functional iterated logarithm law for the solutions of multivalued stochastic differential equations.

  1. Large number discrimination by mosquitofish.

    Directory of Open Access Journals (Sweden)

    Christian Agrillo

    Full Text Available BACKGROUND: Recent studies have demonstrated that fish display rudimentary numerical abilities similar to those observed in mammals and birds. The mechanisms underlying the discrimination of small quantities (<4 were recently investigated while, to date, no study has examined the discrimination of large numerosities in fish. METHODOLOGY/PRINCIPAL FINDINGS: Subjects were trained to discriminate between two sets of small geometric figures using social reinforcement. In the first experiment mosquitofish were required to discriminate 4 from 8 objects with or without experimental control of the continuous variables that co-vary with number (area, space, density, total luminance. Results showed that fish can use the sole numerical information to compare quantities but that they preferentially use cumulative surface area as a proxy of the number when this information is available. A second experiment investigated the influence of the total number of elements to discriminate large quantities. Fish proved to be able to discriminate up to 100 vs. 200 objects, without showing any significant decrease in accuracy compared with the 4 vs. 8 discrimination. The third experiment investigated the influence of the ratio between the numerosities. Performance was found to decrease when decreasing the numerical distance. Fish were able to discriminate numbers when ratios were 1:2 or 2:3 but not when the ratio was 3:4. The performance of a sample of undergraduate students, tested non-verbally using the same sets of stimuli, largely overlapped that of fish. CONCLUSIONS/SIGNIFICANCE: Fish are able to use pure numerical information when discriminating between quantities larger than 4 units. As observed in human and non-human primates, the numerical system of fish appears to have virtually no upper limit while the numerical ratio has a clear effect on performance. These similarities further reinforce the view of a common origin of non-verbal numerical systems in all

  2. The IEA Large Coil Task

    International Nuclear Information System (INIS)

    Beard, D.S.; Klose, W.; Shimamoto, S.; Vecsey, G.

    1988-01-01

    A multinational program of cooperative research, development, demonstrations, and exchanges of information on superconducting magnets for fusion was initiated in 1977 under an IEA agreement. The first major step in the development of TF magnets was called the Large Coil Task. Participants in LCT were the U.S. DOE, EURATOM, JAERI, and the Departement Federal de l'Interieur of Switzerland. The goals of LCT were to obtain experimental data, to demonstrate reliable operation of large superconducting coils, and to prove design principles and fabrication techniques being considered for the toroidal magnets of thermonuclear reactors. These goals were to be accomplished through coordinated but largely independent design, development, and construction of six test coils, followed by collaborative testing in a compact toroidal test array at fields of 8 T and higher. Under the terms of the IEA Agreement, the United States built and operated the test facility at Oak Ridge and provided three test coils. The other participants provided one coil each. Information on design and manufacturing and all test data were shared by all. The LCT team of each participant included a government laboratory and industrial partners or contractors. The last coil was completed in 1985, and the test assembly was completed in October of that year. Over the next 23 months, the six-coil array was cooled down and extensive testing was performed. Results were gratifying, as tests achieved design-point performance and well beyond. (Each coil reached a peak field of 9 T.) Experiments elucidated coil behavior, delineated limits of operability, and demonstrated coil safety. (orig./KP)

  3. Instantons and Large N

    Science.gov (United States)

    Mariño, Marcos

    2015-09-01

    Preface; Part I. Instantons: 1. Instantons in quantum mechanics; 2. Unstable vacua in quantum field theory; 3. Large order behavior and Borel summability; 4. Non-perturbative aspects of Yang-Mills theories; 5. Instantons and fermions; Part II. Large N: 6. Sigma models at large N; 7. The 1=N expansion in QCD; 8. Matrix models and matrix quantum mechanics at large N; 9. Large N QCD in two dimensions; 10. Instantons at large N; Appendix A. Harmonic analysis on S3; Appendix B. Heat kernel and zeta functions; Appendix C. Effective action for large N sigma models; References; Author index; Subject index.

  4. Large deviations for Gaussian processes in Hoelder norm

    International Nuclear Information System (INIS)

    Fatalov, V R

    2003-01-01

    Some results are proved on the exact asymptotic representation of large deviation probabilities for Gaussian processes in the Hoeder norm. The following classes of processes are considered: the Wiener process, the Brownian bridge, fractional Brownian motion, and stationary Gaussian processes with power-law covariance function. The investigation uses the method of double sums for Gaussian fields

  5. The Cauchy problem for the Pavlov equation with large data

    Science.gov (United States)

    Wu, Derchyi

    2017-08-01

    We prove a local solvability of the Cauchy problem for the Pavlov equation with large initial data by the inverse scattering method. The Pavlov equation arises in studies Einstein-Weyl geometries and dispersionless integrable models. Our theory yields a local solvability of Cauchy problems for a quasi-linear wave equation with a characteristic initial hypersurface.

  6. The Efficacy and Safety of Chinese Herbal Medicine Jinlida as Add-On Medication in Type 2 Diabetes Patients Ineffectively Managed by Metformin Monotherapy: A Double-Blind, Randomized, Placebo-Controlled, Multicenter Trial.

    Science.gov (United States)

    Lian, Fengmei; Tian, Jiaxing; Chen, Xinyan; Li, Zhibin; Piao, Chunli; Guo, Junjie; Ma, Licheng; Zhao, Lijuan; Xia, Chengdong; Wang, Chong-Zhi; Yuan, Chun-Su; Tong, Xiaolin

    2015-01-01

    Metformin plays an important role in diabetes treatment. Studies have shown that the combined use of oral hypoglycemic medications is more effective than metformin monotherapy. In this double-blind, randomized, placebo-controlled, multicenter trial, we evaluated whether Jinlida, a Chinese herbal medicine, enhances the glycemic control of metformin in type 2 diabetes patients whose HbA1c was ineffectively controlled with metformin alone. A total of 186 diabetes patients were enrolled in this double-Blind, randomized, placebo-controlled, multicenter trial. Subjects were randomly allocated to receive either Jinlida (9 g) or the placebo TID for 12 consecutive weeks. All subjects in both groups also continuously received their metformin without any dose change. During this 12-week period, the HbA1c, FPG, 2 h PG, body weight, BMI were assessed. HOMA insulin resistance (HOMA-IR) and β-cell function (HOMA-β) were also evaluated. At week 12, compared to the HbA1c level from week 0, the level of the Jinlida group was reduced by 0.92 ± 1.09% and that of the placebo group was reduced by 0.53 ± 0.94%. The 95% CI was 0.69-1.14 for the Jinlida group vs. 0.34-0.72 for the placebo group. There was a very significant HbA1c reduction between the two groups after 12 weeks (p Jinlida group and placebo group were reduced from week 0. There were a very significant FG and 2 h PG level reductions between the two groups after 12 weeks (both p Jinlida group also showed improved β-cell function with a HOMA-β increase (p Jinlida significantly enhanced the hypoglycemic action of metformin when the drug was used alone. This Chinese herbal medicine may have a clinical value as an add-on medication to metformin monotherapy. Chinese Clinical Trial Register ChiCTR-TRC-13003159.

  7. Compound heterozygous mutations (p.Leu13Pro and p.Tyr294*) associated with factor VII deficiency cause impaired secretion through ineffective translocation and extensive intracellular degradation of factor VII.

    Science.gov (United States)

    Suzuki, Keijiro; Sugawara, Takeshi; Ishida, Yoji; Suwabe, Akira

    2013-02-01

    Congenital coagulation factor VII (FVII) deficiency is a rare coagulation disease. We investigated the molecular mechanisms of this FVII deficiency in a patient with compound heterozygous mutations. A 22-year-old Japanese female was diagnosed with asymptomatic FVII deficiency. The FVII activity and antigen were greatly reduced (activity, 13.0%; antigen, 10.8%). We analyzed the F7 gene of this patient and characterized mutant FVII proteins using in vitro expression studies. Sequence analysis revealed that the patient was compound heterozygous with a point mutation (p.Leu13Pro) in the central hydrophobic core of the signal peptides and a novel non-sense mutation (p.Tyr294*) in the catalytic domain. Expression studies revealed that mutant FVII with p.Leu13Pro (FVII13P) showed less accumulation in the cells (17.5%) and less secretion into the medium (64.8%) than wild type showed. Truncated FVII resulting from p.Tyr294* (FVII294X) was also decreased in the cells (32.0%), but was not secreted into the medium. Pulse-chase experiments revealed that both mutants were extensively degraded intracellularly compared to wild type. The majority of FVII13P cannot translocate into endoplasmic reticulum (ER). However, a small amount of FVII13P was processed normally with post-translational modifications and was secreted into the medium. The fact that FVII294X was observed only in ER suggests that it is retained in ER. Proteasome apparently plays a central role in these degradations. These findings demonstrate that both mutant FVIIs impaired secretion through ineffective translocation to and retention in ER with extensive intracellular degradation, resulting in an insufficient phenotype. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. [Utility and validity of indicators from the Nursing Outcomes Classification as a support tool for diagnosing Ineffective Self Health Management in patients with chronic conditions in primary health care].

    Science.gov (United States)

    Morilla-Herrera, J C; Morales-Asencio, J M; Fernández-Gallego, M C; Cobos, E Berrobianco; Romero, A Delgado

    2011-01-01

    Self-care and management of therapeutic regime (drugs adherence, preventive behaviours and development of healthy life-styles) are key components for managing chronic diseases. Nursing has standardized languages which describe many of these situations, such as the diagnosis "Ineffective Self Health Management" (ISHM) or many of the Nursing Outcomes Classification (NOC) indicators. The aims of this study were to determine the interobserver reliability of a NOC-based instrument for assessment and aid in diagnosis of the ISHM in patients with chronic conditions in Primary Health Care, to determine its diagnostic validity and to describe the prevalence of patients with this problem. Cross-sectional validation study developed in the provinces of Málaga, Cádiz and Almería from 2006 to 2009. Each patient was assessed by 3 independent observers: the first two observers evaluated scoring of the NOC indicators and the third one acted as the "gold-standard". Two hundred and twenty-eight patients were included, 37.7% of them with more than one chronic condition. NOC indicators showed a high interobserver reliability (ICC>0,70) and a consistency (Cronbach's alpha: 0.81). With a cut-point of 10.5, sensitivity was 61% and specificity 85%, and the area under the curve was 0.81 (CI95%: 0.77 to 0.85). The prevalence of patients with ISHM was 36% (CI 95%: 34 to 40). The use of NOC indicators allows evaluation of management of the therapeutic regime in people with chronic conditions with a satisfactory validity and it provides new approaches for dealing with this problem.

  9. Moderate-intensity statin therapy seems ineffective in primary cardiovascular prevention in patients with type 2 diabetes complicated by nephropathy. A multicenter prospective 8 years follow up study.

    Science.gov (United States)

    Sasso, Ferdinando Carlo; Lascar, Nadia; Ascione, Antonella; Carbonara, Ornella; De Nicola, Luca; Minutolo, Roberto; Salvatore, Teresa; Rizzo, Maria Rosaria; Cirillo, Plinio; Paolisso, Giuseppe; Marfella, Raffaele

    2016-10-13

    Although numerous studies and metanalysis have shown the beneficial effect of statin therapy in CVD secondary prevention, there is still controversy such the use of statins for primary CVD prevention in patients with DM. The purpose of this study was to evaluate the occurrence of total major adverse cardio-vascular events (MACE) in a cohort of patients with type 2 diabetes complicated by nephropathy treated with statins, in order to verify real life effect of statin on CVD primary prevention. We conducted an observational prospective multicenter study on 564 patients with type 2 diabetic nephropathy free of cardiovascular disease attending 21 national outpatient diabetes clinics and followed them up for 8 years. 169 of them were treated with statins (group A) while 395 were not on statins (group B). Notably, none of the patients was treated with a high-intensity statin therapy according to last ADA position statement. Total MACE occurred in 32 patients from group A and in 68 patients from group B. Fatal MACE occurred in 13 patients from group A and in 30 from group B; nonfatal MACE occurred in 19 patients from group A and in 38 patients from group B. The analysis of the Kaplan-Meier survival curves showed a not statistically significant difference in the incidence of total (p 0.758), fatal (p 0.474) and nonfatal (p 0.812) MACE between the two groups. HbA1c only showed a significant difference in the incidence of MACE between the two groups (HR 1.201, CI 1.041-1.387, p 0.012). These findings suggest that, in a real clinical setting, moderate-intensity statin treatment is ineffective in cardiovascular primary prevention for patients with diabetic nephropathy. Trial registration ClinicalTrials.gov Identifier NCT00535925. Date of registration: September 24, 2007, retrospectively registered.

  10. In situ analysis of soil at an open burning/open detonation disposal facility: J-Field, Aberdeen Proving Ground, Maryland

    International Nuclear Information System (INIS)

    Martino, L.; Cho, E.; Wrobel, J.

    1994-01-01

    Investigators have used a field-portable X-Ray Fluorescence (XRF) Analyzer to screen soils for a suite of metals indicative of the open burning and open detonation (OB/OD) activities that occurred at the J-Field site at Aberdeen Proving Ground, Maryland. The field XRF results were incorporated into a multiphase investigation of contaminants at the Toxic Burning Pits Area of Concern at J-Field. The authors determined that the field-portable XRF unit used for the study and the general concept of field XRF screening are invaluable tools for investigating an OB/OD site where intrusive sampling techniques could present unacceptable hazards to site workers

  11. Proceedings of the Scientific Conference on Obscuration and Aerosol Research Held in Aberdeen Proving Ground, Maryland on 17-21 June 1985.

    Science.gov (United States)

    1986-07-01

    Chem. Engrs., 32, 35 (1954). 9. Littman, H., Vukovic , D.V., Zdansk, F.K., and Grabavtit, Z.B., Can. J . Chem. Eng., 54, 33 (1976). 10. Morgan, M.H...COMMAND Aberdeen Proving Ground, Maryland 21010-5423 DISTRIBUTIONSTATEMENW A, Approvod iapublic releaB6~ s11 "’ftbui U o ited’ ^ J .. Di scl aimer The...PARTICLE COMPOSITION R. G. Keesee and A. W. Castleman, Jr ........ ........................ .13 NUCLEATION AND PARTICLE GROWTH " S. G. Kim and J . R

  12. Supernova Remnants with Fermi Large Area Telescope

    Directory of Open Access Journals (Sweden)

    Caragiulo M.

    2017-01-01

    Full Text Available The Large Area Telescope (LAT, on-board the Fermi satellite, proved to be, after 8 years of data taking, an excellent instrument to detect and observe Supernova Remnants (SNRs in a range of energies running from few hundred MeV up to few hundred GeV. It provides essential information on physical processes that occur at the source, involving both accelerated leptons and hadrons, in order to understand the mechanisms responsible for the primary Cosmic Ray (CR acceleration. We show the latest results in the observation of Galactic SNRs by Fermi-LAT.

  13. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  14. Large Neighborhood Search

    DEFF Research Database (Denmark)

    Pisinger, David; Røpke, Stefan

    2010-01-01

    Heuristics based on large neighborhood search have recently shown outstanding results in solving various transportation and scheduling problems. Large neighborhood search methods explore a complex neighborhood by use of heuristics. Using large neighborhoods makes it possible to find better...... candidate solutions in each iteration and hence traverse a more promising search path. Starting from the large neighborhood search method,we give an overview of very large scale neighborhood search methods and discuss recent variants and extensions like variable depth search and adaptive large neighborhood...

  15. Il Servizio Nazionale di Valutazione e le prove Invalsi. Stato dell’arte e proposte per una valutazione come agente di cambiamento

    Directory of Open Access Journals (Sweden)

    Roberto Trinchero

    2014-12-01

    Full Text Available Qual è la funzione del Servizio Nazionale di Valutazione formativa degli istituti scolastici? A cosa servono davvero le prove Invalsi? Le critiche che spesso vengono mosse a queste prove sono veramente fondate? Come può la valutazione dell’offerta formativa scolastica costituire davvero un agente di miglioramento? Il presente articolo intende fornire alcune risposte a queste domande, partendo dalle istanze che hanno ispirato l’autonomia scolastica e offrendo spunti per un utilizzo non fazioso della valutazione. La valutazione può essere davvero agente di cambiamento a patto che: i sia attribuito ai dati il corretto significato; ii la scuola sia in grado di comprendere i potenziali suggerimenti che la valutazione può dare e si apra al cambiamento positivo. La valutazione applicata ad una “scuola che si difende” non può che provocare inutili esiti di facciata. La valutazione applicata ad una “scuola che apprende” può davvero aiutarla ad esplicare appieno tutte le proprie potenzialità.

  16. Volume measurement system for plutonium nitrate solution and its uncertainty to be used for nuclear materials accountancy proved by demonstration over fifteen years

    International Nuclear Information System (INIS)

    Hosoma, Takashi

    2010-10-01

    An accurate volume measurement system for plutonium nitrate solution stored in an accountability tank with dip-tubes has been developed and demonstrated over fifteen years at the Plutonium Conversion Development Facility of the Japan Atomic Energy Agency. As a result of calibrations during the demonstration, it was proved that measurement uncertainty practically achieved and maintained was less than 0.1% (systematic character) and 0.15% (random) as one sigma which was half of the current target uncertainty admitted internationally. It was also proved that discrepancy between measured density and analytically determined density was less than 0.002 g·cm -3 as one sigma. These uncertainties include effects by long term use of the accountability tank where cumulative plutonium throughput is six tons. The system consists of high precision differential pressure transducers and a dead-weight tester, sequentially controlled valves for periodical zero adjustment, dampers to reduce pressure oscillation and a procedure to correct measurement biases. The sequence was also useful to carry out maintenances safely without contamination. Longevity of the transducer was longer than 15 years. Principles and essentials to determine solution volume and weight of plutonium, measurement biases and corrections, accurate pressure measurement system, maintenances and diagnostics, operational experiences, evaluation of measurement uncertainty are described. (author)

  17. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  18. Large Pelagics Intercept Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Large Pelagics Intercept Survey (LPIS) is a dockside survey of private and charterboat captains who have just completed fishing trips directed at large pelagic...

  19. Large electrostatic accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1984-01-01

    The paper is divided into four parts: a discussion of the motivation for the construction of large electrostatic accelerators, a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year

  20. The use of peracetic acid in drinking water systems: flow tests; L'acido peracetico in potabilizzazione: prove in flusso

    Energy Technology Data Exchange (ETDEWEB)

    Ragazzo, P. [Consorzio per l' Acquedotto del Basso Piave, San Dona' di Piave, VE (Italy); Navazio, G. [Padua Univ., Padua (Italy). Dipt. dei Processi Chimici dell' Ingegneria; Cavadone, A. [Solvay Chimica Italia S.p.A., Milan (Italy)

    2000-09-01

    In a previous research, a preliminary study was carried out on the disinfection efficiency of peracetic acid (PAA), comparing it to that of other disinfectants that are typically used, in batch tests with dosage values ranging from 0.5 to 5 ppm. The study was carried out on samples of water collected from several significant points of the treatment process at the main water treatment plant in Jesolo (Venice, Italy). On the basis of results (basically positive at that time) obtained from these tests, a 400 litre/hour pilot plant was built, as a lower scale reproduction of the drink water treatment system mentioned earlier, in order to study the characteristics of PAA even in tests that could more realistically simulate the flow of water along the process. These tests essentially confirmed the kinetics of the spontaneous hydrolysis to CH{sub 3} COOH+H{sub 2}O{sub 2} and those of dismutation to CH{sub 3}COOH+O{sub 2}, with half-life time values ranging from 3 to 12 hours, depending on the characteristics of the water (especially the pH factor) and the PAA concentration values. [Italian] In un precedente lavoro e' stato effettuato un preventivo studio sull'efficienza disinfettiva dell'acido peracetico, anche in confronto con gli altri piu' usuali disinfettanti, in prove condotte in batch, con dosaggi compresi tra 0.5 e 5 ppm, su campioni di acqua prelevati dai diversi punti significativi della linea di trattamento della centrale di Jesolo (Torre Caligo), gestita dal Consorzio Acquedottistico del Basso Piave di S. Dona' di Piave (Venezia). Sulla base dei risultati, sostanzialmente positivi, e' stato costruito un impianto pilota da 400l/h, riproducente, in scala, la linea di potabilizzazione su ricordata, per studiare le caratteristiche del PAA anche in prove piu' probanti condotte in flusso. In tali prove sono state sostanzialmente riconfermate le cinetiche delle reazioni spontanee di dirolisi a CH{sub 3}COOH+H{sub 2}O{sub 2} e di

  1. The Efficacy and Safety of Chinese Herbal Medicine Jinlida as Add-On Medication in Type 2 Diabetes Patients Ineffectively Managed by Metformin Monotherapy: A Double-Blind, Randomized, Placebo-Controlled, Multicenter Trial.

    Directory of Open Access Journals (Sweden)

    Fengmei Lian

    Full Text Available Metformin plays an important role in diabetes treatment. Studies have shown that the combined use of oral hypoglycemic medications is more effective than metformin monotherapy. In this double-blind, randomized, placebo-controlled, multicenter trial, we evaluated whether Jinlida, a Chinese herbal medicine, enhances the glycemic control of metformin in type 2 diabetes patients whose HbA1c was ineffectively controlled with metformin alone.A total of 186 diabetes patients were enrolled in this double-Blind, randomized, placebo-controlled, multicenter trial. Subjects were randomly allocated to receive either Jinlida (9 g or the placebo TID for 12 consecutive weeks. All subjects in both groups also continuously received their metformin without any dose change. During this 12-week period, the HbA1c, FPG, 2 h PG, body weight, BMI were assessed. HOMA insulin resistance (HOMA-IR and β-cell function (HOMA-β were also evaluated.At week 12, compared to the HbA1c level from week 0, the level of the Jinlida group was reduced by 0.92 ± 1.09% and that of the placebo group was reduced by 0.53 ± 0.94%. The 95% CI was 0.69-1.14 for the Jinlida group vs. 0.34-0.72 for the placebo group. There was a very significant HbA1c reduction between the two groups after 12 weeks (p < 0.01. Both FG and 2 h PG levels of the Jinlida group and placebo group were reduced from week 0. There were a very significant FG and 2 h PG level reductions between the two groups after 12 weeks (both p < 0.01. The Jinlida group also showed improved β-cell function with a HOMA-β increase (p < 0.05. No statistical significance was observed in the body weight and BMI changes. No serious adverse events were reported.Jinlida significantly enhanced the hypoglycemic action of metformin when the drug was used alone. This Chinese herbal medicine may have a clinical value as an add-on medication to metformin monotherapy.Chinese Clinical Trial Register ChiCTR-TRC-13003159.

  2. NOAA's Joint Polar Satellite System's (JPSS) Proving Ground and Risk Reduction (PGRR) Program - Bringing JPSS Science into Support of Key NOAA Missions!

    Science.gov (United States)

    Sjoberg, W.; McWilliams, G.

    2017-12-01

    This presentation will focus on the continuity of the NOAA Joint Polar Satellite System (JPSS) Program's Proving Ground and Risk Reduction (PGRR) and key activities of the PGRR Initiatives. The PGRR Program was established in 2012, following the launch of the Suomi National Polar Partnership (SNPP) satellite. The JPSS Program Office has used two PGRR Project Proposals to establish an effective approach to managing its science and algorithm teams in order to focus on key NOAA missions. The presenter will provide details of the Initiatives and the processes used by the initiatives that have proven so successful. Details of the new 2017 PGRR Call-for-Proposals and the status of project selections will be discussed.

  3. Focused feasibility study for surface soil at the main pits and pushout area, J-field toxic burning pits area, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Patton, T.; Benioff, P.; Biang, C.; Butler, J. [and others

    1996-06-01

    The Environmental Management Division of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act, as amended (CERCLA). J-Field is located within the Edgewood Area of APG in Harford County, Maryland. Since World War II, activities in the Edgewood Area have included the development, manufacture, testing, and destruction of chemical agents and munitions. These materials were destroyed at J-Field by open burning/open detonation. Portions of J-Field continue to be used for the detonation and disposal of unexploded ordnance (UXO) by open burning/open detonation under authority of the Resource Conservation and Recovery Act.

  4. Rationale and methods of the Prospective Study of Biomarkers, Symptom Improvement, and Ventricular Remodeling During Sacubitril/Valsartan Therapy for Heart Failure (PROVE-HF).

    Science.gov (United States)

    Januzzi, James L; Butler, Javed; Fombu, Emmanuel; Maisel, Alan; McCague, Kevin; Piña, Ileana L; Prescott, Margaret F; Riebman, Jerome B; Solomon, Scott

    2018-05-01

    Sacubitril/valsartan is an angiotensin receptor-neprilysin inhibitor indicated for the treatment of patients with chronic heart failure (HF) with reduced ejection fraction; however, its mechanism of benefit remains unclear. Biomarkers that are linked to ventricular remodeling, myocardial injury, and fibrosis may provide mechanistic insight and important clinical guidance regarding sacubitril/valsartan use. This 52-week, multicenter, open-label, single-arm study is designed to (1) correlate biomarker changes with cardiac remodeling parameters, cardiovascular outcomes, and patient-reported outcome data and (2) determine short- and long-term changes in concentrations of biomarkers related to potential mechanisms of action and effects of sacubitril/valsartan therapy. Approximately 830 patients with HF with reduced ejection fraction will be initiated and titrated on sacubitril/valsartan according to United States prescribing information. Primary efficacy end points include the changes in N-terminal pro-B-type natriuretic peptide concentrations and cardiac remodeling from baseline to 1 year. Secondary end points include changes in concentrations of N-terminal pro-B-type natriuretic peptide and remodeling to 6 months, and changes in patient-reported outcomes using the Kansas City Cardiomyopathy Questionnaire-23 from baseline to 1 year. In addition, several other relevant biomarkers will be measured. Biomarker changes relative to the number of cardiovascular events in 12 months will also be assessed as exploratory end points. Results from the Prospective Study of Biomarkers, Symptom Improvement, and Ventricular Remodeling During Sacubitril/Valsartan Therapy for Heart Failure (PROVE-HF) will help establish a mechanistic understanding of angiotensin receptor-neprilysin inhibitor therapeutic benefits and provide clinicians with clarity on how to interpret information on biomarkers during treatment (PROVE-HF ClinicalTrials.gov identifier: NCT02887183). Copyright © 2018 The

  5. Cyber security risk management: public policy implications of correlated risk, imperfect ability to prove loss, and observability of self-protection.

    Science.gov (United States)

    Oğüt, Hulisi; Raghunathan, Srinivasan; Menon, Nirup

    2011-03-01

    The correlated nature of security breach risks, the imperfect ability to prove loss from a breach to an insurer, and the inability of insurers and external agents to observe firms' self-protection efforts have posed significant challenges to cyber security risk management. Our analysis finds that a firm invests less than the social optimal levels in self-protection and in insurance when risks are correlated and the ability to prove loss is imperfect. We find that the appropriate social intervention policy to induce a firm to invest at socially optimal levels depends on whether insurers can verify a firm's self-protection levels. If self-protection of a firm is observable to an insurer so that it can design a contract that is contingent on the self-protection level, then self-protection and insurance behave as complements. In this case, a social planner can induce a firm to choose the socially optimal self-protection and insurance levels by offering a subsidy on self-protection. We also find that providing a subsidy on insurance does not provide a similar inducement to a firm. If self-protection of a firm is not observable to an insurer, then self-protection and insurance behave as substitutes. In this case, a social planner should tax the insurance premium to achieve socially optimal results. The results of our analysis hold regardless of whether the insurance market is perfectly competitive or not, implying that solely reforming the currently imperfect insurance market is insufficient to achieve the efficient outcome in cyber security risk management. © 2010 Society for Risk Analysis.

  6. Large mass storage facility

    International Nuclear Information System (INIS)

    Peskin, A.M.

    1978-01-01

    The report of a committee to study the questions surrounding possible acquisition of a large mass-storage device is presented. The current computing environment at BNL and justification for an online large mass storage device are briefly discussed. Possible devices to meet the requirements of large mass storage are surveyed, including future devices. The future computing needs of BNL are prognosticated. 2 figures, 4 tables

  7. Characterization of Preferential Ground-Water Seepage From a Chlorinated Hydrocarbon-Contaminated Aquifer to West Branch Canal Creek, Aberdeen Proving Ground, Maryland, 2002-04

    Science.gov (United States)

    Majcher, Emily H.; Phelan, Daniel J.; Lorah, Michelle M.; McGinty, Angela L.

    2007-01-01

    Wetlands act as natural transition zones between ground water and surface water, characterized by the complex interdependency of hydrology, chemical and physical properties, and biotic effects. Although field and laboratory demonstrations have shown efficient natural attenuation processes in the non-seep wetland areas and stream bottom sediments of West Branch Canal Creek, chlorinated volatile organic compounds are present in a freshwater tidal creek at Aberdeen Proving Ground, Maryland. Volatile organic compound concentrations in surface water indicate that in some areas of the wetland, preferential flow paths or seeps allow transport of organic compounds from the contaminated sand aquifer to the overlying surface water without undergoing natural attenuation. From 2002 through 2004, the U.S. Geological Survey, in cooperation with the Environmental Conservation and Restoration Division of the U.S. Army Garrison, Aberdeen Proving Ground, characterized preferential ground-water seepage as part of an ongoing investigation of contaminant distribution and natural attenuation processes in wetlands at this site. Seep areas were discrete and spatially consistent during thermal infrared surveys in 2002, 2003, and 2004 throughout West Branch Canal Creek wetlands. In these seep areas, temperature measurements in shallow pore water and sediment more closely resembled those in ground water than those in nearby surface water. Generally, pore water in seep areas contaminated with chlorinated volatile organic compounds had lower methane and greater volatile organic compound concentrations than pore water in non-seep wetland sediments. The volatile organic compounds detected in shallow pore water in seeps were spatially similar to the dominant volatile organic compounds in the underlying Canal Creek aquifer, with both parent and anaerobic daughter compounds detected. Seep locations characterized as focused seeps contained the highest concentrations of chlorinated parent compounds

  8. Large N Scalars

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2016-01-01

    We construct effective Lagrangians, and corresponding counting schemes, valid to describe the dynamics of the lowest lying large N stable massive composite state emerging in strongly coupled theories. The large N counting rules can now be employed when computing quantum corrections via an effective...

  9. Large bowel resection

    Science.gov (United States)

    ... blockage in the intestine due to scar tissue Colon cancer Diverticular disease (disease of the large bowel) Other reasons for bowel resection are: Familial polyposis (polyps are growths on the lining of the colon or rectum) Injuries that damage the large bowel ...

  10. Adaptive Large Neighbourhood Search

    DEFF Research Database (Denmark)

    Røpke, Stefan

    Large neighborhood search is a metaheuristic that has gained popularity in recent years. The heuristic repeatedly moves from solution to solution by first partially destroying the solution and then repairing it. The best solution observed during this search is presented as the final solution....... This tutorial introduces the large neighborhood search metaheuristic and the variant adaptive large neighborhood search that dynamically tunes parameters of the heuristic while it is running. Both heuristics belong to a broader class of heuristics that are searching a solution space using very large...... neighborhoods. The tutorial also present applications of the adaptive large neighborhood search, mostly related to vehicle routing problems for which the heuristic has been extremely successful. We discuss how the heuristic can be parallelized and thereby take advantage of modern desktop computers...

  11. New and improved methods for monitoring air quality and the terrestrial environment: Applications at Aberdeen Proving Ground-Edgewood area. Annual report, 1 April--14 November 1997

    Energy Technology Data Exchange (ETDEWEB)

    Bromenshenk, J.J.; Smith, G.C.

    1998-03-01

    Honey bees (Apis mellifera L.) have been shown to be multi-media monitors of chemical exposures and resultant effects. This five-year project has developed an automated system to assess in real-time colony behavioral responses to stressors, both anthropogenic and natural, including inclement weather. Field trials at the Aberdeen Proving Ground-Edgewood included the Old O Field and J field landfills, the Canal Creek and Bush River areas, and a Churchville, MD reference site. Preliminary results show varying concentrations of bioavailable inorganic elements and chlorinated hydrocarbons in bee colonies from all Maryland sites. Industrial solvents in the air inside beehives exhibited the greatest between site differences, with the highest levels occurring in hives near landfills at Old O Field, J Field, and at some sites in the Bush River and Canal Creek areas. Compared to 1996, the 1997 levels of solvents in Old O Field hives decreased by an order of magnitude, and colony performance significantly improved, probably as a consequence of capping the landfill. Recent chemical monitoring accomplishments include development of a new apparatus to quantitatively calibrate TD/GC/MS analysis, a QA/QC assessment of factors that limit the precision of these analyses, and confirmation of transport of aqueous contaminants into the hive. Real-time effects monitoring advances include development of an extensive array of software tools for automated data display, inspection, and numerical analysis and the ability to deliver data from remote locations in real time through Internet or Intranet connections.

  12. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Introduction and Problem Formulation for a Multiple Stressor Risk Assessment

    International Nuclear Information System (INIS)

    Efroymson, Rebecca Ann; Peterson, Mark J.; Jones, Daniel Steven; Suter, Glenn

    2008-01-01

    An ecological risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework (MERAF). The focus of the assessment was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. The problem formulation for the assessment included conceptual models for three component activities of the test, helicopter overflight, missile firing, and tracked vehicle movement, and two ecological endpoint entities, woody desert wash communities and desert mule deer (Odocoileus hemionus crooki) populations. An activity-specific risk assessment framework was available to provide guidance for assessing risks associated with aircraft overflights. Key environmental features of the study area include barren desert pavement and tree-lined desert washes. The primary stressors associated with helicopter overflights were sound and the view of the aircraft. The primary stressor associated with Hellfire missile firing was sound. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased ponding, infiltration and/or evaporation associated with disturbances to desert pavement. A plan for estimating integrated risks from the three military activities was included in the problem formulation

  13. X-ray fluorescence investigation of heavy-metal contamination on metal surfaces in the Pilot Plant Complex, Aberdeen Proving Ground, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Draugelis, A.K.; Schneider, J.F.; Billmark, K.A.; Zimmerman, R.E.

    1995-07-01

    A field program using a portable x-ray fluorescence (XRF) instrument was carried out to obtain data on loadings of RCRA-regulated heavy metals in paint on metal surfaces within the Pilot Plant Complex at Aberdeen Proving Ground, Maryland. Measured loadings of heavy metals were sufficiently small that they do not present problems for either human exposure or the disposition of building demolition rubble. An attempt to develop an external calibration of the XRF instrument for cadmium, chromium, and lead was unsuccessful. Significant substrate effects were observed for cadmium and chromium; for accurate results for these elements, it appears necessary to calibrate by using a sample of the actual metal substrate on which the paint is located. No substrate effects were observed for lead, but the use of lead L-shell x-ray emission lines in the instrument mode utilized in this study appears to result in a significant underestimate of the lead loading due to self-absorption of these emissions.

  14. Environmental geophysics: Buildings E5485, E5487, and E5489 decommissioning - the open-quotes Ghost Townclose quotes complex, Aberdeen Proving Ground, Maryland

    International Nuclear Information System (INIS)

    McGinnis, L.D.; Thompson, M.D.; Miller, S.F.

    1994-06-01

    Buildings E5485, E5487, and E5489, referred to informally as the open-quotes Ghost Townclose quotes complex, are potentially contaminated sites in the Edgewood section of Aberdeen Proving Ground. Noninvasive geophysical surveys, including magnetics, EM-31, EM-61, and ground-penetrating radar, were conducted to assist a sampling and monitoring program prior to decommissioning and dismantling of the buildings. The buildings are located on a marginal wetland bordering the west branch of Canal Creek. The dominant geophysical signature in the open-quotes Ghost Town close quotes complex is a pattern of northeast-southwest and northwest-southeast anomalies that appear to be associated with a trench/pipe/sewer system, documented by the presence of a manhole. Combinations of anomalies suggest that line sources include nonmetallic and ferromagnetic materials in trenches. On the basis of anomaly associations, the sewer lines probably rest in a trench, back-filled with conductive, amphibolitic, crushed rock. Where the sewer lines connect manholes or junctions with other lines, ferromagnetic materials are present. Isolated, unidentified magnetic anomalies litter the area around Building E5487, particularly to the north. Three small magnetic sources are located east of Building E5487

  15. Privacy as Personality Right: Why the ECtHR’s Focus on Ulterior Interests Might Prove Indispensable in the Age of “Big Data”

    Directory of Open Access Journals (Sweden)

    Bart van der Sloot

    2015-02-01

    Full Text Available Article 8 ECHR was adopted as a classic negative right, which provides the citizen protection from unlawful and arbitrary interference by the state with his private and family life, home and communication. The ECtHR, however, has gradually broadened its scope so that the right to privacy encroaches upon other provisions embodied in the Convention, includes rights and freedoms explicitly left out of the ECHR by the drafters of the Convention and functions as the main pillar on which the Court has built its practice of opening up the Convention for new rights and freedoms. Consequently, Article 8 ECHR has been transformed from a classic privacy right to a personality right, providing protection to the personal development of individuals. Apart from its theoretical significance, this shift might prove indispensable in the age of Big Data, as personality rights protect a different type of interest, which is far more easy to substantiate in the new technological paradigm than those associated with the right to privacy.

  16. Long-term ground-water monitoring program and performance-evaluation plan for the extraction system at the former Nike Missile Battery Site, Aberdeen Proving Ground, Maryland

    Science.gov (United States)

    Senus, Michael P.; Tenbus, Frederick J.

    2000-01-01

    This report presents lithologic and ground-water-quality data collected during April and May 2000 in the remote areas of the tidal wetland of West Branch Canal Creek, Aberdeen Proving Ground, Maryland. Contamination of the Canal Creek aquifer with volatile organic compounds has been documented in previous investigations of the area. This study was conducted to investigate areas that were previously inaccessible because of deep mud and shallow water, and to support ongoing investigations of the fate and transport of volatile organic compounds in the Canal Creek aquifer. A unique vibracore drill rig mounted on a hovercraft was used for drilling and ground-water sampling. Continuous cores of the wetland sediment and of the Canal Creek aquifer were collected at five sites. Attempts to sample ground water were made by use of a continuous profiler at 12 sites, without well installation, at a total of 81 depths within the aquifer. Of those 81 attempts, only 34 sampling depths produced enough water to collect samples. Ground-water samples from two sites had the highest concentrations of volatile organic compounds?with total volatile organic compound concentrations in the upper part of the aquifer ranging from about 15,000 to 50,000 micrograms per liter. Ground-water samples from five sites had much lower total volatile organic compound concentrations (95 to 2,100 micrograms per liter), whereas two sites were essentially not contaminated, with total volatile organic compound concentrations less than or equal to 5 micrograms per liter.

  17. Contamination of ground water, surface water, and soil, and evaluation of selected ground-water pumping alternatives in the Canal Creek area of Aberdeen Proving Ground, Maryland

    Science.gov (United States)

    Lorah, Michelle M.; Clark, Jeffrey S.

    1996-01-01

    Chemical manufacturing, munitions filling, and other military-support activities have resulted in the contamination of ground water, surface water, and soil in the Canal Creek area of Aberdeen Proving Ground, Maryland. Chlorinated volatile organic compounds, including 1,1,2,2-tetrachloroethane and trichloroethylene, are widespread ground-water contaminants in two aquifers that are composed of unconsolidated sand and gravel. Distribution and fate of chlorinated organic compounds in the ground water has been affected by the movement and dissolution of solvents in their dense immiscible phase and by microbial degradation under anaerobic conditions. Detection of volatile organic contaminants in adjacent surface water indicates that shallow contaminated ground water discharges to surface water. Semivolatile organic compounds, especially polycyclic aromatic hydrocarbons, are the most prevalent organic contaminants in soils. Various trace elements, such as arsenic, cadmium, lead, and zinc, were found in elevated concentrations in ground water, surface water, and soil. Simulations with a ground-water-flow model and particle tracker postprocessor show that, without remedial pumpage, the contaminants will eventually migrate to Canal Creek and Gunpowder River. Simulations indicate that remedial pumpage of 2.0 million gallons per day from existing wells is needed to capture all particles originating in the contaminant plumes. Simulated pumpage from offsite wells screened in a lower confined aquifer does not affect the flow of contaminated ground water in the Canal Creek area.

  18. Concepts and procedures required for successful reduction of tensor magnetic gradiometer data obtained from an unexploded ordnance detection demonstration at Yuma Proving Grounds, Arizona

    Science.gov (United States)

    Bracken, Robert E.; Brown, Philip J.

    2006-01-01

    On March 12, 2003, data were gathered at Yuma Proving Grounds, in Arizona, using a Tensor Magnetic Gradiometer System (TMGS). This report shows how these data were processed and explains concepts required for successful TMGS data reduction. Important concepts discussed include extreme attitudinal sensitivity of vector measurements, low attitudinal sensitivity of gradient measurements, leakage of the common-mode field into gradient measurements, consequences of thermal drift, and effects of field curvature. Spatial-data collection procedures and a spin-calibration method are addressed. Discussions of data-reduction procedures include tracking of axial data by mathematically matching transfer functions among the axes, derivation and application of calibration coefficients, calculation of sensor-pair gradients, thermal-drift corrections, and gradient collocation. For presentation, the magnetic tensor at each data station is converted to a scalar quantity, the I2 tensor invariant, which is easily found by calculating the determinant of the tensor. At important processing junctures, the determinants for all stations in the mapped area are shown in shaded relief map-view. Final processed results are compared to a mathematical model to show the validity of the assumptions made during processing and the reasonableness of the ultimate answer obtained.

  19. Biodegradable Magnetic Silica@Iron Oxide Nanovectors with Ultra-Large Mesopores for High Protein Loading, Magnetothermal Release, and Delivery

    KAUST Repository

    Omar, Haneen; Croissant, Jonas G.; Alamoudi, Kholod; Alsaiari, Shahad K.; Alradwan, Ibrahim; Majrashi, Majed A.; Anjum, Dalaver H.; Martins, Patricia; Moosa, Basem; Almalik, Abdulaziz; Khashab, Niveen M.

    2016-01-01

    The delivery of large cargos of diameter above 15 nm for biomedical applications has proved challenging since it requires biocompatible, stably-loaded, and biodegradable nanomaterials. In this study, we describe the design of biodegradable silica

  20. ERP inside Large Organizations

    Directory of Open Access Journals (Sweden)

    Constantin Daniel AVRAM

    2010-01-01

    Full Text Available Many large companies in Romania are still functioning without an ERP system. Instead they are using traditional application systems built around the strong boundaries of specific functions: finance, selling, HR, production. An ERP will offer lots of advantages among which the integration of functionalities and support for top management decisions. Although the total cost of ownership is not small and there are some risks when implementing an ERP inside large and very large organizations, having such a system is mandatory. Choosing the right product and vendor and using a correct risk management strategy, will ensure a successful implementation.

  1. Large Pelagics Telephone Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Large Pelagics Telephone Survey (LPTS) collects fishing effort information directly from captains holding Highly Migratory Species (HMS) permits (required by...

  2. Large Customers (DR Sellers)

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccot, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-10-25

    State of the large customers for demand response integration of solar and wind into electric grid; openADR; CAISO; DR as a pseudo generation; commercial and industrial DR strategies; California regulations

  3. Large Pelagics Biological Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Large Pelagics Biological Survey (LPBS) collects additional length and weight information and body parts such as otoliths, caudal vertebrae, dorsal spines, and...

  4. Large Rotor Test Apparatus

    Data.gov (United States)

    Federal Laboratory Consortium — This test apparatus, when combined with the National Full-Scale Aerodynamics Complex, produces a thorough, full-scale test capability. The Large Rotor Test Apparatus...

  5. Large transverse momentum phenomena

    International Nuclear Information System (INIS)

    Brodsky, S.J.

    1977-09-01

    It is pointed out that it is particularly significant that the quantum numbers of the leading particles are strongly correlated with the quantum numbers of the incident hadrons indicating that the valence quarks themselves are transferred to large p/sub t/. The crucial question is how they get there. Various hadron reactions are discussed covering the structure of exclusive reactions, inclusive reactions, normalization of inclusive cross sections, charge correlations, and jet production at large transverse momentum. 46 references

  6. Large Retailers’ Financial Services

    OpenAIRE

    Risso, Mario

    2010-01-01

    Over the last few years, large retailers offering financial services have considerably grown in the financial services sector. Retailers are increasing the wideness and complexity of their offer of financial services. Large retail companies provide financial services to their customers following different strategic ways. The provision of financial services in the retailers offer is implemented in several different ways related to the strategies, the structures and the degree of financial know...

  7. Large momentum transfer phenomena

    International Nuclear Information System (INIS)

    Imachi, Masahiro; Otsuki, Shoichiro; Matsuoka, Takeo; Sawada, Shoji.

    1978-01-01

    The large momentum transfer phenomena in hadron reaction drastically differ from small momentum transfer phenomena, and are described in this paper. Brief review on the features of the large transverse momentum transfer reactions is described in relation with two-body reactions, single particle productions, particle ratios, two jet structure, two particle correlations, jet production cross section, and the component of momentum perpendicular to the plane defined by the incident protons and the triggered pions and transverse momentum relative to jet axis. In case of two-body process, the exponent N of the power law of the differential cross section is a value between 10 to 11.5 in the large momentum transfer region. The breaks of the exponential behaviors into the power ones are observed at the large momentum transfer region. The break would enable to estimate the order of a critical length. The large momentum transfer phenomena strongly suggest an important role of constituents of hadrons in the hard region. Hard rearrangement of constituents from different initial hadrons induces large momentum transfer reactions. Several rules to count constituents in the hard region have been proposed so far to explain the power behavior. Scale invariant quark interaction and hard reactions are explained, and a summary of the possible types of hard subprocess is presented. (Kato, T.)

  8. Report of the large solenoid detector group

    International Nuclear Information System (INIS)

    Hanson, G.G.; Mori, S.; Pondrom, L.G.

    1987-09-01

    This report presents a conceptual design of a large solenoid for studying physics at the SSC. The parameters and nature of the detector have been chosen based on present estimates of what is required to allow the study of heavy quarks, supersymmetry, heavy Higgs particles, WW scattering at large invariant masses, new W and Z bosons, and very large momentum transfer parton-parton scattering. Simply stated, the goal is to obtain optimum detection and identification of electrons, muons, neutrinos, jets, W's and Z's over a large rapidity region. The primary region of interest extends over +-3 units of rapidity, although the calorimetry must extend to +-5.5 units if optimal missing energy resolution is to be obtained. A magnetic field was incorporated because of the importance of identifying the signs of the charges for both electrons and muons and because of the added possibility of identifying tau leptons and secondary vertices. In addition, the existence of a magnetic field may prove useful for studying new physics processes about which we currently have no knowledge. Since hermeticity of the calorimetry is extremely important, the entire central and endcap calorimeters were located inside the solenoid. This does not at the moment seem to produce significant problems (although many issues remain to be resolved) and in fact leads to a very effective muon detector in the central region

  9. Design and Performance of an Enhanced Bioremediation Pilot Test in a Tidal Wetland Seep, West Branch Canal Creek, Aberdeen Proving Ground, Maryland

    Science.gov (United States)

    Majcher, Emily H.; Lorah, Michelle M.; Phelan, Daniel J.; McGinty, Angela L.

    2009-01-01

    Because of a lack of available in situ remediation methods for sensitive wetland environments where contaminated groundwater discharges, the U.S. Geological Survey, in cooperation with the U.S. Army Garrison, Aberdeen Proving Ground, Maryland, conceived, designed, and pilot tested a permeable reactive mat that can be placed horizontally at the groundwater/surface-water interface. Development of the reactive mat was part of an enhanced bioremediation study in a tidal wetland area along West Branch Canal Creek at Aberdeen Proving Ground, where localized areas of preferential discharge (seeps) transport groundwater contaminated with carbon tetrachloride, chloroform, tetrachloroethene, trichloroethene, and 1,1,2,2-tetrachloroethane from the Canal Creek aquifer to land surface. The reactive mat consisted of a mixture of commercially available organic- and nutrient-rich peat and compost that was bioaugmented with a dechlorinating microbial consortium, WBC-2, developed for this study. Due to elevated chlorinated methane concentrations in the pilot test site, a layer of zero-valent iron mixed with the peat and compost was added at the base of the reactive mat to promote simultaneous abiotic and biotic degradation. The reactive mat for the pilot test area was designed to optimize chlorinated volatile organic compound degradation efficiency without altering the geotechnical and hydraulic characteristics, or creating undesirable water quality in the surrounding wetland area, which is referred to in this report as achieving geotechnical, hydraulic, and water-quality compatibility. Optimization of degradation efficiency was achieved through the selection of a sustainable organic reactive matrix, electron donor, and bioaugmentation method. Consideration of geotechnical compatibility through design calculations of bearing capacity, settlement, and geotextile selection showed that a 2- to 3-feet tolerable thickness of the mat was possible, with 0.17 feet settlement predicted for

  10. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Tracked Vehicle Movement across Desert Pavement

    International Nuclear Information System (INIS)

    Peterson, Mark J; Efroymson, Rebecca Ann; Hargrove, William Walter

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the tracked vehicle movement component of the testing program. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased infiltration and/or evaporation associated with disturbances to desert pavement. The simulated exposure of wash vegetation to water loss was quantified using estimates of exposed land area from a digital ortho quarter quad aerial photo and field observations, a 30 30 m digital elevation model, the flow accumulation feature of ESRI ArcInfo, and a two-step process in which runoff was estimated from direct precipitation to a land area and from water that flowed from upgradient to a land area. In all simulated scenarios, absolute water loss decreased with distance from the disturbance, downgradient in the washes; however, percentage water loss was greatest in land areas immediately downgradient of a disturbance. Potential effects on growth and survival of wash trees were quantified by using an empirical relationship derived from a local unpublished study of water infiltration rates. The risk characterization concluded that neither risk to wash vegetation growth or survival nor risk to mule deer abundance and reproduction was expected. The risk characterization was negative for both the incremental risk of the test program and the combination of the test and pretest disturbances

  11. Standard and biological treatment in large vessel vasculitis: guidelines and current approaches.

    Science.gov (United States)

    Muratore, Francesco; Pipitone, Nicolò; Salvarani, Carlo

    2017-04-01

    Giant cell arteritis and Takayasu arteritis are the two major forms of idiopathic large vessel vasculitis. High doses of glucocorticoids are effective in inducing remission in both conditions, but relapses and recurrences are common, requiring prolonged glucocorticoid treatment with the risk of the related adverse events. Areas covered: In this article, we will review the standard and biological treatment strategies in large vessel vasculitis, and we will focus on the current approaches to these diseases. Expert commentary: The results of treatment trials with conventional immunosuppressive agents such as methotrexate, azathioprine, mycophenolate mofetil, and cyclophosphamide have overall been disappointing. TNF-α blockers are ineffective in giant cell arteritis, while observational evidence and a phase 2 randomized trial support the use of tocilizumab in relapsing giant cell arteritis. Observational evidence strongly supports the use of anti-TNF-α agents and tocilizumab in Takayasu patients with relapsing disease. However biological agents are not curative, and relapses remain common.

  12. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  13. Implementing an energetic life cycle analysis to prove the benefits of lignocellulosic feedstocks with protein separation for the chemical industry from the existing bioethanol industry

    NARCIS (Netherlands)

    Brehmer, B.; Sanders, J.P.M.

    2009-01-01

    The biofuel ethanol is currently being produced in large quantities from corn in the US and from wheat in the EU and further capacity expansion is expected. Relying on the so-called 1st generation technology, only the starch contained in the edible portion of the crops (ears/grains) is subjected to

  14. Manufacture of large monoblock LP rotor forgings and their quality

    International Nuclear Information System (INIS)

    Suzuki, Akira; Kinoshita, Shushi; Kohno, Masayoshi; Miyakawa, Mutsuhiro; Kikuchi, Hideo

    1986-01-01

    This paper describes the manufacturing and the quality of large monoblock low pressure rotors forged from 360 ton and 420 ton ingots. To obtain good and homogenous mechanical properties throughout a rotor, a computer was used to determine the heat treatment conditions. It was found that the technique was very effective at predicting mechanical properties of a monoblock rotor. Mechanical properties including the fracture toughness and fatigue crack propagation characteristics of monoblock rotor forgings proved satisfactory. (author)

  15. Large electrostatic accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C.M.

    1984-01-01

    The increasing importance of energetic heavy ion beams in the study of atomic physics, nuclear physics, and materials science has partially or wholly motivated the construction of a new generation of large electrostatic accelerators designed to operate at terminal potentials of 20 MV or above. In this paper, the author briefly discusses the status of these new accelerators and also discusses several recent technological advances which may be expected to further improve their performance. The paper is divided into four parts: (1) a discussion of the motivation for the construction of large electrostatic accelerators, (2) a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, (3) a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and (4) a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year. Due to time and space constraints, discussion is restricted to consideration of only tandem accelerators.

  16. Large electrostatic accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1984-01-01

    The increasing importance of energetic heavy ion beams in the study of atomic physics, nuclear physics, and materials science has partially or wholly motivated the construction of a new generation of large electrostatic accelerators designed to operate at terminal potentials of 20 MV or above. In this paper, the author briefly discusses the status of these new accelerators and also discusses several recent technological advances which may be expected to further improve their performance. The paper is divided into four parts: (1) a discussion of the motivation for the construction of large electrostatic accelerators, (2) a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, (3) a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and (4) a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year. Due to time and space constraints, discussion is restricted to consideration of only tandem accelerators

  17. Developing Large Web Applications

    CERN Document Server

    Loudon, Kyle

    2010-01-01

    How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti

  18. Choice of large projects

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R

    1978-08-01

    Conventional cost/benefit or project analysis has generally not taken into account circumstances in which the project under consideration is large enough that its introduction to the economy would have significant general equilibrium effects. In this paper, rules are examined that would indicate whether such large projects should be accepted or rejected. The rules utilize information yielded by before-project and after-project equilibrium prices and production data. Rules are developed for the undistorted ''first-best'' case, the case in which the fixed costs of the project are covered by distortionary taxation, and for the case of projects producing public goods. 34 references.

  19. Large volumes and spectroscopy of walking theories

    DEFF Research Database (Denmark)

    Del Debbio, L.; Lucini, B.; Patella, A.

    2016-01-01

    A detailed investigation of finite-size effects is performed for SU(2) gauge theory with two fermions in the adjoint representation, which previous lattice studies have shown to be inside the conformal window. The system is investigated with different spatial and temporal boundary conditions...... and the spatial lattice size L satisfy the relation LMPS≥15. This bound, which is at least a factor of three higher than what is observed in QCD, is a likely consequence of the different spectral signatures of the two theories, with the scalar isosinglet (0++ glueball) being the lightest particle in our model....... In addition to stressing the importance of simulating large lattice sizes, our analysis emphasizes the need to understand quantitatively the full spectrum of the theory rather than just the spectrum in the mesonic isotriplet sector. While for the lightest fermion measuring masses from gluonic operators proves...

  20. Large Core Three Branch Polymer Power Splitters

    Directory of Open Access Journals (Sweden)

    V. Prajzler

    2015-12-01

    Full Text Available We report about three branch large core polymer power splitters optimized for connecting standard plastic optical fibers. A new point of the design is insertion of a rectangle-shaped spacing between the input and the central part of the splitter, which will ensure more even distribution of the output optical power. The splitters were designed by beam propagation method using BeamPROP software. Acrylic-based polymers were used as optical waveguides being poured into the Y-grooves realized by computer numerical controlled engraving on poly(methyl methacrylate substrate. Measurement of the optical insertion losses proved that the insertion optical loss could be lowered to 2.1 dB at 650 nm and optical power coupling ratio could reach 31.8% : 37.3% : 30.9%.

  1. Authentication of Primordial Characteristics of the CLBL-1 Cell Line Prove the Integrity of a Canine B-Cell Lymphoma in a Murine In Vivo Model

    OpenAIRE

    Rütgen, Barbara C.; Willenbrock, Saskia; Reimann-Berg, Nicola; Walter, Ingrid; Fuchs-Baumgartinger, Andrea; Wagner, Siegfried; Kovacic, Boris; Essler, Sabine E.; Schwendenwein, Ilse; Nolte, Ingo; Saalmüller, Armin; Escobar, Hugo Murua

    2012-01-01

    Cell lines are key tools in cancer research allowing the generation of neoplasias in animal models resembling the initial tumours able to mimic the original neoplasias closely in vivo. Canine lymphoma is the major hematopoietic malignancy in dogs and considered as a valuable spontaneous large animal model for human Non-Hodgkin's Lymphoma (NHL). Herein we describe the establishment and characterisation of an in vivo model using the canine B-cell lymphoma cell line CLBL-1 analysing the stabilit...

  2. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    OF SELECTED EXISTING BUILDINGS IN AND AROUND COPENHAGEN COVERED WITH MOSAIC TILES, UNGLAZED OR GLAZED CLAY TILES. ITS BUILDINGS WHICH HAVE QUALITIES THAT I WOULD LIKE APPLIED, PERHAPS TRANSFORMED OR MOST PREFERABLY, INTERPRETED ANEW, FOR THE LARGE GLAZED CONCRETE PANELS I AM DEVELOPING. KEYWORDS: COLOR, LIGHT...

  3. Large hydropower generating units

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    This document presents the Brazilian experience with the design, fabrication, construction, commissioning and operation of large scale and generation capacity unities. The experience had been acquired with the implementation of Itumbiara, Paulo Afonso IV, Tucurui, Itaipu and Xingo power plants, which are among the largest world unities.

  4. Large Data Set Mining

    NARCIS (Netherlands)

    Leemans, I.B.; Broomhall, Susan

    2017-01-01

    Digital emotion research has yet to make history. Until now large data set mining has not been a very active field of research in early modern emotion studies. This is indeed surprising since first, the early modern field has such rich, copyright-free, digitized data sets and second, emotion studies

  5. Representing Large Virtual Worlds

    NARCIS (Netherlands)

    Kol, T.R.

    2018-01-01

    The ubiquity of large virtual worlds and their growing complexity in computer graphics require efficient representations. This means that we need smart solutions for the underlying storage of these complex environments, but also for their visualization. How the virtual world is best stored and how

  6. The large hadron computer

    CERN Multimedia

    Hirstius, Andreas

    2008-01-01

    Plans for dealing with the torrent of data from the Large Hadron Collider's detectors have made the CERN particle-phycis lab, yet again, a pioneer in computing as well as physics. The author describes the challenges of processing and storing data in the age of petabyt science. (4 pages)

  7. LARGE BUILDING HVAC SIMULATION

    Science.gov (United States)

    The report discusses the monitoring and collection of data relating to indoor pressures and radon concentrations under several test conditions in a large school building in Bartow, Florida. The Florida Solar Energy Center (FSEC) used an integrated computational software, FSEC 3.0...

  8. Large Hadron Collider

    CERN Multimedia

    2007-01-01

    "In the spring 2008, the Large Hadron Collider (LHC) machine at CERN (the European Particle Physics laboratory) will be switched on for the first time. The huge machine is housed in a circular tunnel, 27 km long, excavated deep under the French-Swiss border near Geneva." (1,5 page)

  9. Large reservoirs: Chapter 17

    Science.gov (United States)

    Miranda, Leandro E.; Bettoli, Phillip William

    2010-01-01

    Large impoundments, defined as those with surface area of 200 ha or greater, are relatively new aquatic ecosystems in the global landscape. They represent important economic and environmental resources that provide benefits such as flood control, hydropower generation, navigation, water supply, commercial and recreational fisheries, and various other recreational and esthetic values. Construction of large impoundments was initially driven by economic needs, and ecological consequences received little consideration. However, in recent decades environmental issues have come to the forefront. In the closing decades of the 20th century societal values began to shift, especially in the developed world. Society is no longer willing to accept environmental damage as an inevitable consequence of human development, and it is now recognized that continued environmental degradation is unsustainable. Consequently, construction of large reservoirs has virtually stopped in North America. Nevertheless, in other parts of the world construction of large reservoirs continues. The emergence of systematic reservoir management in the early 20th century was guided by concepts developed for natural lakes (Miranda 1996). However, we now recognize that reservoirs are different and that reservoirs are not independent aquatic systems inasmuch as they are connected to upstream rivers and streams, the downstream river, other reservoirs in the basin, and the watershed. Reservoir systems exhibit longitudinal patterns both within and among reservoirs. Reservoirs are typically arranged sequentially as elements of an interacting network, filter water collected throughout their watersheds, and form a mosaic of predictable patterns. Traditional approaches to fisheries management such as stocking, regulating harvest, and in-lake habitat management do not always produce desired effects in reservoirs. As a result, managers may expend resources with little benefit to either fish or fishing. Some locally

  10. Existence of Mott-Schwinger interaction proved by means of p-/sup 12/C elastic scattering. [450 to 600 keV

    Energy Technology Data Exchange (ETDEWEB)

    Krause, H H; Arnold, W; Berg, H; Ulbricht, J; Clausnitzer, G [Giessen Univ. (Germany, F.R.). Inst. fuer Kernphysik

    1979-01-01

    The aim of this work was the unambiguous proof of the existence of the Mott-Schwinger interaction. The analyzing power of the p-/sup 12/C elastic scattering was measured in the energy range from 450 to 600 keV for scattering angles theta/sub Lab/ = 90/sup 0/ and 120/sup 0/ with an overall accuracy up to ..delta..A = 1 x /sup -4/. The data can be described very well with the R-matrix formalism including Mott-Schwinger interaction. Omitting this interaction results in large discrepancies.

  11. Large Hadron Collider manual

    CERN Document Server

    Lavender, Gemma

    2018-01-01

    What is the universe made of? How did it start? This Manual tells the story of how physicists are seeking answers to these questions using the world’s largest particle smasher – the Large Hadron Collider – at the CERN laboratory on the Franco-Swiss border. Beginning with the first tentative steps taken to build the machine, the digestible text, supported by color photographs of the hardware involved, along with annotated schematic diagrams of the physics experiments, covers the particle accelerator’s greatest discoveries – from both the perspective of the writer and the scientists who work there. The Large Hadron Collider Manual is a full, comprehensive guide to the most famous, record-breaking physics experiment in the world, which continues to capture the public imagination as it provides new insight into the fundamental laws of nature.

  12. [Large benign prostatic hiperplasia].

    Science.gov (United States)

    Soria-Fernández, Guillermo René; Jungfermann-Guzman, José René; Lomelín-Ramos, José Pedro; Jaspersen-Gastelum, Jorge; Rosas-Nava, Jesús Emmanuel

    2012-01-01

    the term prostatic hyperplasia is most frequently used to describe the benign prostatic growth, this being a widely prevalent disorder associated with age that affects most men as they age. The association between prostate growth and urinary obstruction in older adults is well documented. large benign prostatic hyperplasia is rare and few cases have been published and should be taken into account during the study of tumors of the pelvic cavity. we report the case of an 81-year-old who had significant symptoms relating to storage and bladder emptying, with no significant elevation of prostate specific antigen. this is a rare condition but it is still important to diagnose and treat as it may be related to severe obstructive uropathy and chronic renal failure. In our institution, cases of large prostatic hyperplasia that are solved by suprapubic adenomectomy are less than 3%.

  13. [Large vessel vasculitides].

    Science.gov (United States)

    Morović-Vergles, Jadranka; Puksić, Silva; Gracanin, Ana Gudelj

    2013-01-01

    Large vessel vasculitis includes Giant cell arteritis and Takayasu arteritis. Giant cell arteritis is the most common form of vasculitis affect patients aged 50 years or over. The diagnosis should be considered in older patients who present with new onset of headache, visual disturbance, polymyalgia rheumatica and/or fever unknown cause. Glucocorticoides remain the cornerstone of therapy. Takayasu arteritis is a chronic panarteritis of the aorta ant its major branches presenting commonly in young ages. Although all large arteries can be affected, the aorta, subclavian and carotid arteries are most commonly involved. The most common symptoms included upper extremity claudication, hypertension, pain over the carotid arteries (carotidynia), dizziness and visual disturbances. Early diagnosis and treatment has improved the outcome in patients with TA.

  14. Large tandem accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1976-01-01

    The increasing importance of energetic heavy ion beams in the study of atomic physics, nuclear physics, and materials science has partially or wholly motivated the construction of a new generation of tandem accelerators designed to operate at maximum terminal potentials in the range 14 to 30 MV. In addition, a number of older tandem accelerators are now being significantly upgraded to improve their heavy ion performance. Both of these developments have reemphasized the importance of negative heavy ion sources. The new large tandem accelerators are described, and the requirements placed on negative heavy ion source technology by these and other tandem accelerators used for the acceleration of heavy ions are discussed. First, a brief description is given of the large tandem accelerators which have been completed recently, are under construction, or are funded for construction, second, the motivation for construction of these accelerators is discussed, and last, criteria for negative ion sources for use with these accelerators are presented

  15. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  16. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  17. Large mass storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, Arnold M.

    1978-08-01

    This is the final report of a study group organized to investigate questions surrounding the acquisition of a large mass storage facility. The programatic justification for such a system at Brookhaven is reviewed. Several candidate commercial products are identified and discussed. A draft of a procurement specification is developed. Some thoughts on possible new directions for computing at Brookhaven are also offered, although this topic was addressed outside of the context of the group's deliberations. 2 figures, 3 tables.

  18. The Large Hadron Collider

    CERN Document Server

    Juettner Fernandes, Bonnie

    2014-01-01

    What really happened during the Big Bang? Why did matter form? Why do particles have mass? To answer these questions, scientists and engineers have worked together to build the largest and most powerful particle accelerator in the world: the Large Hadron Collider. Includes glossary, websites, and bibliography for further reading. Perfect for STEM connections. Aligns to the Common Core State Standards for Language Arts. Teachers' Notes available online.

  19. Large granular lymphocyte leukemia: natural history and response to treatment.

    LENUS (Irish Health Repository)

    Fortune, Anne F

    2012-02-01

    Large granular lymphocyte leukemia (T-LGL) is an indolent T lymphoproliferative disorder that was difficult to diagnose with certainty until clonality testing of the T cell receptor gene became routinely available. We studied the natural history and response to treatment in 25 consecutive patients with T-LGL diagnosed between 2004 and 2008 in which the diagnosis was confirmed by molecular analysis, to define an effective treatment algorithm. The median age at diagnosis was 61 years (range 27-78), with a male to female ratio of 1:1.8 and presenting features of fatigue (n = 13), recurrent infections (n = 9), and\\/or abnormal blood counts (n = 5). Thirteen patients with symptomatic disease were treated as follows: pentostatin (nine patients), cyclosporine (six patients), methotrexate (three patients), and alemtuzumab in two patients in whom pentostatin was ineffective. Pentostatin was the single most effective therapy, with a response rate of 75% and minimal toxicity. The overall survival (OS) and progression-free survival (PFS) 37 months from diagnosis were 80% and 52%, respectively. Treatment of T-LGL should be reserved for patients with symptomatic disease, but in this series, pentostatin treatment was less toxic and more effective than cyclosporine or methotrexate.

  20. Large Right Pleural Effusion

    Directory of Open Access Journals (Sweden)

    Robert Rowe

    2016-09-01

    Full Text Available History of present illness: An 83-year-old male with a distant history of tuberculosis status post treatment and resection approximately fifty years prior presented with two days of worsening shortness of breath. He denied any chest pain, and reported his shortness of breath was worse with exertion and lying flat. Significant findings: Chest x-ray and bedside ultrasound revealed a large right pleural effusion, estimated to be greater than two and a half liters in size. Discussion: The incidence of pleural effusion is estimated to be at least 1.5 million cases annually in the United States.1 Erect posteroanterior and lateral chest radiography remains the mainstay for diagnosis of a pleural effusion; on upright chest radiography small effusions (>400cc will blunt the costophrenic angles, and as the size of an effusion grows it will begin to obscure the hemidiphragm.1 Large effusions will cause mediastinal shift away from the affected side (seen in effusions >1000cc.1 Lateral decubitus chest radiography can detect effusions greater than 50cc.1 Ultrasonography can help differentiate large pulmonary masses from effusions and can be instrumental in guiding thoracentesis.1 The patient above was comfortable at rest and was admitted for a non-emergent thoracentesis. The pulmonology team removed 2500cc of fluid, and unfortunately the patient subsequently developed re-expansion pulmonary edema and pneumothorax ex-vacuo. It is generally recommended that no more than 1500cc be removed to minimize the risk of re-expansion pulmonary edema.2

  1. Large litter sizes

    DEFF Research Database (Denmark)

    Sandøe, Peter; Rutherford, K.M.D.; Berg, Peer

    2012-01-01

    This paper presents some key results and conclusions from a review (Rutherford et al. 2011) undertaken regarding the ethical and welfare implications of breeding for large litter size in the domestic pig and about different ways of dealing with these implications. Focus is primarily on the direct...... possible to achieve a drop in relative piglet mortality and the related welfare problems. However, there will be a growing problem with the need to use foster or nurse sows which may have negative effects on both sows and piglets. This gives rise to new challenges for management....

  2. Large lithium loop experience

    International Nuclear Information System (INIS)

    Kolowith, R.; Owen, T.J.; Berg, J.D.; Atwood, J.M.

    1981-10-01

    An engineering design and operating experience of a large, isothermal, lithium-coolant test loop are presented. This liquid metal coolant loop is called the Experimental Lithium System (ELS) and has operated safely and reliably for over 6500 hours through September 1981. The loop is used for full-scale testing of components for the Fusion Materials Irradiation Test (FMIT) Facility. Main system parameters include coolant temperatures to 430 0 C and flow to 0.038 m 3 /s (600 gal/min). Performance of the main pump, vacuum system, and control system is discussed. Unique test capabilities of the ELS are also discussed

  3. Large coil test facility

    International Nuclear Information System (INIS)

    Nelms, L.W.; Thompson, P.B.

    1980-01-01

    Final design of the facility is nearing completion, and 20% of the construction has been accomplished. A large vacuum chamber, houses the test assembly which is coupled to appropriate cryogenic, electrical, instrumentation, diagnostc systems. Adequate assembly/disassembly areas, shop space, test control center, offices, and test support laboratories are located in the same building. Assembly and installation operations are accomplished with an overhead crane. The major subsystems are the vacuum system, the test stand assembly, the cryogenic system, the experimental electric power system, the instrumentation and control system, and the data aquisition system

  4. Hierarchies in Quantum Gravity: Large Numbers, Small Numbers, and Axions

    Science.gov (United States)

    Stout, John Eldon

    Our knowledge of the physical world is mediated by relatively simple, effective descriptions of complex processes. By their very nature, these effective theories obscure any phenomena outside their finite range of validity, discarding information crucial to understanding the full, quantum gravitational theory. However, we may gain enormous insight into the full theory by understanding how effective theories with extreme characteristics--for example, those which realize large-field inflation or have disparate hierarchies of scales--can be naturally realized in consistent theories of quantum gravity. The work in this dissertation focuses on understanding the quantum gravitational constraints on these "extreme" theories in well-controlled corners of string theory. Axion monodromy provides one mechanism for realizing large-field inflation in quantum gravity. These models spontaneously break an axion's discrete shift symmetry and, assuming that the corrections induced by this breaking remain small throughout the excursion, create a long, quasi-flat direction in field space. This weakly-broken shift symmetry has been used to construct a dynamical solution to the Higgs hierarchy problem, dubbed the "relaxion." We study this relaxion mechanism and show that--without major modifications--it can not be naturally embedded within string theory. In particular, we find corrections to the relaxion potential--due to the ten-dimensional backreaction of monodromy charge--that conflict with naive notions of technical naturalness and render the mechanism ineffective. The super-Planckian field displacements necessary for large-field inflation may also be realized via the collective motion of many aligned axions. However, it is not clear that string theory provides the structures necessary for this to occur. We search for these structures by explicitly constructing the leading order potential for C4 axions and computing the maximum possible field displacement in all compactifications of

  5. Resonance power supplies for large accelerator

    International Nuclear Information System (INIS)

    Karady, G.; Schneider, E.J.

    1993-01-01

    The resonance power supply has been proposed as an efficient power supply for a future 6 GB, keon producing accelerator. This report presents a detailed analysis of the circuit operation. Based on these analyses each component is designed, one line diagram is developed, component requirements are determined and a detailed cost estimate is prepared. The major components of the system are: the magnet power supply, high voltage by-pass thyristor switch, with l0kA repetitive interruption capability, capacitor banks, capacitor bank thyristor switch, and an energy make up device. The most important components are the bypass thyristor switch and the energy injection device. The bypass thyristor switch is designed to turn on and interrupt to 10 kA dc current with a recovery voltage of 20kV and repetition frequency of 3 Hz. The switch consists of a large array of series and parallel connected thyristors and gate turn off (GTO) devices. The make up energy device is designed to replace the circuit energy losses. A capacitor bank is charged with constant current and discharged during the acceleration period. One of the advantages of the developed circuit is that it can be supplied directly from the local power network. In order to prove the validity of the assumptions, a scaled down model circuit was thoroughly tested. These tests proved that the engineering design of critical components is correct and this resonant power supply can be properly controlled by an inventer/rectifier connected in series with the magnet and by the make up energy device. This finding reduces the system cost

  6. Large orbit neoclassical transport

    International Nuclear Information System (INIS)

    Lin, Z.; Tang, W.M.; Lee, W.W.

    1997-01-01

    Neoclassical transport in the presence of large ion orbits is investigated. The study is motivated by the recent experimental results that ion thermal transport levels in enhanced confinement tokamak plasmas fall below the open-quotes irreducible minimum levelclose quotes predicted by standard neoclassical theory. This apparent contradiction is resolved in the present analysis by relaxing the basic neoclassical assumption that the ions orbital excursions are much smaller than the local toroidal minor radius and the equilibrium scale lengths of the system. Analytical and simulation results are in agreement with trends from experiments. The development of a general formalism for neoclassical transport theory with finite orbit width is also discussed. copyright 1997 American Institute of Physics

  7. Large Superconducting Magnet Systems

    CERN Document Server

    Védrine, P.

    2014-07-17

    The increase of energy in accelerators over the past decades has led to the design of superconducting magnets for both accelerators and the associated detectors. The use of Nb−Ti superconducting materials allows an increase in the dipole field by up to 10 T compared with the maximum field of 2 T in a conventional magnet. The field bending of the particles in the detectors and generated by the magnets can also be increased. New materials, such as Nb$_{3}$Sn and high temperature superconductor (HTS) conductors, can open the way to higher fields, in the range 13–20 T. The latest generations of fusion machines producing hot plasma also use large superconducting magnet systems.

  8. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  9. Large Superconducting Magnet Systems

    Energy Technology Data Exchange (ETDEWEB)

    Védrine, P [Saclay (France)

    2014-07-01

    The increase of energy in accelerators over the past decades has led to the design of superconducting magnets for both accelerators and the associated detectors. The use of Nb−Ti superconducting materials allows an increase in the dipole field by up to 10 T compared with the maximum field of 2 T in a conventional magnet. The field bending of the particles in the detectors and generated by the magnets can also be increased. New materials, such as Nb3Sn and high temperature superconductor (HTS) conductors, can open the way to higher fields, in the range 13–20 T. The latest generations of fusion machines producing hot plasma also use large superconducting magnet systems.

  10. Mediastinal large cell lymphoma with sclerosis

    International Nuclear Information System (INIS)

    Franco, Sergio; Pulcheri, Wolmar; Spector, Nelson; Nucci, Marcio; Oliveira, Halley P. de; Morais, Jose Carlos; Romano, Sergio

    1995-01-01

    Five cases of primary mediastinal large-cell lymphoma with sclerosis diagnosed at the University Hospital Clementino Fraga Filho (Federal University of Rio de Janeiro) between 1986 and 1994 were identified. They were studied on clinical, morphological and immuno-histochemical grounds. Clinically, the disease was characterized by the young age of the patients, mediastinal involvement by bulky disease and compressive symptoms. None of the patients had evidence of extra-thoracic disease as presentation. On morphological grounds they had evidence of extra-thoracic disease at presentation. On morphological grounds they showed a mixture of immuno blasts and large follicular enter cell with sclerosis. Three of five cases proved to be of B-cell origin. Four of five patients were treated with chemotherapy. Cases 1 and with MACOP-B, and cases 3 and 4 with Pro-MACE-cytaBOM and consolidation radiation therapy. All the patients achieved a complete remission, and are alive, free of disease, with a follow-up of 1 to 8 years. (author). 28 refs., 8 figs., 2 tabs

  11. Seasonal changes in background levels of deuterium and oxygen-18 prove water drinking by harp seals, which affects the use of the doubly labelled water method.

    Science.gov (United States)

    Nordøy, Erling S; Lager, Anne R; Schots, Pauke C

    2017-12-01

    The aim of this study was to monitor seasonal changes in stable isotopes of pool freshwater and harp seal ( Phoca groenlandica ) body water, and to study whether these potential seasonal changes might bias results obtained using the doubly labelled water (DLW) method when measuring energy expenditure in animals with access to freshwater. Seasonal changes in the background levels of deuterium and oxygen-18 in the body water of four captive harp seals and in the freshwater pool in which they were kept were measured over a time period of 1 year. The seals were offered daily amounts of capelin and kept under a seasonal photoperiod of 69°N. Large seasonal variations of deuterium and oxygen-18 in the pool water were measured, and the isotope abundance in the body water showed similar seasonal changes to the pool water. This shows that the seals were continuously equilibrating with the surrounding water as a result of significant daily water drinking. Variations in background levels of deuterium and oxygen-18 in freshwater sources may be due to seasonal changes in physical processes such as precipitation and evaporation that cause fractionation of isotopes. Rapid and abrupt changes in the background levels of deuterium and oxygen-18 may complicate calculation of energy expenditure by use of the DLW method. It is therefore strongly recommended that analysis of seasonal changes in background levels of isotopes is performed before the DLW method is applied on (free-ranging) animals, and to use a control group in order to correct for changes in background levels. © 2017. Published by The Company of Biologists Ltd.

  12. Authentication of primordial characteristics of the CLBL-1 cell line prove the integrity of a canine B-cell lymphoma in a murine in vivo model.

    Directory of Open Access Journals (Sweden)

    Barbara C Rütgen

    Full Text Available Cell lines are key tools in cancer research allowing the generation of neoplasias in animal models resembling the initial tumours able to mimic the original neoplasias closely in vivo. Canine lymphoma is the major hematopoietic malignancy in dogs and considered as a valuable spontaneous large animal model for human Non-Hodgkin's Lymphoma (NHL. Herein we describe the establishment and characterisation of an in vivo model using the canine B-cell lymphoma cell line CLBL-1 analysing the stability of the induced tumours and the ability to resemble the original material. CLBL-1 was injected into Rag2(-/-γ(c (-/- mice. The generated tumor material was analysed by immunophenotyping and histopathology and used to establish the cell line CLBL-1M. Both cell lines were karyotyped for detection of chromosomal aberrations. Additionally, CLBL-1 was stimulated with IL-2 and DSP30 as described for primary canine B-cell lymphomas and NHL to examine the stimulatory effect on cell proliferation. CLBL-1 in vivo application resulted in lymphoma-like disease and tumor formation. Immunophenotypic analysis of tumorous material showed expression of CD45(+, MHCII(+, CD11a(+ and CD79αcy(+. PARR analysis showed positivity for IgH indicating a monoclonal character. These cytogenetic, molecular, immunophenotypical and histological characterisations of the in vivo model reveal that the induced tumours and thereof generated cell line resemble closely the original material. After DSP30 and IL-2 stimulation, CLBL-1 showed to respond in the same way as primary material. The herein described CLBL-1 in vivo model provides a highly stable tool for B-cell lymphoma research in veterinary and human medicine allowing various further in vivo studies.

  13. Authentication of primordial characteristics of the CLBL-1 cell line prove the integrity of a canine B-cell lymphoma in a murine in vivo model.

    Science.gov (United States)

    Rütgen, Barbara C; Willenbrock, Saskia; Reimann-Berg, Nicola; Walter, Ingrid; Fuchs-Baumgartinger, Andrea; Wagner, Siegfried; Kovacic, Boris; Essler, Sabine E; Schwendenwein, Ilse; Nolte, Ingo; Saalmüller, Armin; Murua Escobar, Hugo

    2012-01-01

    Cell lines are key tools in cancer research allowing the generation of neoplasias in animal models resembling the initial tumours able to mimic the original neoplasias closely in vivo. Canine lymphoma is the major hematopoietic malignancy in dogs and considered as a valuable spontaneous large animal model for human Non-Hodgkin's Lymphoma (NHL). Herein we describe the establishment and characterisation of an in vivo model using the canine B-cell lymphoma cell line CLBL-1 analysing the stability of the induced tumours and the ability to resemble the original material. CLBL-1 was injected into Rag2(-/-)γ(c) (-/-) mice. The generated tumor material was analysed by immunophenotyping and histopathology and used to establish the cell line CLBL-1M. Both cell lines were karyotyped for detection of chromosomal aberrations. Additionally, CLBL-1 was stimulated with IL-2 and DSP30 as described for primary canine B-cell lymphomas and NHL to examine the stimulatory effect on cell proliferation. CLBL-1 in vivo application resulted in lymphoma-like disease and tumor formation. Immunophenotypic analysis of tumorous material showed expression of CD45(+), MHCII(+), CD11a(+) and CD79αcy(+). PARR analysis showed positivity for IgH indicating a monoclonal character. These cytogenetic, molecular, immunophenotypical and histological characterisations of the in vivo model reveal that the induced tumours and thereof generated cell line resemble closely the original material. After DSP30 and IL-2 stimulation, CLBL-1 showed to respond in the same way as primary material. The herein described CLBL-1 in vivo model provides a highly stable tool for B-cell lymphoma research in veterinary and human medicine allowing various further in vivo studies.

  14. Gravitation on large scales

    Science.gov (United States)

    Giraud, E.

    found to be E(Gamma, r) = {Gamma^2 over G}r 8) A quantized model is deduced from a Schrodinger-type equation - {{D^2} {{d^2 Psi(r)} over {dr^2}}} = {[E - {{G M} over r}] Psi(r)} where D^2 is the product of the energy Gamma M^{1/2} by the square of the radius r where {{G M} over r} = {Gamma_f M^{1/2}}. The boundary conditions are given by Psi (0) = 0 and the effective potential 9) The data are in agreement with the hypothesis of quantization, but that hypothesis is not proved because, the mass-to-light ratio being a ''free'' variable, it is always possible to shift a Gamma-curve out of its best ''energy level''. However, if one moves a Gamma-fit from an ''energy level'' to the next, the fitting of the curve becomes clearly poorer. 10) The Newtonian mass-to-light ratios of Class I galaxies range from ~7 to ~75. The mass-to-light ratios of the same objects deduced from the Gamma-dynamics are reduced to 1.1 articles. The Gamma-dynamics is sensitive to the integrated mass through the term Gamma M^{1/2}, and to the mass and density through the Newtonian term {G M} over r. This kind of coupling is particularly efficient in galaxies like NGC 1560 whose rotation curve shows conspicuous structure.

  15. Large capacity storage of integrated objects before change blindness.

    Science.gov (United States)

    Landman, Rogier; Spekreijse, Henk; Lamme, Victor A F

    2003-01-01

    Normal people have a strikingly low ability to detect changes in a visual scene. This has been taken as evidence that the brain represents only a few objects at a time, namely those currently in the focus of attention. In the present study, subjects were asked to detect changes in the orientation of rectangular figures in a textured display across a 1600 ms gray interval. In the first experiment, change detection improved when the location of a possible change was cued during the interval. The cue remained effective during the entire interval, but after the interval, it was ineffective, suggesting that an initially large representation was overwritten by the post-change display. To control for an effect of light intensity during the interval on the decay of the representation, we compared performance with a gray or a white interval screen in a second experiment. We found no difference between these conditions. In the third experiment, attention was occasionally misdirected during the interval by first cueing the wrong figure, before cueing the correct figure. This did not compromise performance compared to a single cue, indicating that when an item is attentionally selected, the representation of yet unchosen items remains available. In the fourth experiment, the cue was shown to be effective when changes in figure size and orientation were randomly mixed. At the time the cue appeared, subjects could not know whether size or orientation would change, therefore these results suggest that the representation contains features in their 'bound' state. Together, these findings indicate that change blindness involves overwriting of a large capacity representation by the post-change display.

  16. Large forging manufacturing process

    Science.gov (United States)

    Thamboo, Samuel V.; Yang, Ling

    2002-01-01

    A process for forging large components of Alloy 718 material so that the components do not exhibit abnormal grain growth includes the steps of: a) providing a billet with an average grain size between ASTM 0 and ASTM 3; b) heating the billet to a temperature of between 1750.degree. F. and 1800.degree. F.; c) upsetting the billet to obtain a component part with a minimum strain of 0.125 in at least selected areas of the part; d) reheating the component part to a temperature between 1750.degree. F. and 1800.degree. F.; e) upsetting the component part to a final configuration such that said selected areas receive no strains between 0.01 and 0.125; f) solution treating the component part at a temperature of between 1725.degree. F. and 1750.degree. F.; and g) aging the component part over predetermined times at different temperatures. A modified process achieves abnormal grain growth in selected areas of a component where desirable.

  17. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  18. The large binocular telescope.

    Science.gov (United States)

    Hill, John M

    2010-06-01

    The Large Binocular Telescope (LBT) Observatory is a collaboration among institutions in Arizona, Germany, Italy, Indiana, Minnesota, Ohio, and Virginia. The telescope on Mount Graham in Southeastern Arizona uses two 8.4 m diameter primary mirrors mounted side by side. A unique feature of the LBT is that the light from the two Gregorian telescope sides can be combined to produce phased-array imaging of an extended field. This cophased imaging along with adaptive optics gives the telescope the diffraction-limited resolution of a 22.65 m aperture and a collecting area equivalent to an 11.8 m circular aperture. This paper describes the design, construction, and commissioning of this unique telescope. We report some sample astronomical results with the prime focus cameras. We comment on some of the technical challenges and solutions. The telescope uses two F/15 adaptive secondaries to correct atmospheric turbulence. The first of these adaptive mirrors has completed final system testing in Firenze, Italy, and is planned to be at the telescope by Spring 2010.

  19. Large Format Radiographic Imaging

    International Nuclear Information System (INIS)

    Rohrer, J. S.; Stewart, Lacey; Wilke, M. D.; King, N. S.; Baker A, S.; Lewis, Wilfred

    1999-01-01

    Radiographic imaging continues to be a key diagnostic in many areas at Los Alamos National Laboratory (LANL). Radiographic recording systems have taken on many form, from high repetition-rate, gated systems to film recording and storage phosphors. Some systems are designed for synchronization to an accelerator while others may be single shot or may record a frame sequence in a dynamic radiography experiment. While film recording remains a reliable standby in the radiographic community, there is growing interest in investigating electronic recording for many applications. The advantages of real time access to remote data acquisition are highly attractive. Cooled CCD camera systems are capable of providing greater sensitivity with improved signal-to-noise ratio. This paper begins with a review of performance characteristics of the Bechtel Nevada large format imaging system, a gated system capable of viewing scintillators up to 300 mm in diameter. We then examine configuration alternatives in lens coupled and fiber optically coupled electro-optical recording systems. Areas of investigation include tradeoffs between fiber optic and lens coupling, methods of image magnification, and spectral matching from scintillator to CCD camera. Key performance features discussed include field of view, resolution, sensitivity, dynamic range, and system noise characteristics

  20. Large Ventral Hernia

    Directory of Open Access Journals (Sweden)

    Meryl Abrams, MD

    2018-04-01

    Full Text Available History of present illness: A 46-year-old female presented to the emergency department (ED with diffuse abdominal pain and three days of poor oral intake associated with non-bilious, non-bloody vomiting. Initial vital signs consisted of a mild resting tachycardia of 111 with a temperature of 38.0 degrees Celsius (°C. On examination, the patient had a large pannus extending to the knees, which contained a hernia. She was tender in this region on examination. Laboratory values included normal serum chemistries and mild leukocytosis of 12.2. The patient reports that her abdomen had been enlarging over the previous 8 years but had not been painful until 3 days prior to presentation. The patient had no associated fever, chills, diarrhea, constipation, chest pain or shortness of breath. Significant findings: Computed tomography (CT scan with intravenous (IV contrast of the abdomen and pelvis demonstrated a large pannus containing a ventral hernia with abdominal contents extending below the knees (white circle, elongation of mesenteric vessels to accommodate abdominal contents outside of the abdomen (white arrow and air fluid levels (white arrow indicating a small bowel obstruction. Discussion: Hernias are a common chief complaint seen in the emergency department. The estimated lifetime risk of a spontaneous abdominal hernia is 5%.1 The most common type of hernia is inguinal while the next most common type of hernia is femoral, which are more common in women.1 Ventral hernias can be epigastric, incisional, or primary abdominal. An asymptomatic, reducible hernia can be followed up as outpatient with a general surgeon for elective repair.2 Hernias become problematic when they are either incarcerated or strangulated. A hernia is incarcerated when the hernia is irreducible and strangulated when its blood supply is compromised. A complicated hernia, especially strangulated, can have a mortality of greater than 50%.1 It is key to perform a thorough history

  1. Large-scale agent-based social simulation : A study on epidemic prediction and control

    NARCIS (Netherlands)

    Zhang, M.

    2016-01-01

    Large-scale agent-based social simulation is gradually proving to be a versatile methodological approach for studying human societies, which could make contributions from policy making in social science, to distributed artificial intelligence and agent technology in computer science, and to theory

  2. Large Deviations for the Annealed Ising Model on Inhomogeneous Random Graphs: Spins and Degrees

    Science.gov (United States)

    Dommers, Sander; Giardinà, Cristian; Giberti, Claudio; Hofstad, Remco van der

    2018-04-01

    We prove a large deviations principle for the total spin and the number of edges under the annealed Ising measure on generalized random graphs. We also give detailed results on how the annealing over the Ising model changes the degrees of the vertices in the graph and show how it gives rise to interesting correlated random graphs.

  3. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  4. Large Deviations for Stochastic Tamed 3D Navier-Stokes Equations

    International Nuclear Information System (INIS)

    Roeckner, Michael; Zhang, Tusheng; Zhang Xicheng

    2010-01-01

    In this paper, using weak convergence method, we prove a large deviation principle of Freidlin-Wentzell type for the stochastic tamed 3D Navier-Stokes equations driven by multiplicative noise, which was investigated in (Roeckner and Zhang in Probab. Theory Relat. Fields 145(1-2), 211-267, 2009).

  5. Large deviations for the Fleming-Viot process with neutral mutation and selection

    OpenAIRE

    Dawson, Donald; Feng, Shui

    1998-01-01

    Large deviation principles are established for the Fleming-Viot processes with neutral mutation and selection, and the corresponding equilibrium measures as the sampling rate goes to 0. All results are first proved for the finite allele model, and then generalized, through the projective limit technique, to the infinite allele model. Explicit expressions are obtained for the rate functions.

  6. Energy Dynamics of an Infinitely Large Offshore Wind Farm

    DEFF Research Database (Denmark)

    Frandsen, Sten Tronæs; Barthelmie, R.J.; Pryor, S.C.

    , particularly in the near-term, can be expected in the higher resource, moderate water depths of the North Sea rather than the Mediterranean. There should therefore be significant interest in understanding the energy dynamics of the infinitely large wind farm – how wakes behave and whether the extraction...... of energy by wind turbines over a large area has a significant and lasting impact on the atmospheric boundary layer. Here we focus on developing understanding of the infinite wind farm through a combination of theoretical considerations, data analysis and modeling. Initial evaluation of power losses due...... is of about the same magnitude as for the infinitely large wind farm. We will examine whether this can be proved theoretically or is indicated by data currently available. We will also evaluate whether energy extraction at the likely scale of development in European Seas can be expected to modulate...

  7. Multipodal Structure and Phase Transitions in Large Constrained Graphs

    Science.gov (United States)

    Kenyon, Richard; Radin, Charles; Ren, Kui; Sadun, Lorenzo

    2017-07-01

    We study the asymptotics of large, simple, labeled graphs constrained by the densities of two subgraphs. It was recently conjectured that for all feasible values of the densities most such graphs have a simple structure. Here we prove this in the special case where the densities are those of edges and of k-star subgraphs, k≥2 fixed. We prove that under such constraints graphs are "multipodal": asymptotically in the number of vertices there is a partition of the vertices into M < ∞ subsets V_1, V_2, \\ldots , V_M, and a set of well-defined probabilities g_{ij} of an edge between any v_i \\in V_i and v_j \\in V_j. For 2≤ k≤ 30 we determine the phase space: the combinations of edge and k-star densities achievable asymptotically. For these models there are special points on the boundary of the phase space with nonunique asymptotic (graphon) structure; for the 2-star model we prove that the nonuniqueness extends to entropy maximizers in the interior of the phase space.

  8. Solving Large Clustering Problems with Meta-Heuristic Search

    DEFF Research Database (Denmark)

    Turkensteen, Marcel; Andersen, Kim Allan; Bang-Jensen, Jørgen

    In Clustering Problems, groups of similar subjects are to be retrieved from data sets. In this paper, Clustering Problems with the frequently used Minimum Sum-of-Squares Criterion are solved using meta-heuristic search. Tabu search has proved to be a successful methodology for solving optimization...... problems, but applications to large clustering problems are rare. The simulated annealing heuristic has mainly been applied to relatively small instances. In this paper, we implement tabu search and simulated annealing approaches and compare them to the commonly used k-means approach. We find that the meta-heuristic...

  9. Central limit theorems for large graphs: Method of quantum decomposition

    International Nuclear Information System (INIS)

    Hashimoto, Yukihiro; Hora, Akihito; Obata, Nobuaki

    2003-01-01

    A new method is proposed for investigating spectral distribution of the combinatorial Laplacian (adjacency matrix) of a large regular graph on the basis of quantum decomposition and quantum central limit theorem. General results are proved for Cayley graphs of discrete groups and for distance-regular graphs. The Coxeter groups and the Johnson graphs are discussed in detail by way of illustration. In particular, the limit distributions obtained from the Johnson graphs are characterized by the Meixner polynomials which form a one-parameter deformation of the Laguerre polynomials

  10. Large Time Behavior of the Vlasov-Poisson-Boltzmann System

    Directory of Open Access Journals (Sweden)

    Li Li

    2013-01-01

    Full Text Available The motion of dilute charged particles can be modeled by Vlasov-Poisson-Boltzmann system. We study the large time stability of the VPB system. To be precise, we prove that when time goes to infinity, the solution of VPB system tends to global Maxwellian state in a rate Ot−∞, by using a method developed for Boltzmann equation without force in the work of Desvillettes and Villani (2005. The improvement of the present paper is the removal of condition on parameter λ as in the work of Li (2008.

  11. A project of X-ray hardening correction in large ICT

    International Nuclear Information System (INIS)

    Fang Min; Liu Yinong; Ni Jianping

    2005-01-01

    This paper presents a means of polychromatic X-ray beam hardening correction using a standard function to transform the polychromatic projection to monochromatic projection in large Industrial Computed Tomography (ICT). Some parameters were defined to verify the validity of hardening correction in large ICT and optimized. Simulated experiments were used to prove that without prior knowledge of the composition of the scanned object, the correction method using monochromatic reconstruction arithmetic could remove beam hardening artifact greatly. (authors)

  12. Unification, small and large

    Energy Technology Data Exchange (ETDEWEB)

    Fritzsch, Harald

    1993-04-15

    Full text: Fruitful exchanges between particle physics, astrophysics and cosmology have become a common feature in the last decade. In January, Coral Gables near Miami was the stage for a 'Unified Symmetry in the Small and the Large' meeting. Coral Gables is a famous physics venue. In January 1964, the year that the quark model of hadrons emerged, Behram Kursunoglu initiated a series of particle physics meetings that continued for 20 years and formed a regular focus for this development. The final such meeting was in 1983, coinciding with both the 80th birthday of field theory pioneer Paul Dirac, who worked in Florida towards the end of his career, and the discovery of the W bosons at CERN. The resurrected Coral Gables meeting began with historical accounts of the emergence of Big Bang cosmology, by Robert Ralph and Herman Alpher, while Andrei Linde proposed our expanding universe as a small part of a stationary system, infinite both in space and in time. The observational status of Big Bang cosmology was reviewed by Bruce Partridge, John Mather and Martin Harwit, emphasizing the cosmic background radiation, where temperature is now measured by the COBE satellite detectors to 2.726 ± 0.01 OK. The tiny fluctuations observed by COBE pose problems for standard cold dark matter models. Edward ('Rocky') Kolb reported on new studies on the electroweak phase transition, based on an analogy with the physics of liquid crystals. Richard Holman discussed the fate of global symmetries at energies near the Planck (grand unification) energy, and Paul Steinhardt talked about tensorial and scalar metric fluctuations in the light of the COBE results. Anthony Tyson gave an impressive description of dark matter studies using gravitational lensing, now emerging as a unique tool for indirectly observing intervening dark matter. A neutrino mass of 10 electronvolts could account for observed dark matter distributions, but fails to provide the necessary seeds for galaxy formation. A

  13. Unification, small and large

    International Nuclear Information System (INIS)

    Fritzsch, Harald

    1993-01-01

    Full text: Fruitful exchanges between particle physics, astrophysics and cosmology have become a common feature in the last decade. In January, Coral Gables near Miami was the stage for a 'Unified Symmetry in the Small and the Large' meeting. Coral Gables is a famous physics venue. In January 1964, the year that the quark model of hadrons emerged, Behram Kursunoglu initiated a series of particle physics meetings that continued for 20 years and formed a regular focus for this development. The final such meeting was in 1983, coinciding with both the 80th birthday of field theory pioneer Paul Dirac, who worked in Florida towards the end of his career, and the discovery of the W bosons at CERN. The resurrected Coral Gables meeting began with historical accounts of the emergence of Big Bang cosmology, by Robert Ralph and Herman Alpher, while Andrei Linde proposed our expanding universe as a small part of a stationary system, infinite both in space and in time. The observational status of Big Bang cosmology was reviewed by Bruce Partridge, John Mather and Martin Harwit, emphasizing the cosmic background radiation, where temperature is now measured by the COBE satellite detectors to 2.726 ± 0.01 OK. The tiny fluctuations observed by COBE pose problems for standard cold dark matter models. Edward ('Rocky') Kolb reported on new studies on the electroweak phase transition, based on an analogy with the physics of liquid crystals. Richard Holman discussed the fate of global symmetries at energies near the Planck (grand unification) energy, and Paul Steinhardt talked about tensorial and scalar metric fluctuations in the light of the COBE results. Anthony Tyson gave an impressive description of dark matter studies using gravitational lensing, now emerging as a unique tool for indirectly observing intervening dark matter. A neutrino mass of 10 electronvolts could account for observed dark matter distributions, but fails to provide the necessary seeds for

  14. Laboratory for Large Data Research

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: The Laboratory for Large Data Research (LDR) addresses a critical need to rapidly prototype shared, unified access to large amounts of data across both the...

  15. Large-D gravity and low-D strings.

    Science.gov (United States)

    Emparan, Roberto; Grumiller, Daniel; Tanabe, Kentaro

    2013-06-21

    We show that in the limit of a large number of dimensions a wide class of nonextremal neutral black holes has a universal near-horizon limit. The limiting geometry is the two-dimensional black hole of string theory with a two-dimensional target space. Its conformal symmetry explains the properties of massless scalars found recently in the large-D limit. For black branes with string charges, the near-horizon geometry is that of the three-dimensional black strings of Horne and Horowitz. The analogies between the α' expansion in string theory and the large-D expansion in gravity suggest a possible effective string description of the large-D limit of black holes. We comment on applications to several subjects, in particular to the problem of critical collapse.

  16. Neural networks prove effective at NOx reduction

    Energy Technology Data Exchange (ETDEWEB)

    Radl, B.J. [Pegasus Technologies, Mentor, OH (USA)

    2000-05-01

    The availability of low cost computer hardware and software is opening up possibilities for the use of artificial intelligence concepts, notably neural networks, in power plant control applications, delivering lower costs, greater efficiencies and reduced emissions. One example of a neural network system is the NeuSIGHT combustion optimisation system, developed by Pegasus Technologies, a subsidiary of KFx Inc. It can help reduce NOx emissions, improve heat rate and enable either deferral or elimination of capital expenditures. on other NOx control technologies, such as low NOx burners, SNCR and SCR. This paper illustrates these benefits using three recent case studies. 4 figs.

  17. How to prove the existence of metabolons?

    DEFF Research Database (Denmark)

    Bassard, Jean-Étienne André; Halkier, Barbara Ann

    2017-01-01

    Sequential enzymes in biosynthetic pathways are organized in metabolons. It is challenging to provide experimental evidence for the existence of metabolons as biosynthetic pathways are composed of highly dynamic protein–protein interactions. Many different methods are being applied, each with str...

  18. Proving maintenance practices at France's CETIC facility

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    CETIC, a PWR maintenance testing, training and validation centre became operational in September 1986. It is designed to meet the following basic requirements: development of plant maintenance processes to reduce work time, validation of tools for use during maintenance, training and qualification of teams for performing high-technology, high-risk operations in nuclear power plants. (U.K.)

  19. Pre-wired systems prove their worth.

    Science.gov (United States)

    2012-03-01

    The 'new generation' of modular wiring systems from Apex Wiring Solutions have been specified for two of the world's foremost teaching hospitals - the Royal London and St Bartholomew's Hospital, as part of a pounds sterling 1 billion redevelopment project, to cut electrical installation times, reduce on-site waste, and provide a pre-wired, factory-tested, power and lighting system. HEJ reports.

  20. NUPEC proves reliability of LWR fuel assemblies

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    It is very important in assuring the safety of nuclear reactors to confirm the reliability of fuel assemblies. The test program of the Nuclear Power Engineering Center on the reliability of fuel assemblies has verified the high performance and reliability of Japanese LWR fuels, and confirmed the propriety of their design and fabrication. This claim is based on the data obtained from the fuel assemblies irradiated in commercial reactors. The NUPEC program includes irradiation test which has been conducted for 11 years since fiscal 1976, and the maximum thermal loading test using the out of pile test facilities simulating a real reactor which has been continued since fiscal 1978. The irradiation test on BWR fuel assemblies in No.3 reactor in Fukushima No.1 Nuclear Power Station, Tokyo Electric Power Co., Inc., and on PWR fuel assemblies in No.3 reactor in Mihama Power Station, Kansai Electric Power Co., Inc., and the maximum thermal loading test on BWR and PWR fuel assemblies are reported. The series of postirradiation examination of the fuel assemblies used for commercial reactors was conducted for the first time in Japan, and the highly systematic data on 27 items were obtained. (Kako, I.)

  1. Vehicle Test Facilities at Aberdeen Proving Ground

    Science.gov (United States)

    1981-07-06

    warehouse and rough terrain forklifts. Two 5-ton-capacity manual chain hoists at the rear of the table regulate its slope from 0 to 40 percent. The overall...Capacity at 24-Inch Load Center. 5. TOP/ HTP 2-2-608, Braking, Wheeled Vehicles, 15 Jav.&ry 1971. 6. TOP 2-2-603, Vehicle Fuel Consumption, 1 November 1977. A-1 r -. ’,’

  2. Has the Effect of Mesotherapy Been Proved?

    OpenAIRE

    Gonca Gökdemir

    2009-01-01

    Mesotherapy is a medical technique that consist of the intracutaneous or subcutaneous injection to the diseased area. It has become as a populer treatment method in cosmetic dermatology recently. Mesotherapy has been used in the treatment of skin rejuvenation, cellulite and localized fat reduction. Substances using in mesotherapy are plant extracts, homeopathic agents, vitamins, and some pharmaceuticals. The effect of these agents are not completely known. There are few experimental and clini...

  3. Has the Effect of Mesotherapy Been Proved?

    Directory of Open Access Journals (Sweden)

    Gonca Gökdemir

    2009-06-01

    Full Text Available Mesotherapy is a medical technique that consist of the intracutaneous or subcutaneous injection to the diseased area. It has become as a populer treatment method in cosmetic dermatology recently. Mesotherapy has been used in the treatment of skin rejuvenation, cellulite and localized fat reduction. Substances using in mesotherapy are plant extracts, homeopathic agents, vitamins, and some pharmaceuticals. The effect of these agents are not completely known. There are few experimental and clinical studies evaluating the efficacy of mesotherapy in any form. In this report, it has been reviewed studies about the effect of compounds commonly used in mesotherapy in literature.

  4. Proving the ecosystem value through hydrological modelling

    International Nuclear Information System (INIS)

    Dorner, W; Spachinger, K; Metzka, R; Porter, M

    2008-01-01

    Ecosystems provide valuable functions. Also natural floodplains and river structures offer different types of ecosystem functions such as habitat function, recreational area and natural detention. From an economic stand point the loss (or rehabilitation) of these natural systems and their provided natural services can be valued as a damage (or benefit). Consequently these natural goods and services must be economically valued in project assessments e.g. cost-benefit-analysis or cost comparison. Especially in smaller catchments and river systems exists significant evidence that natural flood detention reduces flood risk and contributes to flood protection. Several research projects evaluated the mitigating effect of land use, river training and the loss of natural flood plains on development, peak and volume of floods. The presented project analysis the hypothesis that ignoring natural detention and hydrological ecosystem services could result in economically inefficient solutions for flood protection and mitigation. In test areas, subcatchments of the Danube in Germany, a combination of hydrological and hydrodynamic models with economic evaluation techniques was applied. Different forms of land use, river structure and flood protection measures were assed and compared from a hydrological and economic point of view. A hydrodynamic model was used to simulate flows to assess the extent of flood affected areas and damages to buildings and infrastructure as well as to investigate the impacts of levees and river structure on a local scale. These model results provided the basis for an economic assessment. Different economic valuation techniques, such as flood damage functions, cost comparison method and substation-approach were used to compare the outcomes of different hydrological scenarios from an economic point of view and value the ecosystem service. The results give significant evidence that natural detention must be evaluated as part of flood mitigation projects. In addition can be stated that the loss of detention due to land use and dikes can be called an externality and results in economic inefficiencies.

  5. Interval logic. Proof theory and theorem proving

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Marthedal

    2002-01-01

    of a direction of an interval, and present a sound and complete Hilbert proof system for it. Because of its generality, SIL can conveniently act as a general formalism in which other interval logics can be encoded. We develop proof theory for SIL including both a sequent calculus system and a labelled natural...

  6. Historic Building Inventory, Aberdeen Proving Ground, Maryland

    Science.gov (United States)

    1982-01-01

    installation into compliance with the National Historic Preservation Act of i9bo ana its amendments, and related federal laws and regulations. To this ena, the...century. OLD BALTIMORE The first formal authorization for the establishment of a Court House was the 1674 Act of Assembly for the construction of a Court...official recorded meeting at the Court House was in 1692, at which Thomas Heath, innkeeper , filed suit for expenses incurreo by tne Justices at the 1687

  7. PROVE IDRAULICHE SU UN SEMOVENTE IRRIGUO

    Directory of Open Access Journals (Sweden)

    Giuseppe Taglioli

    2007-09-01

    Full Text Available The correct management of the irrigation is a well known factor of success in agriculture, even as regard the water saving. As a consequence it was tested here a very common medium sized hose-reel irrigation machine, known in Italy as rotolone. Several hydraulic and technical aspects were investigated, in two tests on a 3 ha field: – the hydraulic distribution on the field, measured by means of 55 rainfall meter; – the uniformity of the forward speed and the effectiveness of the controller; – the effect of the standstill times on distribution quality; – the hydraulic performance declared by the manufacturer; – the water consumption, the manpower requirements; – the quality of irrigation. Moreover a theoretical analysis of the best overlapping between the range of two adjacent sprinkler was developed. The tests have shown the importance of the regularization of the backward speed: in lack of this the variations can reach the value of 70%. The value of 85% in overlapping of wetted areas, recommended by manufacturers, was theoretically justified. The measured mean range, at the recommended pressure, was 10% lower than declared by manufacturer. The rain hourly intensity was too high for the needing of the clay soil of the field. The jet spraying was coarse in relationship to the clay fraction of the soil but not for the crop (maize. The hose-reel irrigation machine examined here, can maintain an high level of feasibility if some improvement were adopted.

  8. Eternity Variables to Prove Simulation of Specifications

    NARCIS (Netherlands)

    Hesselink, Wim H.

    2005-01-01

    Simulations of specifications are introduced as a unification and generalization of refinement mappings, history variables, forward simulations, prophecy variables, and backward simulations. A specification implements another specification if and only if there is a simulation from the first one to

  9. L’Inquisizione, gli indizi, le prove

    Directory of Open Access Journals (Sweden)

    Guido Dall’Olio

    2012-11-01

    Full Text Available This essay focuses on some characteristics of the inquisitorial trial, comparing it with the accusatory procedure. The inquisitorial trial, which resulted in almost inevitable torture of the accused, is certainly in our eyes an injustice and a violation of the legal rights that are essential for us in a proper procedure. However, the care with which the evidence was produced and examined by the institutions that adopted the inquisitorial procedure deserves to be emphasized, especially for the consequences it had in certain types of imaginary crime such as witchcraft.

  10. The Importance of Proving the Null

    Science.gov (United States)

    Gallistel, C. R.

    2009-01-01

    Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is…

  11. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  12. Thirteen proves lucky number at Ipswich.

    Science.gov (United States)

    Coppard, Mark

    2008-10-01

    At one of East Anglia's largest hospitals, OCS Healthcare claims to have contributed to a "remarkable climate of change" with 13 specialist services, bringing tangible benefits to the half million people served by The Ipswich Hospital NHS Trust. The head of OCS Healthcare, Mark Coppard, describes what the company dubs an "exemplar of private sector expertise supporting public healthcare excellence".

  13. Using NFC phones for proving credentials

    NARCIS (Netherlands)

    Alpár, G.; Batina, L.; Verdult, R.

    2012-01-01

    In this paper we propose a new solution for mobile payments called Tap2 technology. To use it, users need only their NFC-enabled mobile phones and credentials implemented on their smart cards. An NFC device acts like a bridge between service providers and secure elements and the secure credentials

  14. What Carroll’s Tortoise Actually Proves

    NARCIS (Netherlands)

    Wieland, J.J.W.

    2013-01-01

    Rationality requires us to have certain propositional attitudes (beliefs, intentions, etc.) given certain other attitudes that we have. Carroll's Tortoise repeatedly shows up in this discussion. Following up on Brunero (Ethical Theory Moral Pract 8:557-569, 2005), I ask what Carroll-style

  15. Quenches in large superconducting magnets

    International Nuclear Information System (INIS)

    Eberhard, P.H.; Alston-Garnjost, M.; Green, M.A.; Lecomte, P.; Smits, R.G.; Taylor, J.D.; Vuillemin, V.

    1977-08-01

    The development of large high current density superconducting magnets requires an understanding of the quench process by which the magnet goes normal. A theory which describes the quench process in large superconducting magnets is presented and compared with experimental measurements. The use of a quench theory to improve the design of large high current density superconducting magnets is discussed

  16. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  17. Health impacts of large dams

    International Nuclear Information System (INIS)

    Lerer, L.B.

    1999-01-01

    Large dams have been criticized because of their negative environmental and social impacts. Public health interest largely has focused on vector-borne diseases, such as schistosomiasis, associated with reservoirs and irrigation projects. Large dams also influence health through changes in water and food security, increases in communicable diseases, and the social disruption caused by construction and involuntary resettlement. Communities living in close proximity to large dams often do not benefit from water transfer and electricity generation revenues. A comprehensive health component is required in environmental and social impact assessments for large dam projects

  18. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  19. Global and exponential attractors of the three dimensional viscous primitive equations of large-scale moist atmosphere

    OpenAIRE

    You, Bo; Li, Fang

    2016-01-01

    This paper is concerned with the long-time behavior of solutions for the three dimensional viscous primitive equations of large-scale moist atmosphere. We prove the existence of a global attractor for the three dimensional viscous primitive equations of large-scale moist atmosphere by asymptotic a priori estimate and construct an exponential attractor by using the smoothing property of the semigroup generated by the three dimensional viscous primitive equations of large-scale moist atmosphere...

  20. Divergence of perturbation theory in large scale structures

    Science.gov (United States)

    Pajer, Enrico; van der Woude, Drian

    2018-05-01

    We make progress towards an analytical understanding of the regime of validity of perturbation theory for large scale structures and the nature of some non-perturbative corrections. We restrict ourselves to 1D gravitational collapse, for which exact solutions before shell crossing are known. We review the convergence of perturbation theory for the power spectrum, recently proven by McQuinn and White [1], and extend it to non-Gaussian initial conditions and the bispectrum. In contrast, we prove that perturbation theory diverges for the real space two-point correlation function and for the probability density function (PDF) of the density averaged in cells and all the cumulants derived from it. We attribute these divergences to the statistical averaging intrinsic to cosmological observables, which, even on very large and "perturbative" scales, gives non-vanishing weight to all extreme fluctuations. Finally, we discuss some general properties of non-perturbative effects in real space and Fourier space.

  1. Large Hadron Collider (LHC) phenomenology, operational challenges and theoretical predictions

    CERN Document Server

    Gilles, Abelin R

    2013-01-01

    The Large Hadron Collider (LHC) is the highest-energy particle collider ever constructed and is considered "one of the great engineering milestones of mankind." It was built by the European Organization for Nuclear Research (CERN) from 1998 to 2008, with the aim of allowing physicists to test the predictions of different theories of particle physics and high-energy physics, and particularly prove or disprove the existence of the theorized Higgs boson and of the large family of new particles predicted by supersymmetric theories. In this book, the authors study the phenomenology, operational challenges and theoretical predictions of LHC. Topics discussed include neutral and charged black hole remnants at the LHC; the modified statistics approach for the thermodynamical model of multiparticle production; and astroparticle physics and cosmology in the LHC era.

  2. Explicit constructivism: a missing link in ineffective lectures?

    Science.gov (United States)

    Prakash, E S

    2010-06-01

    This study tested the possibility that interactive lectures explicitly based on activating learners' prior knowledge and driven by a series of logical questions might enhance the effectiveness of lectures. A class of 54 students doing the respiratory system course in the second year of the Bachelor of Medicine and Bachelor of Surgery program in my university was randomized to two groups to receive one of two types of lectures, "typical" lectures (n = 28, 18 women and 10 men) or "constructivist" lectures (n = 26, 19 women and 7 men), on the same topic: the regulation of respiration. Student pretest scores in the two groups were comparable (P > 0.1). Students that received the constructivist lectures did much better in the posttest conducted immediately after the lectures (6.8 +/- 3.4 for constructivist lectures vs. 4.2 +/- 2.3 for typical lectures, means +/- SD, P = 0.004). Although both types of lectures were well received, students that received the constructivist lectures appeared to have been more satisfied with their learning experience. However, on a posttest conducted 4 mo later, scores obtained by students in the two groups were not any different (6.9 +/- 3 for constructivist lectures vs. 6.9 +/- 3.7 for typical lectures, P = 0.94). This study adds to the increasing body of evidence that there is a case for the use of interactive lectures that make the construction of knowledge and understanding explicit, easy, and enjoyable to learners.

  3. Ineffective Communication in Nigeria: A Problem Associated with ...

    African Journals Online (AJOL)

    Communication is one thing. Effective communication is another. Specifically, for one to express an idea in speech or writing is one thing and expressing it as it is in one's mind is another thing. Furthermore, receiving the idea through listening or reading is one thing while understanding it, as it is intended, to be able to make ...

  4. Nevada v. Herrington: an ineffective check on the DOE

    International Nuclear Information System (INIS)

    Karkut, J.E.

    1988-01-01

    In this decision, the United States Court of Appeals for the Ninth Circuit held that Nevada was entitled to Department of Energy (DOE) funding for certain hydrologic and geologic studies of the Yucca Mountain site. This site is located in Nye County, Nevada and could be selected as America's first high-level nuclear-waste repository. The studies' purpose is to provide independent state examination of the area's repository suitability. The court applied statutory construction principles to the Nuclear Waste Policy Act of 1982 (NWPA) to reach its decision. The decision has significance for its support of states' pre-site characterization funding rights, for the manner in which the court determined that DOE was not acting within the scope of the NWPA, and for underlying concerns left unaddressed. This Note provides background for and analysis of this decision. Factors necessitating the NWPA's passage are outlined, followed by a sketch of the events leading to this lawsuit. The court's review standard and NWPA analysis based on the statute's language and underlying congressional intent are explained. The decision is then analyzed and critiqued. Finally, a perspective viewing DOE as dangerously out of touch with NWPA statutory mandates and unrestrained in the repository selection process is expressed

  5. Why do ineffective treatments seem helpful? A brief review

    Directory of Open Access Journals (Sweden)

    Hartman Steve E

    2009-10-01

    Full Text Available Abstract After any therapy, when symptoms improve, healthcare providers (and patients are tempted to award credit to treatment. Over time, a particular treatment can seem so undeniably helpful that scientific verification of efficacy is judged an inconvenient waste of time and resources. Unfortunately, practitioners' accumulated, day-to-day, informal impressions of diagnostic reliability and clinical efficacy are of limited value. To help clarify why even treatments entirely lacking in direct effect can seem helpful, I will explain why real signs and symptoms often improve, independent of treatment. Then, I will detail quirks of human perception, interpretation, and memory that often make symptoms seem improved, when they are not. I conclude that healthcare will grow to full potential only when judgments of clinical efficacy routinely are based in properly scientific, placebo-controlled, outcome analysis.

  6. Amino Acids Are an Ineffective Fertilizer for Dunaliella spp. Growth

    Directory of Open Access Journals (Sweden)

    Colin A. Murphree

    2017-05-01

    Full Text Available Autotrophic microalgae are a promising bioproducts platform. However, the fundamental requirements these organisms have for nitrogen fertilizer severely limit the impact and scale of their cultivation. As an alternative to inorganic fertilizers, we investigated the possibility of using amino acids from deconstructed biomass as a nitrogen source in the genus Dunaliella. We found that only four amino acids (glutamine, histidine, cysteine, and tryptophan rescue Dunaliella spp. growth in nitrogen depleted media, and that supplementation of these amino acids altered the metabolic profile of Dunaliella cells. Our investigations revealed that histidine is transported across the cell membrane, and that glutamine and cysteine are not transported. Rather, glutamine, cysteine, and tryptophan are degraded in solution by a set of oxidative chemical reactions, releasing ammonium that in turn supports growth. Utilization of biomass-derived amino acids is therefore not a suitable option unless additional amino acid nitrogen uptake is enabled through genetic modifications of these algae.

  7. Examining the (in)effectiveness of personalized communication

    NARCIS (Netherlands)

    Maslowska, E.; Smit, E.; van den Putte, B.; Eisend, M.; Langner, T.

    2011-01-01

    Personalized communication has become a very popular marketing strategy, but the research on its effectiveness is still limited. This study examined the persuasiveness of personalized digital newsletters in terms of increased attention, cognitive activity, evaluation, attitude, intention, and

  8. Ineffectiveness of sun awareness posters in dermatology clinics.

    Science.gov (United States)

    Jung, G W; Senthilselvan, A; Salopek, T G

    2010-06-01

    Although sun awareness posters have been used in doctors' offices and clinics for decades to promote sun protective behaviour, there is no evidence of their usefulness. To investigate whether sun awareness posters lead to inquiry of skin cancer and sun protection measures. Patients considered at risk for skin cancer seen at a dermatology clinic were randomly asked to complete a questionnaire designed to assess the effectiveness of three different sun awareness posters placed in patient rooms. The posters were selected on the basis of their catchy slogan and eye-appealing images, and included those featuring parental interest, sex appeal and informative advice. Only half of the patients noticed the posters (50.6%). The poster with sex appeal garnered the most attention (67.8%), followed by the informative poster (49.2%) and the parental interest poster (35.8%) (P poster inquired about cutaneous cancers and sun protection practices twice as often as those who did not notice the poster, only one-tenth of such inquiries were attributed to the poster ( approximately 5% of the target population). As reported in the questionnaire, the posters themselves were less effective than the advice of physicians in influencing patient attitudes towards sun protection measures. Organizations that produce and disseminate posters should consider beyond focus groups when they design their posters and should consider field testing their products to ensure that they are reaching the targeted audience and are having the expected beneficial effect, otherwise their posters are simply decorative.

  9. Ineffective programme management on the delivery of health ...

    African Journals Online (AJOL)

    outsourced to the Department of Public Works and the Independent. Development .... achieve a common strategic or business goal. ... Since programme management ... civil and structural engineering together with quantity surveying. The.

  10. Ineffective crypsis in a crab spider: a prey community perspective.

    Science.gov (United States)

    Brechbühl, Rolf; Casas, Jérôme; Bacher, Sven

    2010-03-07

    Cryptic coloration is assumed to be beneficial to predators because of an increased encounter rate with unwary prey. This hypothesis is, however, very rarely, if ever, studied in the field. The aim of this study was to quantify the encounter rate and capture success of an ambush predator, in the field, as a function of its level of colour-matching with the background. We used the crab spider Misumena vatia, which varies its body colour and can thereby match the colour of the flower it hunts upon. We carried out a manipulative field experiment using a complete factorial design resulting in six different colour combinations of crab spiders and flowers differing in their degree of colour-matching. A rich and diverse set of naturally occurring insects visited the flowers while we continuously video-recorded the spider's foraging activity. This enabled us to test the crypsis, the spider avoidance and the flower visitor attraction hypotheses, all three supported by previous studies. Flower visitors of different groups either avoided crab spiders independent of colour-matching, such as solitary bees and syrphid flies, or ignored them, such as bumble-bees and honeybees. Moreover, colour-matched spiders did not have a higher encounter rate and capture success compared to the visually apparent ones. Thus, our results support the spider avoidance hypothesis, reject the two other hypotheses and uncovered a fourth behaviour: indifference to predators. Because flower visitors reacted differently, a community approach is mandatory in order to understand the function of background colour-matching in generalist predators. We discuss our results in relation to the size and sociality of the prey and in relation to the functional significance of colour change in this predator.

  11. Ineffective crypsis in a crab spider: a prey community perspective

    OpenAIRE

    Brechbühl, Rolf; Casas, Jérôme; Bacher, Sven

    2009-01-01

    Cryptic coloration is assumed to be beneficial to predators because of an increased encounter rate with unwary prey. This hypothesis is, however, very rarely, if ever, studied in the field. The aim of this study was to quantify the encounter rate and capture success of an ambush predator, in the field, as a function of its level of colour-matching with the background. We used the crab spider Misumena vatia, which varies its body colour and can thereby match the colour of the flower it hunts u...

  12. Diphtheria in Lao PDR: Insufficient Coverage or Ineffective Vaccine?

    Science.gov (United States)

    Nanthavong, Naphavanh; Black, Antony P; Nouanthong, Phonethipsavanh; Souvannaso, Chanthasone; Vilivong, Keooudomphone; Muller, Claude P; Goossens, Sylvie; Quet, Fabrice; Buisson, Yves

    2015-01-01

    During late 2012 and early 2013 several outbreaks of diphtheria were notified in the North of the Lao People's Democratic Republic. The aim of this study was to determine whether the re-emergence of this vaccine-preventable disease was due to insufficient vaccination coverage or reduction of vaccine effectiveness within the affected regions. A serosurvey was conducted in the Huaphan Province on a cluster sampling of 132 children aged 12-59 months. Serum samples, socio-demographic data, nutritional status and vaccination history were collected when available. Anti-diphtheria and anti-tetanus IgG antibody levels were measured by ELISA. Overall, 63.6% of participants had detectable diphtheria antibodies and 71.2% tetanus antibodies. Factors independently associated with non-vaccination against diphtheria were the distance from the health centre (OR: 6.35 [95% CI: 1.4-28.8], p = 0.01), the Lao Theung ethnicity (OR: 12.2 [95% CI:1,74-85, 4], p = 0.01) and the lack of advice on vaccination given at birth (OR: 9.8 [95% CI: 1.5-63.8], (p = 0.01) while the level of maternal edu-cation was a protective factor (OR: 0.08 [95% CI: 0.008-0.81], p = 0.03). Most respondents claimed financial difficulties as the main reason for non-vaccination. Out of 55 children whose vaccination certificates stated that they were given all 3 doses of diphtheria-containing vaccine, 83.6% had diphtheria antibodies and 92.7% had tetanus antibodies. Furthermore, despite a high prevalence of stunted and underweight children (53% and 25.8%, respectively), the low levels of anti-diphtheria antibodies were not correlated to the nutritional status. Our data highlight a significant deficit in both the vaccination coverage and diphtheria vaccine effectiveness within the Huaphan Province. Technical deficiencies in the methods of storage and distribution of vaccines as well as unreliability of vaccination cards are discussed. Several hypotheses are advanced to explain such a decline in immunity against diphtheria and recommendations are provided to prevent future outbreaks.

  13. Organisational ineffectiveness: environmental shifts and the transition to crisis

    OpenAIRE

    Fischbacher-Smith, Denis

    2014-01-01

    Purpose:\\ud – The purpose of this paper is to explore the notion of effectiveness in the context of organisational crisis. It considers the “darker” side of organisational effectiveness by exploring the processes by which effectiveness can be eroded as an organisation moves from an ordered state, through a complex one, and into a state of chaos, or crisis. It brings together complementary literatures on risk, crisis management, and complexity, and uses those lenses to frame some of the key pr...

  14. Internalization of Ineffective Platinum Complex in Nanocapsules Renders It Cytotoxic

    Czech Academy of Sciences Publication Activity Database

    Vrána, Oldřich; Novohradský, Vojtěch; Medrikova, Z.; Burdikova, J.; Stuchlíková, O.; Kašpárková, Jana; Brabec, Viktor

    2016-01-01

    Roč. 22, č. 8 (2016), s. 2728-2735 ISSN 0947-6539 R&D Projects: GA ČR(CZ) GA14-21053S Institutional support: RVO:68081707 Keywords : interstrand cross-links * dna-adducts * cisplatin nanocapsules Subject RIV: BO - Biophysics Impact factor: 5.317, year: 2016

  15. Corruption and legal (in)effectiveness: an empirical investigation

    NARCIS (Netherlands)

    Herzfeld, T.; Weiss, Ch.

    2003-01-01

    Numerous studies have investigated the causes and measured the consequences of differences in corruption among countries. An effective legal system has been viewed as a key component in reducing corruption. However, estimating cross-sectional as well as panel data models, we find a significant

  16. Preoperative pain measures ineffective in outpatient abdominal surgeries.

    Science.gov (United States)

    Wright, Robert; Wright, Julia; Perry, Kyler; Wright, Daniel

    2018-05-01

    The multimodality addition of preoperative gabapentin, acetaminophen, and celecoxib (GAC) and postoperative TENS has been recommended to diminish narcotics. We predict that GAC-TENS implementation will reduce recovery room time, improve pain control, reduce narcotic refills, and demonstrate usefulness of TENS treatment. A prospective study compared a control group of patients not utilizing the GAC-TENS protocol during 2015 to patients using the GAC-TENS protocol during 2016. There was less recovery room time in the control group compared to the protocol group. Postoperative day one pain control was similar between the groups. Less refills were noted. TENS unit satisfaction level was rated "very helpful" by 63% of patients. The results call into question the efficacy of the American Pain Society recommendations as they increase time in recovery room but do not decrease the quantity of narcotics used in the recovery room, nor do they improve pain satisfaction responses. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Analysis of Ineffectiveness Arising in “Investor-government” Relations

    Directory of Open Access Journals (Sweden)

    Dmytro B. Sokolovskyi

    2015-09-01

    Full Text Available Purpose: This article deals with the problem of forming Pareto non-optimal norms of mutual behavior of investors and government in the process of decision-making related to financing designed to reduce risks in investment activity. Methodology: Considering the interdependent type (nature of interactions between related parties, game theory tools were used to model such interactions. Much attention was directed to search for  parameters of interaction leading to certain Nash equilibriums in pure strategies. The formal results obtained with the model were verified by statistical analysis. Findings: Analysis showed that the rational behavior of related parties can lead to unexpected results. Powerful investors will aim to work in socially-oriented economies, whereas primarily small investors will operate in most liberal economies with a minimum tax burden but with a higher level of risk. As for governments’ behaviors, the images are the same: small economies tend to liberalize their tax systems and to secure investment faster than powerful ones. Empirical verification based on statistical data of groups of countries generally confirmed the conclusions. These formal and logical conclusions were from statistical analysis of 124 countries divided into 5 groups: OECD countries, post-socialist countries, Latin American countries, APAC countries and ACP countries. Provided that the more powerful ones are covered economies, there was stronger interdependence between the size of economies and tax burden and also between total investment and tax burden, where this dependence is positive. Originality: The results obtained used Nash equilibriums in pure strategies as models of behavioral norms to define behaviors of related parties and also to explain assumptions concerning the behaviors of investors and government.

  18. The Ineffectiveness of Manual Treatment of Swimming Pools NNAJI ...

    African Journals Online (AJOL)

    Michael Horsfall

    there was a level of dissatisfaction among the swimmers. Some of ... period, the COD was above 80mg/l, the pH was between 6.2 and 7.1 as against 7.2 to 7.8 recommended by .... Fig 6: Chemical oxygen Demand of the Pool for Three Months.

  19. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  20. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  1. Physics with large extra dimensions

    Indian Academy of Sciences (India)

    can then be accounted by the existence of large internal dimensions, in the sub- ... strongly coupled heterotic theory with one large dimension is described by a weakly ..... one additional U(1) factor corresponding to an extra 'U(1)' D-brane is ...

  2. MPQS with three large primes

    NARCIS (Netherlands)

    Leyland, P.; Lenstra, A.K.; Dodson, B.; Muffett, A.; Wagstaff, S.; Fieker, C.; Kohel, D.R.

    2002-01-01

    We report the factorization of a 135-digit integer by the triple-large-prime variation of the multiple polynomial quadratic sieve. Previous workers [6][10] had suggested that using more than two large primes would be counterproductive, because of the greatly increased number of false reports from

  3. Querying Large Biological Network Datasets

    Science.gov (United States)

    Gulsoy, Gunhan

    2013-01-01

    New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…

  4. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  5. Modern Sorters for Soil Segregation on Large Scale Remediation Projects

    International Nuclear Information System (INIS)

    Shonka, J.J.; Kelley, J.E.; O'Brien, J.M.

    2008-01-01

    In the mid-1940's, Dr. C. Lapointe developed a Geiger tube based uranium ore scanner and picker to replace hand-cobbing. In the 1990's, a modern version of the Lapointe Picker for soil sorting was developed around the need to clean the Johnston Atoll of plutonium. It worked well with sand, but these systems are ineffective with soil, especially with wet conditions. Additionally, several other constraints limited throughput. Slow moving belts and thin layers of material on the belt coupled with the use of multiple small detectors and small sorting gates make these systems ineffective for high throughput. Soil sorting of clay-bearing soils and building debris requires a new look at both the material handling equipment, and the radiation detection methodology. A new class of Super-Sorters has attained throughput of one hundred times that of the old designs. Higher throughput means shorter schedules which reduce costs substantially. The planning, cost, implementation, and other site considerations for these new Super-Sorters are discussed. Modern soil segregation was developed by Ed Bramlitt of the Defense Nuclear Agency for clean up at Johnston Atoll. The process eventually became the Segmented Gate System (SGS). This system uses an array of small sodium iodide (NaI) detectors, each viewing a small volume (segment), that control a gate. The volume in the gate is approximately one kg. This system works well when the material to be processed is sand; however, when the material is wet and sticky (soils with clays) the system has difficulty moving the material through the gates. Super-Sorters are a new class of machine designed to take advantage of high throughput aggregate processing conveyors, large acquisition volumes, and large NaI detectors using gamma spectroscopy. By using commercially available material handling equipment, the system can attain processing rates of up to 400 metric tons/hr with spectrum acquisition approximately every 0.5 sec, so the acquisition

  6. Mobility and powering of large detectors. Moving large detectors

    International Nuclear Information System (INIS)

    Thompson, J.

    1977-01-01

    The possibility is considered of moving large lepton detectors at ISABELLE for readying new experiments, detector modifications, and detector repair. A large annex (approximately 25 m x 25 m) would be built adjacent to the Lepton Hall separated from the Lepton Hall by a wall of concrete 11 m high x 12 m wide (for clearance of the detector) and approximately 3 m thick (for radiation shielding). A large pad would support the detector, the door, the cryogenic support system and the counting house. In removing the detector from the beam hall, one would push the pad into the annex, add a dummy beam pipe, bake out the beam pipe, and restack and position the wall on a small pad at the door. The beam could then operate again while experimenters could work on the large detector in the annex. A consideration and rough price estimate of various questions and proposed solutions are given

  7. Global solubility of the three-dimensional Navier-Stokes equations with uniformly large initial vorticity

    International Nuclear Information System (INIS)

    Makhalov, A S; Nikolaenko, V P

    2003-01-01

    This paper is a survey of results concerning the three-dimensional Navier-Stokes and Euler equations with initial data characterized by uniformly large vorticity. The existence of regular solutions of the three-dimensional Navier-Stokes equations on an unbounded time interval is proved for large initial data both in R 3 and in bounded cylindrical domains. Moreover, the existence of smooth solutions on large finite time intervals is established for the three-dimensional Euler equations. These results are obtained without additional assumptions on the behaviour of solutions for t>0. Any smooth solution is not close to any two-dimensional manifold. Our approach is based on the computation of singular limits of rapidly oscillating operators, non-linear averaging, and a consideration of the mutual absorption of non-linear oscillations of the vorticity field. The use of resonance conditions, methods from the theory of small divisors, and non-linear averaging of almost periodic functions leads to the limit resonant Navier-Stokes equations. Global solubility of these equations is proved without any conditions on the three-dimensional initial data. The global regularity of weak solutions of three-dimensional Navier-Stokes equations with uniformly large vorticity at t=0 is proved by using the regularity of weak solutions and the strong convergence

  8. Measuring happiness in large population

    Science.gov (United States)

    Wenas, Annabelle; Sjahputri, Smita; Takwin, Bagus; Primaldhi, Alfindra; Muhamad, Roby

    2016-01-01

    The ability to know emotional states for large number of people is important, for example, to ensure the effectiveness of public policies. In this study, we propose a measure of happiness that can be used in large scale population that is based on the analysis of Indonesian language lexicons. Here, we incorporate human assessment of Indonesian words, then quantify happiness on large-scale of texts gathered from twitter conversations. We used two psychological constructs to measure happiness: valence and arousal. We found that Indonesian words have tendency towards positive emotions. We also identified several happiness patterns during days of the week, hours of the day, and selected conversation topics.

  9. Techniques for extracting single-trial activity patterns from large-scale neural recordings

    Science.gov (United States)

    Churchland, Mark M; Yu, Byron M; Sahani, Maneesh; Shenoy, Krishna V

    2008-01-01

    Summary Large, chronically-implanted arrays of microelectrodes are an increasingly common tool for recording from primate cortex, and can provide extracellular recordings from many (order of 100) neurons. While the desire for cortically-based motor prostheses has helped drive their development, such arrays also offer great potential to advance basic neuroscience research. Here we discuss the utility of array recording for the study of neural dynamics. Neural activity often has dynamics beyond that driven directly by the stimulus. While governed by those dynamics, neural responses may nevertheless unfold differently for nominally identical trials, rendering many traditional analysis methods ineffective. We review recent studies – some employing simultaneous recording, some not – indicating that such variability is indeed present both during movement generation, and during the preceding premotor computations. In such cases, large-scale simultaneous recordings have the potential to provide an unprecedented view of neural dynamics at the level of single trials. However, this enterprise will depend not only on techniques for simultaneous recording, but also on the use and further development of analysis techniques that can appropriately reduce the dimensionality of the data, and allow visualization of single-trial neural behavior. PMID:18093826

  10. Large Hadron Collider nears completion

    CERN Multimedia

    2008-01-01

    Installation of the final component of the Large Hadron Collider particle accelerator is under way along the Franco-Swiss border near Geneva, Switzerland. When completed this summer, the LHC will be the world's largest and most complex scientific instrument.

  11. Phenomenology of large Nc QCD

    International Nuclear Information System (INIS)

    Lebed, R.F.

    1999-01-01

    These lectures are designed to introduce the methods and results of large N c QCD in a presentation intended for nuclear and particle physicists alike. Beginning with definitions and motivations of the approach, we demonstrate that all quark and gluon Feynman diagrams are organized into classes based on powers of 1/N c . We then show that this result can be translated into definite statements about mesons and baryons containing arbitrary numbers of constituents. In the mesons, numerous well-known phenomenological properties follow as immediate consequences of simply counting powers of N c , while for the baryons, quantitative large N c analyses of masses and other properties are seen to agree with experiment, even when 'large' N c is set equal to its observed value of 3. Large N c reasoning is also used to explain some simple features of nuclear interactions. (author)

  12. Phenomenology of large Nc QCD

    International Nuclear Information System (INIS)

    Richard Lebed

    1998-01-01

    These lectures are designed to introduce the methods and results of large N c QCD in a presentation intended for nuclear and particle physicists alike. Beginning with definitions and motivations of the approach, they demonstrate that all quark and gluon Feynman diagrams are organized into classes based on powers of 1/N c . They then show that this result can be translated into definite statements about mesons and baryons containing arbitrary numbers of constituents. In the mesons, numerous well-known phenomenological properties follow as immediate consequences of simply counting powers of N c , while for the baryons, quantitative large N c analyses of masses and other properties are seen to agree with experiment, even when ''large'' N c is set equal to its observed value of 3. Large N c reasoning is also used to explain some simple features of nuclear interactions

  13. Community Detection for Large Graphs

    KAUST Repository

    Peng, Chengbin; Kolda, Tamara G.; Pinar, Ali; Zhang, Zhihua; Keyes, David E.

    2014-01-01

    Many real world networks have inherent community structures, including social networks, transportation networks, biological networks, etc. For large scale networks with millions or billions of nodes in real-world applications, accelerating current

  14. Physics with large extra dimensions

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 62; Issue 2 ... The recent understanding of string theory opens the possibility that the string scale can be as ... by the existence of large internal dimensions, in the sub-millimeter region.

  15. Utility unbundling : large consumer's perspective

    International Nuclear Information System (INIS)

    Block, C.

    1997-01-01

    The perspectives of Sunoco as a large user of electric power on utility unbundling were presented. Sunoco's Sarnia refinery runs up an energy bill of over $60 million per year for electricity, natural gas (used both as a feedstock as well as a fuel), natural gas liquids and steam. As a large customer Sunoco advocates unbundling of all services, leaving only the 'pipes and wires' as true monopolies. In their view, regulation distorts the market place and prevents the lower prices that would result from competition as has been seen in the airline and telephone industries. Sunoco's expectation is that in the post-deregulated environment large and small consumers will have a choice of energy supplier, and large consumers will increasingly turn to co-generation as the most desirable way of meeting their power needs

  16. LSD: Large Survey Database framework

    Science.gov (United States)

    Juric, Mario

    2012-09-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures.

  17. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  18. Optimization theory for large systems

    CERN Document Server

    Lasdon, Leon S

    2002-01-01

    Important text examines most significant algorithms for optimizing large systems and clarifying relations between optimization procedures. Much data appear as charts and graphs and will be highly valuable to readers in selecting a method and estimating computer time and cost in problem-solving. Initial chapter on linear and nonlinear programming presents all necessary background for subjects covered in rest of book. Second chapter illustrates how large-scale mathematical programs arise from real-world problems. Appendixes. List of Symbols.

  19. Hidden supersymmetry and large N

    International Nuclear Information System (INIS)

    Alfaro, J.

    1988-01-01

    In this paper we present a new method to deal with the leading order in the large-N expansion of a quantum field theory. The method uses explicitly the hidden supersymmetry that is present in the path-integral formulation of a stochastic process. In addition to this we derive a new relation that is valid in the leading order of the large-N expansion of the hermitian-matrix model for any spacetime dimension. (orig.)

  20. Observations of large-amplitude MHD waves in Jupiter's foreshock in connection with a quasi-perpendicular shock structure

    Science.gov (United States)

    Bavassano-Cattaneo, M. B.; Moreno, G.; Scotto, M. T.; Acuna, M.

    1987-01-01

    Plasma and magnetic field observations performed onboard the Voyager 2 spacecraft have been used to investigate Jupiter's foreshock. Large-amplitude waves have been detected in association with the quasi-perpendicular structure of the Jovian bow shock, thus proving that the upstream turbulence is not a characteristic signature of the quasi-parallel shock.

  1. Aphrodisiac pheromones from the wings of the Small Cabbage White and Large Cabbage White butterflies, Pieris rapae and Pieris brassicae

    NARCIS (Netherlands)

    Yildizhan, S.; Loon, van J.J.A.; Sramkova, A.; Ayasse, M.; Arsene, C.; Broeke, ten C.J.M.; Schulz, S.

    2009-01-01

    The small and large cabbage butterflies, Pieris rapae and P. brassicae, are found worldwide and are of considerable economic importance. The composition of the male scent-producing organs present on the wings was investigated. More than 120 components were identified, but only a small portion proved

  2. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    Science.gov (United States)

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  3. Large deviations of heavy-tailed random sums with applications in insurance and finance

    NARCIS (Netherlands)

    Kluppelberg, C; Mikosch, T

    We prove large deviation results for the random sum S(t)=Sigma(i=1)(N(t)) X-i, t greater than or equal to 0, where (N(t))(t greater than or equal to 0) are non-negative integer-valued random variables and (X-n)(n is an element of N) are i.i.d. non-negative random Variables with common distribution

  4. An economical device for carbon supplement in large-scale micro-algae production.

    Science.gov (United States)

    Su, Zhenfeng; Kang, Ruijuan; Shi, Shaoyuan; Cong, Wei; Cai, Zhaoling

    2008-10-01

    One simple but efficient carbon-supplying device was designed and developed, and the correlative carbon-supplying technology was described. The absorbing characterization of this device was studied. The carbon-supplying system proved to be economical for large-scale cultivation of Spirulina sp. in an outdoor raceway pond, and the gaseous carbon dioxide absorptivity was enhanced above 78%, which could reduce the production cost greatly.

  5. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    Science.gov (United States)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2018-03-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  6. Large transverse momentum hadronic processes

    International Nuclear Information System (INIS)

    Darriulat, P.

    1977-01-01

    The possible relations between deep inelastic leptoproduction and large transverse momentum (psub(t)) processes in hadronic collisions are usually considered in the framework of the quark-parton picture. Experiments observing the structure of the final state in proton-proton collisions producing at least one large transverse momentum particle have led to the following conclusions: a large fraction of produced particles are uneffected by the large psub(t) process. The other products are correlated to the large psub(t) particle. Depending upon the sign of scalar product they can be separated into two groups of ''towards-movers'' and ''away-movers''. The experimental evidence are reviewed favouring such a picture and the properties are discussed of each of three groups (underlying normal event, towards-movers and away-movers). Some phenomenological interpretations are presented. The exact nature of away- and towards-movers must be further investigated. Their apparent jet structure has to be confirmed. Angular correlations between leading away and towards movers are very informative. Quantum number flow, both within the set of away and towards-movers, and between it and the underlying normal event, are predicted to behave very differently in different models

  7. The large-s field-reversed configuration experiment

    International Nuclear Information System (INIS)

    Hoffman, A.L.; Carey, L.N.; Crawford, E.A.; Harding, D.G.; DeHart, T.E.; McDonald, K.F.; McNeil, J.L.; Milroy, R.D.; Slough, J.T.; Maqueda, R.; Wurden, G.A.

    1993-01-01

    The Large-s Experiment (LSX) was built to study the formation and equilibrium properties of field-reversed configurations (FRCs) as the scale size increases. The dynamic, field-reversed theta-pinch method of FRC creation produces axial and azimuthal deformations and makes formation difficult, especially in large devices with large s (number of internal gyroradii) where it is difficult to achieve initial plasma uniformity. However, with the proper technique, these formation distortions can be minimized and are then observed to decay with time. This suggests that the basic stability and robustness of FRCs formed, and in some cases translated, in smaller devices may also characterize larger FRCs. Elaborate formation controls were included on LSX to provide the initial uniformity and symmetry necessary to minimize formation disturbances, and stable FRCs could be formed up to the design goal of s = 8. For x ≤ 4, the formation distortions decayed away completely, resulting in symmetric equilibrium FRCs with record confinement times up to 0.5 ms, agreeing with previous empirical scaling laws (τ∝sR). Above s = 4, reasonably long-lived (up to 0.3 ms) configurations could still be formed, but the initial formation distortions were so large that they never completely decayed away, and the equilibrium confinement was degraded from the empirical expectations. The LSX was only operational for 1 yr, and it is not known whether s = 4 represents a fundamental limit for good confinement in simple (no ion beam stabilization) FRCs or whether it simply reflects a limit of present formation technology. Ideally, s could be increased through flux buildup from neutral beams. Since the addition of kinetic or beam ions will probably be desirable for heating, sustainment, and further stabilization of magnetohydrodynamic modes at reactor-level s values, neutral beam injection is the next logical step in FRC development. 24 refs., 21 figs., 2 tabs

  8. Large aperture optical switching devices

    International Nuclear Information System (INIS)

    Goldhar, J.; Henesian, M.A.

    1983-01-01

    We have developed a new approach to constructing large aperture optical switches for next generation inertial confinement fusion lasers. A transparent plasma electrode formed in low pressure ionized gas acts as a conductive coating to allow the uniform charging of the optical faces of an electro-optic material. In this manner large electric fields can be applied longitudinally to large aperture, high aspect ratio Pockels cells. We propose a four-electrode geometry to create the necessary high conductivity plasma sheets, and have demonstrated fast (less than 10 nsec) switching in a 5x5 cm aperture KD*P Pockels cell with such a design. Detaid modelling of Pockels cell performance with plasma electrodes has been carried out for 15 and 30 cm aperture designs

  9. Assembling large, complex environmental metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Howe, A. C. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Jansson, J. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Malfatti, S. A. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tringe, S. G. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tiedje, J. M. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Brown, C. T. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Computer Science and Engineering

    2012-12-28

    The large volumes of sequencing data required to sample complex environments deeply pose new challenges to sequence analysis approaches. De novo metagenomic assembly effectively reduces the total amount of data to be analyzed but requires significant computational resources. We apply two pre-assembly filtering approaches, digital normalization and partitioning, to make large metagenome assemblies more computationaly tractable. Using a human gut mock community dataset, we demonstrate that these methods result in assemblies nearly identical to assemblies from unprocessed data. We then assemble two large soil metagenomes from matched Iowa corn and native prairie soils. The predicted functional content and phylogenetic origin of the assembled contigs indicate significant taxonomic differences despite similar function. The assembly strategies presented are generic and can be extended to any metagenome; full source code is freely available under a BSD license.

  10. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  11. Enhanced safety features of CHASHMA NPP UNIT-2 to encounter selected severe accidents, various challenges involved to prove the adequacy of severe accidents prevention/mitigation measures and to write management guidelines with one possible solution to these challenges

    International Nuclear Information System (INIS)

    Iqbal, Z.; Minhaj, A.

    2007-01-01

    This paper describes enhanced safety features of Chashma Nuclear Power Plant Unit-2 (C-2), a 325 MWe PWR to encounter selected severe accidents and discusses various challenges involved to prove the adequacy of severe accidents encountering measures and to write severe accident management guidelines (SAMGs) in compliance with the recently introduced national regulations based on the new IAEA nuclear safety standards. C-2 is being built by China National Nuclear Corporation (CNNC) for Pakistan Atomic Energy Commission (PAEC). Its twin, Unit-1 (C-1) also a 325 MWe PWR, was commissioned in 2000. Nuclear power safety with reference to severe accidents should be treated as a global issue and therefore the developed countries should include the people of developing countries in nuclear power industry's various severe accidents based research and development programs. The implementation of this idea may also deliver few other useful and mutually beneficial byproducts. (author)

  12. Effectiveness and Cost Efficiency of Different Surveillance Components for Proving Freedom and Early Detection of Disease: Bluetongue Serotype 8 in Cattle as Case Study for Belgium, France and the Netherlands.

    Science.gov (United States)

    Welby, S; van Schaik, G; Veldhuis, A; Brouwer-Middelesch, H; Peroz, C; Santman-Berends, I M; Fourichon, C; Wever, P; Van der Stede, Y

    2017-12-01

    Quick detection and recovery of country's freedom status remain a constant challenge in animal health surveillance. The efficacy and cost efficiency of different surveillance components in proving the absence of infection or (early) detection of bluetongue serotype 8 in cattle populations within different countries (the Netherlands, France, Belgium) using surveillance data from years 2006 and 2007 were investigated using an adapted scenario tree model approach. First, surveillance components (sentinel, yearly cross-sectional and passive clinical reporting) within each country were evaluated in terms of efficacy for substantiating freedom of infection. Yearly cross-sectional survey and passive clinical reporting performed well within each country with sensitivity of detection values ranging around 0.99. The sentinel component had a sensitivity of detection around 0.7. Secondly, how effective the components were for (early) detection of bluetongue serotype 8 and whether syndromic surveillance on reproductive performance, milk production and mortality data available from the Netherlands and Belgium could be of added value were evaluated. Epidemic curves were used to estimate the timeliness of detection. Sensitivity analysis revealed that expected within-herd prevalence and number of herds processed were the most influential parameters for proving freedom and early detection. Looking at the assumed direct costs, although total costs were low for sentinel and passive clinical surveillance components, passive clinical surveillance together with syndromic surveillance (based on reproductive performance data) turned out most cost-efficient for the detection of bluetongue serotype 8. To conclude, for emerging or re-emerging vectorborne disease that behaves such as bluetongue serotype 8, it is recommended to use passive clinical and syndromic surveillance as early detection systems for maximum cost efficiency and sensitivity. Once an infection is detected and eradicated

  13. Large ceramics for fusion applications

    International Nuclear Information System (INIS)

    Hauth, W.E.; Stoddard, S.D.

    1979-01-01

    Prominent ceramic raw materials and products manufacturers were surveyed to determine the state of the art for alumina ceramic fabrication. This survey emphasized current capabilities and limitations for fabrication of large, high-density, high-purity, complex shapes. Some directions are suggested for future needs and development. Ceramic-to-ceramic sealing has applications for several technologies that require large and/or complex vacuum-tight ceramic shapes. Information is provided concerning the assembly of complex monolithic ceramic shapes by bonding of subassemblies at temperatures ranging from 450 to 1500 0 C. Future applications and fabrication techniques for various materials are presented

  14. Dijets at large rapidity intervals

    CERN Document Server

    Pope, B G

    2001-01-01

    Inclusive diet production at large pseudorapidity intervals ( Delta eta ) between the two jets has been suggested as a regime for observing BFKL dynamics. We have measured the dijet cross section for large Delta eta in pp collisions at square root s = 1800 and 630 GeV using the DOE detector. The partonic cross section increases strongly with the size of Delta eta . The observed growth is even stronger than expected on the basis of BFKL resummation in the leading logarithmic approximation. The growth of the partonic cross section can be accommodated with an effective BFKL intercept of alpha /sub BFKL/(20 GeV) = 1.65 +or- 0.07.

  15. Acquisition of reliable vacuum hardware for large accelerator systems

    International Nuclear Information System (INIS)

    Welch, K.M.

    1995-01-01

    Credible and effective communications prove to be the major challenge in the acquisition of reliable vacuum hardware. Technical competence is necessary but not sufficient. The authors must effectively communicate with management, sponsoring agencies, project organizations, service groups, staff and with vendors. Most of Deming's 14 quality assurance tenants relate to creating an enlightened environment of good communications. All projects progress along six distinct, closely coupled, dynamic phases. All six phases are in a state of perpetual change. These phases and their elements are discussed, with emphasis given to the acquisition phase and its related vocabulary. Large projects require great clarity and rigor as poor communications can be costly. For rigor to be cost effective, it can't be pedantic. Clarity thrives best in a low-risk, team environment

  16. Conservation Laws for Gyrokinetic Equations for Large Perturbations and Flows

    Science.gov (United States)

    Dimits, Andris

    2017-10-01

    Gyrokinetic theory has proved to be very useful for the understanding of magnetized plasmas, both to simplify analytical treatments and as a basis for efficient numerical simulations. Gyrokinetic theories were previously developed in two extended orderings that are applicable to large fluctuations and flows as may arise in the tokamak edge and scrapeoff layer. In the present work, we cast the resulting equations in a field-theoretical variational form, and derive, up to second order in the respective orderings, the associated global and local energy and (linear and toroidal) momentum conservation relations that result from Noether's theorem. The consequences of these for the various possible choices of numerical discretization used in gyrokinetic simulations are considered. Prepared for US DOE by LLNL under Contract DE-AC52-07NA27344 and supported by the U.S. DOE, OFES.

  17. The case for a large heavy oil stream

    International Nuclear Information System (INIS)

    Reimer, P.

    2005-01-01

    EnCana Corporation markets significant proprietary and third party crude oil production in North America. This presentation presented details of EnCana's projected resources as well as estimated proved reserves in Canadian oil sands. Details of the Western Canadian heavy oil market were presented. Issues concerning Western Canadian Select (WCS) were also presented, including details of distillation and asphalt characteristics. Details of the WCS synthetic bitumen synergy were examined, as well as quality management issues. It was suggested that further optimization of WCS facilities include reduced operating complexity; less tank proliferation; delivery quality consistency; and reliability. WCS refiner advantages were also evaluated. Shipping and ramping details were discussed, along with growth potential. It was noted that WCS satisfies all the criteria for a benchmark crude. It was concluded that the case for a large Canadian heavy oil stream includes reduced operating complexity; optimized logistics; delivery quality consistency; improved stream liquidity; and enhanced price discovery. tabs., figs

  18. In the loop Large Hadron Collider project - UK engineering firms

    CERN Document Server

    Wilks, N

    2004-01-01

    This paper presents the latest measures being taken to boost the level of UK engineering firms' involvement in research at CERN (Centre for Nuclear Research), including its 27 km circular Large Hadron Collider (LHC) project. Virtually all of the components on this complex project have had to be custom-made, usually in the form of collaboration. It is part of these collaborations that some UK firms have proved they can shine. However, despite the proven capabilities, the financial return continues to be less than the government's funding. Each of the 20 CERN member states provides funds in proportion to its GDP and the UK is the second largest financial contributor. UK firms become price-competitive where a contract calls for a degree of customisation or product development, project management and tight quality control. Development of the Particle Physics Grid, for dissemination and analysis of data from the LHC, continues to provide major supply opportunities for UK manufacturers.

  19. Large Eddy Simulation of turbulence

    International Nuclear Information System (INIS)

    Poullet, P.; Sancandi, M.

    1994-12-01

    Results of Large Eddy Simulation of 3D isotropic homogeneous turbulent flows are presented. A computer code developed on Connexion Machine (CM5) has allowed to compare two turbulent viscosity models (Smagorinsky and structure function). The numerical scheme influence on the energy density spectrum is also studied [fr

  20. The Large Vector Multiplet Action

    OpenAIRE

    Ryb, Itai

    2007-01-01

    We discuss possible actions for the d=2, N=(2,2) large vector multiplet that gauges isometries of generalized Kahler geometries. We explore two scenarios that allow us to write kinetic and superpotential terms for the scalar field-strengths, and write kinetic terms for the spinor invariants that can introduce topological terms for the connections.

  1. Qatar - large capital investment planned

    International Nuclear Information System (INIS)

    Roberts, J.

    1996-01-01

    Large capital investments are planned throughout Qatar's energy industry over the next five years totalling $25 billion. This article describes the successful commissioning of Qatar's first liquefied natural gas (LNG) project on time and within budget. The second LNG plant is well underway and plans for a third are under negotiation. (UK)

  2. Large deviations and portfolio optimization

    Science.gov (United States)

    Sornette, Didier

    Risk control and optimal diversification constitute a major focus in the finance and insurance industries as well as, more or less consciously, in our everyday life. We present a discussion of the characterization of risks and of the optimization of portfolios that starts from a simple illustrative model and ends by a general functional integral formulation. A major item is that risk, usually thought of as one-dimensional in the conventional mean-variance approach, has to be addressed by the full distribution of losses. Furthermore, the time-horizon of the investment is shown to play a major role. We show the importance of accounting for large fluctuations and use the theory of Cramér for large deviations in this context. We first treat a simple model with a single risky asset that exemplifies the distinction between the average return and the typical return and the role of large deviations in multiplicative processes, and the different optimal strategies for the investors depending on their size. We then analyze the case of assets whose price variations are distributed according to exponential laws, a situation that is found to describe daily price variations reasonably well. Several portfolio optimization strategies are presented that aim at controlling large risks. We end by extending the standard mean-variance portfolio optimization theory, first within the quasi-Gaussian approximation and then using a general formulation for non-Gaussian correlated assets in terms of the formalism of functional integrals developed in the field theory of critical phenomena.

  3. Inconsistency in large pharmacogenomic studies

    DEFF Research Database (Denmark)

    Haibe-Kains, Benjamin; El-Hachem, Nehme; Birkbak, Nicolai Juul

    2013-01-01

    Two large-scale pharmacogenomic studies were published recently in this journal. Genomic data are well correlated between studies; however, the measured drug response data are highly discordant. Although the source of inconsistencies remains uncertain, it has potential implications for using...

  4. Strategic Management of Large Projects

    Institute of Scientific and Technical Information of China (English)

    WangYingluo; LiuYi; LiYuan

    2004-01-01

    The strategic management of large projects is both theoretically and practically important. Some scholars have advanced flexible strategy theory in China. The difference of strategic flexibility and flexible strategy is pointed out. The supporting system and characteristics of flexible strategy are analyzed. The changes of flexible strategy and integration of strategic management are discussed.

  5. Mass spectrometry of large molecules

    International Nuclear Information System (INIS)

    Facchetti, S.

    1985-01-01

    The lectures in this volume were given at a course on mass spectrometry of large molecules, organized within the framework of the Training and Education programme of the Joint Research Centre of the European Communities. Although first presented in 1983, most of the lectures have since been updated by their authors. (orig.)

  6. Large for Gestational Age (LGA)

    Science.gov (United States)

    ... mother Other risk factors for having large-for-gestational-age newborns include Maternal obesity Having had previous LGA babies Genetic abnormalities or syndromes (for example, Beckwith-Wiedemann syndrome or Sotos syndrome) Excessive weight gain during pregnancy (the fetus gets more calories as ...

  7. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  8. Active Learning in Large Classes

    DEFF Research Database (Denmark)

    Gørtz, Inge Li

    2011-01-01

    teaching large classes (more than 50 students), and describe how we successfully have in a second semester course in the Bachelor of Engineering (BEng) and Bachelor of Science Engineering (BSc Eng) program at the Technical University of Denmark (DTU). Approximately 200 students is attending...

  9. Adding large EM stack support

    KAUST Repository

    Holst, Glendon

    2016-12-01

    Serial section electron microscopy (SSEM) image stacks generated using high throughput microscopy techniques are an integral tool for investigating brain connectivity and cell morphology. FIB or 3View scanning electron microscopes easily generate gigabytes of data. In order to produce analyzable 3D dataset from the imaged volumes, efficient and reliable image segmentation is crucial. Classical manual approaches to segmentation are time consuming and labour intensive. Semiautomatic seeded watershed segmentation algorithms, such as those implemented by ilastik image processing software, are a very powerful alternative, substantially speeding up segmentation times. We have used ilastik effectively for small EM stacks – on a laptop, no less; however, ilastik was unable to carve the large EM stacks we needed to segment because its memory requirements grew too large – even for the biggest workstations we had available. For this reason, we refactored the carving module of ilastik to scale it up to large EM stacks on large workstations, and tested its efficiency. We modified the carving module, building on existing blockwise processing functionality to process data in manageable chunks that can fit within RAM (main memory). We review this refactoring work, highlighting the software architecture, design choices, modifications, and issues encountered.

  10. Protection of large capacitor banks

    International Nuclear Information System (INIS)

    Sprott, J.C.; Lovell, T.W.

    1982-06-01

    Large capacitor banks, as used in many pulsed plasma experiments, are subject to catastrophic failure in the event of a short in the output or in an individual capacitor. Methods are described for protecting such banks to minimize the damage and down-time caused by such a failure

  11. CERN's Large Hadron Collider project

    Science.gov (United States)

    Fearnley, Tom A.

    1997-03-01

    The paper gives a brief overview of CERN's Large Hadron Collider (LHC) project. After an outline of the physics motivation, we describe the LHC machine, interaction rates, experimental challenges, and some important physics channels to be studied. Finally we discuss the four experiments planned at the LHC: ATLAS, CMS, ALICE and LHC-B.

  12. CERN's Large Hadron Collider project

    International Nuclear Information System (INIS)

    Fearnley, Tom A.

    1997-01-01

    The paper gives a brief overview of CERN's Large Hadron Collider (LHC) project. After an outline of the physics motivation, we describe the LHC machine, interaction rates, experimental challenges, and some important physics channels to be studied. Finally we discuss the four experiments planned at the LHC: ATLAS, CMS, ALICE and LHC-B

  13. Large area CMOS image sensors

    International Nuclear Information System (INIS)

    Turchetta, R; Guerrini, N; Sedgwick, I

    2011-01-01

    CMOS image sensors, also known as CMOS Active Pixel Sensors (APS) or Monolithic Active Pixel Sensors (MAPS), are today the dominant imaging devices. They are omnipresent in our daily life, as image sensors in cellular phones, web cams, digital cameras, ... In these applications, the pixels can be very small, in the micron range, and the sensors themselves tend to be limited in size. However, many scientific applications, like particle or X-ray detection, require large format, often with large pixels, as well as other specific performance, like low noise, radiation hardness or very fast readout. The sensors are also required to be sensitive to a broad spectrum of radiation: photons from the silicon cut-off in the IR down to UV and X- and gamma-rays through the visible spectrum as well as charged particles. This requirement calls for modifications to the substrate to be introduced to provide optimized sensitivity. This paper will review existing CMOS image sensors, whose size can be as large as a single CMOS wafer, and analyse the technical requirements and specific challenges of large format CMOS image sensors.

  14. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  15. Parallel Algorithm for Incremental Betweenness Centrality on Large Graphs

    KAUST Repository

    Jamour, Fuad Tarek

    2017-10-17

    Betweenness centrality quantifies the importance of nodes in a graph in many applications, including network analysis, community detection and identification of influential users. Typically, graphs in such applications evolve over time. Thus, the computation of betweenness centrality should be performed incrementally. This is challenging because updating even a single edge may trigger the computation of all-pairs shortest paths in the entire graph. Existing approaches cannot scale to large graphs: they either require excessive memory (i.e., quadratic to the size of the input graph) or perform unnecessary computations rendering them prohibitively slow. We propose iCentral; a novel incremental algorithm for computing betweenness centrality in evolving graphs. We decompose the graph into biconnected components and prove that processing can be localized within the affected components. iCentral is the first algorithm to support incremental betweeness centrality computation within a graph component. This is done efficiently, in linear space; consequently, iCentral scales to large graphs. We demonstrate with real datasets that the serial implementation of iCentral is up to 3.7 times faster than existing serial methods. Our parallel implementation that scales to large graphs, is an order of magnitude faster than the state-of-the-art parallel algorithm, while using an order of magnitude less computational resources.

  16. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  17. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  18. Endoscopically assisted enucleation of a large mandibular periapical cyst.

    Science.gov (United States)

    Nestal Zibo, Heleia; Miller, Ene

    2011-01-01

    Enucleation of large cysts in the jaws is an invasive method that might be associated with complications. Marsupialization is a less invasive alternative method but it involves a prolonged and uncomfortable healing period. This study addresses a contemporaneous and less invasive surgical technique for treating larger mandibular cysts. MATERIALS AND METHODS. A 48-year-old woman presented with a large mandibular apical cyst involving the left parasymphysis, body, ramus and condylar neck, with involvement of the alveolar inferior nerve. The cystic lesion was enucleated using a 30° 4.0 mm endoscopic scope and endoscopic instruments through two small accesses: the ostectomy site of previously performed marsupialization and the alveolus of the involved third molar extracted of the time of the enucleation of the cyst. RESULTS. The endoscopic scope provided good visualization of the whole cystic cavity allowing the removal of any residual pathologic tissue and preservation of the integrity of the involved inferior alveolar nerve. The morbidity of the surgical procedure was extremely reduced. At a 6-month follow-up the patient did not present any symptom of inflammation and a panoramic X-ray showed good bone repair and remodelation. CONCLUSIONS. Endoscopically assisted enucleation proved to be effective method of treating a large mandibular cyst, providing total enucleation with a minimal invasive technique.

  19. Large core plastic planar optical splitter fabricated by 3D printing technology

    Science.gov (United States)

    Prajzler, Václav; Kulha, Pavel; Knietel, Marian; Enser, Herbert

    2017-10-01

    We report on the design, fabrication and optical properties of large core multimode optical polymer splitter fabricated using fill up core polymer in substrate that was made by 3D printing technology. The splitter was designed by the beam propagation method intended for assembling large core waveguide fibers with 735 μm diameter. Waveguide core layers were made of optically clear liquid adhesive, and Veroclear polymer was used as substrate and cover layers. Measurement of optical losses proved that the insertion optical loss was lower than 6.8 dB in the visible spectrum.

  20. The large hadron collider project

    International Nuclear Information System (INIS)

    Maiani, L.

    1999-01-01

    Knowledge of the fundamental constituents of matter has greatly advanced, over the last decades. The standard theory of fundamental interactions presents us with a theoretically sound picture, which describes with great accuracy known physical phenomena on most diverse energy and distance scales. These range from 10 -16 cm, inside the nucleons, up to large-scale astrophysical bodies, including the early Universe at some nanosecond after the Big-Bang and temperatures of the order of 10 2 GeV. The picture is not yet completed, however, as we lack the observation of the Higgs boson, predicted in the 100-500 GeV range - a particle associated with the generation of particle masses and with the quantum fluctuations in the primordial Universe. In addition, the standard theory is expected to undergo a change of regime in the 10 3 GeV region, with the appearance of new families of particles, most likely associated with the onset of a new symmetry (supersymmetry). In 1994, the CERN Council approved the construction of the large hadron collider (LHC), a proton-proton collider of a new design to be installed in the existing LEP tunnel, with an energy of 7 TeV per beam and extremely large luminosity, of ∝10 34 cm -2 s -1 . Construction was started in 1996, with the additional support of the US, Japan, Russia, Canada and other European countries, making the LHC a really global project, the first one in particle physics. After a short review of the physics scenario, I report on the present status of the LHC construction. Special attention is given to technological problems such as the realization of the super-conducting dipoles, following an extensive R and D program with European industries. The construction of the large LHC detectors has required a vast R and D program by a large international community, to overcome the problems posed by the complexity of the collisions and by the large luminosity of the machine. (orig.)