WorldWideScience

Sample records for sources rigorous tests

  1. Long persistence of rigor mortis at constant low temperature.

    Science.gov (United States)

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  2. Scientific rigor through videogames.

    Science.gov (United States)

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Putrefactive rigor: apparent rigor mortis due to gas distension.

    Science.gov (United States)

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  4. A rigorous test for a new conceptual model for collisions

    International Nuclear Information System (INIS)

    Peixoto, E.M.A.; Mu-Tao, L.

    1979-01-01

    A rigorous theoretical foundation for the previously proposed model is formulated and applied to electron scattering by H 2 in the gas phase. An rigorous treatment of the interaction potential between the incident electron and the Hydrogen molecule is carried out to calculate Differential Cross Sections for 1 KeV electrons, using Glauber's approximation Wang's molecular wave function for the ground electronic state of H 2 . Moreover, it is shown for the first time that, when adequately done, the omission of two center terms does not adversely influence the results of molecular calculations. It is shown that the new model is far superior to the Independent Atom Model (or Independent Particle Model). The accuracy and simplicity of the new model suggest that it may be fruitfully applied to the description of other collision phenomena (e.g., in molecular beam experiments and nuclear physics). A new techniques is presented for calculations involving two center integrals within the frame work of the Glauber's approximation for scattering. (Author) [pt

  5. A new look at the statistical assessment of approximate and rigorous methods for the estimation of stabilized formation temperatures in geothermal and petroleum wells

    International Nuclear Information System (INIS)

    Espinoza-Ojeda, O M; Santoyo, E; Andaverde, J

    2011-01-01

    Approximate and rigorous solutions of seven heat transfer models were statistically examined, for the first time, to estimate stabilized formation temperatures (SFT) of geothermal and petroleum boreholes. Constant linear and cylindrical heat source models were used to describe the heat flow (either conductive or conductive/convective) involved during a borehole drilling. A comprehensive statistical assessment of the major error sources associated with the use of these models was carried out. The mathematical methods (based on approximate and rigorous solutions of heat transfer models) were thoroughly examined by using four statistical analyses: (i) the use of linear and quadratic regression models to infer the SFT; (ii) the application of statistical tests of linearity to evaluate the actual relationship between bottom-hole temperatures and time function data for each selected method; (iii) the comparative analysis of SFT estimates between the approximate and rigorous predictions of each analytical method using a β ratio parameter to evaluate the similarity of both solutions, and (iv) the evaluation of accuracy in each method using statistical tests of significance, and deviation percentages between 'true' formation temperatures and SFT estimates (predicted from approximate and rigorous solutions). The present study also enabled us to determine the sensitivity parameters that should be considered for a reliable calculation of SFT, as well as to define the main physical and mathematical constraints where the approximate and rigorous methods could provide consistent SFT estimates

  6. The Relationship between Project-Based Learning and Rigor in STEM-Focused High Schools

    Science.gov (United States)

    Edmunds, Julie; Arshavsky, Nina; Glennie, Elizabeth; Charles, Karen; Rice, Olivia

    2016-01-01

    Project-based learning (PjBL) is an approach often favored in STEM classrooms, yet some studies have shown that teachers struggle to implement it with academic rigor. This paper explores the relationship between PjBL and rigor in the classrooms of ten STEM-oriented high schools. Utilizing three different data sources reflecting three different…

  7. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  8. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  9. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    Science.gov (United States)

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  10. Realizing rigor in the mathematics classroom

    CERN Document Server

    Hull, Ted H (Henry); Balka, Don S

    2014-01-01

    Rigor put within reach! Rigor: The Common Core has made it policy-and this first-of-its-kind guide takes math teachers and leaders through the process of making it reality. Using the Proficiency Matrix as a framework, the authors offer proven strategies and practical tools for successful implementation of the CCSS mathematical practices-with rigor as a central objective. You'll learn how to Define rigor in the context of each mathematical practice Identify and overcome potential issues, including differentiating instruction and using data

  11. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    Science.gov (United States)

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  12. Increased scientific rigor will improve reliability of research and effectiveness of management

    Science.gov (United States)

    Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.

    2018-01-01

    Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and

  13. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  14. Rigorous Science: a How-To Guide

    Directory of Open Access Journals (Sweden)

    Arturo Casadevall

    2016-11-01

    Full Text Available Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.

  15. Large source test stand for H-(D-) ion source

    International Nuclear Information System (INIS)

    Larson, R.; McKenzie-Wilson, R.

    1981-01-01

    The Brookhaven National Laboratory Neutral Beam Group has constructed a large source test stand for testing of the various source modules under development. The first objective of the BNL program is to develop a source module capable of delivering 10A of H - (D - ) at 25 kV operating in the steady state mode with satisfactory gas and power efficiency. The large source test stand contains gas supply and vacuum pumping systems, source cooling systems, magnet power supplies and magnet cooling systems, two arc power supplies rated at 25 kW and 50 kW, a large battery driven power supply and an extractor electrode power supply. Figure 1 is a front view of the vacuum vessel showing the control racks with the 36'' vacuum valves and refrigerated baffles mounted behind. Figure 2 shows the rear view of the vessel with a BNL Mk V magnetron source mounted in the source aperture and also shows the cooled magnet coils. Currently two types of sources are under test: a large magnetron source and a hollow cathode discharge source

  16. Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.

    Science.gov (United States)

    Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten

    2018-01-01

    Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence

  17. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  18. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors

    Directory of Open Access Journals (Sweden)

    Spiros Pagiatakis

    2009-10-01

    Full Text Available In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times. It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF. It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at −40 °C, −20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  19. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    Science.gov (United States)

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  20. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  1. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    Science.gov (United States)

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  2. Studying the co-evolution of production and test code in open source and industrial developer test processes through repository mining

    NARCIS (Netherlands)

    Zaidman, A.; Van Rompaey, B.; Van Deursen, A.; Demeyer, S.

    2010-01-01

    Many software production processes advocate rigorous development testing alongside functional code writing, which implies that both test code and production code should co-evolve. To gain insight in the nature of this co-evolution, this paper proposes three views (realized by a tool called TeMo)

  3. A case of instantaneous rigor?

    Science.gov (United States)

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  4. Mathematical Rigor in Introductory Physics

    Science.gov (United States)

    Vandyke, Michael; Bassichis, William

    2011-10-01

    Calculus-based introductory physics courses intended for future engineers and physicists are often designed and taught in the same fashion as those intended for students of other disciplines. A more mathematically rigorous curriculum should be more appropriate and, ultimately, more beneficial for the student in his or her future coursework. This work investigates the effects of mathematical rigor on student understanding of introductory mechanics. Using a series of diagnostic tools in conjunction with individual student course performance, a statistical analysis will be performed to examine student learning of introductory mechanics and its relation to student understanding of the underlying calculus.

  5. "Rigor mortis" in a live patient.

    Science.gov (United States)

    Chakravarthy, Murali

    2010-03-01

    Rigor mortis is conventionally a postmortem change. Its occurrence suggests that death has occurred at least a few hours ago. The authors report a case of "Rigor Mortis" in a live patient after cardiac surgery. The likely factors that may have predisposed such premortem muscle stiffening in the reported patient are, intense low cardiac output status, use of unusually high dose of inotropic and vasopressor agents and likely sepsis. Such an event may be of importance while determining the time of death in individuals such as described in the report. It may also suggest requirement of careful examination of patients with muscle stiffening prior to declaration of death. This report is being published to point out the likely controversies that might arise out of muscle stiffening, which should not always be termed rigor mortis and/ or postmortem.

  6. Classroom Talk for Rigorous Reading Comprehension Instruction

    Science.gov (United States)

    Wolf, Mikyung Kim; Crosson, Amy C.; Resnick, Lauren B.

    2004-01-01

    This study examined the quality of classroom talk and its relation to academic rigor in reading-comprehension lessons. Additionally, the study aimed to characterize effective questions to support rigorous reading comprehension lessons. The data for this study included 21 reading-comprehension lessons in several elementary and middle schools from…

  7. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  8. Supersymmetry and the Parisi-Sourlas dimensional reduction: A rigorous proof

    International Nuclear Information System (INIS)

    Klein, A.; Landau, L.J.; Perez, J.F.

    1984-01-01

    Functional integrals that are formally related to the average correlation functions of a classical field theory in the presence of random external sources are given a rigorous meaning. Their dimensional reduction to the Schwinger functions of the corresponding quantum field theory in two fewer dimensions is proven. This is done by reexpressing those functional integrals as expectations of a supersymmetric field theory. The Parisi-Sourlas dimensional reduction of a supersymmetric field theory to a usual quantum field theory in two fewer dimensions is proven. (orig.)

  9. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    Science.gov (United States)

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  10. [Experimental study of restiffening of the rigor mortis].

    Science.gov (United States)

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  11. [Rigor mortis -- a definite sign of death?].

    Science.gov (United States)

    Heller, A R; Müller, M P; Frank, M D; Dressler, J

    2005-04-01

    In the past years an ongoing controversial debate exists in Germany, regarding quality of the coroner's inquest and declaration of death by physicians. We report the case of a 90-year old female, who was found after an unknown time following a suicide attempt with benzodiazepine. The examination of the patient showed livores (mortis?) on the left forearm and left lower leg. Moreover, rigor (mortis?) of the left arm was apparent which prevented arm flexion and extension. The hypothermic patient with insufficient respiration was intubated and mechanically ventilated. Chest compressions were not performed, because central pulses were (hardly) palpable and a sinus bradycardia 45/min (AV-block 2 degrees and sole premature ventricular complexes) was present. After placement of an intravenous line (17 G, external jugular vein) the hemodynamic situation was stabilized with intermittent boli of epinephrine and with sodium bicarbonate. With improved circulation livores and rigor disappeared. In the present case a minimal central circulation was noted, which could be stabilized, despite the presence of certain signs of death ( livores and rigor mortis). Considering the finding of an abrogated peripheral perfusion (livores), we postulate a centripetal collapse of glycogen and ATP supply in the patients left arm (rigor), which was restored after resuscitation and reperfusion. Thus, it appears that livores and rigor are not sensitive enough to exclude a vita minima, in particular in hypothermic patients with intoxications. Consequently a careful ABC-check should be performed even in the presence of apparently certain signs of death, to avoid underdiagnosing a vita minima. Additional ECG- monitoring is required to reduce the rate of false positive declarations of death. To what extent basic life support by paramedics should commence when rigor and livores are present until physician DNR order, deserves further discussion.

  12. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    Science.gov (United States)

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  13. Monitoring muscle optical scattering properties during rigor mortis

    Science.gov (United States)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  14. Neutron Sources for Standard-Based Testing

    Energy Technology Data Exchange (ETDEWEB)

    Radev, Radoslav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McLean, Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-10

    The DHS TC Standards and the consensus ANSI Standards use 252Cf as the neutron source for performance testing because its energy spectrum is similar to the 235U and 239Pu fission sources used in nuclear weapons. An emission rate of 20,000 ± 20% neutrons per second is used for testing of the radiological requirements both in the ANSI standards and the TCS. Determination of the accurate neutron emission rate of the test source is important for maintaining consistency and agreement between testing results obtained at different testing facilities. Several characteristics in the manufacture and the decay of the source need to be understood and accounted for in order to make an accurate measurement of the performance of the neutron detection instrument. Additionally, neutron response characteristics of the particular instrument need to be known and taken into account as well as neutron scattering in the testing environment.

  15. New rigorous asymptotic theorems for inverse scattering amplitudes

    International Nuclear Information System (INIS)

    Lomsadze, Sh.Yu.; Lomsadze, Yu.M.

    1984-01-01

    The rigorous asymptotic theorems both of integral and local types obtained earlier and establishing logarithmic and in some cases even power correlations aetdeen the real and imaginary parts of scattering amplitudes Fsub(+-) are extended to the inverse amplitudes 1/Fsub(+-). One also succeeds in establishing power correlations of a new type between the real and imaginary parts, both for the amplitudes themselves and for the inverse ones. All the obtained assertions are convenient to be tested in high energy experiments when the amplitudes show asymptotic behaviour

  16. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    Science.gov (United States)

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  17. An ultramicroscopic study on rigor mortis.

    Science.gov (United States)

    Suzuki, T

    1976-01-01

    Gastrocnemius muscles taken from decapitated mice at various intervals after death and from mice killed by 2,4-dinitrophenol or mono-iodoacetic acid injection to induce rigor mortis soon after death, were observed by electron microscopy. The prominent appearance of many fine cross striations in the myofibrils (occurring about every 400 A) was considered to be characteristic of rigor mortis. These striations were caused by minute granules studded along the surfaces of both thick and thin filaments and appeared to be the bridges connecting the 2 kinds of filaments and accounted for the hardness and rigidity of the muscle.

  18. Tenderness of pre- and post rigor lamb longissimus muscle.

    Science.gov (United States)

    Geesink, Geert; Sujang, Sadi; Koohmaraie, Mohammad

    2011-08-01

    Lamb longissimus muscle (n=6) sections were cooked at different times post mortem (prerigor, at rigor, 1dayp.m., and 7 days p.m.) using two cooking methods. Using a boiling waterbath, samples were either cooked to a core temperature of 70 °C or boiled for 3h. The latter method was meant to reflect the traditional cooking method employed in countries where preparation of prerigor meat is practiced. The time postmortem at which the meat was prepared had a large effect on the tenderness (shear force) of the meat (PCooking prerigor and at rigor meat to 70 °C resulted in higher shear force values than their post rigor counterparts at 1 and 7 days p.m. (9.4 and 9.6 vs. 7.2 and 3.7 kg, respectively). The differences in tenderness between the treatment groups could be largely explained by a difference in contraction status of the meat after cooking and the effect of ageing on tenderness. Cooking pre and at rigor meat resulted in severe muscle contraction as evidenced by the differences in sarcomere length of the cooked samples. Mean sarcomere lengths in the pre and at rigor samples ranged from 1.05 to 1.20 μm. The mean sarcomere length in the post rigor samples was 1.44 μm. Cooking for 3 h at 100 °C did improve the tenderness of pre and at rigor prepared meat as compared to cooking to 70 °C, but not to the extent that ageing did. It is concluded that additional intervention methods are needed to improve the tenderness of prerigor cooked meat. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    Science.gov (United States)

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Rigor or mortis: best practices for preclinical research in neuroscience.

    Science.gov (United States)

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  2. High and low rigor temperature effects on sheep meat tenderness and ageing.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (Prigor at each ageing time were significantly different (Prigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  3. Physiological studies of muscle rigor mortis in the fowl

    International Nuclear Information System (INIS)

    Nakahira, S.; Kaneko, K.; Tanaka, K.

    1990-01-01

    A simple system was developed for continuous measurement of muscle contraction during nor mortis. Longitudinal muscle strips dissected from the Peroneus Longus were suspended in a plastic tube containing liquid paraffin. Mechanical activity was transmitted to a strain-gauge transducer which is connected to a potentiometric pen-recorder. At the onset of measurement 1.2g was loaded on the muscle strip. This model was used to study the muscle response to various treatments during nor mortis. All measurements were carried out under the anaerobic condition at 17°C, except otherwise stated. 1. The present system was found to be quite useful for continuous measurement of muscle rigor course. 2. Muscle contraction under the anaerobic condition at 17°C reached a peak about 2 hours after the onset of measurement and thereafter it relaxed at a slow rate. In contrast, the aerobic condition under a high humidity resulted in a strong rigor, about three times stronger than that in the anaerobic condition. 3. Ultrasonic treatment (37, 000-47, 000Hz) at 25°C for 10 minutes resulted in a moderate muscle rigor. 4. Treatment of muscle strip with 2mM EGTA at 30°C for 30 minutes led to a relaxation of the muscle. 5. The muscle from the birds killed during anesthesia with pentobarbital sodium resulted in a slow rate of rigor, whereas the birds killed one day after hypophysectomy led to a quick muscle rigor as seen in intact controls. 6. A slight muscle rigor was observed when muscle strip was placed in a refrigerator at 0°C for 18.5 hours and thereafter temperature was kept at 17°C. (author)

  4. A CUMULATIVE MIGRATION METHOD FOR COMPUTING RIGOROUS TRANSPORT CROSS SECTIONS AND DIFFUSION COEFFICIENTS FOR LWR LATTICES WITH MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi

    2016-05-01

    A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.

  5. Estimation of the breaking of rigor mortis by myotonometry.

    Science.gov (United States)

    Vain, A; Kauppila, R; Vuori, E

    1996-05-31

    Myotonometry was used to detect breaking of rigor mortis. The myotonometer is a new instrument which measures the decaying oscillations of a muscle after a brief mechanical impact. The method gives two numerical parameters for rigor mortis, namely the period and decrement of the oscillations, both of which depend on the time period elapsed after death. In the case of breaking the rigor mortis by muscle lengthening, both the oscillation period and decrement decreased, whereas, shortening the muscle caused the opposite changes. Fourteen h after breaking the stiffness characteristics of the right and left m. biceps brachii, or oscillation periods, were assimilated. However, the values for decrement of the muscle, reflecting the dissipation of mechanical energy, maintained their differences.

  6. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  7. Rigorous solution to Bargmann-Wigner equation for integer spin

    CERN Document Server

    Huang Shi Zhong; Wu Ning; Zheng Zhi Peng

    2002-01-01

    A rigorous method is developed to solve the Bargamann-Wigner equation for arbitrary integer spin in coordinate representation in a step by step way. The Bargmann-Wigner equation is first transformed to a form easier to solve, the new equations are then solved rigorously in coordinate representation, and the wave functions in a closed form are thus derived

  8. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    Science.gov (United States)

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  9. Guidelines for testing sealed radiation sources

    International Nuclear Information System (INIS)

    1989-01-01

    These guidelines are based on article 16(1) of the Ordinance on the Implementation of Atomic Safety and Radiation Protection dated 11 October 1984 (VOAS), in connection with article 36 of the Executory Provision to the VOAS, of 11 October 1984. They apply to the testing of sealed sources to verify their intactness, tightness and non-contamination as well as observance of their fixed service time. The type, scope and intervals of testing as well as the evaluation of test results are determined. These guidelines also apply to the testing of radiation sources forming part of radiation equipment, unless otherwise provided for in the type license or permit. These guidelines enter into force on 1 January 1990

  10. Biomedical text mining for research rigor and integrity: tasks, challenges, directions.

    Science.gov (United States)

    Kilicoglu, Halil

    2017-06-13

    An estimated quarter of a trillion US dollars is invested in the biomedical research enterprise annually. There is growing alarm that a significant portion of this investment is wasted because of problems in reproducibility of research findings and in the rigor and integrity of research conduct and reporting. Recent years have seen a flurry of activities focusing on standardization and guideline development to enhance the reproducibility and rigor of biomedical research. Research activity is primarily communicated via textual artifacts, ranging from grant applications to journal publications. These artifacts can be both the source and the manifestation of practices leading to research waste. For example, an article may describe a poorly designed experiment, or the authors may reach conclusions not supported by the evidence presented. In this article, we pose the question of whether biomedical text mining techniques can assist the stakeholders in the biomedical research enterprise in doing their part toward enhancing research integrity and rigor. In particular, we identify four key areas in which text mining techniques can make a significant contribution: plagiarism/fraud detection, ensuring adherence to reporting guidelines, managing information overload and accurate citation/enhanced bibliometrics. We review the existing methods and tools for specific tasks, if they exist, or discuss relevant research that can provide guidance for future work. With the exponential increase in biomedical research output and the ability of text mining approaches to perform automatic tasks at large scale, we propose that such approaches can support tools that promote responsible research practices, providing significant benefits for the biomedical research enterprise. Published by Oxford University Press 2017. This work is written by a US Government employee and is in the public domain in the US.

  11. Rigorous bounds on the free energy of electron-phonon models

    NARCIS (Netherlands)

    Raedt, Hans De; Michielsen, Kristel

    1997-01-01

    We present a collection of rigorous upper and lower bounds to the free energy of electron-phonon models with linear electron-phonon interaction. These bounds are used to compare different variational approaches. It is shown rigorously that the ground states corresponding to the sharpest bounds do

  12. Aspects related to the testing of sealed radioactive sources

    International Nuclear Information System (INIS)

    Olteanu, C. M.; Nistor, V.; Valeca, S. C.

    2016-01-01

    Sealed radioactive sources are commonly used in a wide range of applications, such as: medical, industrial, agricultural and scientific research. The radioactive material is contained within the sealed source and the device allows the radiation to be used in a controlled way. Accidents can result if the control over a small fraction of those sources is lost. Sealed nuclear sources fall under the category of special form radioactive material, therefore they must meet safety requirements during transport according to regulations. Testing sealed radioactive sources is an important step in the conformity assessment process in order to obtain the design approval. In ICN Pitesti, the Reliability and Testing Laboratory is notified by CNCAN to perform tests on sealed radioactive sources. This paper wants to present aspects of the verifying tests on sealed capsules for Iridium-192 sources in order to demonstrate the compliance with the regulatory requirements and the program of quality assurance of the tests performed. (authors)

  13. Study design elements for rigorous quasi-experimental comparative effectiveness research.

    Science.gov (United States)

    Maciejewski, Matthew L; Curtis, Lesley H; Dowd, Bryan

    2013-03-01

    Quasi-experiments are likely to be the workhorse study design used to generate evidence about the comparative effectiveness of alternative treatments, because of their feasibility, timeliness, affordability and external validity compared with randomized trials. In this review, we outline potential sources of discordance in results between quasi-experiments and experiments, review study design choices that can improve the internal validity of quasi-experiments, and outline innovative data linkage strategies that may be particularly useful in quasi-experimental comparative effectiveness research. There is an urgent need to resolve the debate about the evidentiary value of quasi-experiments since equal consideration of rigorous quasi-experiments will broaden the base of evidence that can be brought to bear in clinical decision-making and governmental policy-making.

  14. Accelerating Biomedical Discoveries through Rigor and Transparency.

    Science.gov (United States)

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  15. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  16. The effect of rigor mortis on the passage of erythrocytes and fluid through the myocardium of isolated dog hearts.

    Science.gov (United States)

    Nevalainen, T J; Gavin, J B; Seelye, R N; Whitehouse, S; Donnell, M

    1978-07-01

    The effect of normal and artificially induced rigor mortis on the vascular passage of erythrocytes and fluid through isolated dog hearts was studied. Increased rigidity of 6-mm thick transmural sections through the centre of the posterior papillary muscle was used as an indication of rigor. The perfusibility of the myocardium was tested by injecting 10 ml of 1% sodium fluorescein in Hanks solution into the circumflex branch of the left coronary artery. In prerigor hearts (20 minute incubation) fluorescein perfused the myocardium evenly whether or not it was preceded by an injection of 10 ml of heparinized dog blood. Rigor mortis developed in all hearts after 90 minutes incubation or within 20 minutes of perfusing the heart with 50 ml of 5 mM iodoacetate in Hanks solution. Fluorescein injected into hearts in rigor did not enter the posterior papillary muscle and adjacent subendocardium whether or not it was preceded by heparinized blood. Thus the vascular occlusion caused by rigor in the dog heart appears to be so effective that it prevents flow into the subendocardium of small soluble ions such as fluorescein.

  17. Trends: Rigor Mortis in the Arts.

    Science.gov (United States)

    Blodget, Alden S.

    1991-01-01

    Outlines how past art education provided a refuge for students from the rigors of other academic subjects. Observes that in recent years art education has become "discipline based." Argues that art educators need to reaffirm their commitment to a humanistic way of knowing. (KM)

  18. Using grounded theory as a method for rigorously reviewing literature

    NARCIS (Netherlands)

    Wolfswinkel, J.; Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.

    2013-01-01

    This paper offers guidance to conducting a rigorous literature review. We present this in the form of a five-stage process in which we use Grounded Theory as a method. We first probe the guidelines explicated by Webster and Watson, and then we show the added value of Grounded Theory for rigorously

  19. Rigor, vigor, and the study of health disparities.

    Science.gov (United States)

    Adler, Nancy; Bush, Nicole R; Pantell, Matthew S

    2012-10-16

    Health disparities research spans multiple fields and methods and documents strong links between social disadvantage and poor health. Associations between socioeconomic status (SES) and health are often taken as evidence for the causal impact of SES on health, but alternative explanations, including the impact of health on SES, are plausible. Studies showing the influence of parents' SES on their children's health provide evidence for a causal pathway from SES to health, but have limitations. Health disparities researchers face tradeoffs between "rigor" and "vigor" in designing studies that demonstrate how social disadvantage becomes biologically embedded and results in poorer health. Rigorous designs aim to maximize precision in the measurement of SES and health outcomes through methods that provide the greatest control over temporal ordering and causal direction. To achieve precision, many studies use a single SES predictor and single disease. However, doing so oversimplifies the multifaceted, entwined nature of social disadvantage and may overestimate the impact of that one variable and underestimate the true impact of social disadvantage on health. In addition, SES effects on overall health and functioning are likely to be greater than effects on any one disease. Vigorous designs aim to capture this complexity and maximize ecological validity through more complete assessment of social disadvantage and health status, but may provide less-compelling evidence of causality. Newer approaches to both measurement and analysis may enable enhanced vigor as well as rigor. Incorporating both rigor and vigor into studies will provide a fuller understanding of the causes of health disparities.

  20. Development and application of test apparatus for classification of sealed source

    International Nuclear Information System (INIS)

    Kim, Dong Hak; Seo, Ki Seog; Bang, Kyoung Sik; Lee, Ju Chan; Son, Kwang Je

    2007-01-01

    Sealed sources have to conducted the tests be done according to the classification requirements for their typical usages in accordance with the relevant domestic notice standard and ISO 2919. After each test, the source shall be examined visually for loss of integrity and pass an appropriate leakage test. Tests to class a sealed source are temperature, external pressure, impact, vibration and puncture test. The environmental test conditions for tests with class numbers are arranged in increasing order of severity. In this study, the apparatus of tests, except the vibration test, were developed and applied to three kinds of sealed source. The conditions of the tests to class a sealed source were stated and the difference between the domestic notice standard and ISO 2919 were considered. And apparatus of the tests were made. Using developed apparatus we conducted the test for 192 Ir brachytherapy sealed source and two kinds of sealed source for industrial radiography. 192 Ir brachytherapy sealed source is classified by temperature class 5, external pressure class 3, impact class 2 and vibration and puncture class 1. Two kinds of sealed source for industrial radiography are classified by temperature class 4, external pressure class 2, impact and puncture class 5 and vibration class 1. After the tests, Liquid nitrogen bubble test and vacuum bubble test were done to evaluate the safety of the sealed sources

  1. How Individual Scholars Can Reduce the Rigor-Relevance Gap in Management Research

    OpenAIRE

    Wolf, Joachim; Rosenberg, Timo

    2012-01-01

    This paper discusses a number of avenues management scholars could follow to reduce the existing gap between scientific rigor and practical relevance without relativizing the importance of the first goal dimension. Such changes are necessary because many management studies do not fully exploit the possibilities to increase their practical relevance while maintaining scientific rigor. We argue that this rigor-relevance gap is not only the consequence of the currently prevailing institutional c...

  2. Onset of rigor mortis is earlier in red muscle than in white muscle.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  3. Enhanced H- ion source testing capabilities at LANSCE

    International Nuclear Information System (INIS)

    Ingalls, W.B.; Hardy, M.W.; Prichard, B.A.; Sander, O.R.; Stelzer, J.E.; Stevens, R.R.; Leung, K.N.; Williams, M.D.

    1998-01-01

    As part of the on-going beam-current upgrade in the Proton Storage Ring (PSR) at the Los Alamos Neutron Science Center (LANSCE), the current available from the H - injector will be increased from the present 16 to 18 mA to as much as 40 mA. A collaboration between the Ion Beam Technology Group at Lawrence Berkeley National Laboratory (LBNL) and the Ion Sources and Injectors section of LANSCE-2 at Los Alamos National Laboratory (LANL) has been formed to develop and evaluate a new ion source. A new Ion Source Test Stand (ISTS) has been constructed at LANSCE to evaluate candidate ion sources. The ISTS has been constructed to duplicate as closely as possible the beam transport and ancillary systems presently in use in the LANSCE H - injector, while incorporating additional beam diagnostics for source testing. The construction and commissioning of the ISTS will be described, preliminary results for the proof-of-principle ion source developed by the Berkeley group will be presented, and future plans for the extension of the test stand will be presented

  4. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  5. Basic Testing of the DUCHAMP Source Finder

    Science.gov (United States)

    Westmeier, T.; Popping, A.; Serra, P.

    2012-01-01

    This paper presents and discusses the results of basic source finding tests in three dimensions (using spectroscopic data cubes) with DUCHAMP, the standard source finder for the Australian Square Kilometre Array Pathfinder. For this purpose, we generated different sets of unresolved and extended Hi model sources. These models were then fed into DUCHAMP, using a range of different parameters and methods provided by the software. The main aim of the tests was to study the performance of DUCHAMP on sources with different parameters and morphologies and assess the accuracy of DUCHAMP's source parametrisation. Overall, we find DUCHAMP to be a powerful source finder capable of reliably detecting sources down to low signal-to-noise ratios and accurately measuring their position and velocity. In the presence of noise in the data, DUCHAMP's measurements of basic source parameters, such as spectral line width and integrated flux, are affected by systematic errors. These errors are a consequence of the effect of noise on the specific algorithms used by DUCHAMP for measuring source parameters in combination with the fact that the software only takes into account pixels above a given flux threshold and hence misses part of the flux. In scientific applications of DUCHAMP these systematic errors would have to be corrected for. Alternatively, DUCHAMP could be used as a source finder only, and source parametrisation could be done in a second step using more sophisticated parametrisation algorithms.

  6. Photoconductivity of amorphous silicon-rigorous modelling

    International Nuclear Information System (INIS)

    Brada, P.; Schauer, F.

    1991-01-01

    It is our great pleasure to express our gratitude to Prof. Grigorovici, the pioneer of the exciting field of amorphous state by our modest contribution to this area. In this paper are presented the outline of the rigorous modelling program of the steady-state photoconductivity in amorphous silicon and related materials. (Author)

  7. 10 CFR 39.35 - Leak testing of sealed sources.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Leak testing of sealed sources. 39.35 Section 39.35 Energy....35 Leak testing of sealed sources. (a) Testing and recordkeeping requirements. Each licensee who uses... record of leak test results in units of microcuries and retain the record for inspection by the...

  8. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    Science.gov (United States)

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  9. Toxicity testing: the search for an in vitro alternative to animal testing.

    Science.gov (United States)

    May, J E; Xu, J; Morse, H R; Avent, N D; Donaldson, C

    2009-01-01

    Prior to introduction to the clinic, pharmaceuticals must undergo rigorous toxicity testing to ensure their safety. Traditionally, this has been achieved using in vivo animal models. However, besides ethical reasons, there is a continual drive to reduce the number of animals used for this purpose due to concerns such as the lack of concordance seen between animal models and toxic effects in humans. Adequate testing to ensure any toxic metabolites are detected can be further complicated if the agent is administered in a prodrug form, requiring a source of cytochrome P450 enzymes for metabolism. A number of sources of metabolic enzymes have been utilised in in vitro models, including cell lines, primary human tissue and liver extracts such as S9. This review examines current and new in vitro models for toxicity testing, including a new model developed within the authors' laboratory utilising HepG2 liver spheroids within a co-culture system to examine the effects of chemotherapeutic agents on other cell types.

  10. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    Science.gov (United States)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  11. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    Science.gov (United States)

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  12. Fast rigorous numerical method for the solution of the anisotropic neutron transport problem and the NITRAN system for fusion neutronics application. Pt. 1

    International Nuclear Information System (INIS)

    Takahashi, A.; Rusch, D.

    1979-07-01

    Some recent neutronics experiments for fusion reactor blankets show that the precise treatment of anisotropic secondary emissions for all types of neutron scattering is needed for neutron transport calculations. In the present work new rigorous methods, i.e. based on non-approximative microscopic neutron balance equations, are applied to treat the anisotropic collision source term in transport equations. The collision source calculation is free from approximations except for the discretization of energy, angle and space variables and includes the rigorous treatment of nonelastic collisions, as far as nuclear data are given. Two methods are presented: first the Ii-method, which relies on existing nuclear data files and then, as an ultimate goal, the I*-method, which aims at the use of future double-differential cross section data, but which is also applicable to the present single-differential data basis to allow a smooth transition to the new data type. An application of the Ii-method is given in the code system NITRAN which employs the Ssub(N)-method to solve the transport equations. Both rigorous methods, the Ii- and the I*-method, are applicable to all radiation transport problems and they can be used also in the Monte-Carlo-method to solve the transport problem. (orig./RW) [de

  13. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    Science.gov (United States)

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  14. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    Science.gov (United States)

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    Science.gov (United States)

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  16. Rigor mortis in an unusual position: Forensic considerations.

    Science.gov (United States)

    D'Souza, Deepak H; Harish, S; Rajesh, M; Kiran, J

    2011-07-01

    We report a case in which the dead body was found with rigor mortis in an unusual position. The dead body was lying on its back with limbs raised, defying gravity. Direction of the salivary stains on the face was also defying the gravity. We opined that the scene of occurrence of crime is unlikely to be the final place where the dead body was found. The clues were revealing a homicidal offence and an attempt to destroy the evidence. The forensic use of 'rigor mortis in an unusual position' is in furthering the investigations, and the scientific confirmation of two facts - the scene of death (occurrence) is different from the scene of disposal of dead body, and time gap between the two places.

  17. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    Science.gov (United States)

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (psalting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  18. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    Science.gov (United States)

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (prigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  19. Moving beyond Data Transcription: Rigor as Issue in Representation of Digital Literacies

    Science.gov (United States)

    Hagood, Margaret Carmody; Skinner, Emily Neil

    2015-01-01

    Rigor in qualitative research has been based upon criteria of credibility, dependability, confirmability, and transferability. Drawing upon articles published during our editorship of the "Journal of Adolescent & Adult Literacy," we illustrate how the use of digital data in research study reporting may enhance these areas of rigor,…

  20. Evaluation of methods to leak test sealed radiation sources

    International Nuclear Information System (INIS)

    Arbeau, N.D.; Scott, C.K.

    1987-04-01

    The methods for the leak testing of sealed radiation sources were reviewed. One hundred and thirty-one equipment vendors were surveyed to identify commercially available leak test instruments. The equipment is summarized in tabular form by radiation type and detector type for easy reference. The radiation characteristics of the licensed sources were reviewed and summarized in a format that can be used to select the most suitable detection method. A test kit is proposed for use by inspectors when verifying a licensee's test procedures. The general elements of leak test procedures are discussed

  1. A negative ion source test facility

    Energy Technology Data Exchange (ETDEWEB)

    Melanson, S.; Dehnel, M., E-mail: morgan@d-pace.com; Potkins, D.; Theroux, J.; Hollinger, C.; Martin, J.; Stewart, T.; Jackle, P.; Withington, S. [D-Pace, Inc., P.O. Box 201, Nelson, British Columbia V1L 5P9 (Canada); Philpott, C.; Williams, P.; Brown, S.; Jones, T.; Coad, B. [Buckley Systems Ltd., 6 Bowden Road, Mount Wellington, Auckland 1060 (New Zealand)

    2016-02-15

    Progress is being made in the development of an Ion Source Test Facility (ISTF) by D-Pace Inc. in collaboration with Buckley Systems Ltd. in Auckland, NZ. The first phase of the ISTF is to be commissioned in October 2015 with the second phase being commissioned in March 2016. The facility will primarily be used for the development and the commercialization of ion sources. It will also be used to characterize and further develop various D-Pace Inc. beam diagnostic devices.

  2. Trends in Methodological Rigor in Intervention Research Published in School Psychology Journals

    Science.gov (United States)

    Burns, Matthew K.; Klingbeil, David A.; Ysseldyke, James E.; Petersen-Brown, Shawna

    2012-01-01

    Methodological rigor in intervention research is important for documenting evidence-based practices and has been a recent focus in legislation, including the No Child Left Behind Act. The current study examined the methodological rigor of intervention research in four school psychology journals since the 1960s. Intervention research has increased…

  3. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  4. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  5. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    Science.gov (United States)

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  6. Quantum key distribution with an unknown and untrusted source

    Science.gov (United States)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2009-03-01

    The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).

  7. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1986-01-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first- and second-order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first-order formulation satisfies the conditions of the Hille--Yosida theorem. A foundation is laid thereby within which the domains associated with the first- and second-order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  8. Some rigorous results concerning spectral theory for ideal MHD

    International Nuclear Information System (INIS)

    Laurence, P.

    1985-05-01

    Spectral theory for linear ideal MHD is laid on a firm foundation by defining appropriate function spaces for the operators associated with both the first and second order (in time and space) partial differential operators. Thus, it is rigorously established that a self-adjoint extension of F(xi) exists. It is shown that the operator L associated with the first order formulation satisfies the conditions of the Hille-Yosida theorem. A foundation is laid thereby within which the domains associated with the first and second order formulations can be compared. This allows future work in a rigorous setting that will clarify the differences (in the two formulations) between the structure of the generalized eigenspaces corresponding to the marginal point of the spectrum ω = 0

  9. Using Project Complexity Determinations to Establish Required Levels of Project Rigor

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Thomas D.

    2015-10-01

    This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.

  10. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    Science.gov (United States)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  11. Layout optimization of DRAM cells using rigorous simulation model for NTD

    Science.gov (United States)

    Jeon, Jinhyuck; Kim, Shinyoung; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Kuechler, Bernd; Zimmermann, Rainer; Muelders, Thomas; Klostermann, Ulrich; Schmoeller, Thomas; Do, Mun-hoe; Choi, Jung-Hoe

    2014-03-01

    scanning electron microscope (SEM) measurements. High resist impact and difficult model data acquisition demand for a simulation model that hat is capable of extrapolating reliably beyond its calibration dataset. We use rigorous simulation models to provide that predictive performance. We have discussed the need of a rigorous mask optimization process for DRAM contact cell layout yielding mask layouts that are optimal in process performance, mask manufacturability and accuracy. In this paper, we have shown the step by step process from analytical illumination source derivation, a NTD and application tailored model calibration to layout optimization such as OPC and SRAF placement. Finally the work has been verified with simulation and experimental results on wafer.

  12. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    Science.gov (United States)

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  13. Student’s rigorous mathematical thinking based on cognitive style

    Science.gov (United States)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  14. Rigorous approximation of stationary measures and convergence to equilibrium for iterated function systems

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Monge, Maurizio; Nisoli, Isaia

    2016-01-01

    We study the problem of the rigorous computation of the stationary measure and of the rate of convergence to equilibrium of an iterated function system described by a stochastic mixture of two or more dynamical systems that are either all uniformly expanding on the interval, either all contracting. In the expanding case, the associated transfer operators satisfy a Lasota–Yorke inequality, we show how to compute a rigorous approximations of the stationary measure in the L "1 norm and an estimate for the rate of convergence. The rigorous computation requires a computer-aided proof of the contraction of the transfer operators for the maps, and we show that this property propagates to the transfer operators of the IFS. In the contracting case we perform a rigorous approximation of the stationary measure in the Wasserstein–Kantorovich distance and rate of convergence, using the same functional analytic approach. We show that a finite computation can produce a realistic computation of all contraction rates for the whole parameter space. We conclude with a description of the implementation and numerical experiments. (paper)

  15. Aviation Flight Test

    Data.gov (United States)

    Federal Laboratory Consortium — Redstone Test Center provides an expert workforce and technologically advanced test equipment to conduct the rigorous testing necessary for U.S. Army acquisition and...

  16. A Test Stand for Ion Sources of Ultimate Reliability

    International Nuclear Information System (INIS)

    Enparantza, R.; Uriarte, L.; Romano, P.; Alonso, J.; Ariz, I.; Egiraun, M.; Bermejo, F. J.; Etxebarria, V.; Lucas, J.; Del Rio, J. M.; Letchford, A.; Faircloth, D.; Stockli, M.

    2009-01-01

    The rationale behind the ITUR project is to perform a comparison between different kinds of H - ion sources using the same beam diagnostics setup. In particular, a direct comparison will be made in terms of the emittance characteristics of Penning Type sources such as those currently in use in the injector for the ISIS (UK) Pulsed Neutron Source and those of volumetric type such as that driving the injector for the ORNL Spallation Neutron Source (TN, U.S.A.). The endeavour here pursued is thus to build an Ion Source Test Stand where virtually any type of source can be tested and its features measured and, thus compared to the results of other sources under the same gauge. It would be possible then to establish a common ground for effectively comparing different ion sources. The long term objectives are thus to contribute towards building compact sources of minimum emittance, maximum performance, high reliability-availability, high percentage of desired particle production, stability and high brightness. The project consortium is lead by Tekniker-IK4 research centre and partners are companies Elytt Energy and Jema Group. The technical viability is guaranteed by the collaboration between the project consortium and several scientific institutions, such the CSIC (Spain), the University of the Basque Country (Spain), ISIS (STFC-UK), SNS (ORNL-USA) and CEA in Saclay (France).

  17. 10 CFR 34.27 - Leak testing and replacement of sealed sources.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Leak testing and replacement of sealed sources. 34.27... SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Equipment § 34.27 Leak testing and replacement... radiographic exposure device and leak testing of any sealed source must be performed by persons authorized to...

  18. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    Science.gov (United States)

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (Prigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (Prigor fillets (37.8 ± 0.8) and had significantly lower (Prigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  19. Experimental evaluation of rigor mortis. VIII. Estimation of time since death by repeated measurements of the intensity of rigor mortis on rats.

    Science.gov (United States)

    Krompecher, T

    1994-10-21

    The development of the intensity of rigor mortis was monitored in nine groups of rats. The measurements were initiated after 2, 4, 5, 6, 8, 12, 15, 24, and 48 h post mortem (p.m.) and lasted 5-9 h, which ideally should correspond to the usual procedure after the discovery of a corpse. The experiments were carried out at an ambient temperature of 24 degrees C. Measurements initiated early after death resulted in curves with a rising portion, a plateau, and a descending slope. Delaying the initial measurement translated into shorter rising portions, and curves initiated 8 h p.m. or later are comprised of a plateau and/or a downward slope only. Three different phases were observed suggesting simple rules that can help estimate the time since death: (1) if an increase in intensity was found, the initial measurements were conducted not later than 5 h p.m.; (2) if only a decrease in intensity was observed, the initial measurements were conducted not earlier than 7 h p.m.; and (3) at 24 h p.m., the resolution is complete, and no further changes in intensity should occur. Our results clearly demonstrate that repeated measurements of the intensity of rigor mortis allow a more accurate estimation of the time since death of the experimental animals than the single measurement method used earlier. A critical review of the literature on the estimation of time since death on the basis of objective measurements of the intensity of rigor mortis is also presented.

  20. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    Science.gov (United States)

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  1. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  2. ATP, IMP, and glycogen in cod muscle at onset and during development of rigor mortis depend on the sampling location

    DEFF Research Database (Denmark)

    Cappeln, Gertrud; Jessen, Flemming

    2002-01-01

    Variation in glycogen, ATP, and IMP contents within individual cod muscles were studied in ice stored fish during the progress of rigor mortis. Rigor index was determined before muscle samples for chemical analyzes were taken at 16 different positions on the fish. During development of rigor......, the contents of glycogen and ATP decreased differently in relation to rigor index depending on sampling location. Although fish were considered to be in strong rigor according to the rigor index method, parts of the muscle were not in rigor as high ATP concentrations were found in dorsal and tall muscle....

  3. Disciplining Bioethics: Towards a Standard of Methodological Rigor in Bioethics Research

    Science.gov (United States)

    Adler, Daniel; Shaul, Randi Zlotnik

    2012-01-01

    Contemporary bioethics research is often described as multi- or interdisciplinary. Disciplines are characterized, in part, by their methods. Thus, when bioethics research draws on a variety of methods, it crosses disciplinary boundaries. Yet each discipline has its own standard of rigor—so when multiple disciplinary perspectives are considered, what constitutes rigor? This question has received inadequate attention, as there is considerable disagreement regarding the disciplinary status of bioethics. This disagreement has presented five challenges to bioethics research. Addressing them requires consideration of the main types of cross-disciplinary research, and consideration of proposals aiming to ensure rigor in bioethics research. PMID:22686634

  4. Infrared source test

    Energy Technology Data Exchange (ETDEWEB)

    Ott, L.

    1994-11-15

    The purpose of the Infrared Source Test (IRST) is to demonstrate the ability to track a ground target with an infrared sensor from an airplane. The system is being developed within the Advance Technology Program`s Theater Missile Defense/Unmanned Aerial Vehicle (UAV) section. The IRST payload consists of an Amber Radiance 1 infrared camera system, a computer, a gimbaled mirror, and a hard disk. The processor is a custom R3000 CPU board made by Risq Modular Systems, Inc. for LLNL. The board has ethernet, SCSI, parallel I/O, and serial ports, a DMA channel, a video (frame buffer) interface, and eight MBytes of main memory. The real-time operating system VxWorks has been ported to the processor. The application code is written in C on a host SUN 4 UNIX workstation. The IRST is the result of a combined effort by physicists, electrical and mechanical engineers, and computer scientists.

  5. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  6. 1+-n+ ECR ION SOURCE DEVELOPMENT TEST STAND

    International Nuclear Information System (INIS)

    Donald P. May

    2006-01-01

    A test stand for the investigation of 1+-n+ charge boosting using an ECR ion sources is currently being assembled at the Texas A and M Cyclotron Institute. The ultimate goal is to relate the charge-boosting of ions of stable species to possible charge-boosting of ions of radioactive species extracted from the diverse, low-charge-state ion sources developed for radioactive ion beams

  7. Electrocardiogram artifact caused by rigors mimicking narrow complex tachycardia: a case report.

    Science.gov (United States)

    Matthias, Anne Thushara; Indrakumar, Jegarajah

    2014-02-04

    The electrocardiogram (ECG) is useful in the diagnosis of cardiac and non-cardiac conditions. Rigors due to shivering can cause electrocardiogram artifacts mimicking various cardiac rhythm abnormalities. We describe an 80-year-old Sri Lankan man with an abnormal electrocardiogram mimicking narrow complex tachycardia during the immediate post-operative period. Electrocardiogram changes caused by muscle tremor during rigors could mimic a narrow complex tachycardia. Identification of muscle tremor as a cause of electrocardiogram artifact can avoid unnecessary pharmacological and non-pharmacological intervention to prevent arrhythmias.

  8. Effects of Pre and Post-Rigor Marinade Injection on Some Quality Parameters of Longissimus Dorsi Muscles

    Science.gov (United States)

    Fadıloğlu, Eylem Ezgi; Serdaroğlu, Meltem

    2018-01-01

    Abstract This study was conducted to evaluate the effects of pre and post-rigor marinade injections on some quality parameters of Longissimus dorsi (LD) muscles. Three marinade formulations were prepared with 2% NaCl, 2% NaCl+0.5 M lactic acid and 2% NaCl+0.5 M sodium lactate. In this study marinade uptake, pH, free water, cooking loss, drip loss and color properties were analyzed. Injection time had significant effect on marinade uptake levels of samples. Regardless of marinate formulation, marinade uptake of pre-rigor samples injected with marinade solutions were higher than post rigor samples. Injection of sodium lactate increased pH values of samples whereas lactic acid injection decreased pH. Marinade treatment and storage period had significant effect on cooking loss. At each evaluation period interaction between marinade treatment and injection time showed different effect on free water content. Storage period and marinade application had significant effect on drip loss values. Drip loss in all samples increased during the storage. During all storage days, lowest CIE L* value was found in pre-rigor samples injected with sodium lactate. Lactic acid injection caused color fade in pre-rigor and post-rigor samples. Interaction between marinade treatment and storage period was found statistically significant (p<0.05). At day 0 and 3, the lowest CIE b* values obtained pre-rigor samples injected with sodium lactate and there were no differences were found in other samples. At day 6, no significant differences were found in CIE b* values of all samples. PMID:29805282

  9. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  10. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    Science.gov (United States)

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  11. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site

  12. Einstein's Theory A Rigorous Introduction for the Mathematically Untrained

    CERN Document Server

    Grøn, Øyvind

    2011-01-01

    This book provides an introduction to the theory of relativity and the mathematics used in its processes. Three elements of the book make it stand apart from previously published books on the theory of relativity. First, the book starts at a lower mathematical level than standard books with tensor calculus of sufficient maturity to make it possible to give detailed calculations of relativistic predictions of practical experiments. Self-contained introductions are given, for example vector calculus, differential calculus and integrations. Second, in-between calculations have been included, making it possible for the non-technical reader to follow step-by-step calculations. Thirdly, the conceptual development is gradual and rigorous in order to provide the inexperienced reader with a philosophically satisfying understanding of the theory.  Einstein's Theory: A Rigorous Introduction for the Mathematically Untrained aims to provide the reader with a sound conceptual understanding of both the special and genera...

  13. Sonoelasticity to monitor mechanical changes during rigor and ageing.

    Science.gov (United States)

    Ayadi, A; Culioli, J; Abouelkaram, S

    2007-06-01

    We propose the use of sonoelasticity as a non-destructive method to monitor changes in the resistance of muscle fibres, unaffected by connective tissue. Vibrations were applied at low frequency to induce oscillations in soft tissues and an ultrasound transducer was used to detect the motions. The experiments were carried out on the M. biceps femoris muscles of three beef cattle. In addition to the sonoelasticity measurements, the changes in meat during rigor and ageing were followed by measurements of both the mechanical resistance of myofibres and pH. The variations of mechanical resistance and pH were compared to those of the sonoelastic variables (velocity and attenuation) at two frequencies. The relationships between pH and velocity or attenuation and between the velocity or attenuation and the stress at 20% deformation were highly correlated. We concluded that sonoelasticity is a non-destructive method that can be used to monitor mechanical changes in muscle fibers during rigor-mortis and ageing.

  14. A rigorous proof for the Landauer-Büttiker formula

    DEFF Research Database (Denmark)

    Cornean, Horia Decebal; Jensen, Arne; Moldoveanu, V.

    Recently, Avron et al. shed new light on the question of quantum transport in mesoscopic samples coupled to particle reservoirs by semi-infinite leads. They rigorously treat the case when the sample undergoes an adiabatic evolution thus generating a current through th leads, and prove the so call...

  15. Type testing of devices with inserted radioactive sources

    International Nuclear Information System (INIS)

    Rolle, A.; Droste, B.; Dombrowski, H.

    2006-01-01

    In Germany devices with inserted radioactive sources can get a type approval if they comply with specific requirements. Whoever operates a device whose type has been approved in accordance with the German Radiation Protection Ordinance does not need an individual authorization. Such type approvals for free use are granted by the Federal Office for Radiation Protection (B.f.S.) on the basis of type testing performed by the Physikalisch-Technische Bundesanstalt (P.T.B.), the national metrology institute, and the Bundesanstalt fur Materialforschung und -prufung (B.A.M.), the Federal Institute for Materials Research and Testing. Main aspects of the assessment are the activity of the radioactive sources, the dose equivalent rate near the devices, the tamper-proofness and leak-tightness of the sources and the safety of the construction of the devices. With the new Radiation Protection Ordinance in 2001, more stringent requirements for a type approval were established. Experiences with the new regulations and the relevant assessment criteria applied by P.T.B. and B.A.M. will be presented. (authors)

  16. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    Science.gov (United States)

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  17. Characterization of rigor mortis of longissimus dorsi and triceps ...

    African Journals Online (AJOL)

    24 h) of the longissimus dorsi (LD) and triceps brachii (TB) muscles as well as the shear force (meat tenderness) and colour were evaluated, aiming at characterizing the rigor mortis in the meat during industrial processing. Data statistic treatment demonstrated that carcass temperature and pH decreased gradually during ...

  18. SOURCE 1ST 2.0: development and beta testing

    International Nuclear Information System (INIS)

    Barber, D.H.; Iglesias, F.C.; Hoang, Y.; Dickson, L.W.; Dickson, R.S.; Richards, M.J.; Gibb, R.A.

    1999-01-01

    SOURCE 1ST 2.0 is the Industry Standard fission product release code that is being developed by Ontario Power Generation, New Brunswick Power, Hydro-Quebec, and Atomic Energy of Canada Ltd. This paper is a report on recent progress on requirement specification, code development, and module verification and validation activities. The theoretical basis for each model in the code is described in a module Software Theory Manual. The development of SOURCE IST 2.0 has required code design decisions about how to implement the software requirements. Development and module testing of the β1 release of SOURCE IST 2.0 (released in July 1999) have led to some interesting insights into fission product release modelling. The beta testing process has allowed code developers and analysts to refine the software requirements for the code. The need to verify physical reference data has guided some decisions on the code and data structure design. Examples of these design decisions are provided. Module testing, and verification and validation activities are discussed. These activities include code-targeted testing, stress testing, code inspection, comparison of code with requirements, and comparison of code results with independent algebraic, numerical, or semi-algebraic calculations. The list of isotopes to be modelled by SOURCE IST 2.0 provides an example of a subset of a reference data set. Isotopes are present on the list for a variety of reasons: personnel or public dose, equipment dose (for environmental qualification), fission rate and actinide modelling, or stable (or long-lived) targets for activation processes. To accommodate controlled changes to the isotope list, the isotope list and associated nuclear data are contained in a reference data file. The questions of multiple computing platforms, and of Year 2000 compliance have been addressed by programming rules for the code. By developing and testing modules on most of the different platforms on which the code is intended

  19. Design and qualification testing of a strontium-90 fluoride heat source

    International Nuclear Information System (INIS)

    Fullam, H.T.

    1981-12-01

    The Strontium Heat Source Development Program began at the Pacific Northwest Laboratory (PNL) in 1972 and is scheduled to be completed by the end of FY-1981. The program is currently funded by the US Department of Energy (DOE) By-Product Utilization Program. The primary objective of the program has been to develop the data and technology required to permit the licensing of power systems for terrestrial applications that utilize 90 SrF 2 -fueled radioisotope heat sources. A secondary objective of the program has been to design and qualification-test a general purpose 90 SrF 2 -fueled heat source. The effort expended in the design and testing of the heat source is described. Detailed information is included on: heat source design, licensing requirements, and qualification test requirements; the qualification test procedures; and the fabrication and testing of capsules of various materials. The results obtained in the qualification tests show that the outer capsule design proposed for the 90 SrF 2 heat source is capable of meeting current licensing requirements when Hastelloy S is used as the outer capsule material. The data also indicate that an outer capsule of Hastelloy C-4 would probably also meet licensing requirements, although Hastelloy S is the preferred material. Therefore, based on the results of this study, the general purpose 90 SrF 2 heat source will consist of a standard WESF Hastelloy C-276 inner capsule filled with 90 SrF 2 and a Hastelloy S outer capsule having a 2.375-in. inner diameter and 0.500-in. wall thickness. The end closures for this study, the general purpose 90 SrF 2 heat a Hastelloy S outer capsule having a 2.375-in. inner diameter and 0.500-in. wall thickness. The end closures for the outer capsule will utilize an interlocking joint design requiring a 0.1-in. penetration closure weld

  20. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  1. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    Science.gov (United States)

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  2. Design and tests of a package for the transport of radioactive sources

    International Nuclear Information System (INIS)

    Santos, Paulo de Oliveira

    2011-01-01

    The Type A package was designed for transportation of seven cobalt-60 sources with total activity of 1 GBq. The shield thickness to accomplish the dose rate and the transport index established by the radioactive transport regulation was calculated by the code MCNP (Monte Carlo N-Particle Transport Code Version 5). The sealed cobalt-60 sources were tested for leakages. according to the regulation ISO 9978:1992 (E). The package was tested according to regulation Radioactive Material Transport CNEN. The leakage tests results pf the sources, and the package tests demonstrate that the transport can be safe performed from the CDTN to the steelmaking industries

  3. Reconciling the Rigor-Relevance Dilemma in Intellectual Capital Research

    Science.gov (United States)

    Andriessen, Daniel

    2004-01-01

    This paper raises the issue of research methodology for intellectual capital and other types of management research by focusing on the dilemma of rigour versus relevance. The more traditional explanatory approach to research often leads to rigorous results that are not of much help to solve practical problems. This paper describes an alternative…

  4. Assessment of the Methodological Rigor of Case Studies in the Field of Management Accounting Published in Journals in Brazil

    Directory of Open Access Journals (Sweden)

    Kelly Cristina Mucio Marques

    2015-04-01

    Full Text Available This study aims to assess the methodological rigor of case studies in management accounting published in Brazilian journals. The study is descriptive. The data were collected using documentary research and content analysis, and 180 papers published from 2008 to 2012 in accounting journals rated as A2, B1, and B2 that were classified as case studies were selected. Based on the literature, we established a set of 15 criteria that we expected to be identified (either explicitly or implicitly in the case studies to classify those case studies as appropriate from the standpoint of methodological rigor. These criteria were partially met by the papers analyzed. The aspects less aligned with those proposed in the literature were the following: little emphasis on justifying the need to understand phenomena in context; lack of explanation of the reason for choosing the case study strategy; the predominant use of questions that do not enable deeper analysis; many studies based on only one source of evidence; little use of data and information triangulation; little emphasis on the data collection method; a high number of cases in which confusion between case study as a research strategy and as data collection method were detected; a low number of papers reporting the method of data analysis; few reports on a study's contributions; and a minority highlighting the issues requiring further research. In conclusion, the method used to apply case studies to management accounting must be improved because few studies showed rigorous application of the procedures that this strategy requires.

  5. Open source innovation phenomenon, participant behaviour, impact

    CERN Document Server

    Herstatt, Cornelius

    2015-01-01

    Open Source Innovation (OSI) has gained considerable momentum within the last years. Academic and management practice interest grows as more and more end-users consider and even participate in Open Source product development like Linux, Android, or Wikipedia. Open Source Innovation: Phenomenon, Participant Behaviour, Impact brings together rigorous academic research and business importance in scrutinizing OCI from three perspectives: The Phenomenon, Participants' Behavior, and Business Implications. The first section introduces OCI artefacts, including who is participating and why, and provide

  6. Quality properties of pre- and post-rigor beef muscle after interventions with high frequency ultrasound.

    Science.gov (United States)

    Sikes, Anita L; Mawson, Raymond; Stark, Janet; Warner, Robyn

    2014-11-01

    The delivery of a consistent quality product to the consumer is vitally important for the food industry. The aim of this study was to investigate the potential for using high frequency ultrasound applied to pre- and post-rigor beef muscle on the metabolism and subsequent quality. High frequency ultrasound (600kHz at 48kPa and 65kPa acoustic pressure) applied to post-rigor beef striploin steaks resulted in no significant effect on the texture (peak force value) of cooked steaks as measured by a Tenderometer. There was no added benefit of ultrasound treatment above that of the normal ageing process after ageing of the steaks for 7days at 4°C. Ultrasound treatment of post-rigor beef steaks resulted in a darkening of fresh steaks but after ageing for 7days at 4°C, the ultrasound-treated steaks were similar in colour to that of the aged, untreated steaks. High frequency ultrasound (2MHz at 48kPa acoustic pressure) applied to pre-rigor beef neck muscle had no effect on the pH, but the calculated exhaustion factor suggested that there was some effect on metabolism and actin-myosin interaction. However, the resultant texture of cooked, ultrasound-treated muscle was lower in tenderness compared to the control sample. After ageing for 3weeks at 0°C, the ultrasound-treated samples had the same peak force value as the control. High frequency ultrasound had no significant effect on the colour parameters of pre-rigor beef neck muscle. This proof-of-concept study showed no effect of ultrasound on quality but did indicate that the application of high frequency ultrasound to pre-rigor beef muscle shows potential for modifying ATP turnover and further investigation is warranted. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  7. Rigorous derivation from Landau-de Gennes theory to Ericksen-Leslie theory

    OpenAIRE

    Wang, Wei; Zhang, Pingwen; Zhang, Zhifei

    2013-01-01

    Starting from Beris-Edwards system for the liquid crystal, we present a rigorous derivation of Ericksen-Leslie system with general Ericksen stress and Leslie stress by using the Hilbert expansion method.

  8. Striation Patterns of Ox Muscle in Rigor Mortis

    Science.gov (United States)

    Locker, Ronald H.

    1959-01-01

    Ox muscle in rigor mortis offers a selection of myofibrils fixed at varying degrees of contraction from sarcomere lengths of 3.7 to 0.7 µ. A study of this material by phase contrast and electron microscopy has revealed four distinct successive patterns of contraction, including besides the familiar relaxed and contracture patterns, two intermediate types (2.4 to 1.9 µ, 1.8 to 1.5 µ) not previously well described. PMID:14417790

  9. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  10. Mathematical framework for fast and rigorous track fit for the ZEUS detector

    Energy Technology Data Exchange (ETDEWEB)

    Spiridonov, Alexander

    2008-12-15

    In this note we present a mathematical framework for a rigorous approach to a common track fit for trackers located in the inner region of the ZEUS detector. The approach makes use of the Kalman filter and offers a rigorous treatment of magnetic field inhomogeneity, multiple scattering and energy loss. We describe mathematical details of the implementation of the Kalman filter technique with a reduced amount of computations for a cylindrical drift chamber, barrel and forward silicon strip detectors and a forward straw drift chamber. Options with homogeneous and inhomogeneous field are discussed. The fitting of tracks in one ZEUS event takes about of 20ms on standard PC. (orig.)

  11. Installation and Characterization of Charged Particle Sources for Space Environmental Effects Testing

    Science.gov (United States)

    Skevington, Jennifer L.

    2010-01-01

    Charged particle sources are integral devices used by Marshall Space Flight Center s Environmental Effects Branch (EM50) in order to simulate space environments for accurate testing of materials and systems. By using these sources inside custom vacuum systems, materials can be tested to determine charging and discharging properties as well as resistance to sputter damage. This knowledge can enable scientists and engineers to choose proper materials that will not fail in harsh space environments. This paper combines the steps utilized to build a low energy electron gun (The "Skevington 3000") as well as the methods used to characterize the output of both the Skevington 3000 and a manufactured Xenon ion source. Such characterizations include beam flux, beam uniformity, and beam energy. Both sources were deemed suitable for simulating environments in future testing.

  12. Learning from Science and Sport - How we, Safety, "Engage with Rigor"

    Science.gov (United States)

    Herd, A.

    2012-01-01

    As the world of spaceflight safety is relatively small and potentially inward-looking, we need to be aware of the "outside world". We should then try to remind ourselves to be open to the possibility that data, knowledge or experience from outside of the spaceflight community may provide some constructive alternate perspectives. This paper will assess aspects from two seemingly tangential fields, science and sport, and align these with the world of safety. In doing so some useful insights will be given to the challenges we face and may provide solutions relevant in our everyday (of safety engineering). Sport, particularly a contact sport such as rugby union, requires direct interaction between members of two (opposing) teams. Professional, accurately timed and positioned interaction for a desired outcome. These interactions, whilst an essential part of the game, are however not without their constraints. The rugby scrum has constraints as to the formation and engagement of the two teams. The controlled engagement provides for an interaction between the two teams in a safe manner. The constraints arising from the reality that an incorrect engagement could cause serious injury to members of either team. In academia, scientific rigor is applied to assure that the arguments provided and the conclusions drawn in academic papers presented for publication are valid, legitimate and credible. The scientific goal of the need for rigor may be expressed in the example of achieving a statistically relevant sample size, n, in order to assure analysis validity of the data pool. A failure to apply rigor could then place the entire study at risk of failing to have the respective paper published. This paper will consider the merits of these two different aspects, scientific rigor and sports engagement, and offer a reflective look at how this may provide a "modus operandi" for safety engineers at any level whether at their desks (creating or reviewing safety assessments) or in a

  13. Source effects on surface waves from Nevada Test Site explosions

    International Nuclear Information System (INIS)

    Patton, H.J.; Vergino, E.S.

    1981-11-01

    Surface waves recorded on the Lawrence Livermore National Laboratory (LLNL) digital network have been used to study five underground nuclear explosions detonated in Yucca Valley at the Nevada Test Site. The purpose of this study is to characterize the reduced displacement potential (RDP) at low frequencies and to test secondary source models of underground explosions. The observations consist of Rayleigh- and Love-wave amplitude and phase spectra in the frequency range 0.03 to 0.16 Hz. We have found that Rayleigh-wave spectral amplitudes are modeled well by a RDP with little or no overshoot for explosions detonated in alluvium and tuff. On the basis of comparisons between observed and predicted source phase, the spall closure source proposed by Viecelli does not appear to be a significant source of Rayleigh waves that reach the far field. We tested two other secondary source models, the strike-slip, tectonic strain release model proposed by Toksoez and Kehrer and the dip-slip thrust model of Masse. The surface-wave observations do not provide sufficient information to discriminate between these models at the low F-values (0.2 to 0.8) obtained for these explosions. In the case of the strike-slip model, the principal stress axes inferred from the fault slip angle and strike angle are in good agreement with the regional tectonic stress field for all but one explosion, Nessel. The results of the Nessel explosion suggest a mechanism other than tectonic strain release

  14. Differential rigor development in red and white muscle revealed by simultaneous measurement of tension and stiffness.

    Science.gov (United States)

    Kobayashi, Masahiko; Takemori, Shigeru; Yamaguchi, Maki

    2004-02-10

    Based on the molecular mechanism of rigor mortis, we have proposed that stiffness (elastic modulus evaluated with tension response against minute length perturbations) can be a suitable index of post-mortem rigidity in skeletal muscle. To trace the developmental process of rigor mortis, we measured stiffness and tension in both red and white rat skeletal muscle kept in liquid paraffin at 37 and 25 degrees C. White muscle (in which type IIB fibres predominate) developed stiffness and tension significantly more slowly than red muscle, except for soleus red muscle at 25 degrees C, which showed disproportionately slow rigor development. In each of the examined muscles, stiffness and tension developed more slowly at 25 degrees C than at 37 degrees C. In each specimen, tension always reached its maximum level earlier than stiffness, and then decreased more rapidly and markedly than stiffness. These phenomena may account for the sequential progress of rigor mortis in human cadavers.

  15. Sources of Variation in Creep Testing

    Science.gov (United States)

    Loewenthal, William S.; Ellis, David L.

    2011-01-01

    Creep rupture is an important material characteristic for the design of rocket engines. It was observed during the characterization of GRCop-84 that the complete data set had nearly 4 orders of magnitude of scatter. This scatter likely confounded attempts to determine how creep performance was influenced by manufacturing. It was unclear if this variation was from the testing, the material, or both. Sources of variation were examined by conducting tests on identically processed specimens at the same specified stresses and temperatures. Significant differences existed between the five constant-load creep frames. The specimen temperature was higher than the desired temperature by as much as 43 C. It was also observed that the temperature gradient was up to 44 C. Improved specimen temperature control minimized temperature variations. The data from additional tests demonstrated that the results from all five frames were comparable. The variation decreased to 1/2 order of magnitude from 2 orders of magnitude for the baseline data set. Independent determination of creep rates in a reference load frame closely matched the creep rates determined after the modifications. Testing in helium tended to decrease the sample temperature gradient, but helium was not a significant improvement over vacuum.

  16. Evaluation of physical dimension changes as nondestructive measurements for monitoring rigor mortis development in broiler muscles.

    Science.gov (United States)

    Cavitt, L C; Sams, A R

    2003-07-01

    Studies were conducted to develop a non-destructive method for monitoring the rate of rigor mortis development in poultry and to evaluate the effectiveness of electrical stimulation (ES). In the first study, 36 male broilers in each of two trials were processed at 7 wk of age. After being bled, half of the birds received electrical stimulation (400 to 450 V, 400 to 450 mA, for seven pulses of 2 s on and 1 s off), and the other half were designated as controls. At 0.25 and 1.5 h postmortem (PM), carcasses were evaluated for the angles of the shoulder, elbow, and wing tip and the distance between the elbows. Breast fillets were harvested at 1.5 h PM (after chilling) from all carcasses. Fillet samples were excised and frozen for later measurement of pH and R-value, and the remainder of each fillet was held on ice until 24 h postmortem. Shear value and pH means were significantly lower, but R-value means were higher (P rigor mortis by ES. The physical dimensions of the shoulder and elbow changed (P rigor mortis development and with ES. These results indicate that physical measurements of the wings maybe useful as a nondestructive indicator of rigor development and for monitoring the effectiveness of ES. In the second study, 60 male broilers in each of two trials were processed at 7 wk of age. At 0.25, 1.5, 3.0, and 6.0 h PM, carcasses were evaluated for the distance between the elbows. At each time point, breast fillets were harvested from each carcass. Fillet samples were excised and frozen for later measurement of pH and sacromere length, whereas the remainder of each fillet was held on ice until 24 h PM. Shear value and pH means (P rigor mortis development. Elbow distance decreased (P rigor development and was correlated (P rigor mortis development in broiler carcasses.

  17. A Framework for Rigorously Identifying Research Gaps in Qualitative Literature Reviews

    DEFF Research Database (Denmark)

    Müller-Bloch, Christoph; Kranz, Johann

    2015-01-01

    Identifying research gaps is a fundamental goal of literature reviewing. While it is widely acknowledged that literature reviews should identify research gaps, there are no methodological guidelines for how to identify research gaps in qualitative literature reviews ensuring rigor and replicability....... Our study addresses this gap and proposes a framework that should help scholars in this endeavor without stifling creativity. To develop the framework we thoroughly analyze the state-of-the-art procedure of identifying research gaps in 40 recent literature reviews using a grounded theory approach....... Based on the data, we subsequently derive a framework for identifying research gaps in qualitative literature reviews and demonstrate its application with an example. Our results provide a modus operandi for identifying research gaps, thus enabling scholars to conduct literature reviews more rigorously...

  18. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  19. Testing methods of ECR ion source experimental platform

    International Nuclear Information System (INIS)

    Zhou Changgeng; Hu Yonghong; Li Yan

    2006-12-01

    The principle and structure of ECR ion source experimental platform were introduce. The testing methods of the parameters of single main component and the comprehensive parameters under the condition of certain beam current and beam spot diameter were summarized in process of manufacturing. Some appropriate testing dates were given. The existent questions (the parameters of plasma density in discharge chamber and accurate hydrogen flow, etc. can not be measured in operation) and resolutions were also put forward. (authors)

  20. Study designs for identifying risk compensation behavior among users of biomedical HIV prevention technologies: balancing methodological rigor and research ethics.

    Science.gov (United States)

    Underhill, Kristen

    2013-10-01

    The growing evidence base for biomedical HIV prevention interventions - such as oral pre-exposure prophylaxis, microbicides, male circumcision, treatment as prevention, and eventually prevention vaccines - has given rise to concerns about the ways in which users of these biomedical products may adjust their HIV risk behaviors based on the perception that they are prevented from infection. Known as risk compensation, this behavioral adjustment draws on the theory of "risk homeostasis," which has previously been applied to phenomena as diverse as Lyme disease vaccination, insurance mandates, and automobile safety. Little rigorous evidence exists to answer risk compensation concerns in the biomedical HIV prevention literature, in part because the field has not systematically evaluated the study designs available for testing these behaviors. The goals of this Commentary are to explain the origins of risk compensation behavior in risk homeostasis theory, to reframe risk compensation as a testable response to the perception of reduced risk, and to assess the methodological rigor and ethical justification of study designs aiming to isolate risk compensation responses. Although the most rigorous methodological designs for assessing risk compensation behavior may be unavailable due to ethical flaws, several strategies can help investigators identify potential risk compensation behavior during Phase II, Phase III, and Phase IV testing of new technologies. Where concerns arise regarding risk compensation behavior, empirical evidence about the incidence, types, and extent of these behavioral changes can illuminate opportunities to better support the users of new HIV prevention strategies. This Commentary concludes by suggesting a new way to conceptualize risk compensation behavior in the HIV prevention context. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. A rigorous pole representation of multilevel cross sections and its practical applications

    International Nuclear Information System (INIS)

    Hwang, R.N.

    1987-01-01

    In this article a rigorous method for representing the multilevel cross sections and its practical applications are described. It is a generalization of the rationale suggested by de Saussure and Perez for the s-wave resonances. A computer code WHOPPER has been developed to convert the Reich-Moore parameters into the pole and residue parameters in momentum space. Sample calculations have been carried out to illustrate that the proposed method preserves the rigor of the Reich-Moore cross sections exactly. An analytical method has been developed to evaluate the pertinent Doppler-broadened line shape functions. A discussion is presented on how to minimize the number of pole parameters so that the existing reactor codes can be best utilized

  2. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.

  3. Experimental evaluation of rigor mortis IX. The influence of the breaking (mechanical solution) on the development of rigor mortis.

    Science.gov (United States)

    Krompecher, Thomas; Gilles, André; Brandt-Casadevall, Conception; Mangin, Patrice

    2008-04-07

    Objective measurements were carried out to study the possible re-establishment of rigor mortis on rats after "breaking" (mechanical solution). Our experiments showed that: *Cadaveric rigidity can re-establish after breaking. *A significant rigidity can reappear if the breaking occurs before the process is complete. *Rigidity will be considerably weaker after the breaking. *The time course of the intensity does not change in comparison to the controls: --the re-establishment begins immediately after the breaking; --maximal values are reached at the same time as in the controls; --the course of the resolution is the same as in the controls.

  4. Orthodontic brackets removal under shear and tensile bond strength resistance tests - a comparative test between light sources

    Science.gov (United States)

    Silva, P. C. G.; Porto-Neto, S. T.; Lizarelli, R. F. Z.; Bagnato, V. S.

    2008-03-01

    We have investigated if a new LEDs system has enough efficient energy to promote efficient shear and tensile bonding strength resistance under standardized tests. LEDs 470 ± 10 nm can be used to photocure composite during bracket fixation. Advantages considering resistance to tensile and shear bonding strength when these systems were used are necessary to justify their clinical use. Forty eight human extracted premolars teeth and two light sources were selected, one halogen lamp and a LEDs system. Brackets for premolar were bonded through composite resin. Samples were submitted to standardized tests. A comparison between used sources under shear bonding strength test, obtained similar results; however, tensile bonding test showed distinct results: a statistical difference at a level of 1% between exposure times (40 and 60 seconds) and even to an interaction between light source and exposure time. The best result was obtained with halogen lamp use by 60 seconds, even during re-bonding; however LEDs system can be used for bonding and re-bonding brackets if power density could be increased.

  5. Upgrade of the BATMAN test facility for H- source development

    Science.gov (United States)

    Heinemann, B.; Fröschle, M.; Falter, H.-D.; Fantz, U.; Franzen, P.; Kraus, W.; Nocentini, R.; Riedl, R.; Ruf, B.

    2015-04-01

    The development of a radio frequency (RF) driven source for negative hydrogen ions for the neutral beam heating devices of fusion experiments has been successfully carried out at IPP since 1996 on the test facility BATMAN. The required ITER parameters have been achieved with the prototype source consisting of a cylindrical driver on the back side of a racetrack like expansion chamber. The extraction system, called "Large Area Grid" (LAG) was derived from a positive ion accelerator from ASDEX Upgrade (AUG) using its aperture size (ø 8 mm) and pattern but replacing the first two electrodes and masking down the extraction area to 70 cm2. BATMAN is a well diagnosed and highly flexible test facility which will be kept operational in parallel to the half size ITER source test facility ELISE for further developments to improve the RF efficiency and the beam properties. It is therefore planned to upgrade BATMAN with a new ITER-like grid system (ILG) representing almost one ITER beamlet group, namely 5 × 14 apertures (ø 14 mm). Additionally to the standard three grid extraction system a repeller electrode upstream of the grounded grid can optionally be installed which is positively charged against it by 2 kV. This is designated to affect the onset of the space charge compensation downstream of the grounded grid and to reduce the backstreaming of positive ions from the drift space backwards into the ion source. For magnetic filter field studies a plasma grid current up to 3 kA will be available as well as permanent magnets embedded into a diagnostic flange or in an external magnet frame. Furthermore different source vessels and source configurations are under discussion for BATMAN, e.g. using the AUG type racetrack RF source as driver instead of the circular one or modifying the expansion chamber for a more flexible position of the external magnet frame.

  6. Upgrade of the BATMAN test facility for H− source development

    International Nuclear Information System (INIS)

    Heinemann, B.; Fröschle, M.; Falter, H.-D.; Fantz, U.; Franzen, P.; Kraus, W.; Nocentini, R.; Riedl, R.; Ruf, B.

    2015-01-01

    The development of a radio frequency (RF) driven source for negative hydrogen ions for the neutral beam heating devices of fusion experiments has been successfully carried out at IPP since 1996 on the test facility BATMAN. The required ITER parameters have been achieved with the prototype source consisting of a cylindrical driver on the back side of a racetrack like expansion chamber. The extraction system, called “Large Area Grid” (LAG) was derived from a positive ion accelerator from ASDEX Upgrade (AUG) using its aperture size (ø 8 mm) and pattern but replacing the first two electrodes and masking down the extraction area to 70 cm2. BATMAN is a well diagnosed and highly flexible test facility which will be kept operational in parallel to the half size ITER source test facility ELISE for further developments to improve the RF efficiency and the beam properties. It is therefore planned to upgrade BATMAN with a new ITER-like grid system (ILG) representing almost one ITER beamlet group, namely 5 × 14 apertures (ø 14 mm). Additionally to the standard three grid extraction system a repeller electrode upstream of the grounded grid can optionally be installed which is positively charged against it by 2 kV. This is designated to affect the onset of the space charge compensation downstream of the grounded grid and to reduce the backstreaming of positive ions from the drift space backwards into the ion source. For magnetic filter field studies a plasma grid current up to 3 kA will be available as well as permanent magnets embedded into a diagnostic flange or in an external magnet frame. Furthermore different source vessels and source configurations are under discussion for BATMAN, e.g. using the AUG type racetrack RF source as driver instead of the circular one or modifying the expansion chamber for a more flexible position of the external magnet frame

  7. Next Generation of Leaching Tests

    Science.gov (United States)

    A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...

  8. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  9. Project of a test stand for cyclotron ion sources

    International Nuclear Information System (INIS)

    Buettig, H.; Dietrich, J.; Merker, H.; Odrich, H.; Preusche, S.; Weissig, J.

    1978-10-01

    In the work the construction of a test stand for testing and optimization of ion sources of the Rossendorf cyclotron U-120 is represented. The design procedure and the construction of the electromagnet, the vacuum chamber with monant, the vacuum system, the power supply and the detecting system are demonstrated. The results of calculations of the motion of ions in the magnetic field are presented. (author)

  10. Pre-rigor temperature and the relationship between lamb tenderisation, free water production, bound water and dry matter.

    Science.gov (United States)

    Devine, Carrick; Wells, Robyn; Lowe, Tim; Waller, John

    2014-01-01

    The M. longissimus from lambs electrically stimulated at 15 min post-mortem were removed after grading, wrapped in polythene film and held at 4 (n=6), 7 (n=6), 15 (n=6, n=8) and 35°C (n=6), until rigor mortis then aged at 15°C for 0, 4, 24 and 72 h post-rigor. Centrifuged free water increased exponentially, and bound water, dry matter and shear force decreased exponentially over time. Decreases in shear force and increases in free water were closely related (r(2)=0.52) and were unaffected by pre-rigor temperatures. © 2013.

  11. Acoustic emission non-destructive testing of structures using source location techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Alan G.

    2013-09-01

    The technology of acoustic emission (AE) testing has been advanced and used at Sandia for the past 40 years. AE has been used on structures including pressure vessels, fire bottles, wind turbines, gas wells, nuclear weapons, and solar collectors. This monograph begins with background topics in acoustics and instrumentation and then focuses on current acoustic emission technology. It covers the overall design and system setups for a test, with a wind turbine blade as the object. Test analysis is discussed with an emphasis on source location. Three test examples are presented, two on experimental wind turbine blades and one on aircraft fire extinguisher bottles. Finally, the code for a FORTRAN source location program is given as an example of a working analysis program. Throughout the document, the stress is on actual testing of real structures, not on laboratory experiments.

  12. Double phosphorylation of the myosin regulatory light chain during rigor mortis of bovine Longissimus muscle.

    Science.gov (United States)

    Muroya, Susumu; Ohnishi-Kameyama, Mayumi; Oe, Mika; Nakajima, Ikuyo; Shibata, Masahiro; Chikuni, Koichi

    2007-05-16

    To investigate changes in myosin light chains (MyLCs) during postmortem aging of the bovine longissimus muscle, we performed two-dimensional gel electrophoresis followed by identification with matrix-assisted laser desorption ionization time-of-flight mass spectrometry. The results of fluorescent differential gel electrophoresis showed that two spots of the myosin regulatory light chain (MyLC2) at pI values of 4.6 and 4.7 shifted toward those at pI values of 4.5 and 4.6, respectively, by 24 h postmortem when rigor mortis was completed. Meanwhile, the MyLC1 and MyLC3 spots did not change during the 14 days postmortem. Phosphoprotein-specific staining of the gels demonstrated that the MyLC2 proteins at pI values of 4.5 and 4.6 were phosphorylated. Furthermore, possible N-terminal region peptides containing one and two phosphoserine residues were detected in each mass spectrum of the MyLC2 spots at pI values of 4.5 and 4.6, respectively. These results demonstrated that MyLC2 became doubly phosphorylated during rigor formation of the bovine longissimus, suggesting involvement of the MyLC2 phosphorylation in the progress of beef rigor mortis. Bovine; myosin regulatory light chain (RLC, MyLC2); phosphorylation; rigor mortis; skeletal muscle.

  13. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  14. General-Purpose Heat Source Development: Safety Test Program. Postimpact evaluation, Design Iteration Test 3

    International Nuclear Information System (INIS)

    Schonfeld, F.W.; George, T.G.

    1984-07-01

    The General-Purpose Heat Source(GPHS) provides power for space missions by transmitting the heat of 238 PuO 2 decay to thermoelectric elements. Because of the inevitable return of certain aborted missions, the heat source must be designed and constructed to survive both re-entry and Earth impact. The Design Iteration Test (DIT) series is part of an ongoing test program. In the third test (DIT-3), a full GPHS module was impacted at 58 m/s and 930 0 C. The module impacted the target at an angle of 30 0 to the pole of the large faces. The four capsules used in DIT-3 survived impact with minimal deformation; no internal cracks other than in the regions indicated by Savannah River Plant (SRP) preimpact nondestructive testing were observed in any of the capsules. The 30 0 impact orientation used in DIT-3 was considerably less severe than the flat-on impact utilized in DIT-1 and DIT-2. The four capsules used in DIT-1 survived, while two of the capsules used in DIT-2 breached; a small quantity (approx. = 50 μg) of 238 PuO 2 was released from the capsules breached in the DIT-2 impact. All of the capsules used in DIT-1 and DIT-2 were severely deformed and contained large internal cracks. Postimpact analyses of the DIT-3 test components are described, with emphasis on weld structure and the behavior of defects identified by SRP nondestructive testing

  15. Simulations of Liners and Test Objects for a New Atlas Advanced Radiography Source

    International Nuclear Information System (INIS)

    Morgan, D. V.; Iversen, S.; Hilko, R. A.

    2002-01-01

    The Advanced Radiographic Source (ARS) will improve the data significantly due to its smaller source width. Because of the enhanced ARS output, larger source-to-object distances are a reality. The harder ARS source will allow radiography of thick high-Z targets. The five different spectral simulations resulted in similar imaging detector weighted transmission. This work used a limited set of test objects and imaging detectors. Other test objects and imaging detectors could possibly change the MVp-sensitivity result. The effect of material motion blur must be considered for the ARS due to the expected smaller X-ray source size. This study supports the original 1.5-MVp value

  16. Safety quality classification test of the sealed neutron sources used in start-up neutron source rods for Qinshan Nuclear Power Plant

    International Nuclear Information System (INIS)

    Yao Chunbing; Guo Gang; Chao Jinglan; Duan Liming

    1992-01-01

    According to the regulations listed in the GB4075, the safety quality classification tests have been carried out for the neutron sources. The test items include temperature, external pressure, impact, vibration and puncture, Two dummy sealed sources are used for each test item. The testing equipment used have been examined and verified to be qualified by the measuring department which is admitted by the National standard Bureau. The leak rate of each tested sample is measured by UL-100 Helium Leak Detector (its minimum detectable leak rate is 1 x 10 -10 Pa·m 3 ·s -1 ). The samples with leak rate less than 1.33 x 10 -8 Pa·m 3 ·s -1 are considered up to the standard. The test results show the safety quality classification class of the neutron sources have reached the class of GB/E66545 which exceeds the preset class

  17. Estimating and Testing the Sources of Evoked Potentials in the Brain.

    Science.gov (United States)

    Huizenga, Hilde M.; Molenaar, Peter C. M.

    1994-01-01

    The source of an event-related brain potential (ERP) is estimated from multivariate measures of ERP on the head under several mathematical and physical constraints on the parameters of the source model. Statistical aspects of estimation are discussed, and new tests are proposed. (SLD)

  18. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    Science.gov (United States)

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  20. Paper 3: Content and Rigor of Algebra Credit Recovery Courses

    Science.gov (United States)

    Walters, Kirk; Stachel, Suzanne

    2014-01-01

    This paper describes the content, organization and rigor of the f2f and online summer algebra courses that were delivered in summers 2011 and 2012. Examining the content of both types of courses is important because research suggests that algebra courses with certain features may be better than others in promoting success for struggling students.…

  1. Beam Profile Measurement of 300 kV Ion Source Test Stand for 1 MV Electrostatic Accelerator

    International Nuclear Information System (INIS)

    Park, Sae-Hoon; Kim, Yu-Seok; Kim, Dae-Il; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2015-01-01

    In this paper, RF ion source, test stand of the ion source and its test results are presented. Beam profile was measured at the downstream from the accelerating tube and at the beam dump by using BPM and wire scanner. The RF ion source of the test stand is verified by measuring the total beam current with a faraday cup in the chamber. The KOMAC (KOrea Multi-purpose Accelerator Complex) has been developing a 300 kV ion source test stand for a 1 MV electrostatic accelerator. An ion source and accelerating tube will be installed in a high pressure vessel. The ion source in a high pressure vessel requires high reliability. To confirm the stable operation of the ion source, a test stand was proposed and developed. The ion source will be tested at the test stand to verify its long-term operation conditions. The test stand consists of a 300 kV high voltage terminal, a battery for the ion source power, a 60 Hz inverter, a 200 MHz RF power, a 5 kV extraction power supply, a 300 kV accelerating tube, and a vacuum system. The beam profile monitor was installed at the downstream from the accelerating tube. Wire scanner and faraday-cup was installed at the end of the chamber

  2. Beam Profile Measurement of 300 kV Ion Source Test Stand for 1 MV Electrostatic Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sae-Hoon; Kim, Yu-Seok [Dongguk University, Gyeonju (Korea, Republic of); Kim, Dae-Il; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Multipurpose Accelerator Complex, Gyeongju (Korea, Republic of)

    2015-10-15

    In this paper, RF ion source, test stand of the ion source and its test results are presented. Beam profile was measured at the downstream from the accelerating tube and at the beam dump by using BPM and wire scanner. The RF ion source of the test stand is verified by measuring the total beam current with a faraday cup in the chamber. The KOMAC (KOrea Multi-purpose Accelerator Complex) has been developing a 300 kV ion source test stand for a 1 MV electrostatic accelerator. An ion source and accelerating tube will be installed in a high pressure vessel. The ion source in a high pressure vessel requires high reliability. To confirm the stable operation of the ion source, a test stand was proposed and developed. The ion source will be tested at the test stand to verify its long-term operation conditions. The test stand consists of a 300 kV high voltage terminal, a battery for the ion source power, a 60 Hz inverter, a 200 MHz RF power, a 5 kV extraction power supply, a 300 kV accelerating tube, and a vacuum system. The beam profile monitor was installed at the downstream from the accelerating tube. Wire scanner and faraday-cup was installed at the end of the chamber.

  3. Test Method for High β Particle Emission Rate of 63Ni Source Plate

    OpenAIRE

    ZHANG Li-feng

    2015-01-01

    For the problem of measurement difficulties of β particle emission rate of Ni-63 source plate used for Ni-63 betavoltaic battery, a relative test method of scintillation current method was erected according to the measurement principle of scintillation detector.β particle emission rate of homemade Ni-63 source plate was tested by the method, and the test results were analysed and evaluated, it was initially thought that scintillation current method was a feasible way of testing β particle emi...

  4. A New 500-kV Ion Source Test Stand for HIF

    International Nuclear Information System (INIS)

    Sangster, T.C.; Ahle, L.E.; Halaxa, E.F.; Karpenko, V.P.; Oldaker, M. E.; Mitchell, J.W.; Beck, D.N.; Bieniosek, F.M.; Henestroza, E.; Kwan, J.W.

    2000-01-01

    One of the most challenging aspects of ion beam driven inertial fusion energy is the reliable and efficient generation of low emittance, high current ion beams. The primary ion source requirements include a rise time of order 1-msec, a pulse width of at least 20-msec, a flattop ripple of less than 0.1% and a repetition rate of at least 5-Hz. Naturally, at such a repetition rate, the duty cycle of the source must be greater than 108 pulses. Although these specifications do not appear to exceed the state-of-the-art for pulsed power, considerable effort remains to develop a suitable high current ion source. Therefore, we are constructing a 500-kV test stand specifically for studying various ion source concepts including surface, plasma and metal vapor arc. This paper will describe the test stand design specifications as well as the details of the various subsystems and components

  5. A Test Beamline on Diamond Light Source

    International Nuclear Information System (INIS)

    Sawhney, K. J. S.; Dolbnya, I. P.; Tiwari, M. K.; Alianelli, L.; Scott, S. M.; Preece, G. M.; Pedersen, U. K.; Walton, R. D.

    2010-01-01

    A Test beamline B16 has been built on the 3 GeV Diamond synchrotron radiation source. The beamline covers a wide photon energy range from 2 to 25 keV. The beamline is highly flexible and versatile in terms of the available beam size (a micron to 100 mm) and the range of energy resolution and photon flux; by virtue of its several operational modes, and the different inter-changeable instruments available in the experiments hutch. Diverse experimental configurations can be flexibly configured using a five-circle diffractometer, a versatile optics test bench, and a suite of detectors. Several experimental techniques including reflectivity, diffraction and imaging are routinely available. Details of the beamline and its measured performance are presented.

  6. Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.

    Science.gov (United States)

    Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven

    2017-07-01

    Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.

  7. Environmental assessment of general-purpose heat source safety verification testing

    International Nuclear Information System (INIS)

    1995-02-01

    This Environmental Assessment (EA) was prepared to identify and evaluate potential environmental, safety, and health impacts associated with the Proposed Action to test General-Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) assemblies at the Sandia National Laboratories (SNL) 10,000-Foot Sled Track Facility, Albuquerque, New Mexico. RTGs are used to provide a reliable source of electrical power on board some spacecraft when solar power is inadequate during long duration space missions. These units are designed to convert heat from the natural decay of radioisotope fuel into electrical power. Impact test data are required to support DOE's mission to provide radioisotope power systems to NASA and other user agencies. The proposed tests will expand the available safety database regarding RTG performance under postulated accident conditions. Direct observations and measurements of GPHS/RTG performance upon impact with hard, unyielding surfaces are required to verify model predictions and to ensure the continual evolution of the RTG designs that perform safely under varied accident environments. The Proposed Action is to conduct impact testing of RTG sections containing GPHS modules with simulated fuel. End-On and Side-On impact test series are planned

  8. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA

  9. Differential algebras with remainder and rigorous proofs of long-term stability

    International Nuclear Information System (INIS)

    Berz, Martin

    1997-01-01

    It is shown how in addition to determining Taylor maps of general optical systems, it is possible to obtain rigorous interval bounds for the remainder term of the n-th order Taylor expansion. To this end, the three elementary operations of addition, multiplication, and differentiation in the Differential Algebraic approach are augmented by suitable interval operations in such a way that a remainder bound of the sum, product, and derivative is obtained from the Taylor polynomial and remainder bound of the operands. The method can be used to obtain bounds for the accuracy with which a Taylor map represents the true map of the particle optical system. In a more general sense, it is also useful for a variety of other numerical problems, including rigorous global optimization of highly complex functions. Combined with methods to obtain pseudo-invariants of repetitive motion and extensions of the Lyapunov- and Nekhoroshev stability theory, the latter can be used to guarantee stability for storage rings and other weakly nonlinear systems

  10. The effects of free recall testing on subsequent source memory.

    Science.gov (United States)

    Brewer, Gene A; Marsh, Richard L; Meeks, Joseph T; Clark-Foos, Arlo; Hicks, Jason L

    2010-05-01

    The testing effect is the finding that prior retrieval of information from memory will result in better subsequent memory for that material. One explanation for these effects is that initial free recall testing increases the recollective details for tested information, which then becomes more available during a subsequent test phase. In three experiments we explored this hypothesis using a source-monitoring test phase after the initial free recall tests. We discovered that memory is differentially enhanced for certain recollective details depending on the nature of the free recall task. Thus further research needs to be conducted to specify how different kinds of memorial details are enhanced by free recall testing.

  11. Rigorous upper bounds for transport due to passive advection by inhomogeneous turbulence

    International Nuclear Information System (INIS)

    Krommes, J.A.; Smith, R.A.

    1987-05-01

    A variational procedure, due originally to Howard and explored by Busse and others for self-consistent turbulence problems, is employed to determine rigorous upper bounds for the advection of a passive scalar through an inhomogeneous turbulent slab with arbitrary generalized Reynolds number R and Kubo number K. In the basic version of the method, the steady-state energy balance is used as a constraint; the resulting bound, though rigorous, is independent of K. A pedagogical reference model (one dimension, K = ∞) is described in detail; the bound compares favorably with the exact solution. The direct-interaction approximation is also worked out for this model; it is somewhat more accurate than the bound, but requires considerably more labor to solve. For the basic bound, a general formalism is presented for several dimensions, finite correlation length, and reasonably general boundary conditions. Part of the general method, in which a Green's function technique is employed, applies to self-consistent as well as to passive problems, and thereby generalizes previous results in the fluid literature. The formalism is extended for the first time to include time-dependent constraints, and a bound is deduced which explicitly depends on K and has the correct physical scalings in all regimes of R and K. Two applications from the theory of turbulent plasmas ae described: flux in velocity space, and test particle transport in stochastic magnetic fields. For the velocity space problem the simplest bound reproduces Dupree's original scaling for the strong turbulence diffusion coefficient. For the case of stochastic magnetic fields, the scaling of the bounds is described for the magnetic diffusion coefficient as well as for the particle diffusion coefficient in the so-called collisionless, fluid, and double-streaming regimes

  12. Orthodontic brackets removal under shear and tensile bond strength resistance tests – a comparative test between light sources

    International Nuclear Information System (INIS)

    Silva, P C G; Porto-Neto, S T; Lizarelli, R F Z; Bagnato, V S

    2008-01-01

    We have investigated if a new LEDs system has enough efficient energy to promote efficient shear and tensile bonding strength resistance under standardized tests. LEDs 470 ± 10 nm can be used to photocure composite during bracket fixation. Advantages considering resistance to tensile and shear bonding strength when these systems were used are necessary to justify their clinical use. Forty eight human extracted premolars teeth and two light sources were selected, one halogen lamp and a LEDs system. Brackets for premolar were bonded through composite resin. Samples were submitted to standardized tests. A comparison between used sources under shear bonding strength test, obtained similar results; however, tensile bonding test showed distinct results: a statistical difference at a level of 1% between exposure times (40 and 60 seconds) and even to an interaction between light source and exposure time. The best result was obtained with halogen lamp use by 60 seconds, even during re-bonding; however LEDs system can be used for bonding and re-bonding brackets if power density could be increased

  13. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    Science.gov (United States)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by

  14. Data from thermal testing of the Open Source Cryostage

    DEFF Research Database (Denmark)

    Buch, Johannes Lørup; Ramløv, Hans

    2016-01-01

    The data presented here is related to the research article "An open source cryostage and software analysis method for detection of antifreeze activity" (Buch and Ramløv, 2016) [1]. The design of the Open Source Cryostage (OSC) is tested in terms of thermal limits, thermal efficiency and electrical...... efficiency. This article furthermore includes an overview of the electrical circuitry and a flowchart of the software program controlling the temperature of the OSC. The thermal efficiency data is presented here as degrees per volt and maximum cooling capacity....

  15. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    Science.gov (United States)

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    Science.gov (United States)

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  17. Building an Evidence Base to Inform Interventions for Pregnant and Parenting Adolescents: A Call for Rigorous Evaluation

    Science.gov (United States)

    Burrus, Barri B.; Scott, Alicia Richmond

    2012-01-01

    Adolescent parents and their children are at increased risk for adverse short- and long-term health and social outcomes. Effective interventions are needed to support these young families. We studied the evidence base and found a dearth of rigorously evaluated programs. Strategies from successful interventions are needed to inform both intervention design and policies affecting these adolescents. The lack of rigorous evaluations may be attributable to inadequate emphasis on and sufficient funding for evaluation, as well as to challenges encountered by program evaluators working with this population. More rigorous program evaluations are urgently needed to provide scientifically sound guidance for programming and policy decisions. Evaluation lessons learned have implications for other vulnerable populations. PMID:22897541

  18. Iterative and range test methods for an inverse source problem for acoustic waves

    International Nuclear Information System (INIS)

    Alves, Carlos; Kress, Rainer; Serranho, Pedro

    2009-01-01

    We propose two methods for solving an inverse source problem for time-harmonic acoustic waves. Based on the reciprocity gap principle a nonlinear equation is presented for the locations and intensities of the point sources that can be solved via Newton iterations. To provide an initial guess for this iteration we suggest a range test algorithm for approximating the source locations. We give a mathematical foundation for the range test and exhibit its feasibility in connection with the iteration method by some numerical examples

  19. Source passing test in Vesivehmaa air field - STUK/HUT team

    International Nuclear Information System (INIS)

    Honkamaa, T.; Tiilikainen, H.; Aarnio, P.; Nikkinen, M.

    1997-01-01

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h -1 and 50 km h -1 . A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5'x5') were used. The sources had a nominal activity of 22 MBq ( 60 Co) and 1.85 GBq ( 137 Cs). The 60 Co source strength was under the detection limit in all measurements. The detection of the 137 Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that 137 Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au)

  20. Feedback for relatedness and competence : Can feedback in blended learning contribute to optimal rigor, basic needs, and motivation?

    NARCIS (Netherlands)

    Bombaerts, G.; Nickel, P.J.

    2017-01-01

    We inquire how peer and tutor feedback influences students' optimal rigor, basic needs and motivation. We analyze questionnaires from two courses in two subsequent years. We conclude that feedback in blended learning can contribute to rigor and basic needs, but it is not clear from our data what

  1. Feed Preparation for Source of Alkali Melt Rate Tests

    International Nuclear Information System (INIS)

    Stone, M. E.; Lambert, D. P.

    2005-01-01

    The purpose of the Source of Alkali testing was to prepare feed for melt rate testing in order to determine the maximum melt-rate for a series of batches where the alkali was increased from 0% Na 2 O in the frit (low washed sludge) to 16% Na 2 O in the frit (highly washed sludge). This document summarizes the feed preparation for the Source of Alkali melt rate testing. The Source of Alkali melt rate results will be issued in a separate report. Five batches of Sludge Receipt and Adjustment Tank (SRAT) product and four batches of Slurry Mix Evaporator (SME) product were produced to support Source of Alkali (SOA) melt rate testing. Sludge Batch 3 (SB3) simulant and frit 418 were used as targets for the 8% Na 2 O baseline run. For the other four cases (0% Na 2 O, 4% Na 2 O, 12% Na 2 O, and 16% Na 2 O in frit), special sludge and frit preparations were necessary. The sludge preparations mimicked washing of the SB3 baseline composition, while frit adjustments consisted of increasing or decreasing Na and then re-normalizing the remaining frit components. For all batches, the target glass compositions were identical. The five SRAT products were prepared for testing in the dry fed melt-rate furnace and the four SME products were prepared for the Slurry-fed Melt-Rate Furnace (SMRF). At the same time, the impacts of washing on a baseline composition from a Chemical Process Cell (CPC) perspective could also be investigated. Five process simulations (0% Na 2 O in frit, 4% Na 2 O in frit, 8% Na 2 O in frit or baseline, 12% Na 2 O in frit, and 16% Na 2 O in frit) were completed in three identical 4-L apparatus to produce the five SRAT products. The SRAT products were later dried and combined with the complementary frits to produce identical glass compositions. All five batches were produced with identical processing steps, including off-gas measurement using online gas chromatographs. Two slurry-fed melter feed batches, a 4% Na 2 O in frit run (less washed sludge combined with

  2. Reciprocity relations in transmission electron microscopy: A rigorous derivation.

    Science.gov (United States)

    Krause, Florian F; Rosenauer, Andreas

    2017-01-01

    A concise derivation of the principle of reciprocity applied to realistic transmission electron microscopy setups is presented making use of the multislice formalism. The equivalence of images acquired in conventional and scanning mode is thereby rigorously shown. The conditions for the applicability of the found reciprocity relations is discussed. Furthermore the positions of apertures in relation to the corresponding lenses are considered, a subject which scarcely has been addressed in previous publications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Statistics for mathematicians a rigorous first course

    CERN Document Server

    Panaretos, Victor M

    2016-01-01

    This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.

  4. Source Location of Noble Gas Plumes

    International Nuclear Information System (INIS)

    Hoffman, I.; Ungar, K.; Bourgouin, P.; Yee, E.; Wotawa, G.

    2015-01-01

    In radionuclide monitoring, one of the most significant challenges from a verification or surveillance perspective is the source location problem. Modern monitoring/surveillance systems employ meteorological source reconstruction — for example, the Fukushima accident, CRL emissions analysis and even radon risk mapping. These studies usually take weeks to months to conduct, involving multidisciplinary teams representing meteorology; dispersion modelling; radionuclide sampling and metrology; and, when relevant, proper representation of source characteristics (e.g., reactor engineering expertise). Several different approaches have been tried in an attempt to determine useful techniques to apply to the source location problem and to develop rigorous methods that combine all potentially relevant observations and models to identify a most probable source location and size with uncertainties. The ultimate goal is to understand the utility and limitations of these techniques so they can transition from R&D to operational tools. (author)

  5. Special aerosol sources for certification and test of aerosol radiometers

    International Nuclear Information System (INIS)

    Belkina, S.K.; Zalmanzon, Y.E.; Kuznetsov, Y.V.; Rizin, A.I.; Fertman, D.E.

    1991-01-01

    The results are presented of the development and practical application of new radionuclide source types (Special Aerosol Sources (SAS)), that meet the international standard recommendations, which are used for certification and test of aerosol radiometers (monitors) using model aerosols of plutonium-239, strontium-yttrium-90 or uranium of natural isotope composition and certified against Union of Soviet Socialist Republics USSR national radioactive aerosol standard or by means of a reference radiometer. The original technology for source production allows the particular features of sampling to be taken into account as well as geometry and conditions of radionuclides radiation registration in the sample for the given type of radiometer. (author)

  6. Special aerosol sources for certification and test of aerosol radiometers

    Energy Technology Data Exchange (ETDEWEB)

    Belkina, S.K.; Zalmanzon, Y.E.; Kuznetsov, Y.V.; Rizin, A.I.; Fertman, D.E. (Union Research Institute of Instrumentation, Moscow (USSR))

    1991-01-01

    The results are presented of the development and practical application of new radionuclide source types (Special Aerosol Sources (SAS)), that meet the international standard recommendations, which are used for certification and test of aerosol radiometers (monitors) using model aerosols of plutonium-239, strontium-yttrium-90 or uranium of natural isotope composition and certified against Union of Soviet Socialist Republics USSR national radioactive aerosol standard or by means of a reference radiometer. The original technology for source production allows the particular features of sampling to be taken into account as well as geometry and conditions of radionuclides radiation registration in the sample for the given type of radiometer. (author).

  7. Evaluation and Testing of Several Free/Open Source Web Vulnerability Scanners

    OpenAIRE

    Suteva, Natasa; Zlatkovski, Dragi; Mileva, Aleksandra

    2013-01-01

    Web Vulnerability Scanners (WVSs) are software tools for identifying vulnerabilities in web applications. There are commercial WVSs, free/open source WVSs, and some companies offer them as a Software-as-a-Service. In this paper, we test and evaluate six free/open source WVSs using the web application WackoPicko with many known vulnerabilities, primary for false negative rates.

  8. Preliminary Tests Of The Decris-sc Ion Source

    CERN Document Server

    Efremov, A; Bechterev, V; Bogomolov, S L; Bondarenko, P G; Datskov, V I; Dmitriev, S; Drobin, V; Lebedev, A; Leporis, M; Malinowski, H; Nikiforov, A; Paschenko, S V; Seleznev, V; Shishov, Yu A; Smirnov, Yu; Tsvineva, G; Yakovlev, B; Yazvitsky, N Yu

    2004-01-01

    A new "liquid He-free" superconducting Electron Cyclotron Resonance Ion Source DECRIS-SC, to be used as injector for the IC-100 small cyclotron, has been designed by FLNR and LHE JINR. The main feature is that a compact refrigerator of Gifford-McMahon type is used to cool the solenoid coils. For the reason of very small cooling power at 4.2 K (about 1 W) our efforts were to optimize the magnetic structure and minimize an external heating of the coils. The maximum magnetic field strength is 3 T and 2 T in injection and extraction region respectively. For the radial plasma confinement a hexapole made of NdFeB permanent magnet is used. The source will be capable of ECR plasma heating using different frequencies (14 GHz or 18 GHz). To be able to deliver usable intensities of solids, the design is also allow axial access for evaporation oven and metal samples using the plasma sputtering technique. Very preliminary results of the source test are presented.

  9. Atlantic salmon skin and fillet color changes effected by perimortem handling stress, rigor mortis, and ice storage.

    Science.gov (United States)

    Erikson, U; Misimi, E

    2008-03-01

    The changes in skin and fillet color of anesthetized and exhausted Atlantic salmon were determined immediately after killing, during rigor mortis, and after ice storage for 7 d. Skin color (CIE L*, a*, b*, and related values) was determined by a Minolta Chroma Meter. Roche SalmoFan Lineal and Roche Color Card values were determined by a computer vision method and a sensory panel. Before color assessment, the stress levels of the 2 fish groups were characterized in terms of white muscle parameters (pH, rigor mortis, and core temperature). The results showed that perimortem handling stress initially significantly affected several color parameters of skin and fillets. Significant transient fillet color changes also occurred in the prerigor phase and during the development of rigor mortis. Our results suggested that fillet color was affected by postmortem glycolysis (pH drop, particularly in anesthetized fillets), then by onset and development of rigor mortis. The color change patterns during storage were different for the 2 groups of fish. The computer vision method was considered suitable for automated (online) quality control and grading of salmonid fillets according to color.

  10. Source passing test in Vesivehmaa air field - STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Honkamaa, T.; Tiilikainen, H. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Aarnio, P.; Nikkinen, M. [Helsinki Univ. of Technology, Espoo (Finland)

    1997-12-31

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h{sup -1} and 50 km h{sup -1}. A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5`x5`) were used. The sources had a nominal activity of 22 MBq ({sup 60}Co) and 1.85 GBq ({sup 137}Cs). The {sup 60}Co source strength was under the detection limit in all measurements. The detection of the {sup 137}Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that {sup 137}Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au).

  11. Source passing test in Vesivehmaa air field - STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Honkamaa, T; Tiilikainen, H [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Aarnio, P; Nikkinen, M [Helsinki Univ. of Technology, Espoo (Finland)

    1998-12-31

    Carborne radiation monitors were tested for point source responses at distances 10 m, 20 m, 50 m, 100 m, 150 m, and 200 m using speed of 20 km h{sup -1} and 50 km h{sup -1}. A large pressurised ionisation chamber (PIC), and HPGe detector (relative efficiency 36.9%) and a NaI(Tl) scintillator detector (size 5`x5`) were used. The sources had a nominal activity of 22 MBq ({sup 60}Co) and 1.85 GBq ({sup 137}Cs). The {sup 60}Co source strength was under the detection limit in all measurements. The detection of the {sup 137}Cs source is visually clear up to 50 m for the spectrometers and up to 20 m for PIC. Statistical analysis shows that {sup 137}Cs source could be detected up to 100 m with the spectrometers and up to 50 m with PIC if the background is well known. (au).

  12. Design of the 'half-size' ITER neutral beam source for the test facility ELISE

    International Nuclear Information System (INIS)

    Heinemann, B.; Falter, H.; Fantz, U.; Franzen, P.; Froeschle, M.; Gutser, R.; Kraus, W.; Nocentini, R.; Riedl, R.; Speth, E.; Staebler, A.; Wuenderlich, D.; Agostinetti, P.; Jiang, T.

    2009-01-01

    In 2007 the radio frequency driven negative hydrogen ion source developed at IPP in Garching was chosen by the ITER board as the new reference source for the ITER neutral beam system. In order to support the design and the commissioning and operating phases of the ITER test facilities ISTF and NBTF in Padua, IPP is presently constructing a new test facility ELISE (Extraction from a Large Ion Source Experiment). ELISE will be operated with the so-called 'half-size ITER source' which is an intermediate step between the present small IPP RF sources (1/8 ITER size) and the full size ITER source. The source will have approximately the width but only half the height of the ITER source. The modular concept with 4 drivers will allow an easy extrapolation to the full ITER size with 8 drivers. Pulsed beam extraction and acceleration up to 60 kV (corresponding to pre-acceleration voltage of SINGAP) is foreseen. The aim of the design of the ELISE source and extraction system was to be as close as possible to the ITER design; it has however some modifications allowing a better diagnostic access as well as more flexibility for exploring open questions. Therefore one major difference compared to the source of ITER, NBTF or ISTF is the possible operation in air. Specific requirements for RF sources as found on IPP test facilities BATMAN and MANITU are implemented [A. Staebler, et al., Development of a RF-driven ion source for the ITER NBI system, SOFT Conference 2008, Fusion Engineering and Design, 84 (2009) 265-268].

  13. Rigorous results on measuring the quark charge below color threshold

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1979-01-01

    Rigorous theorems are presented showing that contributions from a color nonsinglet component of the current to matrix elements of a second order electromagnetic transition are suppressed by factors inversely proportional to the energy of the color threshold. Parton models which obtain matrix elements proportional to the color average of the square of the quark charge are shown to neglect terms of the same order of magnitude as terms kept. (author)

  14. Induced over voltage test on transformers using enhanced Z-source inverter based circuit

    Science.gov (United States)

    Peter, Geno; Sherine, Anli

    2017-09-01

    The normal life of a transformer is well above 25 years. The economical operation of the distribution system has its roots in the equipments being used. The economy being such, that it is financially advantageous to replace transformers with more than 15 years of service in the second perennial market. Testing of transformer is required, as its an indication of the extent to which a transformer can comply with the customers specified requirements and the respective standards (IEC 60076-3). In this paper, induced over voltage testing on transformers using enhanced Z source inverter is discussed. Power electronic circuits are now essential for a whole array of industrial electronic products. The bulky motor generator set, which is used to generate the required frequency to conduct the induced over voltage testing of transformers is nowadays replaced by static frequency converter. First conventional Z-source inverter, and second an enhanced Z source inverter is being used to generate the required voltage and frequency to test the transformer for induced over voltage test, and its characteristics is analysed.

  15. APL/JHU free flight tests of the General Purpose Heat Source module. Testing: 5-7 March 1984

    International Nuclear Information System (INIS)

    Baker, W.M. II.

    1984-01-01

    Purpose of the test was to obtain statistical information on the dynamics of the General Purpose Heat Source (GPHS) module at terminal speeds. Models were designed to aerodynamically and dynamically represent the GPHS module. Normal and high speed photographic coverage documented the motion of the models. This report documents test parameters and techniques for the free-spin tests. It does not include data analysis

  16. Avery Island heater tests: measured data for 1000 days of heating

    International Nuclear Information System (INIS)

    Van Sambeek, L.L.; Stickney, R.G.; DeJong, K.B.

    1983-10-01

    Three heater tests were conducted in the Avery Island salt mine. The measurements of temperature and displacement, and the calculation of stress in the vicinity of each heater are of primary importance in the understanding of the thermal and thermomechanical response of the salt to an emplaced heat source. This report presents the temperature, displacement, and calculated stress data gathered during the heating phase of the three heater tests. The data presented have application in the ongoing studies of the response of geologicic media to an emplaced heat source. Specifically, electric heaters, which simulate canisters of heat-generating nuclear waste, were placed in the floor of the Avery Island salt mine, and measurements were made of the response of the salt caused by the heating. The purpose of this report is to transmit the data to the scientific community; rigorous analysis and interpretation of the data are considered beyond the scope of this data report. 11 references, 46 figures

  17. Rigor mortis development in turkey breast muscle and the effect of electrical stunning.

    Science.gov (United States)

    Alvarado, C Z; Sams, A R

    2000-11-01

    Rigor mortis development in turkey breast muscle and the effect of electrical stunning on this process are not well characterized. Some electrical stunning procedures have been known to inhibit postmortem (PM) biochemical reactions, thereby delaying the onset of rigor mortis in broilers. Therefore, this study was designed to characterize rigor mortis development in stunned and unstunned turkeys. A total of 154 turkey toms in two trials were conventionally processed at 20 to 22 wk of age. Turkeys were either stunned with a pulsed direct current (500 Hz, 50% duty cycle) at 35 mA (40 V) in a saline bath for 12 seconds or left unstunned as controls. At 15 min and 1, 2, 4, 8, 12, and 24 h PM, pectoralis samples were collected to determine pH, R-value, L* value, sarcomere length, and shear value. In Trial 1, the samples obtained for pH, R-value, and sarcomere length were divided into surface and interior samples. There were no significant differences between the surface and interior samples among any parameters measured. Muscle pH significantly decreased over time in stunned and unstunned birds through 2 h PM. The R-values increased to 8 h PM in unstunned birds and 24 h PM in stunned birds. The L* values increased over time, with no significant differences after 1 h PM for the controls and 2 h PM for the stunned birds. Sarcomere length increased through 2 h PM in the controls and 12 h PM in the stunned fillets. Cooked meat shear values decreased through the 1 h PM deboning time in the control fillets and 2 h PM in the stunned fillets. These results suggest that stunning delayed the development of rigor mortis through 2 h PM, but had no significant effect on the measured parameters at later time points, and that deboning turkey breasts at 2 h PM or later will not significantly impair meat tenderness.

  18. Rigorous Integration of Non-Linear Ordinary Differential Equations in Chebyshev Basis

    Czech Academy of Sciences Publication Activity Database

    Dzetkulič, Tomáš

    2015-01-01

    Roč. 69, č. 1 (2015), s. 183-205 ISSN 1017-1398 R&D Projects: GA MŠk OC10048; GA ČR GD201/09/H057 Institutional research plan: CEZ:AV0Z10300504 Keywords : Initial value problem * Rigorous integration * Taylor model * Chebyshev basis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.366, year: 2015

  19. Make-up of injector test stand (ITS-1) and preliminary results with Model-I ion source

    International Nuclear Information System (INIS)

    Matsuda, S.; Ito, T.; Kondo, U.; Ohara, Y.; Oga, T.; Shibata, T.; Shirakata, H.; Sugawara, T.; Tanaka, S.

    Constitution of the 1-st injector test stand (ITS-1) in the Thermonuclear Division, JAERI, and the performance of the Model-I ion source are described. Heating a plasma by neutral beam injection is one of the promising means in the thermonuclear fusion devices. Purpose of the test stand is to develop the ion sources used in such injection systems. The test stand was completed in February 1975, which is capable of testing the ion sources up to 12 amps at 30 kV. A hydrogen ion beam of 5.5 amps at 25 kV was obtained in the Model-I ion source

  20. Rigorous quantum limits on monitoring free masses and harmonic oscillators

    Science.gov (United States)

    Roy, S. M.

    2018-03-01

    There are heuristic arguments proposing that the accuracy of monitoring position of a free mass m is limited by the standard quantum limit (SQL): σ2( X (t ) ) ≥σ2( X (0 ) ) +(t2/m2) σ2( P (0 ) ) ≥ℏ t /m , where σ2( X (t ) ) and σ2( P (t ) ) denote variances of the Heisenberg representation position and momentum operators. Yuen [Phys. Rev. Lett. 51, 719 (1983), 10.1103/PhysRevLett.51.719] discovered that there are contractive states for which this result is incorrect. Here I prove universally valid rigorous quantum limits (RQL), viz. rigorous upper and lower bounds on σ2( X (t ) ) in terms of σ2( X (0 ) ) and σ2( P (0 ) ) , given by Eq. (12) for a free mass and by Eq. (36) for an oscillator. I also obtain the maximally contractive and maximally expanding states which saturate the RQL, and use the contractive states to set up an Ozawa-type measurement theory with accuracies respecting the RQL but beating the standard quantum limit. The contractive states for oscillators improve on the Schrödinger coherent states of constant variance and may be useful for gravitational wave detection and optical communication.

  1. The Chandra Source Catalog: Algorithms

    Science.gov (United States)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  2. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    Science.gov (United States)

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  3. Comparison of General Purpose Heat Source testing with the ANSI N43.6-1977 (R 1989) sealed source standard

    International Nuclear Information System (INIS)

    Grigsby, C.O.

    1998-01-01

    This analysis provides a comparison of the testing of Radioisotope Thermoelectric Generators (RTGs) and RTG components with the testing requirements of ANSI N43.6-1977 (R1989) ''Sealed Radioactive Sources, Categorization''. The purpose of this comparison is to demonstrate that the RTGs meet or exceed the requirements of the ANSI standard, and thus can be excluded from the radioactive inventory of the Chemistry and Metallurgy Research (CMR) building in Los Alamos per Attachment 1 of DOE STD 1027-92. The approach used in this analysis is as follows: (1) describe the ANSI sealed source classification methodology; (2) develop sealed source performance requirements for the RTG and/or RTG components based on criteria from the accident analysis for CMR; (3) compare the existing RTG or RTG component test data to the CMR requirements; and (4) determine the appropriate ANSI classification for the RTG and/or RTG components based on CMR performance requirements. The CMR requirements for treating RTGs as sealed sources are derived from the radiotoxicity of the isotope ( 238 P7) and amount (13 kg) of radioactive material contained in the RTG. The accident analysis for the CMR BIO identifies the bounding accidents as wing-wide fire, explosion and earthquake. These accident scenarios set the requirements for RTGs or RTG components stored within the CMR

  4. Low-Energy Microfocus X-Ray Source for Enhanced Testing Capability in the Stray Light Facility

    Science.gov (United States)

    Gaskin, Jessica; O'Dell, Stephen; Kolodziejczak, Jeff

    2015-01-01

    Research toward high-resolution, soft x-ray optics (mirrors and gratings) necessary for the next generation large x-ray observatories requires x-ray testing using a low-energy x-ray source with fine angular size (energy microfocus (approximately 0.1 mm spot) x-ray source from TruFocus Corporation that mates directly to the Stray Light Facility (SLF). MSFC X-ray Astronomy team members are internationally recognized for their expertise in the development, fabrication, and testing of grazing-incidence optics for x-ray telescopes. One of the key MSFC facilities for testing novel x-ray instrumentation is the SLF. This facility is an approximately 100-m-long beam line equipped with multiple x-ray sources and detectors. This new source adds to the already robust compliment of instrumentation, allowing MSFC to support additional internal and community x-ray testing needs.

  5. Rationale for a spallation neutron source target system test facility at the 1-MW Long-Pulse Spallation Source

    International Nuclear Information System (INIS)

    Sommer, W.F.

    1995-12-01

    The conceptual design study for a 1-MW Long-Pulse Spallation Source at the Los Alamos Neutron Science Center has shown the feasibility of including a spallation neutron test facility at a relatively low cost. This document presents a rationale for developing such a test bed. Currently, neutron scattering facilities operate at a maximum power of 0.2 MW. Proposed new designs call for power levels as high as 10 MW, and future transmutation activities may require as much as 200 MW. A test bed will allow assessment of target neutronics; thermal hydraulics; remote handling; mechanical structure; corrosion in aqueous, non-aqueous, liquid metal, and molten salt systems; thermal shock on systems and system components; and materials for target systems. Reliable data in these areas are crucial to the safe and reliable operation of new high-power facilities. These tests will provide data useful not only to spallation neutron sources proposed or under development, but also to other projects in accelerator-driven transmutation technologies such as the production of tritium

  6. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  7. Preparation of tracing source layer in simulation test of nuclide migration

    International Nuclear Information System (INIS)

    Zhao Yingjie; Ni Shiwei; Li Weijuan; Yamamoto, T.; Tanaka, T.; Komiya, T.

    1993-01-01

    In cooperative research between CIRP and JAERI on safety assessment for shallow land disposal of low level radioactive waste, a laboratory simulation test of nuclide migration was carried out, in which the undisturbed loess soil column sampled from CIRP' s field test site was used as testing material, three nuclides, Sr-85, Cs-137 and Co-60 were used as tracers. Special experiment on tracing method was carried out, which included measuring pH value of quartz sand in HCl solution, determining the eligible water content of quartz sand as tracer carrier, measuring distribution uniformity of nuclides in the tracing quartz sand, determining elution rate of nuclides from the tracing quartz sand and detecting activity uniformity of tracing source layer. The experiment results showed that the tracing source layer, in which fine quartz sand was used as tracer carrier, satisfied expected requirement. (1 fig.)

  8. Prediction of Near-Field Wave Attenuation Due to a Spherical Blast Source

    Science.gov (United States)

    Ahn, Jae-Kwang; Park, Duhee

    2017-11-01

    Empirical and theoretical far-field attenuation relationships, which do not capture the near-field response, are most often used to predict the peak amplitude of blast wave. Jiang et al. (Vibration due to a buried explosive source. PhD Thesis, Curtin University, Western Australian School of Mines, 1993) present rigorous wave equations that simulates the near-field attenuation to a spherical blast source in damped and undamped media. However, the effect of loading frequency and velocity of the media have not yet been investigated. We perform a suite of axisymmetric, dynamic finite difference analyses to simulate the propagation of stress waves induced by spherical blast source and to quantify the near-field attenuation. A broad range of loading frequencies, wave velocities, and damping ratios are used in the simulations. The near-field effect is revealed to be proportional to the rise time of the impulse load and wave velocity. We propose an empirical additive function to the theoretical far-field attenuation curve to predict the near-field range and attenuation. The proposed curve is validated against measurements recorded in a test blast.

  9. International Test Comparisons: Reviewing Translation Error in Different Source Language-Target Language Combinations

    Science.gov (United States)

    Zhao, Xueyu; Solano-Flores, Guillermo; Qian, Ming

    2018-01-01

    This article addresses test translation review in international test comparisons. We investigated the applicability of the theory of test translation error--a theory of the multidimensionality and inevitability of test translation error--across source language-target language combinations in the translation of PISA (Programme of International…

  10. Ultracold neutron source at the PULSTAR reactor: Engineering design and cryogenic testing

    Energy Technology Data Exchange (ETDEWEB)

    Korobkina, E., E-mail: ekorobk@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Box 7909, Raleigh, NC 27695 (United States); Medlin, G. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Wehring, B.; Hawari, A.I. [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Box 7909, Raleigh, NC 27695 (United States); Huffman, P.R.; Young, A.R. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Beaumont, B. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Palmquist, G. [Department of Physics, North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States)

    2014-12-11

    Construction is completed and commissioning is in progress for an ultracold neutron (UCN) source at the PULSTAR reactor on the campus of North Carolina State University. The source utilizes two stages of neutron moderation, one in heavy water at room temperature and the other in solid methane at ∼40K, followed by a converter stage, solid deuterium at 5 K, that allows a single down scattering of cold neutrons to provide UCN. The UCN source rolls into the thermal column enclosure of the PULSTAR reactor, where neutrons will be delivered from a bare face of the reactor core by streaming through a graphite-lined assembly. The source infrastructure, i.e., graphite-lined assembly, heavy-water system, gas handling system, and helium liquefier cooling system, has been tested and all systems operate as predicted. The research program being considered for the PULSTAR UCN source includes the physics of UCN production, fundamental particle physics, and material surface studies of nanolayers containing hydrogen. In the present paper we report details of the engineering and cryogenic design of the facility as well as results of critical commissioning tests without neutrons.

  11. A rigorous proof of the Landau-Peierls formula and much more

    DEFF Research Database (Denmark)

    Briet, Philippe; Cornean, Horia; Savoie, Baptiste

    2012-01-01

    We present a rigorous mathematical treatment of the zero-field orbital magnetic susceptibility of a non-interacting Bloch electron gas, at fixed temperature and density, for both metals and semiconductors/insulators. In particular, we obtain the Landau-Peierls formula in the low temperature and d...... and density limit as conjectured by Kjeldaas and Kohn (Phys Rev 105:806–813, 1957)....

  12. Unmet Need: Improving mHealth Evaluation Rigor to Build the Evidence Base.

    Science.gov (United States)

    Mookherji, Sangeeta; Mehl, Garrett; Kaonga, Nadi; Mechael, Patricia

    2015-01-01

    mHealth-the use of mobile technologies for health-is a growing element of health system activity globally, but evaluation of those activities remains quite scant, and remains an important knowledge gap for advancing mHealth activities. In 2010, the World Health Organization and Columbia University implemented a small-scale survey to generate preliminary data on evaluation activities used by mHealth initiatives. The authors describe self-reported data from 69 projects in 29 countries. The majority (74%) reported some sort of evaluation activity, primarily nonexperimental in design (62%). The authors developed a 6-point scale of evaluation rigor comprising information on use of comparison groups, sample size calculation, data collection timing, and randomization. The mean score was low (2.4); half (47%) were conducting evaluations with a minimum threshold (4+) of rigor, indicating use of a comparison group, while less than 20% had randomized the mHealth intervention. The authors were unable to assess whether the rigor score was appropriate for the type of mHealth activity being evaluated. What was clear was that although most data came from mHealth projects pilots aimed for scale-up, few had designed evaluations that would support crucial decisions on whether to scale up and how. Whether the mHealth activity is a strategy to improve health or a tool for achieving intermediate outcomes that should lead to better health, mHealth evaluations must be improved to generate robust evidence for cost-effectiveness assessment and to allow for accurate identification of the contribution of mHealth initiatives to health systems strengthening and the impact on actual health outcomes.

  13. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  14. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    Science.gov (United States)

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  15. College Readiness in California: A Look at Rigorous High School Course-Taking

    Science.gov (United States)

    Gao, Niu

    2016-01-01

    Recognizing the educational and economic benefits of a college degree, education policymakers at the federal, state, and local levels have made college preparation a priority. There are many ways to measure college readiness, but one key component is rigorous high school coursework. California has not yet adopted a statewide college readiness…

  16. Rigor in Qualitative Supply Chain Management Research

    DEFF Research Database (Denmark)

    Goffin, Keith; Raja, Jawwad; Claes, Björn

    2012-01-01

    , reliability, and theoretical saturation. Originality/value – It is the authors' contention that the addition of the repertory grid technique to the toolset of methods used by logistics and supply chain management researchers can only enhance insights and the building of robust theories. Qualitative studies......Purpose – The purpose of this paper is to share the authors' experiences of using the repertory grid technique in two supply chain management studies. The paper aims to demonstrate how the two studies provided insights into how qualitative techniques such as the repertory grid can be made more...... rigorous than in the past, and how results can be generated that are inaccessible using quantitative methods. Design/methodology/approach – This paper presents two studies undertaken using the repertory grid technique to illustrate its application in supply chain management research. Findings – The paper...

  17. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  18. Heat and mass release for some transient fuel source fires: A test report

    International Nuclear Information System (INIS)

    Nowlen, S.P.

    1986-10-01

    Nine fire tests using five different trash fuel source packages were conducted by Sandia National Laboratories. This report presents the findings of these tests. Data reported includes heat and mass release rates, total heat and mass release, plume temperatures, and average fuel heat of combustion. These tests were conducted as a part of the US Nuclear Regulatory Commission sponsored fire safety research program. Data from these tests were intended for use in nuclear power plant probabilistic risk assessment fire analyses. The results were also used as input to a fire test program at Sandia investigating the vulnerability of electrical control cabinets to fire. The fuel packages tested were chosen to be representative of small to moderately sized transient trash fuel sources of the type that would be found in a nuclear power plant. The highest fire intensity encountered during these tests was 145 kW. Plume temperatures did not exceed 820 0 C

  19. Optimal correction and design parameter search by modern methods of rigorous global optimization

    International Nuclear Information System (INIS)

    Makino, K.; Berz, M.

    2011-01-01

    Frequently the design of schemes for correction of aberrations or the determination of possible operating ranges for beamlines and cells in synchrotrons exhibit multitudes of possibilities for their correction, usually appearing in disconnected regions of parameter space which cannot be directly qualified by analytical means. In such cases, frequently an abundance of optimization runs are carried out, each of which determines a local minimum depending on the specific chosen initial conditions. Practical solutions are then obtained through an often extended interplay of experienced manual adjustment of certain suitable parameters and local searches by varying other parameters. However, in a formal sense this problem can be viewed as a global optimization problem, i.e. the determination of all solutions within a certain range of parameters that lead to a specific optimum. For example, it may be of interest to find all possible settings of multiple quadrupoles that can achieve imaging; or to find ahead of time all possible settings that achieve a particular tune; or to find all possible manners to adjust nonlinear parameters to achieve correction of high order aberrations. These tasks can easily be phrased in terms of such an optimization problem; but while mathematically this formulation is often straightforward, it has been common belief that it is of limited practical value since the resulting optimization problem cannot usually be solved. However, recent significant advances in modern methods of rigorous global optimization make these methods feasible for optics design for the first time. The key ideas of the method lie in an interplay of rigorous local underestimators of the objective functions, and by using the underestimators to rigorously iteratively eliminate regions that lie above already known upper bounds of the minima, in what is commonly known as a branch-and-bound approach. Recent enhancements of the Differential Algebraic methods used in particle

  20. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez

    2012-08-01

    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  1. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Science.gov (United States)

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  2. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    Energy Technology Data Exchange (ETDEWEB)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com [Faculty of Science, Chandrakasem Rajabhat University, Ratchadaphisek Road, Chatuchak, Bangkok 10900 (Thailand); Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com [Department of Mathematics and Computer Science, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Chatrabhuti, Auttakit, E-mail: dma3ac2@gmail.com [Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330 (Thailand); Visser, Matt, E-mail: matt.visser@msor.vuw.ac.nz [School of Mathematics, Statistics, and Operations Research, Victoria University of Wellington, PO Box 600, Wellington 6140 (New Zealand)

    2016-06-02

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  3. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    International Nuclear Information System (INIS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-01-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  4. Evaluation of the Non-Transient Hydrologic Source Term from the CAMBRIC Underground Nuclear Test in Frenchman Flat, Nevada Test Site

    International Nuclear Information System (INIS)

    Tompson, A B; Maxwell, R M; Carle, S F; Zavarin, M; Pawloski, G A.; Shumaker, D E

    2005-01-01

    Hydrologic Source Term (HST) calculations completed in 1998 at the CAMBRIC underground nuclear test site were LLNL's first attempt to simulate a hydrologic source term at the NTS by linking groundwater flow and transport modeling with geochemical modeling (Tompson et al., 1999). Significant effort was applied to develop a framework that modeled in detail the flow regime and captured all appropriate chemical processes that occurred over time. However, portions of the calculations were simplified because of data limitations and a perceived need for generalization of the results. For example: (1) Transient effects arising from a 16 years of pumping at the site for a radionuclide migration study were not incorporated. (2) Radionuclide fluxes across the water table, as derived from infiltration from a ditch to which pumping effluent was discharged, were not addressed. (3) Hydrothermal effects arising from residual heat of the test were not considered. (4) Background data on the ambient groundwater flow direction were uncertain and not represented. (5) Unclassified information on the Radiologic Source Term (RST) inventory, as tabulated recently by Bowen et al. (2001), was unavailable; instead, only a limited set of derived data were available (see Tompson et al., 1999). (6) Only a small number of radionuclides and geochemical reactions were incorporated in the work. (7) Data and interpretation of the RNM-2S multiple well aquifer test (MWAT) were not available. As a result, the current Transient CAMBRIC Hydrologic Source Term project was initiated as part of a broader Phase 2 Frenchman Flat CAU flow and transport modeling effort. The source term will be calculated under two scenarios: (1) A more specific representation of the transient flow and radionuclide release behavior at the site, reflecting the influence of the background hydraulic gradient, residual test heat, pumping experiment, and ditch recharge, and taking into account improved data sources and modeling

  5. Dataset for Testing Contamination Source Identification Methods for Water Distribution Networks

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset includes the results of a simulation study using the source inversion techniques available in the Water Security Toolkit. The data was created to test...

  6. Effects of chopping time, meat source and storage temperature on the colour of New Zealand type fresh beef sausages.

    Science.gov (United States)

    Boles, J A; Mikkelsen, V L; Swan, J E

    1998-05-01

    The colour stability of finely chopped fresh sausages made from post-rigor, pre-rigor salt added (1.5% w/w) or pre-rigor no salt added beef mince was evaluated using a Hunter Miniscan (L (∗) a (∗) b (∗)) and sensory colour panel. Batters were chopped for various times and sausages stored at -1.5 °, + 4.0 ° and + 8.0 °C. Regardless of meat source or chopping time, colour stability was greatest at -1.5 °C. Panellists found the colour of all sausages stored at -1.5 °C acceptable for at least six days. Sausages made from unsalted pre-rigor mince had markedly better colour stability than those made from the other meats, especially when stored at 4 °C or 8 °C.

  7. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Luiz C.L. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Matematica Aplicada]. E-mail: botelho.luiz@superig.com.br

    2008-07-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R{sup {infinity}}, we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  8. Some comments on rigorous quantum field path integrals in the analytical regularization scheme

    International Nuclear Information System (INIS)

    Botelho, Luiz C.L.

    2008-01-01

    Through the systematic use of the Minlos theorem on the support of cylindrical measures on R ∞ , we produce several mathematically rigorous path integrals in interacting euclidean quantum fields with Gaussian free measures defined by generalized powers of the Laplacian operator. (author)

  9. A plea for rigorous conceptual analysis as central method in transnational law design

    NARCIS (Netherlands)

    Rijgersberg, R.; van der Kaaij, H.

    2013-01-01

    Although shared problems are generally easily identified in transnational law design, it is considerably more difficult to design frameworks that transcend the peculiarities of local law in a univocal fashion. The following exposition is a plea for giving more prominence to rigorous conceptual

  10. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  11. Hypothesis tests for the detection of constant speed radiation moving sources

    Energy Technology Data Exchange (ETDEWEB)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Sannie, Guillaume; Gameiro, Jordan; Normand, Stephane [CEA, LIST, Laboratoire Capteurs Architectures Electroniques, 99 Gif-sur-Yvette, (France); Mechin, Laurence [CNRS, UCBN, Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen, 4050 Caen, (France)

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)

  12. Design and tests of a package for the transport of radioactive sources; Projeto e testes de uma embalagem para o transporte de fontes radioativas

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Paulo de Oliveira, E-mail: pos@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-10-26

    The Type A package was designed for transportation of seven cobalt-60 sources with total activity of 1 GBq. The shield thickness to accomplish the dose rate and the transport index established by the radioactive transport regulation was calculated by the code MCNP (Monte Carlo N-Particle Transport Code Version 5). The sealed cobalt-60 sources were tested for leakages. according to the regulation ISO 9978:1992 (E). The package was tested according to regulation Radioactive Material Transport CNEN. The leakage tests results pf the sources, and the package tests demonstrate that the transport can be safe performed from the CDTN to the steelmaking industries

  13. OpenSR: An Open-Source Stimulus-Response Testing Framework

    Directory of Open Access Journals (Sweden)

    Carolyn C. Matheus

    2015-01-01

    Full Text Available Stimulus–response (S–R tests provide a unique way to acquire information about human perception by capturing automatic responses to stimuli and attentional processes. This paper presents OpenSR, a user-centered S–R testing framework providing a graphical user interface that can be used by researchers to customize, administer, and manage one type of S–R test, the implicit association test. OpenSR provides an extensible open-source Web-based framework that is platform independent and can be implemented on most computers using any operating system. In addition, it provides capabilities for automatically generating and assigning participant identifications, assigning participants to different condition groups, tracking responses, and facilitating collecting and exporting of data. The Web technologies and languages used in creating the OpenSR framework are discussed, namely, HTML5, CSS3, JavaScript, jQuery, Twitter Bootstrap, Python, and Django. OpenSR is available for free download.

  14. Report on the engineering test of the LBL 30 second neutral beam source for the MFTF-B project

    International Nuclear Information System (INIS)

    Vella, M.C.; Pincosy, P.A.; Hauck, C.A.; Pyle, R.V.

    1984-08-01

    Positive ion based neutral beam development in the US has centered on the long pulse, Advanced Positive Ion Source (APIS). APIS eventually focused on development of 30 second sources for MFTF-B. The Engineering Test was part of competitive testing of the LBL and ORNL long pulse sources carried out for the MFTF-B Project. The test consisted of 500 beam shots with 80 kV, 30 second deuterium, and was carried out on the Neutral Beam Engineering Test Facility (NBETF). This report summarizes the results of LBL testing, in which the LBL APIS demonstrated that it would meet the requirements for MFTF-B 30 second sources. In part as a result of this test, the LBL design was found to be suitable as the baseline for a Common Long Pulse Source design for MFTF-B, TFTR, and Doublet Upgrade

  15. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    Science.gov (United States)

    Zhang, S. Y.; Shen, G. H.; Sun, Y.; Zhou, D. Z.; Zhang, X. X.; Li, J. W.; Huang, C.; Zhang, X. G.; Dong, Y. J.; Zhang, W. J.; Zhang, B. Q.; Shi, C. Y.

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference 90Sr/90Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  16. The feasibility of 10 keV X-ray as radiation source in total dose response radiation test

    International Nuclear Information System (INIS)

    Li Ruoyu; Li Bin; Luo Hongwei; Shi Qian

    2005-01-01

    The standard radiation source utilized in traditional total dose response radiation test is 60 Co, which is environment-threatening. X-rays, as a new radiation source, has the advantages such as safety, precise control of dose rate, strong intensity, possibility of wafer-level test or even on-line test, which greatly reduce cost for package, test and transportation. This paper discussed the feasibility of X-rays replacing 60 Co as the radiation source, based on the radiation mechanism and the effects of radiation on gate oxide. (authors)

  17. Application of the rigorous method to x-ray and neutron beam scattering on rough surfaces

    International Nuclear Information System (INIS)

    Goray, Leonid I.

    2010-01-01

    The paper presents a comprehensive numerical analysis of x-ray and neutron scattering from finite-conducting rough surfaces which is performed in the frame of the boundary integral equation method in a rigorous formulation for high ratios of characteristic dimension to wavelength. The single integral equation obtained involves boundary integrals of the single and double layer potentials. A more general treatment of the energy conservation law applicable to absorption gratings and rough mirrors is considered. In order to compute the scattering intensity of rough surfaces using the forward electromagnetic solver, Monte Carlo simulation is employed to average the deterministic diffraction grating efficiency due to individual surfaces over an ensemble of realizations. Some rules appropriate for numerical implementation of the theory at small wavelength-to-period ratios are presented. The difference between the rigorous approach and approximations can be clearly seen in specular reflectances of Au mirrors with different roughness parameters at wavelengths where grazing incidence occurs at close to or larger than the critical angle. This difference may give rise to wrong estimates of rms roughness and correlation length if they are obtained by comparing experimental data with calculations. Besides, the rigorous approach permits taking into account any known roughness statistics and allows exact computation of diffuse scattering.

  18. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    Science.gov (United States)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  19. Various quantum nonlocality tests with a commercial two-photon entanglement source

    International Nuclear Information System (INIS)

    Pomarico, Enrico; Bancal, Jean-Daniel; Sanguinetti, Bruno; Rochdi, Anas; Gisin, Nicolas

    2011-01-01

    Nonlocality is a fascinating and counterintuitive aspect of nature, revealed by the violation of a Bell inequality. The standard and easiest configuration in which Bell inequalities can be measured has been proposed by Clauser-Horne-Shimony-Holt (CHSH). However, alternative nonlocality tests can also be carried out. In particular, Bell inequalities requiring multiple measurement settings can provide deeper fundamental insights about quantum nonlocality, as well as offering advantages in the presence of noise and detection inefficiency. In this paper we show how these nonlocality tests can be performed using a commercially available source of entangled photon pairs. We report the violation of a series of these nonlocality tests (I 3322 , I 4422 , and chained inequalities). With the violation of the chained inequality with 4 settings per side we put an upper limit at 0.49 on the local content of the states prepared by the source (instead of 0.63 attainable with CHSH). We also quantify the amount of true randomness that has been created during our experiment (assuming fair sampling of the detected events).

  20. Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound

    Science.gov (United States)

    Shiraishi, Naoto; Tajima, Hiroyasu

    2017-08-01

    A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.

  1. Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II

    Energy Technology Data Exchange (ETDEWEB)

    George J. Koperna Jr.; Vello A. Kuuskraa; David E. Riestenberg; Aiysha Sultana; Tyler Van Leeuwen

    2009-06-01

    This report serves as the final technical report and users manual for the 'Rigorous Screening Technology for Identifying Suitable CO2 Storage Sites II SBIR project. Advanced Resources International has developed a screening tool by which users can technically screen, assess the storage capacity and quantify the costs of CO2 storage in four types of CO2 storage reservoirs. These include CO2-enhanced oil recovery reservoirs, depleted oil and gas fields (non-enhanced oil recovery candidates), deep coal seems that are amenable to CO2-enhanced methane recovery, and saline reservoirs. The screening function assessed whether the reservoir could likely serve as a safe, long-term CO2 storage reservoir. The storage capacity assessment uses rigorous reservoir simulation models to determine the timing, ultimate storage capacity, and potential for enhanced hydrocarbon recovery. Finally, the economic assessment function determines both the field-level and pipeline (transportation) costs for CO2 sequestration in a given reservoir. The screening tool has been peer reviewed at an Electrical Power Research Institute (EPRI) technical meeting in March 2009. A number of useful observations and recommendations emerged from the Workshop on the costs of CO2 transport and storage that could be readily incorporated into a commercial version of the Screening Tool in a Phase III SBIR.

  2. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    Science.gov (United States)

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  3. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  4. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    OpenAIRE

    K. Di; Y. Liu; B. Liu; M. Peng

    2012-01-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D c...

  5. Source Country Differences in Test Score Gaps: Evidence from Denmark

    Science.gov (United States)

    Rangvid, Beatrice Schindler

    2010-01-01

    We combine data from three studies for Denmark in the PISA 2000 framework to investigate differences in the native-immigrant test score gap by country of origin. In addition to the controls available from PISA data sources, we use student-level data on home background and individual migration histories linked from administrative registers. We find…

  6. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, S.Y. [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Shen, G.H., E-mail: shgh@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Sun, Y., E-mail: sunying@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhou, D.Z., E-mail: dazhuang.zhou@gmail.com [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, X.X., E-mail: xxzhang@cma.gov.cn [National Center for Space Weather, Beijing (China); Li, J.W., E-mail: lijw@cma.gov.cn [National Center for Space Weather, Beijing (China); Huang, C., E-mail: huangc@cma.gov.cn [National Center for Space Weather, Beijing (China); Zhang, X.G., E-mail: zhangxg@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Dong, Y.J., E-mail: dyj@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, W.J., E-mail: zhangreatest@163.com [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Zhang, B.Q., E-mail: zhangbinquan@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China); Shi, C.Y., E-mail: scy@nssc.ac.cn [National Space Science Center, Chinese Academy of Sciences, Beijing (China); Beijing Key Laboratory of Space Environment Exploration, Beijing (China)

    2016-05-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference {sup 90}Sr/{sup 90}Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  7. A new method of testing space-based high-energy electron detectors with radioactive electron sources

    International Nuclear Information System (INIS)

    Zhang, S.Y.; Shen, G.H.; Sun, Y.; Zhou, D.Z.; Zhang, X.X.; Li, J.W.; Huang, C.; Zhang, X.G.; Dong, Y.J.; Zhang, W.J.; Zhang, B.Q.; Shi, C.Y.

    2016-01-01

    Space-based electron detectors are commonly tested using radioactive β-sources which emit a continuous spectrum without spectral lines. Therefore, the tests are often to be considered only qualitative. This paper introduces a method, which results in more than a qualitative test even when using a β-source. The basic idea is to use the simulated response function of the instrument to invert the measured spectrum and compare this inverted spectrum with a reference spectrum obtained from the same source. Here we have used Geant4 to simulate the instrument response function (IRF) and a 3.5 mm thick Li-drifted Si detector to obtain the reference "9"0Sr/"9"0Yi source spectrum to test and verify the geometric factors of the Omni-Direction Particle Detector (ODPD) on the Tiangong-1 (TG-1) and Tiangong-2 (TG-2) spacecraft. The TG spacecraft are experimental space laboratories and prototypes of the Chinese space station. The excellent agreement between the measured and reference spectra demonstrates that this test method can be used to quantitatively assess the quality of the instrument. Due to its simplicity, the method is faster and therefore more efficient than traditional full calibrations using an electron accelerator.

  8. Endangered Butterflies as a Model System for Managing Source Sink Dynamics on Department of Defense Lands

    Science.gov (United States)

    used three species of endangered butterflies as a model system to rigorously investigate the source-sink dynamics of species being managed on military...lands. Butterflies have numerous advantages as models for source-sink dynamics , including rapid generation times and relatively limited dispersal, but...they are subject to the same processes that determine source-sink dynamics of longer-lived, more vagile taxa.1.2 Technical Approach: For two of our

  9. 75 FR 29732 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Science.gov (United States)

    2010-05-27

    ... rigorous knowledge and skills in English- language arts and mathematics that employers and colleges expect... specialists and to access the student outcome data needed to meet annual evaluation and reporting requirements...

  10. Rigorous Line-Based Transformation Model Using the Generalized Point Strategy for the Rectification of High Resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Kun Hu

    2016-09-01

    Full Text Available High precision geometric rectification of High Resolution Satellite Imagery (HRSI is the basis of digital mapping and Three-Dimensional (3D modeling. Taking advantage of line features as basic geometric control conditions instead of control points, the Line-Based Transformation Model (LBTM provides a practical and efficient way of image rectification. It is competent to build the mathematical relationship between image space and the corresponding object space accurately, while it reduces the workloads of ground control and feature recognition dramatically. Based on generalization and the analysis of existing LBTMs, a novel rigorous LBTM is proposed in this paper, which can further eliminate the geometric deformation caused by sensor inclination and terrain variation. This improved nonlinear LBTM is constructed based on a generalized point strategy and resolved by least squares overall adjustment. Geo-positioning accuracy experiments with IKONOS, GeoEye-1 and ZiYuan-3 satellite imagery are performed to compare rigorous LBTM with other relevant line-based and point-based transformation models. Both theoretic analysis and experimental results demonstrate that the rigorous LBTM is more accurate and reliable without adding extra ground control. The geo-positioning accuracy of satellite imagery rectified by rigorous LBTM can reach about one pixel with eight control lines and can be further improved by optimizing the horizontal and vertical distribution of control lines.

  11. Improved Thermal-Vacuum Compatible Flat Plate Radiometric Source For System-Level Testing Of Optical Sensors

    Science.gov (United States)

    Schwarz, Mark A.; Kent, Craig J.; Bousquet, Robert; Brown, Steven W.

    2016-01-01

    In this work, we describe an improved thermal-vacuum compatible flat plate radiometric source which has been developed and utilized for the characterization and calibration of remote optical sensors. This source is unique in that it can be used in situ, in both ambient and thermal-vacuum environments, allowing it to follow the sensor throughout its testing cycle. The performance of the original flat plate radiometric source was presented at the 2009 SPIE1. Following the original efforts, design upgrades were incorporated into the source to improve both radiometric throughput and uniformity. The pre-thermal-vacuum (pre-TVAC) testing results of a spacecraft-level optical sensor with the improved flat plate illumination source, both in ambient and vacuum environments, are presented. We also briefly discuss potential FPI configuration changes in order to improve its radiometric performance.

  12. Rigorous patient-prosthesis matching of Perimount Magna aortic bioprosthesis.

    Science.gov (United States)

    Nakamura, Hiromasa; Yamaguchi, Hiroki; Takagaki, Masami; Kadowaki, Tasuku; Nakao, Tatsuya; Amano, Atsushi

    2015-03-01

    Severe patient-prosthesis mismatch, defined as effective orifice area index ≤0.65 cm(2) m(-2), has demonstrated poor long-term survival after aortic valve replacement. Reported rates of severe mismatch involving the Perimount Magna aortic bioprosthesis range from 4% to 20% in patients with a small annulus. Between June 2008 and August 2011, 251 patients (mean age 70.5 ± 10.2 years; mean body surface area 1.55 ± 0.19 m(2)) underwent aortic valve replacement with a Perimount Magna bioprosthesis, with or without concomitant procedures. We performed our procedure with rigorous patient-prosthesis matching to implant a valve appropriately sized to each patient, and carried out annular enlargement when a 19-mm valve did not fit. The bioprosthetic performance was evaluated by transthoracic echocardiography predischarge and at 1 and 2 years after surgery. Overall hospital mortality was 1.6%. Only 5 (2.0%) patients required annular enlargement. The mean follow-up period was 19.1 ± 10.7 months with a 98.4% completion rate. Predischarge data showed a mean effective orifice area index of 1.21 ± 0.20 cm(2) m(-2). Moderate mismatch, defined as effective orifice area index ≤0.85 cm(2) m(-2), developed in 4 (1.6%) patients. None developed severe mismatch. Data at 1 and 2 years showed only two cases of moderate mismatch; neither was severe. Rigorous patient-prosthesis matching maximized the performance of the Perimount Magna, and no severe mismatch resulted in this Japanese population of aortic valve replacement patients. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  13. Manufacturing cost study on the ion sources for the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    A study of the cost of manufacturing 48 ion sources for the Mirror Fusion Test Facility is described. The estimate is built up from individual part costs and assembly operation times for the 80 kV prototype source constructed by LLL and described by LLL drawings furnished during December 1978. Recommendations for cost reduction are made

  14. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  15. Advanced photon source low-energy undulator test line

    International Nuclear Information System (INIS)

    Milton, S.V.

    1997-01-01

    The injector system of the Advanced Photon Source (APS) consists of a linac capable of producing 450-MeV positrons or > 650-MeV electrons, a positron accumulator ring (PAR), and a booster synchrotron designed to accelerate particles to 7 GeV. There are long periods of time when these machines are not required for filling the main storage ring and instead can be used for synchrotron radiation research. We describe here an extension of the linac beam transport called the Low-Energy Undulator Test Line (LEUTL). The LEUTL will have a twofold purpose. The first is to fully characterize innovative, future generation undulators, some of which may prove difficult or impossible to measure by traditional techniques. These might include small-gap and superconducting undulators, very long undulators, undulators with designed-in internal focusing, and helical undulators. This technique also holds the promise of extending the magnetic measurement sensitivity beyond that presently attainable. This line will provide the capability to directly test undulators before their possible insertion into operating storage rings. A second use for the test line will be to investigate the generation of coherent radiation at wavelengths down to a few tens of nanometers

  16. 10 CFR 34.67 - Records of leak testing of sealed sources and devices containing depleted uranium.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Records of leak testing of sealed sources and devices containing depleted uranium. 34.67 Section 34.67 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL... Requirements § 34.67 Records of leak testing of sealed sources and devices containing depleted uranium. Each...

  17. Interplanetary electrons: what is the strength of the Jupiter source

    International Nuclear Information System (INIS)

    Fillius, W.; Ip, Wing-Huen; Knickerbocker, P.

    1977-01-01

    Because there is not enough information to support a rigorous answer, we use a phenomenological approach and conservative assumptions to address the source strength of Jupiter for interplanetary electrons. We estimate that Jupiter emits approximately 10 24 - 10 26 electrons s -1 of energy > 6 MeV, which source may be compared with the population of approximately 3 x 10 28 electrons of the same energy in Jupiter's outer magnetosphere. We conclude that Jupiter accelerates particles at a rate exceeding that of ordinary trapped particle dynamical processes. (author)

  18. Bringing scientific rigor to community-developed programs in Hong Kong

    Directory of Open Access Journals (Sweden)

    Fabrizio Cecilia S

    2012-12-01

    Full Text Available Abstract Background This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR. Methods The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Results Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. Conclusions The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  19. Desarrollo constitucional, legal y jurisprudencia del principio de rigor subsidiario

    Directory of Open Access Journals (Sweden)

    Germán Eduardo Cifuentes Sandoval

    2013-09-01

    Full Text Available In colombia the environment state administration is in charge of environmental national system, SINA, SINA is made up of states entities that coexist beneath a mixed organization of centralization and decentralization. SINA decentralization express itself in a administrative and territorial level, and is waited that entities that function under this structure act in a coordinated way in order to reach suggested objectives in the environmental national politicy. To achieve the coordinated environmental administration through entities that define the SINA, the environmental legislation of Colombia has include three basic principles: 1. The principle of “armorial regional” 2. The principle of “gradationnormative” 3. The principle of “rigorsubsidiaries”. These principles belong to the article 63, law 99 of 1933, and even in the case of the two first, it is possible to find equivalents in other norms that integrate the Colombian legal system, it does not happen in that way with the “ rigor subsidiaries” because its elements are uniques of the environmental normativity and do not seem to be similar to those that make part of the principle of “ subsidiaridad” present in the article 288 of the politic constitution. The “ rigor subsidiaries” give to decentralizates entities certain type of special ability to modify the current environmental legislation to defend the local ecological patrimony. It is an administrative ability with a foundation in the decentralization autonomy that allows to take place of the reglamentary denied of the legislative power with the condition that the new normativity be more demanding that the one that belongs to the central level

  20. Bringing scientific rigor to community-developed programs in Hong Kong.

    Science.gov (United States)

    Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M

    2012-12-31

    This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  1. Robust source and mask optimization compensating for mask topography effects in computational lithography.

    Science.gov (United States)

    Li, Jia; Lam, Edmund Y

    2014-04-21

    Mask topography effects need to be taken into consideration for a more accurate solution of source mask optimization (SMO) in advanced optical lithography. However, rigorous 3D mask models generally involve intensive computation and conventional SMO fails to manipulate the mask-induced undesired phase errors that degrade the usable depth of focus (uDOF) and process yield. In this work, an optimization approach incorporating pupil wavefront aberrations into SMO procedure is developed as an alternative to maximize the uDOF. We first design the pupil wavefront function by adding primary and secondary spherical aberrations through the coefficients of the Zernike polynomials, and then apply the conjugate gradient method to achieve an optimal source-mask pair under the condition of aberrated pupil. We also use a statistical model to determine the Zernike coefficients for the phase control and adjustment. Rigorous simulations of thick masks show that this approach provides compensation for mask topography effects by improving the pattern fidelity and increasing uDOF.

  2. An accelerated electrochemical MIC test for stainless alloys

    International Nuclear Information System (INIS)

    Gendron, T.S.; Cleland, R.D.

    1994-01-01

    Previous work in our laboratory and elsewhere has suggested that MIC of stainless steels and nickel-base alloys occurs in locally anaerobic regions that support the growth of sulfate reducing bacteria (SRB). The cathodic reaction is provided by oxygen reduction at remote sites. Such a coupling between anode and cathode is difficult to reproduce in the laboratory, but can be simulated indirectly using a double electrochemical cell, as in previous work. A more realistic simulation using a single aerated electrochemical cell has now been developed, in which a second organism (P. aeruginosa) is used to provide an anoxic habitat for SRB growth and possibly a source of organic carbon, within a layer of silt. A bare alloy electrode is used as the oxygen cathode. Tests of this kind using rigorous microbiological procedures have generated pitting corrosion of several alloys in low chloride media simulating freshwater heat exchanger conditions. Similar test procedures are applicable to other environments of interest to this symposium

  3. Controlled source electromagnetic data analysis with seismic constraints and rigorous uncertainty estimation in the Black Sea

    Science.gov (United States)

    Gehrmann, R. A. S.; Schwalenberg, K.; Hölz, S.; Zander, T.; Dettmer, J.; Bialas, J.

    2016-12-01

    In 2014 an interdisciplinary survey was conducted as part of the German SUGAR project in the Western Black Sea targeting gas hydrate occurrences in the Danube Delta. Marine controlled source electromagnetic (CSEM) data were acquired with an inline seafloor-towed array (BGR), and a two-polarization horizontal ocean-bottom source and receiver configuration (GEOMAR). The CSEM data are co-located with high-resolution 2-D and 3-D seismic reflection data (GEOMAR). We present results from 2-D regularized inversion (MARE2DEM by Kerry Key), which provides a smooth model of the electrical resistivity distribution beneath the source and multiple receivers. The 2-D approach includes seafloor topography and structural constraints from seismic data. We estimate uncertainties from the regularized inversion and compare them to 1-D Bayesian inversion results. The probabilistic inversion for a layered subsurface treats the parameter values and the number of layers as unknown by applying reversible-jump Markov-chain Monte Carlo sampling. A non-diagonal data covariance matrix obtained from residual error analysis accounts for correlated errors. The resulting resistivity models show generally high resistivity values between 3 and 10 Ωm on average which can be partly attributed to depleted pore water salinities due to sea-level low stands in the past, and locally up to 30 Ωm which is likely caused by gas hydrates. At the base of the gas hydrate stability zone resistivities rise up to more than 100 Ωm which could be due to gas hydrate as well as a layer of free gas underneath. However, the deeper parts also show the largest model parameter uncertainties. Archie's Law is used to derive estimates of the gas hydrate saturation, which vary between 30 and 80% within the anomalous layers considering salinity and porosity profiles from a distant DSDP bore hole.

  4. General-Purpose Heat Source Safety Verification Test program: Edge-on flyer plate tests

    International Nuclear Information System (INIS)

    George, T.G.

    1987-03-01

    The radioisotope thermoelectric generator (RTG) that will supply power for the Galileo and Ulysses space missions contains 18 General-Purpose Heat Source (GPHS) modules. The GPHS modules provide power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Each module contains four 238 PuO 2 -fueled clads and generates 250 W(t). Because the possibility of a launch vehicle explosion always exists, and because such an explosion could generate a field of high-energy fragments, the fueled clads within each GPHS module must survive fragment impact. The edge-on flyer plate tests were included in the Safety Verification Test series to provide information on the module/clad response to the impact of high-energy plate fragments. The test results indicate that the edge-on impact of a 3.2-mm-thick, aluminum-alloy (2219-T87) plate traveling at 915 m/s causes the complete release of fuel from capsules contained within a bare GPHS module, and that the threshold velocity sufficient to cause the breach of a bare, simulant-fueled clad impacted by a 3.5-mm-thick, aluminum-alloy (5052-T0) plate is approximately 140 m/s

  5. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  6. Safety Test Program Summary SNAP 19 Pioneer Heat Source Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1971-07-01

    Sixteen heat source assemblies have been tested in support of the SNAP 19 Pioneer Safety Test Program. Seven were subjected to simulated reentry heating in various plasma arc facilities followed by impact on earth or granite. Six assemblies were tested under abort accident conditions of overpressure, shrapnel impact, and solid and liquid propellant fires. Three capsules were hot impacted under Transit capsule impact conditions to verify comparability of test results between the two similar capsule designs, thus utilizing both Pioneer and Transit Safety Test results to support the Safety Analysis Report for Pioneer. The tests have shown the fuel is contained under all nominal accident environments with the exception of minor capsule cracks under severe impact and solid fire environments. No catastrophic capsule failures occurred in this test which would release large quantities of fuel. In no test was fuel visible to the eye following impact or fire. Breached capsules were defined as those which exhibit thoria contamination on its surface following a test, or one which exhibited visible cracks in the post test metallographic analyses.

  7. Detailed design of the RF source for the 1 MV neutral beam test facility

    International Nuclear Information System (INIS)

    Marcuzzi, D.; Palma, M. Dalla; Pavei, M.; Heinemann, B.; Kraus, W.; Riedl, R.

    2009-01-01

    In the framework of the EU activities for the development of the Neutral Beam Injector for ITER, the detailed design of the Radio Frequency (RF) driven negative ion source to be installed in the 1 MV ITER Neutral Beam Test Facility (NBTF) has been carried out. Results coming from ongoing R and D on IPP test beds [A. Staebler et al., Development of a RF-Driven Ion Source for the ITER NBI System, this conference] and the design of the new ELISE facility [B. Heinemann et al., Design of the Half-Size ITER Neutral Beam Source Test Facility ELISE, this conference] brought several modifications to the solution based on the previous design. An assessment was carried out regarding the Back-Streaming positive Ions (BSI+) that impinge on the back plates of the ion source and cause high and localized heat loads. This led to the redesign of most heated components to increase cooling, and to different choices for the plasma facing materials to reduce the effects of sputtering. The design of the electric circuit, gas supply and the other auxiliary systems has been optimized. Integration with other components of the beam source has been revised, with regards to the interfaces with the supporting structure, the plasma grid and the flexible connections. In the paper the design will be presented in detail, as well as the results of the analyses performed for the thermo-mechanical verification of the components.

  8. Long-term storage life of light source modules by temperature cycling accelerated life test

    International Nuclear Information System (INIS)

    Sun Ningning; Tan Manqing; Li Ping; Jiao Jian; Guo Xiaofeng; Guo Wentao

    2014-01-01

    Light source modules are the most crucial and fragile devices that affect the life and reliability of the interferometric fiber optic gyroscope (IFOG). While the light emitting chips were stable in most cases, the module packaging proved to be less satisfactory. In long-term storage or the working environment, the ambient temperature changes constantly and thus the packaging and coupling performance of light source modules are more likely to degrade slowly due to different materials with different coefficients of thermal expansion in the bonding interface. A constant temperature accelerated life test cannot evaluate the impact of temperature variation on the performance of a module package, so the temperature cycling accelerated life test was studied. The main failure mechanism affecting light source modules is package failure due to solder fatigue failure including a fiber coupling shift, loss of cooling efficiency and thermal resistor degradation, so the Norris-Landzberg model was used to model solder fatigue life and determine the activation energy related to solder fatigue failure mechanism. By analyzing the test data, activation energy was determined and then the mean life of light source modules in different storage environments with a continuously changing temperature was simulated, which has provided direct reference data for the storage life prediction of IFOG. (semiconductor devices)

  9. The influence of low temperature, type of muscle and electrical stimulation on the course of rigor mortis, ageing and tenderness of beef muscles.

    Science.gov (United States)

    Olsson, U; Hertzman, C; Tornberg, E

    1994-01-01

    The course of rigor mortis, ageing and tenderness have been evaluated for two beef muscles, M. semimembranosus (SM) and M. longissimus dorsi (LD), when entering rigor at constant temperatures in the cold-shortening region (1, 4, 7 and 10°C). The influence of electrical stimulation (ES) was also examined. Post-mortem changes were registered by shortening and isometric tension and by following the decline of pH, ATP and creatine phosphate. The effect of ageing on tenderness was recorded by measuring shear-force (2, 8 and 15 days post mortem) and the sensory properties were assessed 15 days post mortem. It was found that shortening increased with decreasing temperature, resulting in decreased tenderness. Tenderness for LD, but not for SM, was improved by ES at 1 and 4°C, whereas ES did not give rise to any decrease in the degree of shortening during rigor mortis development. This suggests that ES influences tenderization more than it prevents cold-shortening. The samples with a pre-rigor mortis temperature of 1°C could not be tenderized, when stored up to 15 days, whereas this was the case for the muscles entering rigor mortis at the other higher temperatures. The results show that under the conditions used in this study, the course of rigor mortis is more important for the ultimate tenderness than the course of ageing. Copyright © 1994. Published by Elsevier Ltd.

  10. Deuterium results at the negative ion source test facility ELISE

    Science.gov (United States)

    Kraus, W.; Wünderlich, D.; Fantz, U.; Heinemann, B.; Bonomo, F.; Riedl, R.

    2018-05-01

    The ITER neutral beam system will be equipped with large radio frequency (RF) driven negative ion sources, with a cross section of 0.9 m × 1.9 m, which have to deliver extracted D- ion beams of 57 A at 1 MeV for 1 h. On the extraction from a large ion source experiment test facility, a source of half of this size is being operational since 2013. The goal of this experiment is to demonstrate a high operational reliability and to achieve the extracted current densities and beam properties required for ITER. Technical improvements of the source design and the RF system were necessary to provide reliable operation in steady state with an RF power of up to 300 kW. While in short pulses the required D- current density has almost been reached, the performance in long pulses is determined in particular in Deuterium by inhomogeneous and unstable currents of co-extracted electrons. By application of refined caesium evaporation and distribution procedures, and reduction and symmetrization of the electron currents, considerable progress has been made and up to 190 A/m2 D-, corresponding to 66% of the value required for ITER, have been extracted for 45 min.

  11. Performance Test of the Microwave Ion Source with the Multi-layer DC Break

    International Nuclear Information System (INIS)

    Kim, Dae Il; Kwon, Hyeok Jung; Kim, Han Sung; Seol, Kyung Tae; Cho, Yong Sub

    2012-01-01

    A microwave proton source has been developed as a proton injector for the 100-MeV proton linac of the PEFP (Proton Engineering Frontier Project). On microwave ion source, the high voltage for the beam extraction is applied to the plasma chamber, also to the microwave components such as a 2.45GHz magnetron, a 3-stub tuner, waveguides. If microwave components can be installed on ground side, the microwave ion source can be operated and maintained easily. For the purpose, the multi-layer DC break has been developed. A multi-layer insulation has the arrangement of conductors and insulators as shown in the Fig. 1. For the purpose of stable operation as the multi-layer DC break, we checked the radiation of the insulator depending on materials and high voltage test of a fabricated multi-layer insulation. In this report, the details of performance test of the multi-layer DC break will be presented

  12. Rigor index, fillet yield and proximate composition of cultured striped catfish (Pangasianodon hypophthalmus for its suitability in processing industries in Bangladesh

    Directory of Open Access Journals (Sweden)

    Salma Noor-E Islami

    2014-12-01

    Full Text Available Rigor-index in market-size striped catfish (Pangasianodon hypophthalmus, locally called Thai-Pangas was determined to assess fillet yield for production of value-added products. In whole fish, rigor started within 1 hr after death under both iced and room temperature conditions while rigor-index reached a maximum of 72.23% within 8 hr and 85.5% within 5 hr at room temperature and iced condition, respectively, which was fully relaxed after 22 hr under both storage conditions. Post-mortem muscle pH decreased to 6.8 after 2 hr, 6.2 after 8 hr and sharp increase to 6.9 after 9 hr. There was a positive correlation between rigor progress and pH shift in fish fillets. Hand filleting was done post-rigor and fillet yield experiment showed 50.4±2.1% fillet, 8.0±0.2% viscera, 8.0±1.3% skin and 32.0±3.2% carcass could be obtained from Thai-Pangas. Proximate composition analysis of four regions of Thai-Pangas viz., head region, middle region, tail region and viscera revealed moisture 78.36%, 81.14%, 81.45% and 57.33%; protein 15.83%, 15.97%, 16.14% and 17.20%; lipid 4.61%, 1.82%, 1.32% and 24.31% and ash 1.09%, 0.96%, 0.95% and 0.86%, respectively indicating suitability of Thai-Pangas for production of value-added products such as fish fillets.

  13. Influência do estresse causado pelo transporte e método de abate sobre o rigor mortis do tambaqui (Colossoma macropomum

    Directory of Open Access Journals (Sweden)

    Joana Maia Mendes

    2015-06-01

    Full Text Available ResumoO presente trabalho avaliou a influência do estresse pré-abate e do método de abate sobre o rigor mortis do tambaqui durante armazenamento em gelo. Foram estudadas respostas fisiológicas do tambaqui ao estresse durante o pré-abate, que foi dividido em quatro etapas: despesca, transporte, recuperação por 24 h e por 48 h. Ao final de cada etapa, os peixes foram amostrados para caracterização do estresse pré-abate por meio de análises dos parâmetros plasmáticos de glicose, lactato e amônia e, em seguida, os peixes foram abatidos por hipotermia ou por asfixia com gás carbônico para o estudo do rigor mortis. Verificou-se que o estado fisiológico de estresse dos peixes foi mais agudo logo após o transporte, implicando numa entrada em rigor mortis mais rápida: 60 minutos para tambaquis abatidos por hipotermia e 120 minutos para tambaquis abatidos por asfixia com gás carbônico. Nos viveiros, os peixes abatidos logo após a despesca apresentaram estado de estresse intermediário, sem diferença no tempo de entrada em rigor mortis em relação ao método de abate (135 minutos. Os peixes que passaram por recuperação ao estresse causado pelo transporte em condições simuladas de indústria apresentaram entrada em rigor mortis mais tardia: 225 minutos (com 24 h de recuperação e 255 minutos (com 48 h de recuperação, igualmente sem diferença em relação aos métodos de abate testados. A resolução do rigor mortis foi mais rápida nos peixes abatidos após o transporte, que foi de 12 dias. Nos peixes abatidos logo após a despesca, a resolução ocorreu com 16 dias e, nos peixes abatidos após recuperação, com 20 dias para 24 h de recuperação ao estresse pré-abate e 24 dias para 48 h de recuperação, sem influência do método de abate na resolução do rigor mortis. Assim, é desejável que o abate do tambaqui destinado à indústria seja feito após período de recuperação ao estresse, com vistas a aumentar sua

  14. Lifetime test on a high-performance dc microwave proton source

    International Nuclear Information System (INIS)

    Sherman, J.D.; Hodgkins, D.J.; Lara, P.D.; Schneider, J.D.; Stevens, R.R. Jr.

    1995-01-01

    Powerful CW proton linear accelerators (100 mA at 0.5--1 GeV) are being proposed for spallation neutron source applications.These production accelerators require high availability and reliability. A microwave proton source, which has already demonstrated several key beam requirements, was operated for one week (170 hours) in a dc mode to test the reliability and lifetime of its plasma generator. The source was operated with 570 W of microwave (2.45 GHz) discharge power and with a 47-kV extraction voltage. This choice of operating parameters gave a proton current density of 250-mA/cm 2 at 83% proton fraction, which is sufficient for a conservative dc injector design. The beam current was 60--65 mA over most of the week, and was sufficiently focused for RFQ injection. Total beam availability, defined as 47-keV beam-on time divided by elapsed time, was 96.2%. Spark downs in the high voltage column and a gas flow control problem caused all the downtime; no plasma generator failures were observed

  15. Rigorous Performance Evaluation of Smartphone GNSS/IMU Sensors for ITS Applications

    Directory of Open Access Journals (Sweden)

    Vassilis Gikas

    2016-08-01

    Full Text Available With the rapid growth in smartphone technologies and improvement in their navigation sensors, an increasing amount of location information is now available, opening the road to the provision of new Intelligent Transportation System (ITS services. Current smartphone devices embody miniaturized Global Navigation Satellite System (GNSS, Inertial Measurement Unit (IMU and other sensors capable of providing user position, velocity and attitude. However, it is hard to characterize their actual positioning and navigation performance capabilities due to the disparate sensor and software technologies adopted among manufacturers and the high influence of environmental conditions, and therefore, a unified certification process is missing. This paper presents the analysis results obtained from the assessment of two modern smartphones regarding their positioning accuracy (i.e., precision and trueness capabilities (i.e., potential and limitations based on a practical but rigorous methodological approach. Our investigation relies on the results of several vehicle tracking (i.e., cruising and maneuvering tests realized through comparing smartphone obtained trajectories and kinematic parameters to those derived using a high-end GNSS/IMU system and advanced filtering techniques. Performance testing is undertaken for the HTC One S (Android and iPhone 5s (iOS. Our findings indicate that the deviation of the smartphone locations from ground truth (trueness deteriorates by a factor of two in obscured environments compared to those derived in open sky conditions. Moreover, it appears that iPhone 5s produces relatively smaller and less dispersed error values compared to those computed for HTC One S. Also, the navigation solution of the HTC One S appears to adapt faster to changes in environmental conditions, suggesting a somewhat different data filtering approach for the iPhone 5s. Testing the accuracy of the accelerometer and gyroscope sensors for a number of

  16. Manufacturing, assembly and tests of SPIDER Vacuum Vessel to develop and test a prototype of ITER neutral beam ion source

    Energy Technology Data Exchange (ETDEWEB)

    Zaccaria, Pierluigi, E-mail: pierluigi.zaccaria@igi.cnr.it [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete S.p.A.), Padova (Italy); Valente, Matteo; Rigato, Wladi; Dal Bello, Samuele; Marcuzzi, Diego; Agostini, Fabio Degli; Rossetto, Federico; Tollin, Marco [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete S.p.A.), Padova (Italy); Masiello, Antonio [Fusion for Energy F4E, Barcelona (Spain); Corniani, Giorgio; Badalocchi, Matteo; Bettero, Riccardo; Rizzetto, Dario [Ettore Zanon S.p.A., Schio (VI) (Italy)

    2015-10-15

    Highlights: • The SPIDER experiment aims to qualify and optimize the ion source for ITER injectors. • The large SPIDER Vacuum Vessel was built and it is under testing at the supplier. • The main working and assembly steps for production are presented in the paper. - Abstract: The SPIDER experiment (Source for the Production of Ions of Deuterium Extracted from an RF plasma) aims to qualify and optimize the full size prototype of the negative ion source foreseen for MITICA (full size ITER injector prototype) and the ITER Heating and Current Drive Injectors. Both SPIDER and MITICA experiments are presently under construction at Consorzio RFX in Padova (I), with the financial support from IO (ITER Organization), Fusion for Energy, Italian research institutions and contributions from Japan and India Domestic Agencies. The vacuum vessel hosting the SPIDER in-vessel components (Beam Source and calorimeters) has been manufactured, assembled and tested during the last two years 2013–2014. The cylindrical vessel, about 6 m long and 4 m in diameter, is composed of two cylindrical modules and two torispherical lids at the ends. All the parts are made by AISI 304 L stainless steel. The possibility of opening/closing the vessel for monitoring, maintenance or modifications of internal components is guaranteed by bolted junctions and suitable movable support structures running on rails fixed to the building floor. A large number of ports, about one hundred, are present on the vessel walls for diagnostic and service purposes. The main working steps for construction and specific technological issues encountered and solved for production are presented in the paper. Assembly sequences and tests on site are furthermore described in detail, highlighting all the criteria and requirements for correct positioning and testing of performances.

  17. Manufacturing, assembly and tests of SPIDER Vacuum Vessel to develop and test a prototype of ITER neutral beam ion source

    International Nuclear Information System (INIS)

    Zaccaria, Pierluigi; Valente, Matteo; Rigato, Wladi; Dal Bello, Samuele; Marcuzzi, Diego; Agostini, Fabio Degli; Rossetto, Federico; Tollin, Marco; Masiello, Antonio; Corniani, Giorgio; Badalocchi, Matteo; Bettero, Riccardo; Rizzetto, Dario

    2015-01-01

    Highlights: • The SPIDER experiment aims to qualify and optimize the ion source for ITER injectors. • The large SPIDER Vacuum Vessel was built and it is under testing at the supplier. • The main working and assembly steps for production are presented in the paper. - Abstract: The SPIDER experiment (Source for the Production of Ions of Deuterium Extracted from an RF plasma) aims to qualify and optimize the full size prototype of the negative ion source foreseen for MITICA (full size ITER injector prototype) and the ITER Heating and Current Drive Injectors. Both SPIDER and MITICA experiments are presently under construction at Consorzio RFX in Padova (I), with the financial support from IO (ITER Organization), Fusion for Energy, Italian research institutions and contributions from Japan and India Domestic Agencies. The vacuum vessel hosting the SPIDER in-vessel components (Beam Source and calorimeters) has been manufactured, assembled and tested during the last two years 2013–2014. The cylindrical vessel, about 6 m long and 4 m in diameter, is composed of two cylindrical modules and two torispherical lids at the ends. All the parts are made by AISI 304 L stainless steel. The possibility of opening/closing the vessel for monitoring, maintenance or modifications of internal components is guaranteed by bolted junctions and suitable movable support structures running on rails fixed to the building floor. A large number of ports, about one hundred, are present on the vessel walls for diagnostic and service purposes. The main working steps for construction and specific technological issues encountered and solved for production are presented in the paper. Assembly sequences and tests on site are furthermore described in detail, highlighting all the criteria and requirements for correct positioning and testing of performances.

  18. An Analysis of COSPA – A Consortium for Open Source in the Public Administration

    OpenAIRE

    Morgan, Lorraine

    2005-01-01

    peer-reviewed This paper reflects on a two-year EU funded specific research targeted project that officially began in January 2004 entitled COSPA, a Consortium for studying, evaluating and supporting the introduction of Open Source Software and Open Data Standards in the Public Administration. COSPA focuses on office automation and desktop system software and aims at rigorously measuring the effort, costs and benefits of a transition to Open Source. The project invo...

  19. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  20. A Rigorous Treatment of Energy Extraction from a Rotating Black Hole

    Science.gov (United States)

    Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.

    2009-05-01

    The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.

  1. From the Kirsch-Kress potential method via the range test to the singular sources method

    International Nuclear Information System (INIS)

    Potthast, R; Schulz, J

    2005-01-01

    We review three reconstruction methods for inverse obstacle scattering problems. We will analyse the relation between the Kirsch-Kress potential method 1986, the range test of Kusiak, Potthast and Sylvester (2003) and the singular sources method of Potthast (2000). In particular, we show that the range test is a logical extension of the Kirsch-Kress method into the category of sampling methods employing the tool of domain sampling. Then we will show how a multi-wave version of the range test can be set up and we will work out its relation to the singular sources method. Numerical examples and demonstrations will be provided

  2. AGN outflows as neutrino sources: an observational test

    Science.gov (United States)

    Padovani, P.; Turcati, A.; Resconi, E.

    2018-04-01

    We test the recently proposed idea that outflows associated with Active Galactic Nuclei (AGN) could be neutrino emitters in two complementary ways. First, we cross-correlate a list of 94 "bona fide" AGN outflows with the most complete and updated repository of IceCube neutrinos currently publicly available, assembled by us for this purpose. It turns out that AGN with outflows matched to an IceCube neutrino have outflow and kinetic energy rates, and bolometric powers larger than those of AGN with outflows not matched to neutrinos. Second, we carry out a statistical analysis on a catalogue of [O III] λ5007 line profiles using a sample of 23,264 AGN at z values (˜6 and 18 per cent respectively, pre-trial) for relatively high velocities and luminosities. Our results are consistent with a scenario where AGN outflows are neutrino emitters but at present do not provide a significant signal. This can be tested with better statistics and source stacking. A predominant role of AGN outflows in explaining the IceCube data appears in any case to be ruled out.

  3. Dependence of the source performance on plasma parameters at the BATMAN test facility

    Science.gov (United States)

    Wimmer, C.; Fantz, U.

    2015-04-01

    The investigation of the dependence of the source performance (high jH-, low je) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H-, its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H- density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa).

  4. Dependence of the source performance on plasma parameters at the BATMAN test facility

    International Nuclear Information System (INIS)

    Wimmer, C.; Fantz, U.

    2015-01-01

    The investigation of the dependence of the source performance (high j H − , low j e ) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H − , its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H − density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa)

  5. Procedure to carry out leakage test in beta radiation sealed sources emitters of 90Sr/90Y

    International Nuclear Information System (INIS)

    Alvarez R, J. T.

    2010-09-01

    In the alpha-beta room of the Secondary Laboratory of Dosimetric Calibration of the Metrology Department of Ionizing Radiations ophthalmic applicators are calibrated in absorbed dose terms in water D w ; these applicators, basically are emitter sealed sources of pure beta radiation of 90 Sr / 90 Y. Concretely, the laboratory quality system indicates to use the established procedure for the calibration of these sources, which establishes the requirement of to carry out a leakage test, before to calibrate the source. However, in the Laboratory leakage test certificates sent by specialized companies in radiological protection services have been received, in which are used gamma spectrometry equipment s for beta radiation leakage tests, since it is not reliable to detect pure beta radiation with a scintillating detector with NaI crystal, (because it could detect the braking radiation produced in the detector). Therefore the Laboratory has had to verify the results of the tests with a correct technique, with the purpose of determining the presence of sources with their altered integrity and radioactive material leakage. The objective of this work is to describe a technique for beta activity measurement - of the standard ISO 7503, part 1 (1988) - and its application with a detector Gm plane (type pankage) in the realization of leakage tests in emitter sources of pure beta radiation, inside the mark of quality assurance indicated by the report ICRU 76. (Author)

  6. Prototype tests on the ion source power supplies of the TEXTOR NI-system

    International Nuclear Information System (INIS)

    Goll, O.; Braunsberger, U.; Schwarz, U.

    1987-01-01

    The PINI ion source for the TEXTOR neutral injector is fed by a new modular transistorized power supply. All modules are located in a high voltage cage on 55 kV dc against ground. The normal operation of the injectors includes frequent grid breakdowns causing transient high voltage stresses on the ion source power supplies. These stresses must not disturb the safe operation of the power supplies. The paper describes the set up for extensive testing of a supply prototype module under the expected operating conditions. The main features of this test program are reviewed and the measures taken for a safe operation are discussed. As a result of the investigations, recommendations for the installation of the power supplies at the TEXTOR NI system are given

  7. Seizing the Future: How Ohio's Career-Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Careers

    Science.gov (United States)

    Guarino, Heidi; Yoder, Shaun

    2015-01-01

    "Seizing the Future: How Ohio's Career and Technical Education Programs Fuse Academic Rigor and Real-World Experiences to Prepare Students for College and Work," demonstrates Ohio's progress in developing strong policies for career and technical education (CTE) programs to promote rigor, including college- and career-ready graduation…

  8. Chemical Explosion Experiments to Improve Nuclear Test Monitoring - Developing a New Paradigm for Nuclear Test Monitoring with the Source Physics Experiments (SPE)

    International Nuclear Information System (INIS)

    Snelson, Catherine M.; Abbott, Robert E.; Broome, Scott T.; Mellors, Robert J.; Patton, Howard J.; Sussman, Aviva J.; Townsend, Margaret J.; Walter, William R.

    2013-01-01

    A series of chemical explosions, called the Source Physics Experiments (SPE), is being conducted under the auspices of the U.S. Department of Energy's National Nuclear Security Administration (NNSA) to develop a new more physics-based paradigm for nuclear test monitoring. Currently, monitoring relies on semi-empirical models to discriminate explosions from earthquakes and to estimate key parameters such as yield. While these models have been highly successful monitoring established test sites, there is concern that future tests could occur in media and at scale depths of burial outside of our empirical experience. This is highlighted by North Korean tests, which exhibit poor performance of a reliable discriminant, mb:Ms (Selby et al., 2012), possibly due to source emplacement and differences in seismic responses for nascent and established test sites. The goal of SPE is to replace these semi-empirical relationships with numerical techniques grounded in a physical basis and thus applicable to any geologic setting or depth

  9. Effect of muscle restraint on sheep meat tenderness with rigor mortis at 18°C.

    Science.gov (United States)

    Devine, Carrick E; Payne, Steven R; Wells, Robyn W

    2002-02-01

    The effect on shear force of skeletal restraint and removing muscles from lamb m. longissimus thoracis et lumborum (LT) immediately after slaughter and electrical stimulation was undertaken at a rigor temperature of 18°C (n=15). The temperature of 18°C was achieved through chilling of electrically stimulated sheep carcasses in air at 12°C, air flow 1-1.5 ms(-2). In other groups, the muscle was removed at 2.5 h post-mortem and either wrapped or left non-wrapped before being placed back on the carcass to follow carcass cooling regimes. Following rigor mortis, the meat was aged for 0, 16, 40 and 65 h at 15°C and frozen. For the non-stimulated samples, the meat was aged for 0, 12, 36 and 60 h before being frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1 × 1 cm cross-section. Commencement of ageing was considered to take place at rigor mortis and this was taken as zero aged meat. There were no significant differences in the rate of tenderisation and initial shear force for all treatments. The 23% cook loss was similar for all wrapped and non-wrapped situations and the values decreased slightly with longer ageing durations. Wrapping was shown to mimic meat left intact on the carcass, as it prevented significant prerigor shortening. Such techniques allows muscles to be removed and placed in a controlled temperature environment to enable precise studies of ageing processes.

  10. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics-Three Decades of High-Quality, Technically-Rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high-quality, technically-rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards contain testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards are used to generate accurate, reliable, repeatable and complete data. Within Committee C28, users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, 50 standards since the Committee's founding in 1986. This paper provides a detailed retrospective of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of standards for advanced ceramics to demonstrate their practical applications.

  11. ASTM Committee C28: International Standards for Properties and Performance of Advanced Ceramics, Three Decades of High-quality, Technically-rigorous Normalization

    Science.gov (United States)

    Jenkins, Michael G.; Salem, Jonathan A.

    2016-01-01

    Physical and mechanical properties and performance of advanced ceramics and glasses are difficult to measure correctly without the proper techniques. For over three decades, ASTM Committee C28 on Advanced Ceramics, has developed high quality, rigorous, full-consensus standards (e.g., test methods, practices, guides, terminology) to measure properties and performance of monolithic and composite ceramics that may be applied to glasses in some cases. These standards testing particulars for many mechanical, physical, thermal, properties and performance of these materials. As a result these standards provide accurate, reliable, repeatable and complete data. Within Committee C28 users, producers, researchers, designers, academicians, etc. have written, continually updated, and validated through round-robin test programs, nearly 50 standards since the Committees founding in 1986. This paper provides a retrospective review of the 30 years of ASTM Committee C28 including a graphical pictogram listing of C28 standards along with examples of the tangible benefits of advanced ceramics standards to demonstrate their practical applications.

  12. Rigorous time slicing approach to Feynman path integrals

    CERN Document Server

    Fujiwara, Daisuke

    2017-01-01

    This book proves that Feynman's original definition of the path integral actually converges to the fundamental solution of the Schrödinger equation at least in the short term if the potential is differentiable sufficiently many times and its derivatives of order equal to or higher than two are bounded. The semi-classical asymptotic formula up to the second term of the fundamental solution is also proved by a method different from that of Birkhoff. A bound of the remainder term is also proved. The Feynman path integral is a method of quantization using the Lagrangian function, whereas Schrödinger's quantization uses the Hamiltonian function. These two methods are believed to be equivalent. But equivalence is not fully proved mathematically, because, compared with Schrödinger's method, there is still much to be done concerning rigorous mathematical treatment of Feynman's method. Feynman himself defined a path integral as the limit of a sequence of integrals over finite-dimensional spaces which is obtained by...

  13. Test and evaluation of the Navy half-watt RTG

    International Nuclear Information System (INIS)

    Rosell, F.E. Jr.; Eggers, P.E.; Gawthrop, W.E.; Rouklove, P.G.; Truscello, V.C.

    1976-01-01

    In the autumn of 1975 the Navy took delivery of eight small-sized, plutonium-fueled radioisotope thermoelectric generators (RTGs) from four contractors (each contractor provided two RTGs). The purpose of these demonstration models is to prove conclusively that it is possible with state-of-the-art technology and materials to produce a super-battery with a 15-year life for use in the form of distributed power sources for remote undersea applications. It is easy to determine the RTG's beginning-of-life performance by actual measurements, but to forecast accurately its end-of-life performance requires rigorous determination of its reliability. This article discusses the test and evaluation program (TEP) used and the results obtained in determining that reliability. The TEP is divided into three general areas: mechanical-electrical, thermochemical-physical, and thermoelectric

  14. Development and tests of molybdenum armored copper components for MITICA ion source

    Science.gov (United States)

    Pavei, Mauro; Böswirth, Bernd; Greuner, Henri; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo

    2016-02-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  15. Development and tests of molybdenum armored copper components for MITICA ion source

    International Nuclear Information System (INIS)

    Pavei, Mauro; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo; Böswirth, Bernd; Greuner, Henri

    2016-01-01

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results

  16. Development and tests of molybdenum armored copper components for MITICA ion source

    Energy Technology Data Exchange (ETDEWEB)

    Pavei, Mauro, E-mail: mauro.pavei@igi.cnr.it; Marcuzzi, Diego; Rizzolo, Andrea; Valente, Matteo [Consorzio RFX, Corso Stati Uniti, 4, I-35127 Padova (Italy); Böswirth, Bernd; Greuner, Henri [Max-Planck-Institut für Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-02-15

    In order to prevent detrimental material erosion of components impinged by back-streaming positive D or H ions in the megavolt ITER injector and concept advancement beam source, a solution based on explosion bonding technique has been identified for producing a 1 mm thick molybdenum armour layer on copper substrate, compatible with ITER requirements. Prototypes have been recently manufactured and tested in the high heat flux test facility Garching Large Divertor Sample Test Facility (GLADIS) to check the capability of the molybdenum-copper interface to withstand several thermal shock cycles at high power density. This paper presents both the numerical fluid-dynamic analyses of the prototypes simulating the test conditions in GLADIS as well as the experimental results.

  17. Leakage test evaluation used for qualification of iodine-125 seeds sealing

    International Nuclear Information System (INIS)

    Feher, Anselmo; Rostelato, Maria E.C.M.; Zeituni, Carlos A.; Calvo, Wilson A.P.; Somessari, Samir L.; Moura, Joao A.; Moura, Eduardo S.; Souza, Carla D.; Rela, Paulo R.

    2009-01-01

    The prostate cancer is a problem of public health in Brazil, and the second cause of cancer deaths in men, exceeded only by lung cancer. Among the possible treatments available for prostate cancer is brachytherapy, in which small seeds containing Iodine-125 radioisotope are implanted in the prostate. The seed consists of a sealed titanium tube measuring 0.8 mm external diameter and 4.5 mm in length, containing a central silver wire with adsorbed Iodine-125. The tube sealing is made with titanium at the ends, using electric arc welding or laser process. This sealing must be leakage-resistant and free of cracks, therefore avoiding the Iodine-125 to deposit in the silver wire to escape and spread into the human body. To ensure this problem does not occur, rigorous leakage tests, in accordance with the standard Radiation protection - Sealed Radioactive Sources - leakage Test Methods - ISO 9978, should be applied. The aim of this study is to determine, implement and evaluate the leakage test to be used in the Iodine-125 seeds production, in order to qualify the sealing procedure. The standard ISO 9978 presents a list of tests to be carried out according to the type of source. The preferential methods for brachytherapy sources are soaking and helium. To assess the seeds leakage, the method of immersion test at room temperature was applied. The seeds are considered leakage-free if the detected activity does not exceed the 185 Bq (5 nCi). An Iodine standard was prepared and its value determined in a sodium iodide detector. A liquid scintillation counter was calibrated with the standard for seeds leakage tests. Forty-eight seeds were welded for these tests. (author)

  18. Biclustering via optimal re-ordering of data matrices in systems biology: rigorous methods and comparative studies

    Directory of Open Access Journals (Sweden)

    Feng Xiao-Jiang

    2008-10-01

    Full Text Available Abstract Background The analysis of large-scale data sets via clustering techniques is utilized in a number of applications. Biclustering in particular has emerged as an important problem in the analysis of gene expression data since genes may only jointly respond over a subset of conditions. Biclustering algorithms also have important applications in sample classification where, for instance, tissue samples can be classified as cancerous or normal. Many of the methods for biclustering, and clustering algorithms in general, utilize simplified models or heuristic strategies for identifying the "best" grouping of elements according to some metric and cluster definition and thus result in suboptimal clusters. Results In this article, we present a rigorous approach to biclustering, OREO, which is based on the Optimal RE-Ordering of the rows and columns of a data matrix so as to globally minimize the dissimilarity metric. The physical permutations of the rows and columns of the data matrix can be modeled as either a network flow problem or a traveling salesman problem. Cluster boundaries in one dimension are used to partition and re-order the other dimensions of the corresponding submatrices to generate biclusters. The performance of OREO is tested on (a metabolite concentration data, (b an image reconstruction matrix, (c synthetic data with implanted biclusters, and gene expression data for (d colon cancer data, (e breast cancer data, as well as (f yeast segregant data to validate the ability of the proposed method and compare it to existing biclustering and clustering methods. Conclusion We demonstrate that this rigorous global optimization method for biclustering produces clusters with more insightful groupings of similar entities, such as genes or metabolites sharing common functions, than other clustering and biclustering algorithms and can reconstruct underlying fundamental patterns in the data for several distinct sets of data matrices arising

  19. Modular Electric Propulsion Test Bed Aircraft, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An all electric aircraft test bed is proposed to provide a dedicated development environment for the rigorous study and advancement of electrically powered aircraft....

  20. Journal of Open Source Software (JOSS): design and first-year review

    Science.gov (United States)

    Smith, Arfon M.

    2018-01-01

    JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.

  1. Source-to-incident flux relation for a tokamak fusion test reactor blanket module

    International Nuclear Information System (INIS)

    Imel, G.R.

    1982-01-01

    The source-to-incident 14-MeV flux relation for a blanket module on the Tokamak Fusion Test Reactor is derived. It is shown that assumptions can be made that allow an analytical expression to be derived, using point kernel methods. In addition, the effect of a nonuniform source distribution is derived, again by relatively simple point kernel methods. It is thought that the methodology developed is valid for a variety of blanket modules on tokamak reactors

  2. Rigorous modelling of light's intensity angular-profile in Abbe refractometers with absorbing homogeneous fluids

    International Nuclear Information System (INIS)

    García-Valenzuela, A; Contreras-Tello, H; Márquez-Islas, R; Sánchez-Pérez, C

    2013-01-01

    We derive an optical model for the light intensity distribution around the critical angle in a standard Abbe refractometer when used on absorbing homogenous fluids. The model is developed using rigorous electromagnetic optics. The obtained formula is very simple and can be used suitably in the analysis and design of optical sensors relying on Abbe type refractometry.

  3. A rigorous phenomenological analysis of the ππ scattering lengths

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.; Sararu, M.

    1979-11-01

    The constraining power of the present experimental data, combined with the general theoretical knowledge about ππ scattering, upon the scattering lengths of this process, is investigated by means of a rigorous functional method. We take as input the experimental phase shifts and make no hypotheses about the high energy behaviour of the amplitudes, using only absolute bounds derived from axiomatic field theory and exact consequences of crossing symmetry. In the simplest application of the method, involving only the π 0 π 0 S-wave, we explored numerically a number of values proposed by various authors for the scattering lengths a 0 and a 2 and found that no one appears to be especially favoured. (author)

  4. A new H2+ source: Conceptual study and experimental test of an upgraded version of the VIS—Versatile ion source

    Science.gov (United States)

    Castro, G.; Torrisi, G.; Celona, L.; Mascali, D.; Neri, L.; Sorbello, G.; Leonardi, O.; Patti, G.; Castorina, G.; Gammino, S.

    2016-08-01

    The versatile ion source is an off-resonance microwave discharge ion source which produces a slightly overdense plasma at 2.45 GHz of pumping wave frequency extracting more than 60 mA proton beams and 50 mA He+ beams. DAEδALUS and IsoDAR experiments require high intensities for H2+ beams to be accelerated by high power cyclotrons for neutrinos generation. In order to fulfill the new requirements, a new plasma chamber and injection system has been designed and manufactured for increasing the H2+ beam intensity. In this paper the studies for the increasing of the H2+/p ratio and for the design of the new plasma chamber and injection system will be shown and discussed together with the experimental tests carried out at Istituto Nazionale di Fisica Nucleare-Laboratori Nazionali del Sud (INFN-LNS) and at Best Cyclotron Systems test-bench in Vancouver, Canada.

  5. Modular Electric Propulsion Test Bed Aircraft, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — A hybrid electric aircraft simulation system and test bed is proposed to provide a dedicated development environment for the rigorous study and advancement of hybrid...

  6. A Generalized Method for the Comparable and Rigorous Calculation of the Polytropic Efficiencies of Turbocompressors

    Science.gov (United States)

    Dimitrakopoulos, Panagiotis

    2018-03-01

    The calculation of polytropic efficiencies is a very important task, especially during the development of new compression units, like compressor impellers, stages and stage groups. Such calculations are also crucial for the determination of the performance of a whole compressor. As processors and computational capacities have substantially been improved in the last years, the need for a new, rigorous, robust, accurate and at the same time standardized method merged, regarding the computation of the polytropic efficiencies, especially based on thermodynamics of real gases. The proposed method is based on the rigorous definition of the polytropic efficiency. The input consists of pressure and temperature values at the end points of the compression path (suction and discharge), for a given working fluid. The average relative error for the studied cases was 0.536 %. Thus, this high-accuracy method is proposed for efficiency calculations related with turbocompressors and their compression units, especially when they are operating at high power levels, for example in jet engines and high-power plants.

  7. Joint optimization of source, mask, and pupil in optical lithography

    Science.gov (United States)

    Li, Jia; Lam, Edmund Y.

    2014-03-01

    Mask topography effects need to be taken into consideration for more advanced resolution enhancement techniques in optical lithography. However, rigorous 3D mask model achieves high accuracy at a large computational cost. This work develops a combined source, mask and pupil optimization (SMPO) approach by taking advantage of the fact that pupil phase manipulation is capable of partially compensating for mask topography effects. We first design the pupil wavefront function by incorporating primary and secondary spherical aberration through the coefficients of the Zernike polynomials, and achieve optimal source-mask pair under the condition of aberrated pupil. Evaluations against conventional source mask optimization (SMO) without incorporating pupil aberrations show that SMPO provides improved performance in terms of pattern fidelity and process window sizes.

  8. Quality of nuchal translucency measurements correlates with broader aspects of program rigor and culture of excellence.

    Science.gov (United States)

    Evans, Mark I; Krantz, David A; Hallahan, Terrence; Sherwin, John; Britt, David W

    2013-01-01

    To determine if nuchal translucency (NT) quality correlates with the extent to which clinics vary in rigor and quality control. We correlated NT performance quality (bias and precision) of 246,000 patients with two alternative measures of clinic culture - % of cases for whom nasal bone (NB) measurements were performed and % of requisitions correctly filled for race-ethnicity and weight. When requisition errors occurred in 5% (33%), the curve lowered to 0.93 MoM (p 90%, MoM was 0.99 compared to those quality exists independent of individual variation in NT quality, and two divergent indices of program rigor are associated with NT quality. Quality control must be program wide, and to effect continued improvement in the quality of NT results across time, the cultures of clinics must become a target for intervention. Copyright © 2013 S. Karger AG, Basel.

  9. Development of interface between MCNP-FISPACT-MCNP (IPR-MFM) based on rigorous two step method

    International Nuclear Information System (INIS)

    Shaw, A.K.; Swami, H.L.; Danani, C.

    2015-01-01

    In this work we present the development of interface tool between MCNP-FISPACT-MCNP (MFM) based on Rigorous Two Step method for the shutdown dose rate (SDDR) calculation. The MFM links MCNP radiation transport and the FISPACT inventory code through a suitable coupling scheme. MFM coupling scheme has three steps. In first step it picks neutron spectrum and total flux from MCNP output file to use as input parameter for FISPACT. It prepares the FISPACT input files by using irradiation history, neutron flux and neutron spectrum and then execute the FISPACT input file in the second step. Third step of MFM coupling scheme extracts the decay gammas from the FISPACT output file and prepares MCNP input file for decay gamma transport followed by execution of MCNP input file and estimation of SDDR. Here detailing of MFM methodology and flow scheme has been described. The programming language PYTHON has been chosen for this development of the coupling scheme. A complete loop of MCNP-FISPACT-MCNP has been developed to handle the simplified geometrical problems. For validation of MFM interface a manual cross-check has been performed which shows good agreements. The MFM interface also has been validated with exiting MCNP-D1S method for a simple geometry with 14 MeV cylindrical neutron source. (author)

  10. Comparison of rigorous modelling of different structure profiles on photomasks for quantitative linewidth measurements by means of UV- or DUV-optical microscopy

    Science.gov (United States)

    Ehret, Gerd; Bodermann, Bernd; Woehler, Martin

    2007-06-01

    The optical microscopy is an important instrument for dimensional characterisation or calibration of micro- and nanostructures, e.g. chrome structures on photomasks. In comparison to scanning electron microscopy (possible contamination of the sample) and atomic force microscopy (slow, risk of damage) optical microscopy is a fast and non destructive metrology method. The precise quantitative determination of the linewidth from the microscope image is, however, only possible by knowledge of the geometry of the structures and their consideration in the optical modelling. We compared two different rigorous model approaches, the Rigorous Coupled Wave Analysis (RCWA) and the Finite Elements Method (FEM) for modelling of structures with different edge angles, linewidths, line to space ratios and polarisations. The RCWA method can adapt inclined edges profiles only by a staircase approximation leading to increased modelling errors of the RCWA method. Even today's sophisticated rigorous methods still show problems with TM-polarisation. Therefore both rigorous methods are compared in terms of their convergence for TE and TM- polarisation. Beyond that also the influence of typical illumination wavelengths (365 nm, 248 nm and 193 nm) on the microscope images and their contribution to the measuring uncertainty budget will be discussed.

  11. The Chandra Source Catalog: Source Variability

    Science.gov (United States)

    Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Evans, I.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to an assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.

  12. Sealed source and device design safety testing. Volume 4: Technical report on the findings of Task 4, Investigation of sealed source for paper mill digester

    International Nuclear Information System (INIS)

    Benac, D.J.; Iddings, F.A.

    1995-10-01

    This report covers the Task 4 activities for the Sealed Source and Device Safety testing program. SwRI was contracted to investigate a suspected leaking radioactive source that was installed in a gauge that was on a paper mill digester. The actual source that was leaking was not available, therefore, SwRI examined another source. SwRI concluded that the encapsulated source examined by SwRI was not leaking. However, the presence of Cs-137 on the interior and exterior of the outer encapsulation and hending tube suggests that contamination probably occurred when the source was first manufactured, then installed in the handling tube

  13. From everyday communicative figurations to rigorous audience news repertoires

    DEFF Research Database (Denmark)

    Kobbernagel, Christian; Schrøder, Kim Christian

    2016-01-01

    In the last couple of decades there has been an unprecedented explosion of news media platforms and formats, as a succession of digital and social media have joined the ranks of legacy media. We live in a ‘hybrid media system’ (Chadwick, 2013), in which people build their cross-media news...... repertoires from the ensemble of old and new media available. This article presents an innovative mixed-method approach with considerable explanatory power to the exploration of patterns of news media consumption. This approach tailors Q-methodology in the direction of a qualitative study of news consumption......, in which a card sorting exercise serves to translate the participants’ news media preferences into a form that enables the researcher to undertake a rigorous factor-analytical construction of their news consumption repertoires. This interpretive, factor-analytical procedure, which results in the building...

  14. Rapid detection and E-test antimicrobial susceptibility testing of Vibrio parahaemolyticus isolated from seafood and environmental sources in Malaysia.

    Science.gov (United States)

    Al-Othrubi, Saleh M; Hanafiah, Alfizah; Radu, Son; Neoh, Humin; Jamal, Rahaman

    2011-04-01

    To find out the prevalence and antimicrobial susceptibility of Vibrio parahaemolyticus in seafoods and environmental sources. The study was carried out at the Center of Excellence for Food Safety Research, University Putra Malaysia; Universiti Kebangsaan Malaysia; Medical Molecular Biology Institute; and University Kebansaan Malaysia Hospital, Malaysia between January 2006 and August 2008. One hundred and forty-four isolates from 400 samples of seafood (122 isolates) and seawater sources (22 isolates) were investigated for the presence of thermostable direct hemolysin (tdh+) and TDH-related hemolysin (trh+) genes using the standard methods. The E-test method was used to test the antimicrobial susceptibility. The study indicates low occurrence of tdh+ (0.69%) and trh+ isolates (8.3%). None of the isolates tested posses both virulence genes. High sensitivity was observed against tetracycline (98%). The mean minimum inhibitory concentration (MIC) of the isolates toward ampicillin increased from 4 ug/ml in 2004 to 24 ug/ml in 2007. The current study demonstrates a low occurrence of pathogenic Vibrio parahaemolyticus in the marine environment and seafood. Nonetheless, the potential risk of vibrio infection due to consumption of Vibrio parahaemolyticus contaminated seafood in Malaysia should not be neglected.

  15. Thermal hydraulic tests of a liquid hydrogen cold neutron source. NISTIR 5026

    International Nuclear Information System (INIS)

    Siegwarth, J.D.; Olson, D.A.; Lewis, M.A.; Rowe, J.M.; Williams, R.E.; Kopetka, P.

    1995-01-01

    Liquid hydrogen cold neutron source designed at NBSR contains neutron moderator chamber. The NIST-B electrically heated glass moderator chamber used to test the NBSR chamber testing showed the following results: Stable operation possible up to at least 2200 watts with two-phase flow; LH 2 mass quickly reaches new, stable value after heat load change; Void fraction well below 20 at anticipated power and pressure; Restart of H 2 flow verified after extending supply line; Visual inspection showed no dryout or unexpected voids

  16. Test stand for magnetron H negative ion source at IPP-Nagoya

    Energy Technology Data Exchange (ETDEWEB)

    Okamura, H; Kuroda, T; Miyahara, A

    1981-02-01

    Test facilities for the development of magnetron H(-) ion source consists of the vacuum system, power supplies, diagnostic equipment, and their controlling electronics. Schematics are presented and relevant items described including sequence control, optical links, the charged pulse forming network, the extractor power supply, magnet power supply, temperature control of the cesium feeder, and the pulsed valve driver. Noise problems and diagnostics are also considered.

  17. 42 CFR 493.901 - Approval of proficiency testing programs.

    Science.gov (United States)

    2010-10-01

    ...) Distribute the samples, using rigorous quality control to assure that samples mimic actual patient specimens... gynecologic cytology and on individual laboratory performance on testing events, cumulative reports and scores...

  18. Rigorous lower bound on the dynamic critical exponent of some multilevel Swendsen-Wang algorithms

    International Nuclear Information System (INIS)

    Li, X.; Sokal, A.D.

    1991-01-01

    We prove the rigorous lower bound z exp ≥α/ν for the dynamic critical exponent of a broad class of multilevel (or ''multigrid'') variants of the Swendsen-Wang algorithm. This proves that such algorithms do suffer from critical slowing down. We conjecture that such algorithms in fact lie in the same dynamic universality class as the stanard Swendsen-Wang algorithm

  19. Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs

    Science.gov (United States)

    Goodman, Lisa A.; Epstein, Deborah; Sullivan, Cris M.

    2018-01-01

    Programs for domestic violence (DV) victims and their families have grown exponentially over the last four decades. The evidence demonstrating the extent of their effectiveness, however, often has been criticized as stemming from studies lacking scientific rigor. A core reason for this critique is the widespread belief that credible evidence can…

  20. Tests of MVD prototype pad detector with a β- source

    International Nuclear Information System (INIS)

    Yeol Kim, Sang; Gook Kim, Young; Su Ryu, Sang; Hwan Kang, Ju; Simon-Gillo, Jehanne; Sullivan, John P.; Heck, Hubert W. van; Xu Guanghua

    1999-01-01

    The MVD group has been testing two versions of silicon pad detectors. One design uses a single metal layer for readout trace routing. The second type uses two layers of metal, allowing for greatly simplified signal routing. However, because the readout traces for the pads pass over the other pads in the same column (separated by an oxide layer), the double-metal design introduces crosstalk into the system. A simple test stand using a 90 Sr β - source with scintillator triggers was made to estimate the crosstalk. The crosstalk between pads in the same column of the pad detector was 1.6-3.1%. The values measured between pads in different columns were very close to zero. The measured crosstalk was below our maximum allowed value of 7.8%

  1. Dosimetric effects of edema in permanent prostate seed implants: a rigorous solution

    International Nuclear Information System (INIS)

    Chen Zhe; Yue Ning; Wang Xiaohong; Roberts, Kenneth B.; Peschel, Richard; Nath, Ravinder

    2000-01-01

    Purpose: To derive a rigorous analytic solution to the dosimetric effects of prostate edema so that its impact on the conventional pre-implant and post-implant dosimetry can be studied for any given radioactive isotope and edema characteristics. Methods and Materials: The edema characteristics observed by Waterman et al (Int. J. Rad. Onc. Biol. Phys, 41:1069-1077; 1998) was used to model the time evolution of the prostate and the seed locations. The total dose to any part of prostate tissue from a seed implant was calculated analytically by parameterizing the dose fall-off from a radioactive seed as a single inverse power function of distance, with proper account of the edema-induced time evolution. The dosimetric impact of prostate edema was determined by comparing the dose calculated with full consideration of prostate edema to that calculated with the conventional dosimetry approach where the seed locations and the target volume are assumed to be stationary. Results: A rigorous analytic solution on the relative dosimetric effects of prostate edema was obtained. This solution proved explicitly that the relative dosimetric effects of edema, as found in the previous numerical studies by Yue et. al. (Int. J. Radiat. Oncol. Biol. Phys. 43, 447-454, 1999), are independent of the size and the shape of the implant target volume and are independent of the number and the locations of the seeds implanted. It also showed that the magnitude of relative dosimetric effects is independent of the location of dose evaluation point within the edematous target volume. It implies that the relative dosimetric effects of prostate edema are universal with respect to a given isotope and edema characteristic. A set of master tables for the relative dosimetric effects of edema were obtained for a wide range of edema characteristics for both 125 I and 103 Pd prostate seed implants. Conclusions: A rigorous analytic solution of the relative dosimetric effects of prostate edema has been

  2. "Snow White" Coating Protects SpaceX Dragon's Trunk Against Rigors of Space

    Science.gov (United States)

    McMahan, Tracy

    2013-01-01

    He described it as "snow white." But NASA astronaut Don Pettit was not referring to the popular children's fairy tale. Rather, he was talking about the white coating of the Space Exploration Technologies Corp. (SpaceX) Dragon spacecraft that reflected from the International Space Station s light. As it approached the station for the first time in May 2012, the Dragon s trunk might have been described as the "fairest of them all," for its pristine coating, allowing Pettit to clearly see to maneuver the robotic arm to grab the Dragon for a successful nighttime berthing. This protective thermal control coating, developed by Alion Science and Technology Corp., based in McLean, Va., made its bright appearance again with the March 1 launch of SpaceX's second commercial resupply mission. Named Z-93C55, the coating was applied to the cargo portion of the Dragon to protect it from the rigors of space. "For decades, Alion has produced coatings to protect against the rigors of space," said Michael Kenny, senior chemist with Alion. "As space missions evolved, there was a growing need to dissipate electrical charges that build up on the exteriors of spacecraft, or there could be damage to the spacecraft s electronics. Alion's research led us to develop materials that would meet this goal while also providing thermal controls. The outcome of this research was Alion's proprietary Z-93C55 coating."

  3. Development and performance test of a continuous source of nitrous acid (HONO)

    Energy Technology Data Exchange (ETDEWEB)

    Ammann, M.; Roessler, E.; Kalberer, M.; Bruetsch, S.; Schwikowski, M.; Baltensperger, U.; Zellweger, C.; Gaeggeler, H.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Laboratory investigations involving nitrous acid (HONO) require a stable, continuous source of HONO at ppb levels. A flow type generation system based on the reaction of sodium nitrite with sulfuric acid has been developed. Performance and speciation of gaseous products were tested with denuder and chemiluminescence techniques. (author) 2 figs., 2 refs.

  4. Effects of well-boat transportation on the muscle pH and onset of rigor mortis in Atlantic salmon.

    Science.gov (United States)

    Gatica, M C; Monti, G; Gallo, C; Knowles, T G; Warriss, P D

    2008-07-26

    During the transport of salmon (Salmo salar), in a well-boat, 10 fish were sampled at each of six stages: in cages after crowding at the farm (stage 1), in the well-boat after loading (stage 2), in the well-boat after eight hours transport and before unloading (stage 3), in the resting cages immediately after finishing unloading (stage 4), after 24 hours resting in cages, (stage 5) and in the processing plant after pumping from the resting cages (stage 6). The water in the well-boat was at ambient temperature with recirculation to the sea. At each stage the fish were stunned percussively and bled by gill cutting. Immediately after death, and then every three hours for 18 hours, the muscle pH and rigor index of the fish were measured. At successive stages the initial muscle pH of the fish decreased, except for a slight gain in stage 5, after they had been rested for 24 hours. The lowest initial muscle pH was observed at stage 6. The fishes' rigor index showed that rigor developed more quickly at each successive stage, except for a slight decrease in rate at stage 5, attributable to the recovery of muscle reserves.

  5. Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance.

    Directory of Open Access Journals (Sweden)

    Sophie Marchal

    Full Text Available Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs' greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately.

  6. Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames

    Science.gov (United States)

    Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.

    2017-12-01

    Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.

  7. Rigorous vector wave propagation for arbitrary flat media

    Science.gov (United States)

    Bos, Steven P.; Haffert, Sebastiaan Y.; Keller, Christoph U.

    2017-08-01

    Precise modelling of the (off-axis) point spread function (PSF) to identify geometrical and polarization aberrations is important for many optical systems. In order to characterise the PSF of the system in all Stokes parameters, an end-to-end simulation of the system has to be performed in which Maxwell's equations are rigorously solved. We present the first results of a python code that we are developing to perform multiscale end-to-end wave propagation simulations that include all relevant physics. Currently we can handle plane-parallel near- and far-field vector diffraction effects of propagating waves in homogeneous isotropic and anisotropic materials, refraction and reflection of flat parallel surfaces, interference effects in thin films and unpolarized light. We show that the code has a numerical precision on the order of 10-16 for non-absorbing isotropic and anisotropic materials. For absorbing materials the precision is on the order of 10-8. The capabilities of the code are demonstrated by simulating a converging beam reflecting from a flat aluminium mirror at normal incidence.

  8. Dynamics of harmonically-confined systems: Some rigorous results

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Zhigang, E-mail: zwu@physics.queensu.ca; Zaremba, Eugene, E-mail: zaremba@sparky.phy.queensu.ca

    2014-03-15

    In this paper we consider the dynamics of harmonically-confined atomic gases. We present various general results which are independent of particle statistics, interatomic interactions and dimensionality. Of particular interest is the response of the system to external perturbations which can be either static or dynamic in nature. We prove an extended Harmonic Potential Theorem which is useful in determining the damping of the centre of mass motion when the system is prepared initially in a highly nonequilibrium state. We also study the response of the gas to a dynamic external potential whose position is made to oscillate sinusoidally in a given direction. We show in this case that either the energy absorption rate or the centre of mass dynamics can serve as a probe of the optical conductivity of the system. -- Highlights: •We derive various rigorous results on the dynamics of harmonically-confined atomic gases. •We derive an extension of the Harmonic Potential Theorem. •We demonstrate the link between the energy absorption rate in a harmonically-confined system and the optical conductivity.

  9. Rigorous theory of molecular orientational nonlinear optics

    International Nuclear Information System (INIS)

    Kwak, Chong Hoon; Kim, Gun Yeup

    2015-01-01

    Classical statistical mechanics of the molecular optics theory proposed by Buckingham [A. D. Buckingham and J. A. Pople, Proc. Phys. Soc. A 68, 905 (1955)] has been extended to describe the field induced molecular orientational polarization effects on nonlinear optics. In this paper, we present the generalized molecular orientational nonlinear optical processes (MONLO) through the calculation of the classical orientational averaging using the Boltzmann type time-averaged orientational interaction energy in the randomly oriented molecular system under the influence of applied electric fields. The focal points of the calculation are (1) the derivation of rigorous tensorial components of the effective molecular hyperpolarizabilities, (2) the molecular orientational polarizations and the electronic polarizations including the well-known third-order dc polarization, dc electric field induced Kerr effect (dc Kerr effect), optical Kerr effect (OKE), dc electric field induced second harmonic generation (EFISH), degenerate four wave mixing (DFWM) and third harmonic generation (THG). We also present some of the new predictive MONLO processes. For second-order MONLO, second-order optical rectification (SOR), Pockels effect and difference frequency generation (DFG) are described in terms of the anisotropic coefficients of first hyperpolarizability. And, for third-order MONLO, third-order optical rectification (TOR), dc electric field induced difference frequency generation (EFIDFG) and pump-probe transmission are presented

  10. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-26

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, this paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.

  11. RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY

    Directory of Open Access Journals (Sweden)

    I. Boukerch

    2013-04-01

    Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  12. Analysis of the Variability of Classified and Unclassified Radiological Source term Inventories in the Frenchman Flat Area, Nevada test Site

    International Nuclear Information System (INIS)

    Zhao, P.; Zavarin, M.

    2008-01-01

    It has been proposed that unclassified source terms used in the reactive transport modeling investigations at NTS CAUs should be based on yield-weighted source terms calculated using the average source term from Bowen et al. (2001) and the unclassified announced yields reported in DOE/NV-209. This unclassified inventory is likely to be used in unclassified contaminant boundary calculations and is, thus, relevant to compare to the classified inventory. They have examined the classified radionuclide inventory produced by 10 underground nuclear tests conducted in the Frenchman Flat (FF) area of the Nevada Test Site. The goals were to (1) evaluate the variability in classified radiological source terms among the 10 tests and (2) compare that variability and inventory uncertainties to an average unclassified inventory (e.g. Bowen 2001). To evaluate source term variability among the 10 tests, radiological inventories were compared on two relative scales: geometric mean and yield-weighted geometric mean. Furthermore, radiological inventories were either decay corrected to a common date (9/23/1992) or the time zero (t 0 ) of each test. Thus, a total of four data sets were produced. The date of 9/23/1992 was chosen based on the date of the last underground nuclear test at the Nevada Test Site

  13. Community historians and the dilemma of rigor vs relevance : A comment on Danziger and van Rappard

    NARCIS (Netherlands)

    Dehue, Trudy

    1998-01-01

    Since the transition from finalism to contextualism, the history of science seems to be caught up in a basic dilemma. Many historians fear that with the new contextualist standards of rigorous historiography, historical research can no longer be relevant to working scientists themselves. The present

  14. A test of unification towards the radio source PKS1413+135

    International Nuclear Information System (INIS)

    Ferreira, M.C.; Julião, M.D.; Martins, C.J.A.P.; Monteiro, A.M.R.V.L.

    2013-01-01

    We point out that existing astrophysical measurements of combinations of the fine-structure constant α, the proton-to-electron mass ratio μ and the proton gyromagnetic ratio g p towards the radio source PKS1413+135 can be used to individually constrain each of these fundamental couplings. While the accuracy of the available measurements is not yet sufficient to test the spatial dipole scenario, our analysis serves as a proof of concept as new observational facilities will soon allow significantly more robust tests. Moreover, these measurements can also be used to obtain constraints on certain classes of unification scenarios, and we compare the constraints obtained for PKS1413+135 with those previously obtained from local atomic clock measurements

  15. A test of unification towards the radio source PKS1413+135

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, M.C., E-mail: up200802537@fc.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Ciências, Universidade do Porto, Rua do Campo Alegre, 4150-007 Porto (Portugal); Julião, M.D., E-mail: meinf12013@fe.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Engenharia, Universidade do Porto, Rua Dr Roberto Frias, 4200-465 Porto (Portugal); Martins, C.J.A.P., E-mail: Carlos.Martins@astro.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Monteiro, A.M.R.V.L., E-mail: mmonteiro@fc.up.pt [Centro de Astrofísica, Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal); Faculdade de Ciências, Universidade do Porto, Rua do Campo Alegre, 4150-007 Porto (Portugal); Department of Applied Physics, Delft University of Technology, P.O. Box 5046, 2600 GA Delft (Netherlands)

    2013-07-09

    We point out that existing astrophysical measurements of combinations of the fine-structure constant α, the proton-to-electron mass ratio μ and the proton gyromagnetic ratio g{sub p} towards the radio source PKS1413+135 can be used to individually constrain each of these fundamental couplings. While the accuracy of the available measurements is not yet sufficient to test the spatial dipole scenario, our analysis serves as a proof of concept as new observational facilities will soon allow significantly more robust tests. Moreover, these measurements can also be used to obtain constraints on certain classes of unification scenarios, and we compare the constraints obtained for PKS1413+135 with those previously obtained from local atomic clock measurements.

  16. JRR-3 cold neutron source facility H2-O2 explosion safety proof testing

    International Nuclear Information System (INIS)

    Hibi, T.; Fuse, H.; Takahashi, H.; Akutsu, C.; Kumai, T.; Kawabata, Y.

    1990-01-01

    A cold Neutron Source (CNS) will be installed in Japan Research Reactor-3 (JRR-3) in Japan Atomic Energy Research Institute (JAERI) during its remodeling project. This CNS holds liquid hydrogen at a temperature of about 20 K as a cold neutron source moderator in the heavy water area of the reactor to moderate thermal neutrons from the reactor to cold neutrons of about 5 meV energy. In the hydrogen circuit of the CNS safety measures are taken to prevent oxygen/hydrogen reaction (H 2 -O 2 explosion). It is also designed in such manner that, should an H 2 -O 2 explosion take place, the soundness of all the components can be maintained so as not to harm the reactor safety. A test hydrogen circuit identical to that of the CNS (real components designed by TECHNICATOME of France) was manufactured to conduct the H 2 -O 2 explosion test. In this test, the detonation that is the severest phenomenon of the oxygen/hydrogen reaction took place in the test hydrogen circuit to measure the exerted pressure on the components and their strain, deformation, leakage, cracking, etc. Based on the results of this measurement, the structural strength of the test hydrogen circuit was analyzed. The results of this test show that the hydrogen circuit components have sufficient structural strength to withstand an oxygen/hydrogen reaction

  17. Assessment of the gas dynamic trap mirror facility as intense neutron source for fusion material test irradiations

    International Nuclear Information System (INIS)

    Fischer, U.; Moeslang, A.; Ivanov, A.A.

    2000-01-01

    The gas dynamic trap (GDT) mirror machine has been proposed by the Budker Institute of nuclear physics, Novosibirsk, as a volumetric neutron source for fusion material test irradiations. On the basis of the GDT plasma confinement concept, 14 MeV neutrons are generated at high production rates in the two end sections of the axially symmetrical central mirror cell, serving as suitable irradiation test regions. In this paper, we present an assessment of the GDT as intense neutron source for fusion material test irradiations. This includes comparisons to irradiation conditions in fusion reactor systems (ITER, Demo) and the International Fusion Material Irradiation Facility (IFMIF), as well as a conceptual design for a helium-cooled tubular test assembly elaborated for the largest of the two test zones taking proper account of neutronics, thermal-hydraulic and mechanical aspects. This tubular test assembly incorporates ten rigs of about 200 cm length used for inserting instrumented test capsules with miniaturized specimens taking advantage of the 'small specimen test technology'. The proposed design allows individual temperatures in each of the rigs, and active heating systems inside the capsules ensures specimen temperature stability even during beam-off periods. The major concern is about the maximum achievable dpa accumulation of less than 15 dpa per full power year on the basis of the present design parameters of the GDT neutron source. A design upgrading is proposed to allow for higher neutron wall loadings in the material test regions

  18. Characterizing the Performance of the Princeton Advanced Test Stand Ion Source

    Science.gov (United States)

    Stepanov, A.; Gilson, E. P.; Grisham, L.; Kaganovich, I.; Davidson, R. C.

    2012-10-01

    The Princeton Advanced Test Stand (PATS) is a compact experimental facility for studying the physics of intense beam-plasma interactions relevant to the Neutralized Drift Compression Experiment - II (NDCX-II). The PATS facility consists of a multicusp RF ion source mounted on a 2 m-long vacuum chamber with numerous ports for diagnostic access. Ar+ beams are extracted from the source plasma with three-electrode (accel-decel) extraction optics. The RF power and extraction voltage (30 - 100 kV) are pulsed to produce 100 μsec duration beams at 0.5 Hz with excellent shot-to-shot repeatability. Diagnostics include Faraday cups, a double-slit emittance scanner, and scintillator imaging. This work reports measurements of beam parameters for a range of beam energies (30 - 50 keV) and currents to characterize the behavior of the ion source and extraction optics. Emittance scanner data is used to calculate the beam trace-space distribution and corresponding transverse emittance. If the plasma density is changing during a beam pulse, time-resolved emittance scanner data has been taken to study the corresponding evolution of the beam trace-space distribution.

  19. An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.

    Science.gov (United States)

    Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D

    2016-05-01

    Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be

  20. Rigorous Multicomponent Reactive Separations Modelling: Complete Consideration of Reaction-Diffusion Phenomena

    International Nuclear Information System (INIS)

    Ahmadi, A.; Meyer, M.; Rouzineau, D.; Prevost, M.; Alix, P.; Laloue, N.

    2010-01-01

    This paper gives the first step of the development of a rigorous multicomponent reactive separation model. Such a model is highly essential to further the optimization of acid gases removal plants (CO 2 capture, gas treating, etc.) in terms of size and energy consumption, since chemical solvents are conventionally used. Firstly, two main modelling approaches are presented: the equilibrium-based and the rate-based approaches. Secondly, an extended rate-based model with rigorous modelling methodology for diffusion-reaction phenomena is proposed. The film theory and the generalized Maxwell-Stefan equations are used in order to characterize multicomponent interactions. The complete chain of chemical reactions is taken into account. The reactions can be kinetically controlled or at chemical equilibrium, and they are considered for both liquid film and liquid bulk. Thirdly, the method of numerical resolution is described. Coupling the generalized Maxwell-Stefan equations with chemical equilibrium equations leads to a highly non-linear Differential-Algebraic Equations system known as DAE index 3. The set of equations is discretized with finite-differences as its integration by Gear method is complex. The resulting algebraic system is resolved by the Newton- Raphson method. Finally, the present model and the associated methods of numerical resolution are validated for the example of esterification of methanol. This archetype non-electrolytic system permits an interesting analysis of reaction impact on mass transfer, especially near the phase interface. The numerical resolution of the model by Newton-Raphson method gives good results in terms of calculation time and convergence. The simulations show that the impact of reactions at chemical equilibrium and that of kinetically controlled reactions with high kinetics on mass transfer is relatively similar. Moreover, the Fick's law is less adapted for multicomponent mixtures where some abnormalities such as counter

  1. Rigor mortis development at elevated temperatures induces pale exudative turkey meat characteristics.

    Science.gov (United States)

    McKee, S R; Sams, A R

    1998-01-01

    Development of rigor mortis at elevated post-mortem temperatures may contribute to turkey meat characteristics that are similar to those found in pale, soft, exudative pork. To evaluate this effect, 36 Nicholas tom turkeys were processed at 19 wk of age and placed in water at 40, 20, and 0 C immediately after evisceration. Pectoralis muscle samples were taken at 15 min, 30 min, 1 h, 2 h, and 4 h post-mortem and analyzed for R-value (an indirect measure of adenosine triphosphate), glycogen, pH, color, and sarcomere length. At 4 h, the remaining intact Pectoralis muscle was harvested, and aged on ice 23 h, and analyzed for drip loss, cook loss, shear values, and sarcomere length. By 15 min post-mortem, the 40 C treatment had higher R-values, which persisted through 4 h. By 1 h, the 40 C treatment pH and glycogen levels were lower than the 0 C treatment; however, they did not differ from those of the 20 C treatment. Increased L* values indicated that color became more pale by 2 h post-mortem in the 40 C treatment when compared to the 20 and 0 C treatments. Drip loss, cook loss, and shear value were increased whereas sarcomere lengths were decreased as a result of the 40 C treatment. These findings suggested that elevated post-mortem temperatures during processing resulted in acceleration of rigor mortis and biochemical changes in the muscle that produced pale, exudative meat characteristics in turkey.

  2. Rigorous Quantum Field Theory A Festschrift for Jacques Bros

    CERN Document Server

    Monvel, Anne Boutet; Iagolnitzer, Daniel; Moschella, Ugo

    2007-01-01

    Jacques Bros has greatly advanced our present understanding of rigorous quantum field theory through numerous fundamental contributions. This book arose from an international symposium held in honour of Jacques Bros on the occasion of his 70th birthday, at the Department of Theoretical Physics of the CEA in Saclay, France. The impact of the work of Jacques Bros is evident in several articles in this book. Quantum fields are regarded as genuine mathematical objects, whose various properties and relevant physical interpretations must be studied in a well-defined mathematical framework. The key topics in this volume include analytic structures of Quantum Field Theory (QFT), renormalization group methods, gauge QFT, stability properties and extension of the axiomatic framework, QFT on models of curved spacetimes, QFT on noncommutative Minkowski spacetime. Contributors: D. Bahns, M. Bertola, R. Brunetti, D. Buchholz, A. Connes, F. Corbetta, S. Doplicher, M. Dubois-Violette, M. Dütsch, H. Epstein, C.J. Fewster, K....

  3. A new method for deriving rigorous results on ππ scattering

    International Nuclear Information System (INIS)

    Caprini, I.; Dita, P.

    1979-06-01

    We develop a new approach to the problem of constraining the ππ scattering amplitudes by means of the axiomatically proved properties of unitarity, analyticity and crossing symmetry. The method is based on the solution of an extremal problem on a convex set of analytic functions and provides a global description of the domain of values taken by any finite number of partial waves at an arbitrary set of unphysical energies, compatible with unitarity, the bounds at complex energies derived from generalized dispersion relations and the crossing integral relations. From this doma domain we obtain new absolute bounds for the amplitudes as well as rigorous correlations between the values of various partial waves. (author)

  4. Information Sources Influencing Soil Testing Innovation Adoption by Grape Farmers in the Khorramdarreh Township

    Directory of Open Access Journals (Sweden)

    Seyedeh Shirin Golbaz

    2015-08-01

    Full Text Available Testing soil  is recognized to be an important practice for sustainable use of nutrients, which has been introduced to Iranian grape farmers as an innovation for over a decade. Its adoption and utilization may be influenced by receiving information from different sources. This study is performed to introduce these information sources that may influence the adoption of soil testing innovation by grape farmers. Using a survey, a sample of 260 out of 3942 grape farmers of the Khorramdarreh Township was selected using a stratified sampling technique and data was collected by structured interviews using a questionnaire. The content and face validity of the questionnaire was discussed and reviewed by a panel of experts consisting of university staff and agricultural professionals. Its reliability was also assessed through a pilot study and its main constructs were approved to be reliable using the Cronbach’s alpha test (measures between 0.71 and 0.84. Less than half of the grape farmers conducted soil testing in their vineyards. A regression analysis showed that variables such as contact of the farmers with model grape producers, Poster Received, publications and listening to radio programs and farmers’ education have a significant positive impact on soil testing innovation adoption. Therefore, both interpersonal and mass media can have a positive effect on farmers to adopt this innovation.

  5. Rigorous covariance propagation of geoid errors to geodetic MDT estimates

    Science.gov (United States)

    Pail, R.; Albertella, A.; Fecher, T.; Savcenko, R.

    2012-04-01

    The mean dynamic topography (MDT) is defined as the difference between the mean sea surface (MSS) derived from satellite altimetry, averaged over several years, and the static geoid. Assuming geostrophic conditions, from the MDT the ocean surface velocities as important component of global ocean circulation can be derived from it. Due to the availability of GOCE gravity field models, for the very first time MDT can now be derived solely from satellite observations (altimetry and gravity) down to spatial length-scales of 100 km and even below. Global gravity field models, parameterized in terms of spherical harmonic coefficients, are complemented by the full variance-covariance matrix (VCM). Therefore, for the geoid component a realistic statistical error estimate is available, while the error description of the altimetric component is still an open issue and is, if at all, attacked empirically. In this study we make the attempt to perform, based on the full gravity VCM, rigorous error propagation to derived geostrophic surface velocities, thus also considering all correlations. For the definition of the static geoid we use the third release of the time-wise GOCE model, as well as the satellite-only combination model GOCO03S. In detail, we will investigate the velocity errors resulting from the geoid component in dependence of the harmonic degree, and the impact of using/no using covariances on the MDT errors and its correlations. When deriving an MDT, it is spectrally filtered to a certain maximum degree, which is usually driven by the signal content of the geoid model, by applying isotropic or non-isotropic filters. Since this filtering is acting also on the geoid component, the consistent integration of this filter process into the covariance propagation shall be performed, and its impact shall be quantified. The study will be performed for MDT estimates in specific test areas of particular oceanographic interest.

  6. The HAW-Project. Test disposal of highly radioactive radiation sources in the Asse salt mine. Final report

    International Nuclear Information System (INIS)

    Rothfuchs, T.; Cuevas, C. de las; Donker, H.; Feddersen, H.K.; Garcia-Celma, A.; Gies, H.; Goreychi, M.; Graefe, V.; Heijdra, J.; Hente, B.; Jockwer, N.; LeMeur, R.; Moenig, J.; Mueller, K.; Prij, J.; Regulla, D.; Smailos, E.; Staupendahl, G.; Till, E.; Zankl, M.

    1995-01-01

    In order to improve the final concept for the disposal of high-level radioactive waste (HAW) in boreholes drilled into salt formation plans were developed a couple of years ago for a full scale testing of the complete technical system of an underground repository. To satisfy the test objectives, thirty highly radioactive radiation sources were planned to be emplaced in six boreholes located in two test galleries at the 800-m-level in the Asse salt mine. A duration of testing of approximately five years was envisaged. Because of licensing uncertainties the German Federal Government decided on December 3rd, 1992 to stop all activities for the preparation of the test disposal immediately. In the course of the preparation of the test disposal, however, a system, necessary for handling of the radiation sources was developed and installed in the Asse salt mine and two non-radioactive reference tests with electrical heaters were started in November 1988. These tests served for the investigation of thermal effects in comparison to the planned radioactive tests. An accompanying scientific investigation programme performed in situ and in the laboratory comprises the estimation and observation of the thermal, radiation-induced, and mechanical interaction between the rock salt and the electrical heaters and the radiation sources, respectively. The laboratory investigations are carried out at Braunschweig (FRG), Petten (NL), Saclay (F) and Barcelona (E). As a consequence of the premature termination of the project the working programme was revised. The new programme agreed to by the project partners included a controlled shutdown of the heater tests in 1993 and a continuation of the laboratory activities until the end of 1994. (orig.)

  7. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    Science.gov (United States)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  8. Test of Effective Solid Angle code for the efficiency calculation of volume source

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    It is hard to determine a full energy (FE) absorption peak efficiency curve for an arbitrary volume source by experiment. That's why the simulation and semi-empirical methods have been preferred so far, and many works have progressed in various ways. Moens et al. determined the concept of effective solid angle by considering an attenuation effect of γ-rays in source, media and detector. This concept is based on a semi-empirical method. An Effective Solid Angle code (ESA code) has been developed for years by the Applied Nuclear Physics Group in Seoul National University. ESA code converts an experimental FE efficiency curve determined by using a standard point source to that for a volume source. To test the performance of ESA Code, we measured the point standard sources and voluminous certified reference material (CRM) sources of γ-ray, and compared with efficiency curves obtained in this study. 200∼1500 KeV energy region is fitted well. NIST X-ray mass attenuation coefficient data is used currently to check for the effect of linear attenuation only. We will use the interaction cross-section data obtained from XCOM code to check the each contributing factor like photoelectric effect, incoherent scattering and coherent scattering in the future. In order to minimize the calculation time and code simplification, optimization of algorithm is needed.

  9. Rigorous force field optimization principles based on statistical distance minimization

    Energy Technology Data Exchange (ETDEWEB)

    Vlcek, Lukas, E-mail: vlcekl1@ornl.gov [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States); Joint Institute for Computational Sciences, University of Tennessee, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6173 (United States); Chialvo, Ariel A. [Chemical Sciences Division, Geochemistry & Interfacial Sciences Group, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6110 (United States)

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  10. Beam experiments with the Grenoble test electron cyclotron resonance ion source at iThemba LABS

    Energy Technology Data Exchange (ETDEWEB)

    Thomae, R., E-mail: rthomae@tlabs.ac.za; Conradie, J.; Fourie, D.; Mira, J.; Nemulodi, F. [iThemba LABS, P.O. Box 722, Somerset West 7130 (South Africa); Kuechler, D.; Toivanen, V. [CERN, BE/ABP/HSL, 1211 Geneva 23 (Switzerland)

    2016-02-15

    At iThemba Laboratory for Accelerator Based Sciences (iThemba LABS) an electron cyclotron ion source was installed and commissioned. This source is a copy of the Grenoble Test Source (GTS) for the production of highly charged ions. The source is similar to the GTS-LHC at CERN and named GTS2. A collaboration between the Accelerators and Beam Physics Group of CERN and the Accelerator and Engineering Department of iThemba LABS was proposed in which the development of high intensity argon and xenon beams is envisaged. In this paper, we present beam experiments with the GTS2 at iThemba LABS, in which the results of continuous wave and afterglow operation of xenon ion beams with oxygen as supporting gases are presented.

  11. Accurate shear measurement with faint sources

    International Nuclear Information System (INIS)

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys

  12. Rigorous RG Algorithms and Area Laws for Low Energy Eigenstates in 1D

    Science.gov (United States)

    Arad, Itai; Landau, Zeph; Vazirani, Umesh; Vidick, Thomas

    2017-11-01

    One of the central challenges in the study of quantum many-body systems is the complexity of simulating them on a classical computer. A recent advance (Landau et al. in Nat Phys, 2015) gave a polynomial time algorithm to compute a succinct classical description for unique ground states of gapped 1D quantum systems. Despite this progress many questions remained unsolved, including whether there exist efficient algorithms when the ground space is degenerate (and of polynomial dimension in the system size), or for the polynomially many lowest energy states, or even whether such states admit succinct classical descriptions or area laws. In this paper we give a new algorithm, based on a rigorously justified RG type transformation, for finding low energy states for 1D Hamiltonians acting on a chain of n particles. In the process we resolve some of the aforementioned open questions, including giving a polynomial time algorithm for poly( n) degenerate ground spaces and an n O(log n) algorithm for the poly( n) lowest energy states (under a mild density condition). For these classes of systems the existence of a succinct classical description and area laws were not rigorously proved before this work. The algorithms are natural and efficient, and for the case of finding unique ground states for frustration-free Hamiltonians the running time is {\\tilde{O}(nM(n))} , where M( n) is the time required to multiply two n × n matrices.

  13. Development of quality assurance programme for prescribed ionizing radiation source testing. Recommendations

    International Nuclear Information System (INIS)

    1999-01-01

    The document gives guidance to those applying for licence to perform ionizing radiation source acceptance tests and long-term stability tests and provides information which should be known when introducing quality assurance systems in compliance with legislative requirements. It is envisaged that this document ('Recommendations') will form a basis for final Safety Guides to be issued by the State Office for Nuclear Safety, the Czech nuclear regulatory authority. The setup of the publication is as follows. Part I gives a glossary of basic terms in quality systems. Part 2 explains quality system principles, paying special attention to radiation safety issues, and describes the structure and scope of quality system documentation. Part 3 explains the individual elements of the quality system and gives practical examples. Part 4 deals with the quality assurance programme; using instructions and practical examples, this part shows how the quality system elements should be applied to long-time stability testing and acceptance testing. A model structure of 2nd degree documentation (guidelines) and a model testing protocol are given in annexes. (P.A.)

  14. The Researchers' View of Scientific Rigor-Survey on the Conduct and Reporting of In Vivo Research.

    Science.gov (United States)

    Reichlin, Thomas S; Vogt, Lucile; Würbel, Hanno

    2016-01-01

    Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers' self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers' self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to

  15. Fabrication and test of prototype ring magnets for the ALS [Advanced Light Source

    International Nuclear Information System (INIS)

    Tanabe, J.; Avery, R.; Caylor, R.; Green, M.I.; Hoyer, E.; Halbach, K.; Hernandez, S.; Humphries, D.; Kajiyama, Y.; Keller, R.; Low, W.; Marks, S.; Milburn, J.; Yee, D.

    1989-03-01

    Prototype Models for the Advanced Light Source (ALS) Booster Dipole, Quadrupole and Sextupole and the Storage Ring Gradient Magnet, Quadrupole and Sextupole have been constructed. The Booster Magnet Prototypes have been tested. The Storage Ring Magnets are presently undergoing tests and magnetic measurements. This paper reviews the designs and parameters for these magnets, briefly describes features of the magnet designs which respond to the special constraints imposed by the requirements for both accelerator rings, and reviews some of the results of magnet measurements for the prototype. 13 refs., 7 figs., 1 tab

  16. Single-crate stand-alone CAMAC control system for a negative ion source test facility

    International Nuclear Information System (INIS)

    Juras, R.C.; Ziegler, N.F.

    1979-01-01

    A single-crate CAMAC system was configured to control a negative ion source development facility at ORNL and control software was written for the crate microcomputer. The software uses inputs from a touch panel and a shaft encoder to control the various operating parameters of the test facility and uses the touch panel to display the operating status. Communication to and from the equipment at ion source potential is accomplished over optical fibers from an ORNL-built CAMAC module. A receiver at ion source potential stores the transmitted data and some of these stored values are then used to control discrete parameters of the ion source (i.e., power supply on or off). Other stored values are sent to a multiplexed digital-to-analog converter to provide analog control signals. A transmitter at ion source potential transmits discrete status information and several channels of analog data from an analog-to-digital converter back to the ground-potential receiver where it is stored to be read and displayed by the software

  17. The preliminary tests of the superconducting electron cyclotron resonance ion source DECRIS-SC2.

    Science.gov (United States)

    Efremov, A; Bekhterev, V; Bogomolov, S; Drobin, V; Loginov, V; Lebedev, A; Yazvitsky, N; Yakovlev, B

    2012-02-01

    A new compact version of the "liquid He-free" superconducting ECR ion source, to be used as an injector of highly charged heavy ions for the MC-400 cyclotron, is designed and built at the Flerov Laboratory of Nuclear Reactions in collaboration with the Laboratory of High Energy Physics of JINR. The axial magnetic field of the source is created by the superconducting magnet and the NdFeB hexapole is used for the radial plasma confinement. The microwave frequency of 14 GHz is used for ECR plasma heating. During the first tests, the source shows a good enough performance for the production of medium charge state ions. In this paper, we will present the design parameters and the preliminary results with gaseous ions.

  18. Testing and intercomparison of model predictions of radionuclide migration from a hypothetical area source

    International Nuclear Information System (INIS)

    O'Brien, R.S.; Yu, C.; Zeevaert, T.; Olyslaegers, G.; Amado, V.; Setlow, L.W.; Waggitt, P.W.

    2008-01-01

    This work was carried out as part of the International Atomic Energy Agency's EMRAS program. One aim of the work was to develop scenarios for testing computer models designed for simulating radionuclide migration in the environment, and to use these scenarios for testing the models and comparing predictions from different models. This paper presents the results of the development and testing of a hypothetical area source of NORM waste/residue using two complex computer models and one screening model. There are significant differences in the methods used to model groundwater flow between the complex models. The hypothetical source was used because of its relative simplicity and because of difficulties encountered in finding comprehensive, well-validated data sets for real sites. The source consisted of a simple repository of uniform thickness, with 1 Bq g -1 of uranium-238 ( 238 U) (in secular equilibrium with its decay products) distributed uniformly throughout the waste. These approximate real situations, such as engineered repositories, waste rock piles, tailings piles and landfills. Specification of the site also included the physical layout, vertical stratigraphic details, soil type for each layer of material, precipitation and runoff details, groundwater flow parameters, and meteorological data. Calculations were carried out with and without a cover layer of clean soil above the waste, for people working and living at different locations relative to the waste. The predictions of the two complex models showed several differences which need more detailed examination. The scenario is available for testing by other modelers. It can also be used as a planning tool for remediation work or for repository design, by changing the scenario parameters and running the models for a range of different inputs. Further development will include applying models to real scenarios and integrating environmental impact assessment methods with the safety assessment tools currently

  19. Smoothing of Transport Plans with Fixed Marginals and Rigorous Semiclassical Limit of the Hohenberg-Kohn Functional

    Science.gov (United States)

    Cotar, Codina; Friesecke, Gero; Klüppelberg, Claudia

    2018-06-01

    We prove rigorously that the exact N-electron Hohenberg-Kohn density functional converges in the strongly interacting limit to the strictly correlated electrons (SCE) functional, and that the absolute value squared of the associated constrained search wavefunction tends weakly in the sense of probability measures to a minimizer of the multi-marginal optimal transport problem with Coulomb cost associated to the SCE functional. This extends our previous work for N = 2 ( Cotar etal. in Commun Pure Appl Math 66:548-599, 2013). The correct limit problem has been derived in the physics literature by Seidl (Phys Rev A 60 4387-4395, 1999) and Seidl, Gorigiorgi and Savin (Phys Rev A 75:042511 1-12, 2007); in these papers the lack of a rigorous proofwas pointed out.We also give amathematical counterexample to this type of result, by replacing the constraint of given one-body density—an infinite dimensional quadratic expression in the wavefunction—by an infinite-dimensional quadratic expression in the wavefunction and its gradient. Connections with the Lawrentiev phenomenon in the calculus of variations are indicated.

  20. Muscle pH, rigor mortis and blood variables in Atlantic salmon transported in two types of well-boat.

    Science.gov (United States)

    Gatica, M C; Monti, G E; Knowles, T G; Gallo, C B

    2010-01-09

    Two systems for transporting live salmon (Salmo salar) were compared in terms of their effects on blood variables, muscle pH and rigor index: an 'open system' well-boat with recirculated sea water at 13.5 degrees C and a stocking density of 107 kg/m3 during an eight-hour journey, and a 'closed system' well-boat with water chilled from 16.7 to 2.1 degrees C and a stocking density of 243.7 kg/m3 during a seven-hour journey. Groups of 10 fish were sampled at each of four stages: in cages at the farm, in the well-boat after loading, in the well-boat after the journey and before unloading, and in the processing plant after they were pumped from the resting cages. At each sampling, the fish were stunned and bled by gill cutting. Blood samples were taken to measure lactate, osmolality, chloride, sodium, cortisol and glucose, and their muscle pH and rigor index were measured at death and three hours later. In the open system well-boat, the initial muscle pH of the fish decreased at each successive stage, and at the final stage they had a significantly lower initial muscle pH and more rapid onset of rigor than the fish transported on the closed system well-boat. At the final stage all the blood variables except glucose were significantly affected in the fish transported on both types of well-boat.

  1. Metrological tests of a 200 L calibration source for HPGE detector systems for assay of radioactive waste drums

    International Nuclear Information System (INIS)

    Boshkova, T.; Mitev, K.

    2016-01-01

    In this work we present test procedures, approval criteria and results from two metrological inspections of a certified large volume "1"5"2Eu source (drum about 200 L) intended for calibration of HPGe gamma assay systems used for activity measurement of radioactive waste drums. The aim of the inspections was to prove the stability of the calibration source during its working life. The large volume source was designed and produced in 2007. It consists of 448 identical sealed radioactive sources (modules) apportioned in 32 transparent plastic tubes which were placed in a wooden matrix which filled the drum. During the inspections the modules were subjected to tests for verification of their certified characteristics. The results show a perfect compliance with the NIST basic guidelines for the properties of a radioactive certified reference material (CRM) and demonstrate the stability of the large volume CRM-drum after 7 years of operation. - Highlights: • Large (200 L) volume drum source designed, produced and certified as CRM in 2007. • Source contains 448 identical sealed radioactive "1"5"2Eu sources (modules). • Two metrological inspections in 2011 and 2014. • No statistically significant changes of the certified characteristics over time. • Stable calibration source for HPGe-gamma radioactive waste assay systems.

  2. The front end test stand high performance H- ion source at Rutherford Appleton Laboratory.

    Science.gov (United States)

    Faircloth, D C; Lawrie, S; Letchford, A P; Gabor, C; Wise, P; Whitehead, M; Wood, T; Westall, M; Findlay, D; Perkins, M; Savage, P J; Lee, D A; Pozimski, J K

    2010-02-01

    The aim of the front end test stand (FETS) project is to demonstrate that chopped low energy beams of high quality can be produced. FETS consists of a 60 mA Penning Surface Plasma Ion Source, a three solenoid low energy beam transport, a 3 MeV radio frequency quadrupole, a chopper, and a comprehensive suite of diagnostics. This paper details the design and initial performance of the ion source and the laser profile measurement system. Beam current, profile, and emittance measurements are shown for different operating conditions.

  3. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    Science.gov (United States)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  4. Thorium molecular negative ion production in a cesium sputter source at BARC-TIFR pelletron accelerator ion source test set up

    International Nuclear Information System (INIS)

    Gupta, A.K.; Mehrotra, N.; Kale, R.M.; Alamelu, D.; Aggarwal, S.K.

    2005-01-01

    Ion source test set up at Pelletron Accelerator facility has been utilized extensively for the production and characterization of negative ions, with particular emphasis being place at the species of experimental users interest. The attention have been focussed towards the formation of rare earth negative ions, due to their importance in the ongoing accelerator mass spectroscopy program and isotopic abundance measurements using secondary negative ion mass spectrometry

  5. Unclassified Source Term and Radionuclide Data for Corrective Action Unit 98: Frenchman Flat Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    Farnham, Irene

    2005-01-01

    Frenchman Flat is one of several areas of the Nevada Test Site (NTS) used for underground nuclear testing (Figure 1-1). These nuclear tests resulted in groundwater contamination in the vicinity of the underground test areas. As a result, the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) is currently conducting a corrective action investigation (CAI) of the Frenchman Flat underground test areas. Since 1996, the Nevada Division of Environmental Protection (NDEP) has regulated NNSA/NSO corrective actions through the ''Federal Facility Agreement and Consent Order'' ([FFACO], 1996). Appendix VI of the FFACO agreement, ''Corrective Action Strategy'', was revised on December 7, 2000, and describes the processes that will be used to complete corrective actions, including those in the Underground Test Area (UGTA) Project. The individual locations covered by the agreement are known as corrective action sites (CASs), which are grouped into corrective action units (CAUs). The UGTA CASs are grouped geographically into five CAUs: Frenchman Flat, Central Pahute Mesa, Western Pahute Mesa, Yucca Flat/Climax Mine, and Rainier Mesa/Shoshone Mountain (Figure 1-1). These CAUs have distinctly different contaminant source, geologic, and hydrogeologic characteristics related to their location (FFACO, 1996). The Frenchman Flat CAU consists of 10 CASs located in the northern part of Area 5 and the southern part of Area 11 (Figure 1-1). This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for Frenchman Flat, CAU 98. The methodology used to estimate hydrologic source terms (HSTs) for the Frenchman Flat CAU is also documented. The HST of an underground nuclear test is the portion of the total inventory of radionuclides that is released over time into the groundwater following the test. The total residual inventory of radionuclides associated with one or

  6. Testing Procedures and Results of the Prototype Fundamental Power Coupler for the Spallation Neutron Source

    International Nuclear Information System (INIS)

    M. Stirbet; I.E. Campisi; E.F. Daly; G.K. Davis; M. Drury; P. Kneisel; G. Myneni; T. Powers; W.J. Schneider; K.M. Wilson; Y. Kang; K.A. Cummings; T. Hardek

    2001-01-01

    High-power RF testing with peak power in excess of 500 kW has been performed on prototype Fundamental Power Couplers (FPC) for the Spallation Neutron Source superconducting (SNS) cavities. The testing followed the development of procedures for cleaning, assembling and preparing the FPC for installation in the test stand. The qualification of the couplers has occurred for the time being only in a limited set of conditions (travelling wave, 20 pps) as the available RF system and control instrumentation are under improvement

  7. The rigorous stochastic matrix multiplication scheme for the calculations of reduced equilibrium density matrices of open multilevel quantum systems

    International Nuclear Information System (INIS)

    Chen, Xin

    2014-01-01

    Understanding the roles of the temporary and spatial structures of quantum functional noise in open multilevel quantum molecular systems attracts a lot of theoretical interests. I want to establish a rigorous and general framework for functional quantum noises from the constructive and computational perspectives, i.e., how to generate the random trajectories to reproduce the kernel and path ordering of the influence functional with effective Monte Carlo methods for arbitrary spectral densities. This construction approach aims to unify the existing stochastic models to rigorously describe the temporary and spatial structure of Gaussian quantum noises. In this paper, I review the Euclidean imaginary time influence functional and propose the stochastic matrix multiplication scheme to calculate reduced equilibrium density matrices (REDM). In addition, I review and discuss the Feynman-Vernon influence functional according to the Gaussian quadratic integral, particularly its imaginary part which is critical to the rigorous description of the quantum detailed balance. As a result, I establish the conditions under which the influence functional can be interpreted as the average of exponential functional operator over real-valued Gaussian processes for open multilevel quantum systems. I also show the difference between the local and nonlocal phonons within this framework. With the stochastic matrix multiplication scheme, I compare the normalized REDM with the Boltzmann equilibrium distribution for open multilevel quantum systems

  8. RIGOROUS PHOTOGRAMMETRIC PROCESSING OF CHANG'E-1 AND CHANG'E-2 STEREO IMAGERY FOR LUNAR TOPOGRAPHIC MAPPING

    Directory of Open Access Journals (Sweden)

    K. Di

    2012-07-01

    Full Text Available Chang'E-1(CE-1 and Chang'E-2(CE-2 are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1 refining EOPs by correcting the attitude angle bias, 2 refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model and DOM (Digital Ortho Map are automatically generated.

  9. Direct integration of the S-matrix applied to rigorous diffraction

    International Nuclear Information System (INIS)

    Iff, W; Lindlein, N; Tishchenko, A V

    2014-01-01

    A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)

  10. Cooperative effort between Consorcio European Spallation Source--Bilbao and Oak Ridge National Laboratory spallation neutron source for manufacturing and testing of the JEMA-designed modulator system

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, David E [ORNL

    2017-01-02

    The JEMA modulator was originally developed for the European Spallation Source (ESS) when Spain was under consideration as a location for the ESS facility. Discussions ensued and the Spallation Neutron Source Research Accelerator Division agreed to form a collaboration with ESS-Bilbao (ESS-B) consortium to provide services for specifying the requirements for a version of the modulator capable of operating twelve 550 kW klystrons, monitoring the technical progress on the contract with JEMA, installing and commissioning the modulator at SNS, and performing a 30 day full power test. This work was recently completed, and this report discusses those activities with primary emphasis on the installation and testing activities.

  11. Cooperative effort between Consorcio European Spallation Source--Bilbao and Oak Ridge National Laboratory spallation neutron source for manufacturing and testing of the JEMA-designed modulator system

    International Nuclear Information System (INIS)

    Anderson, David E.

    2017-01-01

    The JEMA modulator was originally developed for the European Spallation Source (ESS) when Spain was under consideration as a location for the ESS facility. Discussions ensued and the Spallation Neutron Source Research Accelerator Division agreed to form a collaboration with ESS-Bilbao (ESS-B) consortium to provide services for specifying the requirements for a version of the modulator capable of operating twelve 550 kW klystrons, monitoring the technical progress on the contract with JEMA, installing and commissioning the modulator at SNS, and performing a 30 day full power test. This work was recently completed, and this report discusses those activities with primary emphasis on the installation and testing activities.

  12. General-purpose heat source: Research and development program, radioisotope thermoelectric generator/thin fragment impact test

    International Nuclear Information System (INIS)

    Reimus, M.A.H.; Hinckley, J.E.

    1996-11-01

    The general-purpose heat source provides power for space missions by transmitting the heat of 238 Pu decay to an array of thermoelectric elements in a radioisotope thermoelectric generator (RTG). Because the potential for a launch abort or return from orbit exists for any space mission, the heat source response to credible accident scenarios is being evaluated. This test was designed to provide information on the response of a loaded RTG to impact by a fragment similar to the type of fragment produced by breakup of the spacecraft propulsion module system. The results of this test indicated that impact by a thin aluminum fragment traveling at 306 m/s may result in significant damage to the converter housing, failure of one fueled clad, and release of a small quantity of fuel

  13. Advancing Explosion Source Theory through Experimentation: Results from Seismic Experiments Since the Moratorium on Nuclear Testing

    Science.gov (United States)

    Bonner, J. L.; Stump, B. W.

    2011-12-01

    On 23 September 1992, the United States conducted the nuclear explosion DIVIDER at the Nevada Test Site (NTS). It would become the last US nuclear test when a moratorium ended testing the following month. Many of the theoretical explosion seismic models used today were developed from observations of hundreds of nuclear tests at NTS and around the world. Since the moratorium, researchers have turned to chemical explosions as a possible surrogate for continued nuclear explosion research. This talk reviews experiments since the moratorium that have used chemical explosions to advance explosion source models. The 1993 Non-Proliferation Experiment examined single-point, fully contained chemical-nuclear equivalence by detonating over a kiloton of chemical explosive at NTS in close proximity to previous nuclear explosion tests. When compared with data from these nearby nuclear explosions, the regional and near-source seismic data were found to be essentially identical after accounting for different yield scaling factors for chemical and nuclear explosions. The relationship between contained chemical explosions and large production mining shots was studied at the Black Thunder coal mine in Wyoming in 1995. The research led to an improved source model for delay-fired mining explosions and a better understanding of mining explosion detection by the International Monitoring System (IMS). The effect of depth was examined in a 1997 Kazakhstan Depth of Burial experiment. Researchers used local and regional seismic observations to conclude that the dominant mechanism for enhanced regional shear waves was local Rg scattering. Travel-time calibration for the IMS was the focus of the 1999 Dead Sea Experiment where a 10-ton shot was recorded as far away as 5000 km. The Arizona Source Phenomenology Experiments provided a comparison of fully- and partially-contained chemical shots with mining explosions, thus quantifying the reduction in seismic amplitudes associated with partial

  14. Rigorous derivation of porous-media phase-field equations

    Science.gov (United States)

    Schmuck, Markus; Kalliadasis, Serafim

    2017-11-01

    The evolution of interfaces in Complex heterogeneous Multiphase Systems (CheMSs) plays a fundamental role in a wide range of scientific fields such as thermodynamic modelling of phase transitions, materials science, or as a computational tool for interfacial flow studies or material design. Here, we focus on phase-field equations in CheMSs such as porous media. To the best of our knowledge, we present the first rigorous derivation of error estimates for fourth order, upscaled, and nonlinear evolution equations. For CheMs with heterogeneity ɛ, we obtain the convergence rate ɛ 1 / 4 , which governs the error between the solution of the new upscaled formulation and the solution of the microscopic phase-field problem. This error behaviour has recently been validated computationally in. Due to the wide range of application of phase-field equations, we expect this upscaled formulation to allow for new modelling, analytic, and computational perspectives for interfacial transport and phase transformations in CheMSs. This work was supported by EPSRC, UK, through Grant Nos. EP/H034587/1, EP/L027186/1, EP/L025159/1, EP/L020564/1, EP/K008595/1, and EP/P011713/1 and from ERC via Advanced Grant No. 247031.

  15. Introduction to an open source internet-based testing program for medical student examinations.

    Science.gov (United States)

    Lee, Yoon-Hwan

    2009-12-20

    The author developed a freely available open source internet-based testing program for medical examination. PHP and Java script were used as the programming language and postgreSQL as the database management system on an Apache web server and Linux operating system. The system approach was that a super user inputs the items, each school administrator inputs the examinees' information, and examinees access the system. The examinee's score is displayed immediately after examination with item analysis. The set-up of the system beginning with installation is described. This may help medical professors to easily adopt an internet-based testing system for medical education.

  16. Heavy ion beams from an Alphatross source for use in calibration and testing of diagnostics

    Science.gov (United States)

    Ward, R. J.; Brown, G. M.; Ho, D.; Stockler, B. F. O. F.; Freeman, C. G.; Padalino, S. J.; Regan, S. P.

    2016-10-01

    Ion beams from the 1.7 MV Pelletron Accelerator at SUNY Geneseo have been used to test and calibrate many inertial confinement fusion (ICF) diagnostics and high energy density physics (HEDP) diagnostics used at the Laboratory for Laser Energetics (LLE). The ion source on this accelerator, a radio-frequency (RF) alkali-metal charge exchange source called an Alphatross, is designed to produce beams of hydrogen and helium isotopes. There is interest in accelerating beams of carbon, oxygen, argon, and other heavy ions for use in testing several diagnostics, including the Time Resolved Tandem Faraday Cup (TRTF). The feasibility of generating these heavy ion beams using the Alphatross source will be reported. Small amounts of various gases are mixed into the helium plasma in the ion source bottle. A velocity selector is used to allow the desired ions to pass into the accelerator. As the heavy ions pass through the stripper canal of the accelerator, they emerge in a variety of charge states. The energy of the ion beam at the high-energy end of the accelerator will vary as a function of the charge state, however the maximum energy deliverable to target is limited by the maximum achievable magnetic field produced by the accelerator's steering magnet. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  17. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  18. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. A simplified model of the source channel of the Leksell GammaKnife tested with PENELOPE

    OpenAIRE

    Al-Dweri, Feras M. O.; Lallena, Antonio M.; Vilches, Manuel

    2004-01-01

    Monte Carlo simulations using the code PENELOPE have been performed to test a simplified model of the source channel geometry of the Leksell GammaKnife$^{\\circledR}$. The characteristics of the radiation passing through the treatment helmets are analysed in detail. We have found that only primary particles emitted from the source with polar angles smaller than 3$^{\\rm o}$ with respect to the beam axis are relevant for the dosimetry of the Gamma Knife. The photons trajectories reaching the out...

  20. A 14-MeV beam-plasma neutron source for materials testing

    International Nuclear Information System (INIS)

    Futch, A.H.; Coensgen, F.H.; Damm, C.C.; Molvik, A.W.

    1989-01-01

    The design and performance of 14-MeV beam-plasma neutron sources for accelerated testing of fusion reactor materials are described. Continuous production of 14-MeV neutron fluxes in the range of 5 to 10 MW/m 2 at the plasma surface are produced by D-T reactions in a two-component plasma. In the present designs, 14-MeV neutrons result from collisions of energetic deuterium ions created by transverse injection of 150-keV deuterium atoms on a fully ionized tritium target plasma. The beam energy, which deposited at the center of the tritium column, is transferred to the warm plasma by electron drag, which flows axially to the end regions. Neutral gas at high pressure absorbs the energy in the tritium plasma and transfers the heat to the walls of the vacuum vessel. The plasma parameters of the neutron source, in dimensionless units, have been achieved in the 2XIIB high-β plasma. The larger magnetic field of the present design permits scaling to the higher energy and density of the neutron source design. In the extrapolation, care has been taken to preserve the scaling and plasma attributes that contributed to equilibrium, magnetohydrodynamic (MHD) stability, and microstability in 2XIIB. The performance and scaling characteristics are described for several designs chosen to enhance the thermal isolation of the two-component plasmas. 11 refs., 3 figs., 3 tabs

  1. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    Science.gov (United States)

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from

  2. Ion beam pellet fusion as a CTR neutron test source

    International Nuclear Information System (INIS)

    Arnold, R.; Martin, R.

    1975-07-01

    Pellet fusion, driven by nanosecond pulses containing α particles with 200 MeV energy, is being developed as a neutron source. A prototype system is in the conceptual design stage. During the coming year, engineering design of required accelerator components, storage rings, and pellet configurations, as well as experiments on energy deposition mechanisms, should be accomplished. Successful construction and tests of prototype rings, followed by two years of full scale system construction, would give a source producing a useful flux of fusion neutrons for materials testing. The system as currently envisioned would employ 100 small superconducting high field storage rings (15 cm radius, 140 kG field) which would be synchronously filled with circulating 1 nsec pulses from a 200 MeV linear accelerator over a period of 3 x 10 -4 sec. These ion pulses would all be simultaneously extracted, forming a total current of 10 kA, and focussed from all directions on a deuterium and tritium (DT) pellet with 0.17 mm radium, surrounded by a heavier (metal) coating to increase confinement time and aid compression efficiency. The overall repetition rate, limited principally by physical transport of the pellets, could reach 100/sec. Spacing between pellet and focussing elements would be about 1 m. The predominant engineering problems are the fast extraction mechanism and beam transport devices for the storage rings. Additional theoretical and experimental studies are required on the crucial energy deposition and transport mechanisms in pellets with ion beam heating before firm estimates can be given. Preliminary estimates suggest fusion neutron yields of at least 10 14 /sec and possibly 10 16 /sec are possible, with optimal pellet dynamics, but without the necessity for any large advances in the state-of-the-art in accelerator and storage ring design. (auth)

  3. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  4. Is Collaborative, Community-Engaged Scholarship More Rigorous than Traditional Scholarship? On Advocacy, Bias, and Social Science Research

    Science.gov (United States)

    Warren, Mark R.; Calderón, José; Kupscznk, Luke Aubry; Squires, Gregory; Su, Celina

    2018-01-01

    Contrary to the charge that advocacy-oriented research cannot meet social science research standards because it is inherently biased, the authors of this article argue that collaborative, community-engaged scholarship (CCES) must meet high standards of rigor if it is to be useful to support equity-oriented, social justice agendas. In fact, they…

  5. Retrieval Can Increase or Decrease Suggestibility Depending on How Memory Is Tested: The Importance of Source Complexity

    Science.gov (United States)

    Chan, Jason C. K.; Wilford, Miko M.; Hughes, Katharine L.

    2012-01-01

    Taking an intervening test between learning episodes can enhance later source recollection. Paradoxically, testing can also increase people's susceptibility to the misinformation effect--a finding termed retrieval-enhanced suggestibility (RES, Chan, Thomas, & Bulevich, 2009). We conducted three experiments to examine this apparent contradiction.…

  6. Guidelines for conducting rigorous health care psychosocial cross-cultural/language qualitative research.

    Science.gov (United States)

    Arriaza, Pablo; Nedjat-Haiem, Frances; Lee, Hee Yun; Martin, Shadi S

    2015-01-01

    The purpose of this article is to synthesize and chronicle the authors' experiences as four bilingual and bicultural researchers, each experienced in conducting cross-cultural/cross-language qualitative research. Through narrative descriptions of experiences with Latinos, Iranians, and Hmong refugees, the authors discuss their rewards, challenges, and methods of enhancing rigor, trustworthiness, and transparency when conducting cross-cultural/cross-language research. The authors discuss and explore how to effectively manage cross-cultural qualitative data, how to effectively use interpreters and translators, how to identify best methods of transcribing data, and the role of creating strong community relationships. The authors provide guidelines for health care professionals to consider when engaging in cross-cultural qualitative research.

  7. Reconsideration of the sequence of rigor mortis through postmortem changes in adenosine nucleotides and lactic acid in different rat muscles.

    Science.gov (United States)

    Kobayashi, M; Takatori, T; Iwadate, K; Nakajima, M

    1996-10-25

    We examined the changes in adenosine triphosphate (ATP), lactic acid, adenosine diphosphate (ADP) and adenosine monophosphate (AMP) in five different rat muscles after death. Rigor mortis has been thought to occur simultaneously in dead muscles and hence to start in small muscles sooner than in large muscles. In this study we found that the rate of decrease in ATP was significantly different in each muscle. The greatest drop in ATP was observed in the masseter muscle. These findings contradict the conventional theory of rigor mortis. Similarly, the rates of change in ADP and lactic acid, which are thought to be related to the consumption or production of ATP, were different in each muscle. However, the rate of change of AMP was the same in each muscle.

  8. Testing Hubbert

    International Nuclear Information System (INIS)

    Brandt, Adam R.

    2007-01-01

    The Hubbert theory of oil depletion, which states that oil production in large regions follows a bell-shaped curve over time, has been cited as a method to predict the future of global oil production. However, the assumptions of the Hubbert method have never been rigorously tested with a large, publicly available data set. In this paper, three assumptions of the modern Hubbert theory are tested using data from 139 oil producing regions. These regions are sub-national (United States state-level, United States regional-level), national, and multi-national (subcontinental and continental) in scale. We test the assumption that oil production follows a bell-shaped curve by generating best-fitting curves for each region using six models and comparing the quality of fit across models. We also test the assumptions that production over time in a region tends to be symmetric, and that production is more bell-shaped in larger regions than in smaller regions

  9. Development of the front end test stand and vessel for extraction and source plasma analyses negative hydrogen ion sources at the Rutherford Appleton Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lawrie, S. R., E-mail: scott.lawrie@stfc.ac.uk [STFC ISIS Pulsed Spallation Neutron and Muon Facility, Rutherford Appleton Laboratory, Harwell Oxford, Harwell (United Kingdom); John Adams Institute of Accelerator Science, University of Oxford, Oxford (United Kingdom); Faircloth, D. C.; Letchford, A. P.; Perkins, M.; Whitehead, M. O.; Wood, T. [STFC ISIS Pulsed Spallation Neutron and Muon Facility, Rutherford Appleton Laboratory, Harwell Oxford, Harwell (United Kingdom); Gabor, C. [ASTeC Intense Beams Group, Rutherford Appleton Laboratory, Harwell Oxford, Harwell (United Kingdom); Back, J. [High Energy Physics Department, University of Warwick, Coventry (United Kingdom)

    2014-02-15

    The ISIS pulsed spallation neutron and muon facility at the Rutherford Appleton Laboratory (RAL) in the UK uses a Penning surface plasma negative hydrogen ion source. Upgrade options for the ISIS accelerator system demand a higher current, lower emittance beam with longer pulse lengths from the injector. The Front End Test Stand is being constructed at RAL to meet the upgrade requirements using a modified ISIS ion source. A new 10% duty cycle 25 kV pulsed extraction power supply has been commissioned and the first meter of 3 MeV radio frequency quadrupole has been delivered. Simultaneously, a Vessel for Extraction and Source Plasma Analyses is under construction in a new laboratory at RAL. The detailed measurements of the plasma and extracted beam characteristics will allow a radical overhaul of the transport optics, potentially yielding a simpler source configuration with greater output and lifetime.

  10. Metrological tests of a 200 L calibration source for HPGE detector systems for assay of radioactive waste drums.

    Science.gov (United States)

    Boshkova, T; Mitev, K

    2016-03-01

    In this work we present test procedures, approval criteria and results from two metrological inspections of a certified large volume (152)Eu source (drum about 200L) intended for calibration of HPGe gamma assay systems used for activity measurement of radioactive waste drums. The aim of the inspections was to prove the stability of the calibration source during its working life. The large volume source was designed and produced in 2007. It consists of 448 identical sealed radioactive sources (modules) apportioned in 32 transparent plastic tubes which were placed in a wooden matrix which filled the drum. During the inspections the modules were subjected to tests for verification of their certified characteristics. The results show a perfect compliance with the NIST basic guidelines for the properties of a radioactive certified reference material (CRM) and demonstrate the stability of the large volume CRM-drum after 7 years of operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    Science.gov (United States)

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  12. Improved rigorous upper bounds for transport due to passive advection described by simple models of bounded systems

    International Nuclear Information System (INIS)

    Kim, Chang-Bae; Krommes, J.A.

    1988-08-01

    The work of Krommes and Smith on rigorous upper bounds for the turbulent transport of a passively advected scalar [/ital Ann. Phys./ 177:246 (1987)] is extended in two directions: (1) For their ''reference model,'' improved upper bounds are obtained by utilizing more sophisticated two-time constraints which include the effects of cross-correlations up to fourth order. Numerical solutions of the model stochastic differential equation are also obtained; they show that the new bounds compare quite favorably with the exact results, even at large Reynolds and Kubo numbers. (2) The theory is extended to take account of a finite spatial autocorrelation length L/sub c/. As a reasonably generic example, the problem of particle transport due to statistically specified stochastic magnetic fields in a collisionless turbulent plasma is revisited. A bound is obtained which reduces for small L/sub c/ to the quasilinear limit and for large L/sub c/ to the strong turbulence limit, and which provides a reasonable and rigorous interpolation for intermediate values of L/sub c/. 18 refs., 6 figs

  13. Proof testing of CANDU concrete containment structures

    International Nuclear Information System (INIS)

    Pandey, M.D.

    1996-05-01

    Prior to commissioning of a CANDU reactor, a proof pressure test is required to demonstrate the structural integrity of the containment envelope. The test pressure specified by AECB Regulatory Document R-7 (1991) was selected without a rigorous consideration of uncertainties associated with estimates of accident pressure and conatinment resistance. This study was undertaken to develop a reliability-based philosophy for defining proof testing requirements that are consistent with the current limit states design code for concrete containments (CSA N287.3).It was shown that the upodated probability of failure after a successful test is always less than the original estimate

  14. Identification of Noise Sources During Rocket Engine Test Firings and a Rocket Launch Using a Microphone Phased-Array

    Science.gov (United States)

    Panda, Jayanta; Mosher, Robert N.; Porter, Barry J.

    2013-01-01

    A 70 microphone, 10-foot by 10-foot, microphone phased array was built for use in the harsh environment of rocket launches. The array was setup at NASA Wallops launch pad 0A during a static test firing of Orbital Sciences' Antares engines, and again during the first launch of the Antares vehicle. It was placed 400 feet away from the pad, and was hoisted on a scissor lift 40 feet above ground. The data sets provided unprecedented insight into rocket noise sources. The duct exit was found to be the primary source during the static test firing; the large amount of water injected beneath the nozzle exit and inside the plume duct quenched all other sources. The maps of the noise sources during launch were found to be time-dependent. As the engines came to full power and became louder, the primary source switched from the duct inlet to the duct exit. Further elevation of the vehicle caused spilling of the hot plume, resulting in a distributed noise map covering most of the pad. As the entire plume emerged from the duct, and the ondeck water system came to full power, the plume itself became the loudest noise source. These maps of the noise sources provide vital insight for optimization of sound suppression systems for future Antares launches.

  15. Introduction to an Open Source Internet-Based Testing Program for Medical Student Examinations

    Directory of Open Access Journals (Sweden)

    Yoon-Hwan Lee

    2009-12-01

    Full Text Available The author developed a freely available open source internet-based testing program for medical examination. PHP and Java script were used as the programming language and postgreSQL as the database management system on an Apache web server and Linux operating system. The system approach was that a super user inputs the items, each school administrator inputs the examinees’ information, and examinees access the system. The examinee’s score is displayed immediately after examination with item analysis. The set-up of the system beginning with installation is described. This may help medical professors to easily adopt an internet-based testing system for medical education.

  16. 8th RILEM International Symposium on Testing and Characterization of Sustainable and Innovative Bituminous Materials

    CERN Document Server

    Partl, Manfred

    2016-01-01

    This work presents the results of RILEM TC 237-SIB (Testing and characterization of sustainable innovative bituminous materials and systems). The papers have been selected for publication after a rigorous peer review process and will be an invaluable source to outline and clarify the main directions of present and future research and standardization for bituminous materials and pavements. The following topics are covered: - Characterization of binder-aggregate interaction - Innovative testing of bituminous binders, additives and modifiers - Durability and aging of asphalt pavements - Mixture design and compaction analysis - Environmentally sustainable materials and technologies - Advances in laboratory characterization of bituminous materials - Modeling of road materials and pavement performance prediction - Field measurement and in-situ characterization - Innovative materials for reinforcement and interlayer systems - Cracking and damage characterization of asphalt pavements - Rec...

  17. Compact X-ray source at STF (Super Conducting Accelerator Test Facility)

    International Nuclear Information System (INIS)

    Urakawa, J

    2012-01-01

    KEK-STF is a super conducting linear accelerator test facility for developing accelerator technologies for the ILC (International Linear Collider). We are supported in developing advanced accelerator technologies using STF by Japanese Ministry (MEXT) for Compact high brightness X-ray source development. Since we are required to demonstrate the generation of high brightness X-ray based on inverse Compton scattering using super conducting linear accelerator and laser storage cavity technologies by October of next year (2012), the design has been fixed and the installation of accelerator components is under way. The necessary technology developments and the planned experiment are explained.

  18. Analysis of Earthquake Catalogs for CSEP Testing Region Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Romashkova, L.; Nekrasova, A.; Kossobokov, V.; Panza, G.F.

    2010-07-01

    A comprehensive analysis shows that the set of catalogs provided by the Istituto Nazionale di Geofisica e Vulcanologia (INGV, Italy) as the authoritative database for the Collaboratory for the Study of Earthquake Predictability - Testing Region Italy (CSEP-TRI), is hardly a unified one acceptable for the necessary tuning of models/algorithms, as well as for running rigorous prospective predictability tests at intermediate- or long-term scale. (author)

  19. Facility for fast neutron irradiation tests of electronics at the ISIS spallation neutron source

    International Nuclear Information System (INIS)

    Andreani, C.; Pietropaolo, A.; Salsano, A.; Gorini, G.; Tardocchi, M.; Paccagnella, A.; Gerardin, S.; Frost, C. D.; Ansell, S.; Platt, S. P.

    2008-01-01

    The VESUVIO beam line at the ISIS spallation neutron source was set up for neutron irradiation tests in the neutron energy range above 10 MeV. The neutron flux and energy spectrum were shown, in benchmark activation measurements, to provide a neutron spectrum similar to the ambient one at sea level, but with an enhancement in intensity of a factor of 10 7 . Such conditions are suitable for accelerated testing of electronic components, as was demonstrated here by measurements of soft error rates in recent technology field programable gate arrays

  20. Design, manufacture and factory testing of the Ion Source and Extraction Power Supplies for the SPIDER experiment

    International Nuclear Information System (INIS)

    Bigi, Marco; Rinaldi, Luigi; Simon, Muriel; Sita, Luca; Taddia, Giuseppe; Carrozza, Saverino; Decamps, Hans; Luchetta, Adriano; Meddour, Abdelraouf; Moressa, Modesto; Morri, Cristiano; Musile Tanzi, Antonio; Recchia, Mauro; Wagner, Uwe; Zamengo, Andrea; Toigo, Vanni

    2015-01-01

    Highlights: • 5 MVA ion source power supplies effectively integrated in 150 m"2 Faraday cage. • Load protection and performance requirements met of custom design high voltage power supplies. • 200 kW tetrode oscillator with 200 kHz frequency range successfully tested. - Abstract: The SPIDER experiment, currently under construction at the Neutral Beam Test Facility in Padua, Italy, is a full-size prototype of the ion source for the ITER Neutral Beam Injectors. The Ion Source and Extraction Power Supplies (ISEPS) for SPIDER are supplied by OCEM Energy Technology s.r.l. (OCEM) under a procurement contract with Fusion for Energy (F4E) covering also the units required for MITICA and ITER injectors. The detailed design of SPIDER ISEPS was finalized in 2011 and manufacture of most components completed by end 2013. The Factory Acceptance Tests took place early 2014. ISEPS, with an overall power rating of 5 MVA, form a heterogeneous set of items including solid state power converters and 1 MHz radiofrequency generators of 200 kW output power. The paper presents the main features of the detailed design developed by OCEM, focusing in particular on the high output voltage pulse step modulators, the high output current resonant converters, the radiofrequency generators by HIMMELWERK GmbH and the architecture and implementation of the complex control system. Details are given on non-standard factory tests verifying the insulation requirements specific to this application. Performance of ISEPS during the factory acceptance tests is described, with emphasis on demonstration of the load protection requirements, a crucial point for all neutral beam power supplies. Finally, key dates of SPIDER ISEPS installation and site testing schedule are provided.

  1. Design, manufacture and factory testing of the Ion Source and Extraction Power Supplies for the SPIDER experiment

    Energy Technology Data Exchange (ETDEWEB)

    Bigi, Marco, E-mail: marco.bigi@igi.cnr.it [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Corso Stati Uniti 4, 35127 Padova (Italy); Rinaldi, Luigi [OCEM Energy Technology, Via della Solidarietà 2/1, 40056 Valsamoggia (località Crespellano), Bologna (Italy); Simon, Muriel [Fusion for Energy, Josep Pla 2, 08019 Barcelona (Spain); Sita, Luca; Taddia, Giuseppe; Carrozza, Saverino [OCEM Energy Technology, Via della Solidarietà 2/1, 40056 Valsamoggia (località Crespellano), Bologna (Italy); Decamps, Hans [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul lez Durance (France); Luchetta, Adriano [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Corso Stati Uniti 4, 35127 Padova (Italy); Meddour, Abdelraouf [HIMMELWERK Hoch- und Mittelfrequenzanlagen GmbH, Jopestr. 10, 72072 Tübingen (Germany); Moressa, Modesto [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Corso Stati Uniti 4, 35127 Padova (Italy); Morri, Cristiano; Musile Tanzi, Antonio [OCEM Energy Technology, Via della Solidarietà 2/1, 40056 Valsamoggia (località Crespellano), Bologna (Italy); Recchia, Mauro [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Corso Stati Uniti 4, 35127 Padova (Italy); Wagner, Uwe [HIMMELWERK Hoch- und Mittelfrequenzanlagen GmbH, Jopestr. 10, 72072 Tübingen (Germany); Zamengo, Andrea; Toigo, Vanni [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Corso Stati Uniti 4, 35127 Padova (Italy)

    2015-10-15

    Highlights: • 5 MVA ion source power supplies effectively integrated in 150 m{sup 2} Faraday cage. • Load protection and performance requirements met of custom design high voltage power supplies. • 200 kW tetrode oscillator with 200 kHz frequency range successfully tested. - Abstract: The SPIDER experiment, currently under construction at the Neutral Beam Test Facility in Padua, Italy, is a full-size prototype of the ion source for the ITER Neutral Beam Injectors. The Ion Source and Extraction Power Supplies (ISEPS) for SPIDER are supplied by OCEM Energy Technology s.r.l. (OCEM) under a procurement contract with Fusion for Energy (F4E) covering also the units required for MITICA and ITER injectors. The detailed design of SPIDER ISEPS was finalized in 2011 and manufacture of most components completed by end 2013. The Factory Acceptance Tests took place early 2014. ISEPS, with an overall power rating of 5 MVA, form a heterogeneous set of items including solid state power converters and 1 MHz radiofrequency generators of 200 kW output power. The paper presents the main features of the detailed design developed by OCEM, focusing in particular on the high output voltage pulse step modulators, the high output current resonant converters, the radiofrequency generators by HIMMELWERK GmbH and the architecture and implementation of the complex control system. Details are given on non-standard factory tests verifying the insulation requirements specific to this application. Performance of ISEPS during the factory acceptance tests is described, with emphasis on demonstration of the load protection requirements, a crucial point for all neutral beam power supplies. Finally, key dates of SPIDER ISEPS installation and site testing schedule are provided.

  2. A study into first-year engineering education success using a rigorous mixed methods approach

    DEFF Research Database (Denmark)

    van den Bogaard, M.E.D.; de Graaff, Erik; Verbraek, Alexander

    2015-01-01

    The aim of this paper is to combine qualitative and quantitative research methods into rigorous research into student success. Research methods have weaknesses that can be overcome by clever combinations. In this paper we use a situated study into student success as an example of how methods...... using statistical techniques. The main elements of the model were student behaviour and student disposition, which were influenced by the students’ perceptions of the education environment. The outcomes of the qualitative studies were useful in interpreting the outcomes of the structural equation...

  3. Open source non-invasive prenatal testing platform and its performance in a public health laboratory

    DEFF Research Database (Denmark)

    Johansen, Peter; Richter, Stine R; Balslev-Harder, Marie

    2016-01-01

    OBJECTIVE: The objective of this study was to introduce non-invasive prenatal testing (NIPT) for fetal autosomal trisomies and gender in a Danish public health setting, using semi-conductor sequencing and published open source scripts for analysis. METHODS: Plasma-derived DNA from a total of 375...... correlation (R(2)  = 0.72) to Y-chromosomal content of the male fetus samples. DISCUSSION: We have implemented NIPT into Danish health care using published open source scripts for autosomal aneuploidy detection and fetal DNA fraction estimation showing excellent false negative and false positive rates. Seq...

  4. The development of an Infrared Environmental System for TOPEX Solar Panel Testing

    Science.gov (United States)

    Noller, E.

    1994-01-01

    Environmental testing and flight qualification of the TOPEX/POSEIDON spacecraft solar panels were performed with infrared (IR) lamps and a control system that were newly designed and integrated. The basic goal was more rigorous testing of the costly panels' new composite-structure design without jeopardizing their safety. The technique greatly reduces the costs and high risks of testing flight solar panels.

  5. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  6. PRO development: rigorous qualitative research as the crucial foundation.

    Science.gov (United States)

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  7. Source Test Report for the 205 Delayed Coking Unit Drum 205-1201 and Drum 205-1202 Depressurization Vents (Marathon Petroleum Company LLC)

    Science.gov (United States)

    The 2010 Source Test was performed during the atmospheric depressurization step of the delayed coking process prior to the removal of petroleum coke from the coke drum. The 205 DCU was operated under a variety of conditions during the 2010 Source Test.

  8. Determination of the direction to a source of antineutrinos via inverse beta decay in Double Chooz

    Science.gov (United States)

    Nikitenko, Ya.

    2016-11-01

    To determine the direction to a source of neutrinos (and antineutrinos) is an important problem for the physics of supernovae and of the Earth. The direction to a source of antineutrinos can be estimated through the reaction of inverse beta decay. We show that the reactor neutrino experiment Double Chooz has unique capabilities to study antineutrino signal from point-like sources. Contemporary experimental data on antineutrino directionality is given. A rigorous mathematical approach for neutrino direction studies has been developed. Exact expressions for the precision of the simple mean estimator of neutrinos' direction for normal and exponential distributions for a finite sample and for the limiting case of many events have been obtained.

  9. Designing and testing a wearable, wireless fNIRS patch.

    Science.gov (United States)

    Abtahi, Mohammadreza; Cay, Gozde; Saikia, Manob Jyoti; Mankodiya, Kunal

    2016-08-01

    Optical brain monitoring using near infrared (NIR) light has got a lot of attention in order to study the complexity of the brain due to several advantages as oppose to other methods such as EEG, fMRI and PET. There are a few commercially available functional NIR spectroscopy (fNIRS) brain monitoring systems, but they are still non-wearable and pose difficulties in scanning the brain while the participants are in motion. In this work, we present our endeavors to design and test a low-cost, wireless fNIRS patch using NIR light sources at wavelengths of 770 and 830nm, photodetectors and a microcontroller to trigger the light sources, read photodetector's output and transfer data wirelessly (via Bluetooth) to a smart-phone. The patch is essentially a 3-D printed wearable system, recording and displaying the brain hemodynamic responses on smartphone, also eliminates the need for complicated wiring of the electrodes. We have performed rigorous lab experiments on the presented system for its functionality. In a proof of concept experiment, the patch detected the NIR absorption on the arm. Another experiment revealed that the patch's battery could last up to several hours with continuous fNIRS recording with and without wireless data transfer.

  10. Performance test of electron cyclotron resonance ion sources for the Hyogo Ion Beam Medical Center

    Science.gov (United States)

    Sawada, K.; Sawada, J.; Sakata, T.; Uno, K.; Okanishi, K.; Harada, H.; Itano, A.; Higashi, A.; Akagi, T.; Yamada, S.; Noda, K.; Torikoshi, M.; Kitagawa, A.

    2000-02-01

    Two electron cyclotron resonance (ECR) ion sources were manufactured for the accelerator facility at the Hyogo Ion Beam Medical Center. H2+, He2+, and C4+ were chosen as the accelerating ions because they have the highest charge to mass ratio among ion states which satisfy the required intensity and quality. The sources have the same structure as the 10 GHz ECR source at the Heavy Ion Medical Accelerator in Chiba except for a few improvements in the magnetic structure. Their performance was investigated at the Sumitomo Heavy Industries factory before shipment. The maximum intensity was 1500 μA for H2+, 1320 μA for He2+, and 580 μA for C4+ at the end of the ion source beam transport line. These are several times higher than required. Sufficient performance was also observed in the flatness and long-term stability of the pulsed beams. These test results satisfy the requirements for medical use.

  11. Paulo Leminski : um estudo sobre o rigor e o relaxo em suas poesias

    OpenAIRE

    Dhynarte de Borba e Albuquerque

    2005-01-01

    O trabalho examina a trajetória da poesia de Paulo Leminski, buscando estabelecer os termos do humor, da pesquisa metalingüística e do eu-lírico, e que não deixa de exibir traços da poesia marginal dos 70. Um autor que trabalhou com a busca do rigor concretista mediante os procedimentos da fala cotidiana mais ou menos relaxada. O esforço poético do curitibano Leminski é uma “linha que nunca termina” – ele escreveu poesias, romances, peças de publicidade, letras de música e fez traduções. Em t...

  12. Design, fabrication, installation and shielding integrity testing of source storage container for automatic source movement system used in TLD calibration facility

    International Nuclear Information System (INIS)

    Subramanian, V.; Baskar, S.; Annalakshmi, O.; Jose, M.T.; Jayshree, C.P.; Choudry, Shreelatha

    2012-01-01

    A state-of-art TLD laboratory has been commissioned in January 2000 at Radiological Safety Division of Indira Gandhi Centre for Atomic Research (IGCAR). The laboratory provides personnel monitoring service to 2000 occupational workers from Indira Gandhi Centre for Atomic Research and Bhabha Atomic Research Centre facilities. The laboratory has been accredited by the Radiation Safety Systems Division (RSSD), Bhabha Atomic Research Centre (BARC) since year 2002. The laboratory has exclusive facility for the calibration of the TLD cards. As apart of accreditation procedure and taking into account of geometry effect, the dose rate at the card position is determined by the accreditation authorities by using graphite chamber (secondary or national standard instrument) and often re estimated by a condenser R meter (M/s Victoreen, Germany) by our laboratory. As per the regulatory requirement, the exposure protocols should be automated. Towards this an automatic source movement system has been augmented in the calibration facility. By using the system, the source will be brought to the irradiation position by pneumatically and exposures will be terminated by counter, timer and triggering system. To accomplish this task a lead container has been designed, fabricated and mounted at the beneath of the calibration table for the storage of source. As per the automation process, a lead container for the source storage has been designed and installed beneath to the Calibration Table. The container was designed to hold a 3Ci 137 Cs source, but present activity of the source is 1.2Ci. Hence, the shielding integrity was tested with higher active source (1.7Ci 60 Co). The dose rate measured outside on the circumference of the container at the middle of the source is found to be the same as calculated using QAD CGGP calculations. The top plug is so designed to avoid inadvertent upward movement of the source. Though, the shielding was not adequate on top of the top plug, however it does

  13. Derivation of basic equations for rigorous dynamic simulation of cryogenic distillation column for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro; Naruse, Yuji

    1981-08-01

    The basic equations are derived for rigorous dynamic simulation of cryogenic distillation columns for hydrogen isotope separation. The model accounts for such factors as differences in latent heat of vaporization among the six isotopic species of molecular hydrogen, decay heat of tritium, heat transfer through the column wall and nonideality of the solutions. Provision is also made for simulation of columns with multiple feeds and multiple sidestreams. (author)

  14. Broadband Liner Optimization for the Source Diagnostic Test Fan

    Science.gov (United States)

    Nark, Douglas M.; Jones, Michael G.

    2012-01-01

    The broadband component of fan noise has grown in relevance with the utilization of increased bypass ratio and advanced fan designs. Thus, while the attenuation of fan tones remains paramount, the ability to simultaneously reduce broadband fan noise levels has become more appealing. This paper describes a broadband acoustic liner optimization study for the scale model Source Diagnostic Test fan. Specifically, in-duct attenuation predictions with a statistical fan source model are used to obtain optimum impedance spectra over a number of flow conditions for three liner locations in the bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners aimed at producing impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increased weighting to specific frequencies and/or operating conditions. Typical tonal liner designs targeting single frequencies at one operating condition are first produced to provide baseline performance information. These are followed by multiple broadband design approaches culminating in a broadband liner targeting the full range of frequencies and operating conditions. The broadband liner is found to satisfy the optimum impedance objectives much better than the tonal liner designs. In addition, the broadband liner is found to provide better attenuation than the tonal designs over the full range of frequencies and operating conditions considered. Thus, the current study successfully establishes a process for the initial design and evaluation of novel broadband liner concepts for complex engine configurations.

  15. Effect of dietary iron source and iron status on iron bioavailability tests in the rat

    International Nuclear Information System (INIS)

    Zhang, D.; Hendricks, D.G.; Mahoney, A.W.

    1986-01-01

    Weanling male rats were made anemic in 7 days by feeding a low iron diet and bleeding. Healthy rats were fed the low iron diet supplemented with ferrous sulfate (29 ppm Fe). Each group was subdivided and fed for 10 days on test diets containing about 29 ppm iron that were formulated with meat:spinach mixtures or meat:soy mixtures to provided 100:0, 75:25, 50:50, 25:75, or 0:100% of the dietary iron from these sources or from a ferrous sulfate diet. After 3 days on the diets all rats were dosed orally with 2 or 5 micro curries of 59 Fe after a 18 hour fast and refeeding for 1.5 hours. Iron status influenced liver iron, carcass iron, liver radio activity and percent of radioactive dose retained. Diet influenced fecal iron and apparent absorption of iron. In iron bioavailability studies assessment methodology and iron status of the test subject greatly influences the estimates of the value of dietary sources of iron

  16. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  17. Invalid Permutation Tests

    Directory of Open Access Journals (Sweden)

    Mikel Aickin

    2010-01-01

    Full Text Available Permutation tests are often presented in a rather casual manner, in both introductory and advanced statistics textbooks. The appeal of the cleverness of the procedure seems to replace the need for a rigorous argument that it produces valid hypothesis tests. The consequence of this educational failing has been a widespread belief in a “permutation principle”, which is supposed invariably to give tests that are valid by construction, under an absolute minimum of statistical assumptions. Several lines of argument are presented here to show that the permutation principle itself can be invalid, concentrating on the Fisher-Pitman permutation test for two means. A simple counterfactual example illustrates the general problem, and a slightly more elaborate counterfactual argument is used to explain why the main mathematical proof of the validity of permutation tests is mistaken. Two modifications of the permutation test are suggested to be valid in a very modest simulation. In instances where simulation software is readily available, investigating the validity of a specific permutation test can be done easily, requiring only a minimum understanding of statistical technicalities.

  18. Unclassified Source Term and Radionuclide Data for Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada Test Site, Nevada, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Peter Martian

    2009-05-01

    This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for CAU 97: Yucca Flat/Climax Mine. The total residual inventory of radionuclides associated with one or more tests is known as the radiologic source term (RST). The RST is comprised of radionuclides in water, glass, or other phases or mineralogic forms. The hydrologic source term (HST) of an underground nuclear test is the portion of the total RST that is released into the groundwater over time following the test. In this report, the HST represents radionuclide release some time after the explosion and does not include the rapidly evolving mechanical, thermal, and chemical processes during the explosion. The CAU 97: Yucca Flat/Climax Mine has many more detonations and a wider variety of settings to consider compared to other CAUs. For instance, the source term analysis and evaluation performed for CAUs 101 and 102: Central and Western Pahute Mesa and CAU 98: Frenchman Flat did not consider vadose zone attenuation because many detonations were located near or below the water table. However, the large number of Yucca Flat/Climax Mine tests and the location of many tests above the water table warrant a more robust analysis of the unsaturated zone.

  19. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  20. Laboratory test of source encapsulation for decreasing PCB concentrations

    DEFF Research Database (Denmark)

    Kolarik, Barbara; Andersen, Helle Vibeke; Markowicz, Pawel

    2016-01-01

    This study investigates the effect of encapsulation of tertiary PCB sources with PERMASORB™ Adsorber Wallpaper and the surface emissions trap (cTrap) on indoor air concentration of PCBs and on the PCB content in the source. The 40 weeks long laboratory investigation shows reduction of the air...... concentration by approx. 90% for both wallpapers, a level comparable to source removal. The potential for extraction of PCBs from the contaminated materials stays unclear for both wallpapers. The cTrap has shown potential to accumulate PCBs, however the total content of PCB in investigated sources has...... apparently increased. The opposite was observed for the PERMASORB™, where the total PCB content in the sources has decreased, with however only small concentration of PCBs in the wallpaper measured at the end of the experiment....

  1. Association Between Maximal Skin Dose and Breast Brachytherapy Outcome: A Proposal for More Rigorous Dosimetric Constraints

    International Nuclear Information System (INIS)

    Cuttino, Laurie W.; Heffernan, Jill; Vera, Robyn; Rosu, Mihaela; Ramakrishnan, V. Ramesh; Arthur, Douglas W.

    2011-01-01

    Purpose: Multiple investigations have used the skin distance as a surrogate for the skin dose and have shown that distances 4.05 Gy/fraction. Conclusion: The initial skin dose recommendations have been based on safe use and the avoidance of significant toxicity. The results from the present study have suggested that patients might further benefit if more rigorous constraints were applied and if the skin dose were limited to 120% of the prescription dose.

  2. Note: Simulation and test of a strip source electron gun.

    Science.gov (United States)

    Iqbal, Munawar; Islam, G U; Misbah, I; Iqbal, O; Zhou, Z

    2014-06-01

    We present simulation and test of an indirectly heated strip source electron beam gun assembly using Stanford Linear Accelerator Center (SLAC) electron beam trajectory program. The beam is now sharply focused with 3.04 mm diameter in the post anode region at 15.9 mm. The measured emission current and emission density were 1.12 A and 1.15 A/cm(2), respectively, that corresponds to power density of 11.5 kW/cm(2), at 10 kV acceleration potential. The simulated results were compared with then and now experiments and found in agreement. The gun is without any biasing, electrostatic and magnetic fields; hence simple and inexpensive. Moreover, it is now more powerful and is useful for accelerators technology due to high emission and low emittance parameters.

  3. pytc: Open-Source Python Software for Global Analyses of Isothermal Titration Calorimetry Data.

    Science.gov (United States)

    Duvvuri, Hiranmayi; Wheeler, Lucas C; Harms, Michael J

    2018-05-08

    Here we describe pytc, an open-source Python package for global fits of thermodynamic models to multiple isothermal titration calorimetry experiments. Key features include simplicity, the ability to implement new thermodynamic models, a robust maximum likelihood fitter, a fast Bayesian Markov-Chain Monte Carlo sampler, rigorous implementation, extensive documentation, and full cross-platform compatibility. pytc fitting can be done using an application program interface or via a graphical user interface. It is available for download at https://github.com/harmslab/pytc .

  4. An accelerated electrochemical MIC test for stainless alloys

    International Nuclear Information System (INIS)

    Gendron, T.S.; Cleland, R.D.

    1994-11-01

    Previous work in our laboratory and elsewhere has suggested that microbially influenced corrosion (MIC) of stainless steels and nickel-base alloys occurs in locally anaerobic regions that support the growth of sulfate-reducing bacteria (SRB). The cathodic reaction is provided by oxygen reduction at remote sites. Such a coupling between anode and cathode is difficult to reproduce in the laboratory, but can be simulated indirectly using a double electrochemical cell, as in previous work. A more realistic simulation using a single aerated electrochemical cell has now been developed, in which a second organism (P. aeruginosa) is used to provide an anoxic habitat for SRB growth and possible a source of organic carbon, within a layer of silt. A bare alloy electrode is used as the oxygen cathode. Tests of this kind using rigorous microbiological procedures have generated pitting corrosion of several alloys in low chloride media simulating freshwater heat exchanger conditions. This report discusses the adaption of these procedures to study corrosion of nuclear waste containers. (author). 20 refs., 2 tabs., 7 figs

  5. General-purpose heat source safety verification test series: SVT-11 through SVT-13

    International Nuclear Information System (INIS)

    George, T.G.; Pavone, D.

    1986-05-01

    The General-Purpose Heat Source (GPHS) is a modular component of the radioisotope thermoelectric generator that will provide power for the Galileo and Ulysses (formerly ISPM) space missions. The GPHS provides power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Because the possibility of an orbital abort always exists, the heat source was designed and constructed to minimize plutonia release in any accident environment. The Safety Verification Test (SVT) series was formulated to evaluate the effectiveness of GPHS plutonia containment after atmospheric reentry and Earth impact. The first two reports (covering SVT-1 through SVT-10) described the results of flat, side-on, and angular module impacts against steel targets at 54 m/s. This report describes flat-on module impacts against concrete and granite targets, at velocities equivalent to or higher than previous SVTs

  6. Rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets

    International Nuclear Information System (INIS)

    Yang, Z.R.

    1993-10-01

    We have exactly calculated the rigorous spin-spin correlation function of Ising model on a special kind of Sierpinski Carpets (SC's) by means of graph expansion and a combinatorial approach and investigated the asymptotic behaviour in the limit of long distance. The result show there is no long range correlation between spins at any finite temperature which indicates no existence of phase transition and thus finally confirms the conclusion produced by the renormalization group method and other physical arguments. (author). 7 refs, 6 figs

  7. Valuing goodwill: not-for-profits prepare for annual impairment testing.

    Science.gov (United States)

    Heuer, Christian; Travers, Mary Ann K

    2011-02-01

    Accounting standards for valuing goodwill and intangible assets are becoming more rigorous for not-for-profit organizations: Not-for-profit healthcare organizations need to test for goodwill impairment at least annually. Impairment testing is a two-stage process: initial analysis to determine whether impairment exists and subsequent calculation of the magnitude of impairment. Certain "triggering" events compel all organizations--whether for-profit or not-for-profit--to perform an impairment test for goodwill or intangible assets.

  8. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  9. SUPER-FMIT, an accelerator-based neutron source for fusion components irradiation testing

    International Nuclear Information System (INIS)

    Burke, R.J.; Holmes, J.J.; Johnson, D.L.; Mann, F.M.; Miles, R.R.

    1984-01-01

    The SUPER-FMIT facility is proposed as an advanced accelerator based neutron source for high flux irradiation testing of large-sized fusion reactor components. The facility would require only small extensions to existing accelerator and target technology originally developed for the Fusion Materials Irradiation Test (FMIT) facility. There, neutrons would be produced by a 0.1 ampere beam of 35 MeV deuterons incident upon a liquid lithium target. The volume available for high flux (> 10 14 n/cm 2 -s) testing in SUPER-FMIT would be 14 liters, about a factor of 30 larger than in the FMIT facility. This is because the effective beam current of 35 MeV deuterons on target can be increased by a factor of ten to 1.0 amperes or more. Such a large increase can be accomplished by acceleration of multiple beams of molecular deuterium ions (D 2 +) to 70 MeV in a common accelerator sructure. The availability of multiple beams and large total current allows great variety in the testing that can be done. For example, fluxes greater than 10 16 n/cm 2 -s, multiple simultaneous experiments, and great flexibility in tailoring of spatial distributions of flux and spectra can be achieved

  10. Cost evaluation of cellulase enzyme for industrial-scale cellulosic ethanol production based on rigorous Aspen Plus modeling.

    Science.gov (United States)

    Liu, Gang; Zhang, Jian; Bao, Jie

    2016-01-01

    Cost reduction on cellulase enzyme usage has been the central effort in the commercialization of fuel ethanol production from lignocellulose biomass. Therefore, establishing an accurate evaluation method on cellulase enzyme cost is crucially important to support the health development of the future biorefinery industry. Currently, the cellulase cost evaluation methods were complicated and various controversial or even conflict results were presented. To give a reliable evaluation on this important topic, a rigorous analysis based on the Aspen Plus flowsheet simulation in the commercial scale ethanol plant was proposed in this study. The minimum ethanol selling price (MESP) was used as the indicator to show the impacts of varying enzyme supply modes, enzyme prices, process parameters, as well as enzyme loading on the enzyme cost. The results reveal that the enzyme cost drives the cellulosic ethanol price below the minimum profit point when the enzyme is purchased from the current industrial enzyme market. An innovative production of cellulase enzyme such as on-site enzyme production should be explored and tested in the industrial scale to yield an economically sound enzyme supply for the future cellulosic ethanol production.

  11. Release of major ions during rigor mortis development in kid Longissimus dorsi muscle.

    Science.gov (United States)

    Feidt, C; Brun-Bellut, J

    1999-01-01

    Ionic strength plays an important role in post mortem muscle changes. Its increase is due to ion release during the development of rigor mortis. Twelve alpine kids were used to study the effects of chilling and meat pH on ion release. Free ions were measured in Longissimus dorsi muscle by capillary electrophoresis after water extraction. All free ion concentrations increased after death, but there were differences between ions. Temperature was not a factor affecting ion release in contrast to ultimate pH value. Three release mechanisms are believed to coexist: a passive binding to proteins, which stops as pH decreases, an active segregation which stops as ATP disappears and the production of metabolites due to anaerobic glycolysis.

  12. Hypnotherapy and Test Anxiety: Two Cognitive-Behavioral Constructs. The Effects of Hypnosis in Reducing Test Anxiety and Improving Academic Achievement in College Students.

    Science.gov (United States)

    Sapp, Marty

    A two-group randomized multivariate analysis of covariance (MANCOVA) was used to investigate the effects of cognitive-behavioral hypnosis in reducing test anxiety and improving academic performance in comparison to a Hawthorne control group. Subjects were enrolled in a rigorous introductory psychology course which covered an entire text in one…

  13. Nitrogen-isotopes and multi-parameter sewage water test for identification of nitrate sources: Groundwater body Marchfeld East of Vienna

    Science.gov (United States)

    Kralik, Martin

    2017-04-01

    The application of nitrogen and oxygen isotopes in nitrate allows, under favourable circumstances, to identify potential sources such as precipitation, chemical fertilisers and manure or sewage water. Without any additional tracer, the source distinction of nitrate from manure or sewage water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore, the Environment Agency Austria developed a new multi parametrical indicator test to allow the identification and quantification of pollution by domestic sewage water. The test analyses 8 substances well known to occur in sewage water: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepine (pharmaceuticals) [1]. These substances are polar and degradation in the aquatic system by microbiological processes is not documented. These 8 Substances do not occur naturally which make them ideal tracers. The test can detect wastewater in the analysed water sample down to 0.1 %. This ideal coupling of these analytic tests helps to identify the nitrogen sources in the groundwater body Marchfeld East of Vienna to a high confidence level. In addition, the results allow a reasonable quantification of nitrogen sources from different types of fertilizers as well as sewage water contributions close to villages and in wells recharged by bank filtration. Recent investigations of groundwater in selected wells in Marchfeld [2] indicated a clear nitrogen contribution by wastewater leakages (sewers or septic tanks) to the total nitrogen budget. However, this contribution is shrinking and the main source comes still from agricultural activities. [1] Humer, F.; Weiss, S.; Reinnicke, S.; Clara, M.; Grath, J.; Windhofer, G. (2013): Multi parametrical indicator test for urban wastewater influence

  14. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    International Nuclear Information System (INIS)

    Woracek, R.; Hofmann, T.; Bulat, M.; Sales, M.; Habicht, K.; Andersen, K.; Strobl, M.

    2016-01-01

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  15. The test beamline of the European Spallation Source – Instrumentation development and wavelength frame multiplication

    Energy Technology Data Exchange (ETDEWEB)

    Woracek, R., E-mail: robin.woracek@esss.se [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Hofmann, T.; Bulat, M. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Sales, M. [Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark); Habicht, K. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner Platz 1, 14109 Berlin (Germany); Andersen, K. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Strobl, M. [European Spallation Source ESS ERIC, P.O. Box 176, SE-22100 Lund (Sweden); Technical University of Denmark, Fysikvej, 2800 Kgs. Lyngby (Denmark)

    2016-12-11

    The European Spallation Source (ESS), scheduled to start operation in 2020, is aiming to deliver the most intense neutron beams for experimental research of any facility worldwide. Its long pulse time structure implies significant differences for instrumentation compared to other spallation sources which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor at Helmholtz-Zentrum Berlin (HZB). Operating the TBL shall provide valuable experience in order to allow for a smooth start of operations at ESS. The beamline is capable of mimicking the ESS pulse structure by a double chopper system and provides variable wavelength resolution as low as 0.5% over a wide wavelength band between 1.6 Å and 10 Å by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components. This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects.

  16. Testing statistical hypotheses

    CERN Document Server

    Lehmann, E L

    2005-01-01

    The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...

  17. Modding a free and open source software video game: "Play testing is hard work"

    Directory of Open Access Journals (Sweden)

    Giacomo Poderi

    2014-03-01

    Full Text Available Video game modding is a form of fan productivity in contemporary participatory culture. We see modding as an important way in which modders experience and conceptualize their work. By focusing on modding in a free and open source software video game, we analyze the practice of modding and the way it changes modders' relationship with their object of interest. The modders' involvement is not always associated with fun and creativity. Indeed, activities such as play testing often undermine these dimensions of modding. We present a case study of modding that is based on ethnographic research done for The Battle for Wesnoth, a free and open source software strategy video game entirely developed by a community of volunteers.

  18. Change Detection for Remote Monitoring of Underground Nuclear Testing: Comparison with Seismic and Associated Explosion Source Phenomenological Data

    DEFF Research Database (Denmark)

    Canty, M.; Jahnke, G.; Nielsen, Allan Aasbjerg

    2005-01-01

    The analysis of open-source satellite imagery is in process of establishing itself as an important tool for monitoring nuclear activities throughout the world which are relevant to disarmament treaties, like e. g. the Comprehensive Nuclear-Test-Ban Treaty (CTBT). However, the detection of anthrop......The analysis of open-source satellite imagery is in process of establishing itself as an important tool for monitoring nuclear activities throughout the world which are relevant to disarmament treaties, like e. g. the Comprehensive Nuclear-Test-Ban Treaty (CTBT). However, the detection...... of conventional multispectral satellite platforms with moderate ground resolution (Landsat TM, ASTER) to detect changes over wide areas.We chose the Nevada Test Site (NTS), USA, for a case study because of the large amount of available ground truth information. The analysis is based on the multivariate alteration...

  19. A rigorous derivation of gravitational self-force

    International Nuclear Information System (INIS)

    Gralla, Samuel E; Wald, Robert M

    2008-01-01

    There is general agreement that the MiSaTaQuWa equations should describe the motion of a 'small body' in general relativity, taking into account the leading order self-force effects. However, previous derivations of these equations have made a number of ad hoc assumptions and/or contain a number of unsatisfactory features. For example, all previous derivations have invoked, without proper justification, the step of 'Lorenz gauge relaxation', wherein the linearized Einstein equation is written in the form appropriate to the Lorenz gauge, but the Lorenz gauge condition is then not imposed-thereby making the resulting equations for the metric perturbation inequivalent to the linearized Einstein equations. (Such a 'relaxation' of the linearized Einstein equations is essential in order to avoid the conclusion that 'point particles' move on geodesics.) In this paper, we analyze the issue of 'particle motion' in general relativity in a systematic and rigorous way by considering a one-parameter family of metrics, g ab (λ), corresponding to having a body (or black hole) that is 'scaled down' to zero size and mass in an appropriate manner. We prove that the limiting worldline of such a one-parameter family must be a geodesic of the background metric, g ab (λ = 0). Gravitational self-force-as well as the force due to coupling of the spin of the body to curvature-then arises as a first-order perturbative correction in λ to this worldline. No assumptions are made in our analysis apart from the smoothness and limit properties of the one-parameter family of metrics, g ab (λ). Our approach should provide a framework for systematically calculating higher order corrections to gravitational self-force, including higher multipole effects, although we do not attempt to go beyond first-order calculations here. The status of the MiSaTaQuWa equations is explained

  20. What Does a Verbal Test Measure? A New Approach to Understanding Sources of Item Difficulty.

    Science.gov (United States)

    Berk, Eric J. Vanden; Lohman, David F.; Cassata, Jennifer Coyne

    Assessing the construct relevance of mental test results continues to present many challenges, and it has proven to be particularly difficult to assess the construct relevance of verbal items. This study was conducted to gain a better understanding of the conceptual sources of verbal item difficulty using a unique approach that integrates…

  1. Rigorous derivation of the perimeter generating functions for the mean-squared radius of gyration of rectangular, Ferrers and pyramid polygons

    International Nuclear Information System (INIS)

    Lin, Keh Ying

    2006-01-01

    We have derived rigorously the perimeter generating functions for the mean-squared radius of gyration of rectangular, Ferrers and pyramid polygons. These functions were found by Jensen recently. His nonrigorous results are based on the analysis of the long series expansions. (comment)

  2. Rigorous decoupling between edge states in frustrated spin chains and ladders

    Science.gov (United States)

    Chepiga, Natalia; Mila, Frédéric

    2018-05-01

    We investigate the occurrence of exact zero modes in one-dimensional quantum magnets of finite length that possess edge states. Building on conclusions first reached in the context of the spin-1/2 X Y chain in a field and then for the spin-1 J1-J2 Heisenberg model, we show that the development of incommensurate correlations in the bulk invariably leads to oscillations in the sign of the coupling between edge states, and hence to exact zero energy modes at the crossing points where the coupling between the edge states rigorously vanishes. This is true regardless of the origin of the frustration (e.g., next-nearest-neighbor coupling or biquadratic coupling for the spin-1 chain), of the value of the bulk spin (we report on spin-1/2, spin-1, and spin-2 examples), and of the value of the edge-state emergent spin (spin-1/2 or spin-1).

  3. Rigorous constraints on the matrix elements of the energy–momentum tensor

    Directory of Open Access Journals (Sweden)

    Peter Lowdon

    2017-11-01

    Full Text Available The structure of the matrix elements of the energy–momentum tensor play an important role in determining the properties of the form factors A(q2, B(q2 and C(q2 which appear in the Lorentz covariant decomposition of the matrix elements. In this paper we apply a rigorous frame-independent distributional-matching approach to the matrix elements of the Poincaré generators in order to derive constraints on these form factors as q→0. In contrast to the literature, we explicitly demonstrate that the vanishing of the anomalous gravitomagnetic moment B(0 and the condition A(0=1 are independent of one another, and that these constraints are not related to the specific properties or conservation of the individual Poincaré generators themselves, but are in fact a consequence of the physical on-shell requirement of the states in the matrix elements and the manner in which these states transform under Poincaré transformations.

  4. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    Science.gov (United States)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  5. Standards for Radiation Effects Testing: Ensuring Scientific Rigor in the Face of Budget Realities and Modern Device Challenges

    Science.gov (United States)

    Lauenstein, J M.

    2015-01-01

    An overview is presented of the space radiation environment and its effects on electrical, electronic, and electromechanical parts. Relevant test standards and guidelines are listed. Test standards and guidelines are necessary to ensure best practices, minimize and bound systematic and random errors, and to ensure comparable results from different testers and vendors. Test standards are by their nature static but exist in a dynamic environment of advancing technology and radiation effects research. New technologies, failure mechanisms, and advancement in our understanding of known failure mechanisms drive the revision or development of test standards. Changes to standards must be weighed against their impact on cost and existing part qualifications. There must be consensus on new best practices. The complexity of some new technologies exceeds the scope of existing test standards and may require development of a guideline specific to the technology. Examples are given to illuminate the value and limitations of key radiation test standards as well as the challenges in keeping these standards up to date.

  6. An efficient and rigorous thermodynamic library and optimal-control of a cryogenic air separation unit

    DEFF Research Database (Denmark)

    Gaspar, Jozsef; Ritschel, Tobias Kasper Skovborg; Jørgensen, John Bagterp

    2017-01-01

    -linear model based control to achieve optimal techno-economic performance. Accordingly, this work presents a computationally efficient and novel approach for solving a tray-by-tray equilibrium model and its implementation for open-loop optimal-control of a cryogenic distillation column. Here, the optimisation...... objective is to reduce the cost of compression in a volatile electricity market while meeting the production requirements, i.e. product flow rate and purity. This model is implemented in Matlab and uses the ThermoLib rigorous thermodynamic library. The present work represents a first step towards plant...

  7. First in situ operation performance test of ground source heat pump in Tunisia

    International Nuclear Information System (INIS)

    Naili, Nabiha; Attar, Issam; Hazami, Majdi; Farhat, Abdelhamid

    2013-01-01

    Highlights: • Evaluate the geothermal energy in Tunisia. • Study of the performance of GSHP system for cooling space. • GSHP is a promising alternative for building cooling in Tunisia. - Abstract: The main purpose of this paper is to study the energetic potential of the deployment in Tunisia of the Ground Source Heat Pump (GSHP) system for cooling mode application. Therefore, a pilot GSHP system using horizontal Ground Heat Exchanger (GHE) was installed and experimented in the Research and Technology Center of Energy (CRTEn), Borj Cédria. The experiment is conducted in a test room with a floor area of about 12 m 2 . In the floor of the tested room is integrated a polyethylene exchanger (PEX) used as a radiant floor cooling (RFC) system. The experimental setup mainly includes the ground temperature, the temperature and flow rate of water circulating in the heat pump and the GHE, as well as the power consumption of the heat pump and circulating pumps. These experimental data are essentially used to evaluate the coefficient of performance of the heat pump (COP hp ) and the overall system (COP sys ) for continuous operation mode. The COP hp and the COP sys were found to be 4.25 and 2.88, respectively. These results reveal that the use of the ground source heat pump is very appropriate for Tunisian building cooling

  8. Microscopic assessment of bone toughness using scratch tests

    Directory of Open Access Journals (Sweden)

    Amrita Kataruka

    2017-06-01

    Full Text Available Bone is a composite material with five distinct structural levels: collagen molecules, mineralized collagen fibrils, lamellae, osteon and whole bone. However, most fracture testing methods have been limited to the macroscopic scale and there is a need for advanced characterization methods to assess toughness at the osteon level and below. The goal of this investigation is to present a novel framework to measure the fracture properties of bone at the microscopic scale using scratch testing. A rigorous experimental protocol is articulated and applied to examine cortical bone specimens from porcine femurs. The observed fracture behavior is very complex: we observe a strong anisotropy of the response with toughening mechanisms and a competition between plastic flow and brittle fracture. The challenge consists then in applying nonlinear fracture mechanics methods such as the J-integral or the energetic Size Effect Law to quantify the fracture toughness in a rigorous fashion. Our result suggests that mixed-mode fracture is instrumental in determining the fracture resistance. There is also a pronounced coupling between fracture and elasticity. Our methodology opens the door to fracture assessment at multiple structural levels, microscopic and potentially nanometer length scale, due to the scalability of scratch tests.

  9. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  10. Description of comprehensive pump test change to ASME OM code, subsection ISTB

    International Nuclear Information System (INIS)

    Hartley, R.S.

    1994-01-01

    The American Society of Mechanical Engineers (ASME) Operations and Maintenance (OM) Main Committee and Board on Nuclear Codes and Standards (BNCS) recently approved changes to ASME OM Code-1990, Subsection ISTB, Inservice Testing of Pumps in Light-Water Reactor Power Plants. The changes will be included in the 1994 addenda to ISTB. The changes, designated as the comprehensive pump test, incorporate a new, improved philosophy for testing safety-related pumps in nuclear power plants. An important philosophical difference between the open-quotes old codeclose quotes inservice testing (IST) requirements and these changes is that the changes concentrate on less frequent, more meaningful testing while minimizing damaging and uninformative low-flow testing. The comprehensive pump test change establishes a more involved biannual test for all pumps and significantly reduces the rigor of the quarterly test for standby pumps. The increased rigor and cost of the biannual comprehensive tests are offset by the reduced cost of testing and potential damage to the standby pumps, which comprise a large portion of the safety-related pumps at most plants. This paper provides background on the pump testing requirements, discusses potential industry benefits of the change, describes the development of the comprehensive pump test, and gives examples and reasons for many of the specific changes. This paper also describes additional changes to ISTB that will be included in the 1994 addenda that are associated with, but not part of, the comprehensive pump test

  11. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China.

    Science.gov (United States)

    Liu, Tao; Zhu, Guanghu; He, Jianfeng; Song, Tie; Zhang, Meng; Lin, Hualiang; Xiao, Jianpeng; Zeng, Weilin; Li, Xing; Li, Zhihao; Xie, Runsheng; Zhong, Haojie; Wu, Xiaocheng; Hu, Wenbiao; Zhang, Yonghui; Ma, Wenjun

    2017-08-02

    Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6%) in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI) data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed) model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250). The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  12. Marketing the HIV test to MSM: ethnic differences in preferred venues and sources.

    Science.gov (United States)

    Lechuga, Julia; Owczarzak, Jill T; Petroll, Andrew E

    2013-05-01

    Lack of awareness of HIV status is associated with an increased likelihood of HIV transmission. We surveyed 633 men who have sex with men (MSM) from diverse ethnic groups recruited from a variety of community venues in a U.S. Midwestern city with rising HIV infection rates. Our first aim was to describe patterns of sexual risk, annual HIV testing frequency, and venues where information about HIV and HIV testing could be disseminated to inner-city MSM. Our second aim was to identify preferred sources to receive information about HIV testing and determine whether these preferences differed by ethnic background. Results indicated that despite similar proportions of high-sexual risk behaviors, compared with African American and Latino MSM, smaller proportions of non-Hispanic White MSM had received an HIV test in the last 12 months. Despite ethnic differences in health care access, a physician's office was the most common HIV testing site. Overall, a majority conveyed a preference to see advertisements in mainstream media outlets. However, when preferences were stratified by ethnicity, African American MSM were the least likely to prefer receiving information from mainstream media and conveyed a stronger preference to receive information from authority figures than non-Hispanic White and Hispanic MSM.

  13. Hydrologic Source Term Processes and Models for the Clearwater and Wineskin Tests, Rainier Mesa, Nevada National Security Site

    Energy Technology Data Exchange (ETDEWEB)

    Carle, Steven F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-05-04

    This report describes the development, processes, and results of a hydrologic source term (HST) model for the CLEARWATER (U12q) and WINESKIN (U12r) tests located on Rainier Mesa, Nevada National Security Site, Nevada (Figure 1.1). Of the 61 underground tests (involving 62 unique detonations) conducted on Rainier Mesa (Area 12) between 1957 and 1992 (USDOE, 2015), the CLEARWATER and WINESKIN tests present many unique features that warrant a separate HST modeling effort from other Rainier Mesa tests.

  14. [Incorporation of an organic MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using independent data sources]. [MAGIC Model

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.

    1992-09-01

    A project was initiated in March, 1992 to (1) incorporate a rigorous organic acid representation, based on empirical data and geochemical considerations, into the MAGIC model of acidification response, and (2) test the revised model using three sets of independent data. After six months of performance, the project is on schedule and the majority of the tasks outlined for Year 1 have been successfully completed. Major accomplishments to data include development of the organic acid modeling approach, using data from the Adirondack Lakes Survey Corporation (ALSC), and coupling the organic acid model with MAGIC for chemical hindcast comparisons. The incorporation of an organic acid representation into MAGIC can account for much of the discrepancy earlier observed between MAGIC hindcasts and paleolimnological reconstructions of preindustrial pH and alkalinity for 33 statistically-selected Adirondack lakes. Additional work is on-going for model calibration and testing with data from two whole-catchment artificial acidification projects. Results obtained thus far are being prepared as manuscripts for submission to the peer-reviewed scientific literature.

  15. IFMIF [International Fusion Materials Irradiation Facility], an accelerator-based neutron source for fusion components irradiation testing: Materials testing capabilities

    International Nuclear Information System (INIS)

    Mann, F.M.

    1988-08-01

    The International Fusion Materials Irradiation Facility (IFMIF) is proposed as an advanced accelerator-based neutron source for high-flux irradiation testing of large-sized fusion reactor components. The facility would require only small extensions to existing accelerator and target technology originally developed for the Fusion Materials Irradiation Test (FMIT) facility. At the extended facility, neutrons would be produced by a 0.1-A beam of 35-MeV deuterons incident upon a liquid lithium target. The volume available for high-flux (>10/sup 15/ n/cm/sup 2/-s) testing in IFMITF would be over a liter, a factor of about three larger than in the FMIT facility. This is because the effective beam current of 35-MeV deuterons on target can be increased by a factor of ten to 1A or more. Such an increase can be accomplished by funneling beams of deuterium ions from the radio-frequency quadruple into a linear accelerator and by taking advantage of recent developments in accelerator technology. Multiple beams and large total current allow great variety in available testing. For example, multiple simultaneous experiments, and great flexibility in tailoring spatial distributions of flux and spectra can be achieved. 5 refs., 2 figs., 1 tab

  16. Vacuum tests of a beamline front-end mock-up at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Liu, C.; Nielsen, R.W.; Kruy, T.L.; Shu, D.; Kuzay, T.M.

    1994-01-01

    A-mock-up has been constructed to test the functioning and performance of the Advanced Photon Source (APS) front ends. The mock-up consists of all components of the APS insertion-device beamline front end with a differential pumping system. Primary vacuum tests have been performed and compared with finite element vacuum calculations. Pressure distribution measurements using controlled leaks demonstrate a better than four decades of pressure difference between the two ends of the mock-up. The measured pressure profiles are consistent with results of finite element analyses of the system. The safety-control systems are also being tested. A closing time of ∼20 ms for the photon shutter and ∼7 ms for the fast closing valve have been obtained. Experiments on vacuum protection systems indicate that the front end is well protected in case of a vacuum breach

  17. Standard Practice for Minimizing Dosimetry Errors in Radiation Hardness Testing of Silicon Electronic Devices Using Co-60 Sources

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers recommended procedures for the use of dosimeters, such as thermoluminescent dosimeters (TLD's), to determine the absorbed dose in a region of interest within an electronic device irradiated using a Co-60 source. Co-60 sources are commonly used for the absorbed dose testing of silicon electronic devices. Note 1—This absorbed-dose testing is sometimes called “total dose testing” to distinguish it from “dose rate testing.” Note 2—The effects of ionizing radiation on some types of electronic devices may depend on both the absorbed dose and the absorbed dose rate; that is, the effects may be different if the device is irradiated to the same absorbed-dose level at different absorbed-dose rates. Absorbed-dose rate effects are not covered in this practice but should be considered in radiation hardness testing. 1.2 The principal potential error for the measurement of absorbed dose in electronic devices arises from non-equilibrium energy deposition effects in the vicinity o...

  18. Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y J; Ren, K F; Coetmellec, S; Lebrun, D, E-mail: fang.ren@coria.f [UMR 6614/CORIA, CNRS and Universite et INSA de Rouen Avenue de l' Universite BP 12, 76801 Saint Etienne du Rouvray (France)

    2009-02-01

    The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.

  19. CMS Forward Pixel Upgrade Electronics and System Testing

    CERN Document Server

    Weber, Hannsjorg Artur

    2016-01-01

    This note discusses results of electronics and system testing of the CMS forward pixel (FPIX) detector upgrade for Phase 1. The FPIX detector is comprised of four stand-alone half cylinders, each of which contains frontend readout electronic boards, power regulators, cables and fibers in addition to the pixel modules. All of the components undergo rigorous testing and quality assurance before assembly into the half cylinders. Afterwards, we perform full system tests on the completely assembled half cylinders, including calibrations at final operating temperatures, characterization of the realistic readout chain, and system grounding and noise studies. The results from all these tests are discussed.

  20. Pearce element ratios: A paradigm for testing hypotheses

    Science.gov (United States)

    Russell, J. K.; Nicholls, Jim; Stanley, Clifford R.; Pearce, T. H.

    Science moves forward with the development of new ideas that are encapsulated by hypotheses whose aim is to explain the structure of data sets or to expand existing theory. These hypotheses remain conjecture until they have been tested. In fact, Karl Popper advocated that a scientist's job does not finish with the creation of an idea but, rather, begins with the testing of the related hypotheses. In Popper's [1959] advocation it is implicit that there be tools with which we can test our hypotheses. Consequently, the development of rigorous tests for conceptual models plays a major role in maintaining the integrity of scientific endeavor [e.g., Greenwood, 1989].

  1. The influence of testing apparatus stiffness on the source properties of laboratory stick-slip

    Science.gov (United States)

    Kilgore, B. D.; McGarr, A.; Beeler, N. M.; Lockner, D. A.

    2016-12-01

    Stick-slip experiments were performed to determine the influence of the testing apparatus stiffness on source properties, to develop methods to relate stick-slip to natural earthquakes, and to examine the hypothesis of McGarr [2012] that the product of unloading stiffness, k, and slip duration, T, is both scale-independent and approximately constant for both laboratory and natural earthquakes. A double-direct shear load frame was used with Sierra White Granite samples at 2 MPa normal stress, and a remote loading rate of 0.2 µm/s. The stiffness of the test apparatus was varied by more than an order of magnitude by inserting disk springs into the shear loading column adjacent to the granite samples. Servo-controlling slip at a point between the forcing ram and the shear force load cell, produced repeatable slip events. Slip and slip duration decrease as k increases, as they do for natural earthquakes. In contrast to earthquakes, stress drop and slip rate decrease with increasing k, and the product kT for these experiments is not constant, but decreases with k. These data, collected over a range of k, do not conform to McGarr's [2012] hypothesis. However, analysis of stick-slip studies from other testing apparatuses is consistent with McGarr's hypothesis; kT is scale-independent, similar to that of earthquakes, equal to the ratio of static stress drop to average slip velocity, and similar to the ratio of shear modulus to wavespeed of rock. These properties result from conducting experiments over a range of sample sizes, using rock samples with the same elastic properties as the Earth, and using testing machines whose stiffnesses decrease, and characteristic periods increase with scale. A consequence of our experiments and analysis is that extrapolation of lab scale earthquake source properties to the Earth is more difficult than previously thought, requiring an accounting for the properties of the testing machines and additional research beyond that reported here.

  2. Impact of post-rigor high pressure processing on the physicochemical and microbial shelf-life of cultured red abalone (Haliotis rufescens).

    Science.gov (United States)

    Hughes, Brianna H; Perkins, L Brian; Yang, Tom C; Skonberg, Denise I

    2016-03-01

    High pressure processing (HPP) of post-rigor abalone at 300MPa for 10min extended the refrigerated shelf-life to four times that of unprocessed controls. Shucked abalone meats were processed at 100 or 300MPa for 5 or 10min, and stored at 2°C for 35days. Treatments were analyzed for aerobic plate count (APC), total volatile base nitrogen (TVBN), K-value, biogenic amines, color, and texture. APC did not exceed 10(6) and TVBN levels remained below 35mg/100g for 35days for the 300MPa treatments. No biogenic amines were detected in the 300MPa treatments, but putrescine and cadaverine were detected in the control and 100MPa treatments. Color and texture were not affected by HPP or storage time. These results indicate that post-rigor processing at 300MPa for 10min can significantly increase refrigerated shelf-life of abalone without affecting chemical or physical quality characteristics important to consumers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    Science.gov (United States)

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Variability in source sediment contributions by applying different statistic test for a Pyrenean catchment.

    Science.gov (United States)

    Palazón, L; Navas, A

    2017-06-01

    Information on sediment contribution and transport dynamics from the contributing catchments is needed to develop management plans to tackle environmental problems related with effects of fine sediment as reservoir siltation. In this respect, the fingerprinting technique is an indirect technique known to be valuable and effective for sediment source identification in river catchments. Large variability in sediment delivery was found in previous studies in the Barasona catchment (1509 km 2 , Central Spanish Pyrenees). Simulation results with SWAT and fingerprinting approaches identified badlands and agricultural uses as the main contributors to sediment supply in the reservoir. In this study the Kruskal-Wallis H-test and (3) principal components analysis. Source contribution results were different between assessed options with the greatest differences observed for option using #3, including the two step process: principal components analysis and discriminant function analysis. The characteristics of the solutions by the applied mixing model and the conceptual understanding of the catchment showed that the most reliable solution was achieved using #2, the two step process of Kruskal-Wallis H-test and discriminant function analysis. The assessment showed the importance of the statistical procedure used to define the optimum composite fingerprint for sediment fingerprinting applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Summary test results of the particle-beam diagnostics for the Advanced Photon Source (APS) subsystems

    International Nuclear Information System (INIS)

    Lumpkin, A.; Wang, X.; Sellyey, W.; Patterson, D.; Kahana, E.

    1994-01-01

    During the first half of 1994, a number of the diagnostic systems for measurement of the charged-particle beam parameters throughout the subsystems of the Advanced Photon Source (APS) have been installed and tested. The particle beams eventually will involve 450-MeV to 7-GeV positrons and with different pulse formats. The first test and commissionin results for beam profiles, beam position monitors, loss rate monitors, current monitors, and synchrotron radiation photon monitors hve been obtained using 200- to 350-MeV electron beams injected into the subsystems. Data presented are principally from the transport lines and the positron accumulator ring

  6. Potential Functional Embedding Theory at the Correlated Wave Function Level. 2. Error Sources and Performance Tests.

    Science.gov (United States)

    Cheng, Jin; Yu, Kuang; Libisch, Florian; Dieterich, Johannes M; Carter, Emily A

    2017-03-14

    Quantum mechanical embedding theories partition a complex system into multiple spatial regions that can use different electronic structure methods within each, to optimize trade-offs between accuracy and cost. The present work incorporates accurate but expensive correlated wave function (CW) methods for a subsystem containing the phenomenon or feature of greatest interest, while self-consistently capturing quantum effects of the surroundings using fast but less accurate density functional theory (DFT) approximations. We recently proposed two embedding methods [for a review, see: Acc. Chem. Res. 2014 , 47 , 2768 ]: density functional embedding theory (DFET) and potential functional embedding theory (PFET). DFET provides a fast but non-self-consistent density-based embedding scheme, whereas PFET offers a more rigorous theoretical framework to perform fully self-consistent, variational CW/DFT calculations [as defined in part 1, CW/DFT means subsystem 1(2) is treated with CW(DFT) methods]. When originally presented, PFET was only tested at the DFT/DFT level of theory as a proof of principle within a planewave (PW) basis. Part 1 of this two-part series demonstrated that PFET can be made to work well with mixed Gaussian type orbital (GTO)/PW bases, as long as optimized GTO bases and consistent electron-ion potentials are employed throughout. Here in part 2 we conduct the first PFET calculations at the CW/DFT level and compare them to DFET and full CW benchmarks. We test the performance of PFET at the CW/DFT level for a variety of types of interactions (hydrogen bonding, metallic, and ionic). By introducing an intermediate CW/DFT embedding scheme denoted DFET/PFET, we show how PFET remedies different types of errors in DFET, serving as a more robust type of embedding theory.

  7. Unclassified Source Term and Radionuclide Data for Corrective Action Unit 98: Frenchman Flat Nevada Test Site, Nevada, Rev. No.: 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene

    2005-09-01

    Frenchman Flat is one of several areas of the Nevada Test Site (NTS) used for underground nuclear testing (Figure 1-1). These nuclear tests resulted in groundwater contamination in the vicinity of the underground test areas. As a result, the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) is currently conducting a corrective action investigation (CAI) of the Frenchman Flat underground test areas. Since 1996, the Nevada Division of Environmental Protection (NDEP) has regulated NNSA/NSO corrective actions through the ''Federal Facility Agreement and Consent Order'' ([FFACO], 1996). Appendix VI of the FFACO agreement, ''Corrective Action Strategy'', was revised on December 7, 2000, and describes the processes that will be used to complete corrective actions, including those in the Underground Test Area (UGTA) Project. The individual locations covered by the agreement are known as corrective action sites (CASs), which are grouped into corrective action units (CAUs). The UGTA CASs are grouped geographically into five CAUs: Frenchman Flat, Central Pahute Mesa, Western Pahute Mesa, Yucca Flat/Climax Mine, and Rainier Mesa/Shoshone Mountain (Figure 1-1). These CAUs have distinctly different contaminant source, geologic, and hydrogeologic characteristics related to their location (FFACO, 1996). The Frenchman Flat CAU consists of 10 CASs located in the northern part of Area 5 and the southern part of Area 11 (Figure 1-1). This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for Frenchman Flat, CAU 98. The methodology used to estimate hydrologic source terms (HSTs) for the Frenchman Flat CAU is also documented. The HST of an underground nuclear test is the portion of the total inventory of radionuclides that is released over time into the groundwater following the test. The total residual inventory

  8. Beam diagnostic tools for the negative hydrogen ion source test facility ELISE

    International Nuclear Information System (INIS)

    Nocentini, Riccardo; Fantz, Ursel; Franzen, Peter; Froeschle, Markus; Heinemann, Bernd; Riedl, Rudolf; Ruf, Benjamin; Wuenderlich, Dirk

    2013-01-01

    Highlights: ► We present an overview of beam diagnostic tools foreseen for the new testbed ELISE. ► A sophisticated diagnostic calorimeter allows beam profile measurement. ► A tungsten wire mesh in the beam path provides a qualitative picture of the beam. ► Stripping losses and beam divergence are measured by H α Doppler shift spectroscopy. -- Abstract: The test facility ELISE, presently being commissioned at IPP, is a first step in the R and D roadmap for the RF driven ion source and extraction system of the ITER NBI system. The “half-size” ITER-like test facility includes a negative hydrogen ion source that can be operated for 1 h. ELISE is expected to extract an ion beam of 20 A at 60 kV for 10 s every 3 min, therefore delivering a total power of 1.2 MW. The extraction area has a geometry that closely reproduces the ITER design, with the same width and half the height, i.e. 1 m × 1 m. This paper presents an overview of beam diagnostic tools foreseen for ELISE. For the commissioning phase, a simple beam dump with basic diagnostic capabilities has been installed. In the second phase, the beam dump will be substituted by a more sophisticated diagnostic calorimeter to allow beam profile measurement. Additionally, a tungsten wire mesh will be introduced in the beam path to provide a qualitative picture of beam size and position. Stripping losses and beam divergence will be measured by means of H α Doppler shift spectroscopy. An absolute calibration is foreseen in order to measure beam intensity

  9. Unclassified Sources Term and Radionuclide Data for Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada Test Site, Nevada, Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Peter Martian

    2009-08-01

    This report documents the evaluation of the information and data available on the unclassified source term and radionuclide contamination for CAU 97: Yucca Flat/Climax Mine. The total residual inventory of radionuclides associated with one or more tests is known as the radiologic source term (RST). The RST is comprised of radionuclides in water, glass, or other phases or mineralogic forms. The hydrologic source term (HST) of an underground nuclear test is the portion of the total RST that is released into the groundwater over time following the test. In this report, the HST represents radionuclide release some time after the explosion and does not include the rapidly evolving mechanical, thermal, and chemical processes during the explosion. The CAU 97: Yucca Flat/Climax Mine has many more detonations and a wider variety of settings to consider compared to other CAUs. For instance, the source term analysis and evaluation performed for CAUs 101 and 102: Central and Western Pahute Mesa and CAU 98: Frenchman Flat did not consider vadose zone attenuation because many detonations were located near or below the water table. However, the large number of Yucca Flat/Climax Mine tests and the location of many tests above the water table warrant a more robust analysis of the unsaturated zone. The purpose of this report is to develop and document conceptual models of the Yucca Flat/Climax Mine HST for use in implementing source terms for the Yucca Flat/Climax Mine models. This document presents future plans to incorporate the radionuclide attenuation mechanisms due to unsaturated/multiphase flow and transport within the Yucca Flat CAU scale modeling. The important processes that influence radionuclide migration for the unsaturated and saturated tests in alluvial, volcanic, and carbonate settings are identified. Many different flow and transport models developed by Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), including original

  10. Radiation sources and technical services

    International Nuclear Information System (INIS)

    Stonek, K.; Satorie, Z.; Vyskocil, I.

    1981-01-01

    Work is briefly described of the department for sealed sources production of the Institute, including leak testing and surface contamination of sealed sources. The department also provides technical services including the inspections of sealed sources used in medicine and geology and repair of damaged sources. It carries out research of the mechanical and thermal strength of sealed sources and of the possibility of reprocessing used 226 Ra sources. The despatch department is responsible for supplying the entire country with home and imported radionuclides. The department of technical services is responsible for testing imported radionuclides, assembling materials testing, industrial and medical irradiation devices, and for the collection and storage of low-level wastes on a national scale. (M.D.)

  11. Early rigorous control interventions can largely reduce dengue outbreak magnitude: experience from Chaozhou, China

    Directory of Open Access Journals (Sweden)

    Tao Liu

    2017-08-01

    Full Text Available Abstract Background Dengue fever is a severe public heath challenge in south China. A dengue outbreak was reported in Chaozhou city, China in 2015. Intensified interventions were implemented by the government to control the epidemic. However, it is still unknown the degree to which intensified control measures reduced the size of the epidemics, and when should such measures be initiated to reduce the risk of large dengue outbreaks developing? Methods We selected Xiangqiao district as study setting because the majority of the indigenous cases (90.6% in Chaozhou city were from this district. The numbers of daily indigenous dengue cases in 2015 were collected through the national infectious diseases and vectors surveillance system, and daily Breteau Index (BI data were reported by local public health department. We used a compartmental dynamic SEIR (Susceptible, Exposed, Infected and Removed model to assess the effectiveness of control interventions, and evaluate the control effect of intervention timing on dengue epidemic. Results A total of 1250 indigenous dengue cases was reported from Xiangqiao district. The results of SEIR modeling using BI as an indicator of actual control interventions showed a total of 1255 dengue cases, which is close to the reported number (n = 1250. The size and duration of the outbreak were highly sensitive to the intensity and timing of interventions. The more rigorous and earlier the control interventions implemented, the more effective it yielded. Even if the interventions were initiated several weeks after the onset of the dengue outbreak, the interventions were shown to greatly impact the prevalence and duration of dengue outbreak. Conclusions This study suggests that early implementation of rigorous dengue interventions can effectively reduce the epidemic size and shorten the epidemic duration.

  12. RF power source for the compact linear collider test facility (CTF3)

    CERN Document Server

    McMonagle, G; Brown, Peter; Carron, G; Hanni, R; Mourier, J; Rossat, G; Syratchev, I V; Tanner, L; Thorndahl, L

    2004-01-01

    The CERN CTF3 facility will test and demonstrate many vital components of CLIC (Compact Linear Collider). This paper describes the pulsed RF power source at 2998.55 MHz for the drive-beam accelerator (DBA), which produces a beam with an energy of 150 MeV and a current of 3.5 Amps. Where possible, existing equipment from the LEP preinjector, especially the modulators and klystrons, is being used and upgraded to achieve this goal. A high power RF pulse compression system is used at the output of each klystron, which requires sophisticated RF phase programming on the low level side to achieve the required RF pulse. In addition to the 3 GHz system two pulsed RF sources operating at 1.5 GHz are being built. The first is a wide-band, low power, travelling wave tube (TWT) for the subharmonic buncher (SHB) system that produces a train of "phase coded" subpulses as part of the injector scheme. The second is a high power narrow band system to produce 20 MW RF power to the 1.5 GHz RF deflectors in the delay loop situate...

  13. ERP correlates of source memory: unitized source information increases familiarity-based retrieval.

    Science.gov (United States)

    Diana, Rachel A; Van den Boom, Wijnand; Yonelinas, Andrew P; Ranganath, Charan

    2011-01-07

    Source memory tests typically require subjects to make decisions about the context in which an item was encoded and are thought to depend on recollection of details from the study episode. Although it is generally believed that familiarity does not contribute to source memory, recent behavioral studies have suggested that familiarity may also support source recognition when item and source information are integrated, or "unitized," during study (Diana, Yonelinas, and Ranganath, 2008). However, an alternative explanation of these behavioral findings is that unitization affects the manner in which recollection contributes to performance, rather than increasing familiarity-based source memory. To discriminate between these possibilities, we conducted an event-related potential (ERP) study testing the hypothesis that unitization increases the contribution of familiarity to source recognition. Participants studied associations between words and background colors using tasks that either encouraged or discouraged unitization. ERPs were recorded during a source memory test for background color. The results revealed two distinct neural correlates of source recognition: a frontally distributed positivity that was associated with familiarity-based source memory in the high-unitization condition only and a parietally distributed positivity that was associated with recollection-based source memory in both the high- and low-unitization conditions. The ERP and behavioral findings provide converging evidence for the idea that familiarity can contribute to source recognition, particularly when source information is encoded as an item detail. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. A Systematic Literature Review on relationship between agile methods and Open Source Software Development methodology

    OpenAIRE

    Gandomani, Taghi Javdani; Zulzalil, Hazura; Ghani, Abdul Azim Abdul; Sultan, Abu Bakar Md

    2013-01-01

    Agile software development methods (ASD) and open source software development methods (OSSD) are two different approaches which were introduced in last decade and both of them have their fanatical advocators. Yet, it seems that relation and interface between ASD and OSSD is a fertile area and few rigorous studies have been done in this matter. Major goal of this study was assessment of the relation and integration of ASD and OSSD. Analyzing of collected data shows that ASD and OSSD are able t...

  15. Application of a generalized Leibniz rule for calculating electromagnetic fields within continuous source regions

    International Nuclear Information System (INIS)

    Silberstein, M.

    1991-01-01

    In deriving the electric and magnetic fields in a continuous source region by differentiating the vector potential, Yaghjian (1985) explains that the central obstacle is the dependence of the integration limits on the differentiation variable. Since it is not mathematically rigorous to assume the curl and integral signs are interchangeable, he uses an integration variable substitution to circumvent this problematic dependence. Here, an alternative derivation is presented, which evaluates the curl of the vector potential volume integral directly, retaining the dependence of the limits of integration on the differentiation variable. It involves deriving a three-dimensional version of Leibniz' rule for differentiating an integral with variable limits of integration, and using the generalized rule to find the Maxwellian and cavity fields in the source region. 7 refs

  16. Radiative Transfer in a Translucent Cloud Illuminated by an Extended Background Source

    Science.gov (United States)

    Biganzoli, Davide; Potenza, Marco A. C.; Robberto, Massimo

    2017-05-01

    We discuss the radiative transfer theory for translucent clouds illuminated by an extended background source. First, we derive a rigorous solution based on the assumption that multiple scatterings produce an isotropic flux. Then we derive a more manageable analytic approximation showing that it nicely matches the results of the rigorous approach. To validate our model, we compare our predictions with accurate laboratory measurements for various types of well-characterized grains, including purely dielectric and strongly absorbing materials representative of astronomical icy and metallic grains, respectively, finding excellent agreement without the need to add free parameters. We use our model to explore the behavior of an astrophysical cloud illuminated by a diffuse source with dust grains having parameters typical of the classic ISM grains of Draine & Lee and protoplanetary disks, with an application to the dark silhouette disk 114-426 in Orion Nebula. We find that the scattering term modifies the transmitted radiation, both in terms of intensity (extinction) and shape (reddening) of the spectral distribution. In particular, for small optical thickness, our results show that scattering makes reddening almost negligible at visible wavelengths. Once the optical thickness increases enough and the probability of scattering events becomes close to or larger than 1, reddening becomes present but is appreciably modified with respect to the standard expression for line-of-sight absorption. Moreover, variations of the grain refractive index, in particular the amount of absorption, also play an important role in changing the shape of the spectral transmission curve, with dielectric grains showing the minimum amount of reddening.

  17. Radiative Transfer in a Translucent Cloud Illuminated by an Extended Background Source

    Energy Technology Data Exchange (ETDEWEB)

    Biganzoli, Davide [Università degli Studi dell’Insubria Dept. of Science and High Technology Via Valleggio, 11, I-22100 Como (Italy); Potenza, Marco A. C. [Universitá degli Studi di Milano Dept. of Physics Via Celoria 16, I-20133 Milano (Italy); Robberto, Massimo, E-mail: robberto@stsci.edu [Space Telescope Science Institute Baltimore, MD 21218 (United States)

    2017-05-01

    We discuss the radiative transfer theory for translucent clouds illuminated by an extended background source. First, we derive a rigorous solution based on the assumption that multiple scatterings produce an isotropic flux. Then we derive a more manageable analytic approximation showing that it nicely matches the results of the rigorous approach. To validate our model, we compare our predictions with accurate laboratory measurements for various types of well-characterized grains, including purely dielectric and strongly absorbing materials representative of astronomical icy and metallic grains, respectively, finding excellent agreement without the need to add free parameters. We use our model to explore the behavior of an astrophysical cloud illuminated by a diffuse source with dust grains having parameters typical of the classic ISM grains of Draine and Lee and protoplanetary disks, with an application to the dark silhouette disk 114–426 in Orion Nebula. We find that the scattering term modifies the transmitted radiation, both in terms of intensity (extinction) and shape (reddening) of the spectral distribution. In particular, for small optical thickness, our results show that scattering makes reddening almost negligible at visible wavelengths. Once the optical thickness increases enough and the probability of scattering events becomes close to or larger than 1, reddening becomes present but is appreciably modified with respect to the standard expression for line-of-sight absorption. Moreover, variations of the grain refractive index, in particular the amount of absorption, also play an important role in changing the shape of the spectral transmission curve, with dielectric grains showing the minimum amount of reddening.

  18. Design And Tests Of A Superconducting Magnet With A Cryocooler For The Ion Source Decris-sc

    CERN Document Server

    Datskov, V I; Bekhterev, V V; Bogomolov, S L; Bondarenko, P G; Dmitriev, S N; Drobin, V M; Efremov, A A; Iakovlev, B I; Leporis, M; Malinowski, H; Nikiforov, S A; Paschenko, S V; Seleznev, V V; Shishov, Yu A; Tsvineva, G P; Yazvitsky, N Yu

    2004-01-01

    A superconducting magnet system (SMS) for the multicharged ion source DECRIS-SC was designed and manufactured at the Joint Institute for Nuclear Research. Successful tests of the SMS were conducted in late 2003 - early 2004. The peculiarities of this system are stipulated by using of a cryocooler 1 W in power for the cryostabilization of the magnet, and also by a special configuration of the magnetic field demanded for the source of ions. Four coils ensure induction of a magnetic field on the axes of the source of up to 3T (the mirror ratio of ~6) which considerably extends possibilities of the ion source from the point of view of producing intense highly charged ion beams. The problem of compensating large forces of interaction between the coils and surrounding iron yoke in this magnet has been successfully solved, and a reliable suspension of the magnet in a cryostat realized. For compounding of the windings working in vacuum at indirect cryostabilization prepreg is used. There has been applied a new techno...

  19. Heuristic derivation of the Rossi-alpha formula for a pulsed neutron source

    International Nuclear Information System (INIS)

    Baeten, P.

    2004-01-01

    Expressions for the Rossi-alpha distribution for a pulsed neutron source were derived using a heuristic derivation based on the method of joint detection probability. This heuristic technique was chosen over the more rigorous master equation method due to its simplicity and the complementary of both techniques. The derived equations also take into account the presence of delayed neutrons and intrinsic neutron sources which often cannot be neglected in source-driven subcritical cores. The obtained expressions showed that the ratio of the correlated to the uncorrelated signal in the Rossi-Alpha distribution for a Pulsed Source (RAPS) was strongly increased compared to the case for a standard Rossi-alpha distribution for a continuous source. It was also demonstrated that by using this RAPS technique four independent measurement quantities, instead of three with the standard Rossi-alpha technique, can be determined. Hence, it is no longer necessary to combine the Rossi-alpha technique with another method to measure the reactivity expressed in dollars. Both properties, the increased signal-to-noise ratio of the correlated signal and the measurement of a fourth measurement quantity, make that the RAPS technique is an excellent candidate for the measurement of kinetic parameters in source-driven subcritical assemblies

  20. Source-Type Inversion of the September 03, 2017 DPRK Nuclear Test

    Science.gov (United States)

    Dreger, D. S.; Ichinose, G.; Wang, T.

    2017-12-01

    On September 3, 2017, the DPRK announced a nuclear test at their Punggye-ri site. This explosion registered a mb 6.3, and was well recorded by global and regional seismic networks. We apply the source-type inversion method (e.g. Ford et al., 2012; Nayak and Dreger, 2015), and the MDJ2 seismic velocity model (Ford et al., 2009) to invert low frequency (0.02 to 0.05 Hz) complete three-component waveforms, and first-motion polarities to map the goodness of fit in source-type space. We have used waveform data from the New China Digital Seismic Network (BJT, HIA, MDJ), Korean Seismic Network (TJN), and the Global Seismograph Network (INCN, MAJO). From this analysis, the event discriminates as an explosion. For a pure explosion model, we find a scalar seismic moment of 5.77e+16 Nm (Mw 5.1), however this model fails to fit the large Love waves registered on the transverse components. The best fitting complete solution finds a total moment of 8.90e+16 Nm (Mw 5.2) that is decomposed as 53% isotropic, 40% double-couple, and 7% CLVD, although the range of isotropic moment from the source-type analysis indicates that it could be as high as 60-80%. The isotropic moment in the source-type inversion is 4.75e16 Nm (Mw 5.05). Assuming elastic moduli from model MDJ2 the explosion cavity radius is approximately 51m, and the yield estimated using Denny and Johnson (1991) is 246kt. Approximately 8.5 minutes after the blast a second seismic event was registered, which is best characterized as a vertically closing horizontal crack, perhaps representing the partial collapse of the blast cavity, and/or a service tunnel. The total moment of the collapse is 3.34e+16 Nm (Mw 4.95). The volumetric moment of the collapse is 1.91e+16 Nm, approximately 1/3 to 1/2 of the explosive moment. German TerraSAR-X observations of deformation (Wang et al., 2017) reveal large radial outward motions consistent with expected deformation for an explosive source, but lack significant vertical motions above the

  1. Rigorous lower bounds on the imaginary parts of the scattering amplitudes and the positions of their zeros

    CERN Document Server

    Uchiyama, T

    1974-01-01

    Rigorous lower bounds are derived from axiomatic field theory, by invoking analyticity and unitarity of the S-matrix. The bounds are expressed in terms of the total cross section and the slope parameter, and are found to be compatible with CERN experimental pp scattering data. It is also shown that the calculated lower-bound values imply non-existence of zeros for -t

  2. Rigorous approach to the comparison between experiment and theory in Casimir force measurements

    International Nuclear Information System (INIS)

    Klimchitskaya, G L; Chen, F; Decca, R S; Fischbach, E; Krause, D E; Lopez, D; Mohideen, U; Mostepanenko, V M

    2006-01-01

    In most experiments on the Casimir force the comparison between measurement data and theory was done using the concept of the root-mean-square deviation, a procedure that has been criticized in the literature. Here we propose a special statistical analysis which should be performed separately for the experimental data and for the results of the theoretical computations. In so doing, the random, systematic and total experimental errors are found as functions of separation, taking into account the distribution laws for each error at 95% confidence. Independently, all theoretical errors are combined to obtain the total theoretical error at the same confidence. Finally, the confidence interval for the differences between theoretical and experimental values is obtained as a function of separation. This rigorous approach is applied to two recent experiments on the Casimir effect

  3. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    Science.gov (United States)

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test

  4. Cytogenotoxicity screening of source water, wastewater and treated water of drinking water treatment plants using two in vivo test systems: Allium cepa root based and Nile tilapia erythrocyte based tests.

    Science.gov (United States)

    Hemachandra, Chamini K; Pathiratne, Asoka

    2017-01-01

    Biological effect directed in vivo tests with model organisms are useful in assessing potential health risks associated with chemical contaminations in surface waters. This study examined the applicability of two in vivo test systems viz. plant, Allium cepa root based tests and fish, Oreochromis niloticus erythrocyte based tests for screening cytogenotoxic potential of raw source water, water treatment waste (effluents) and treated water of drinking water treatment plants (DWTPs) using two DWTPs associated with a major river in Sri Lanka. Measured physico-chemical parameters of the raw water, effluents and treated water samples complied with the respective Sri Lankan standards. In the in vivo tests, raw water induced statistically significant root growth retardation, mitodepression and chromosomal abnormalities in the root meristem of the plant and micronuclei/nuclear buds evolution and genetic damage (as reflected by comet scores) in the erythrocytes of the fish compared to the aged tap water controls signifying greater genotoxicity of the source water especially in the dry period. The effluents provoked relatively high cytogenotoxic effects on both test systems but the toxicity in most cases was considerably reduced to the raw water level with the effluent dilution (1:8). In vivo tests indicated reduction of cytogenotoxic potential in the tested drinking water samples. The results support the potential applications of practically feasible in vivo biological test systems such as A. cepa root based tests and the fish erythrocyte based tests as complementary tools for screening cytogenotoxicity potential of the source water and water treatment waste reaching downstream of aquatic ecosystems and for evaluating cytogenotoxicity eliminating efficacy of the DWTPs in different seasons in view of human and ecological safety. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Alternative pre-rigor foreshank positioning can improve beef shoulder muscle tenderness.

    Science.gov (United States)

    Grayson, A L; Lawrence, T E

    2013-09-01

    Thirty beef carcasses were harvested and the foreshank of each side was independently positioned (cranial, natural, parallel, or caudal) 1h post-mortem to determine the effect of foreshank angle at rigor mortis on the sarcomere length and tenderness of six beef shoulder muscles. The infraspinatus (IS), pectoralis profundus (PP), serratus ventralis (SV), supraspinatus (SS), teres major (TM) and triceps brachii (TB) were excised 48 h post-mortem for Warner-Bratzler shear force (WBSF) and sarcomere length evaluations. All muscles except the SS had altered (P<0.05) sarcomere lengths between positions; the cranial position resulted in the longest sarcomeres for the SV and TB muscles whilst the natural position had longer sarcomeres for the PP and TM muscles. The SV from the cranial position had lower (P<0.05) shear than the caudal position and TB from the natural position had lower (P<0.05) shear than the parallel or caudal positions. Sarcomere length was moderately correlated (r=-0.63; P<0.01) to shear force. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. The test beamline of the European Spallation Source - Instrumentation development and wavelength frame multiplication

    DEFF Research Database (Denmark)

    Woracek, R.; Hofmann, T.; Bulat, M.

    2016-01-01

    which, in contrast, are all providing short neutron pulses. In order to enable the development of methods and technology adapted to this novel type of source well in advance of the first instruments being constructed at ESS, a test beamline (TBL) was designed and built at the BER II research reactor...... wavelength band between 1.6 A and 10 A by a dedicated wavelength frame multiplication (WFM) chopper system. WFM is proposed for several ESS instruments to allow for flexible time-of-flight resolution. Hence, ESS will benefit from the TBL which offers unique possibilities for testing methods and components....... This article describes the main capabilities of the instrument, its performance as experimentally verified during the commissioning, and its relevance to currently starting ESS instrumentation projects....

  7. Low power microwave tests on RF gun prototype of the Iranian Light Source Facility

    Directory of Open Access Journals (Sweden)

    A Sadeghipanah

    2017-08-01

    Full Text Available In this paper, we introduce RF electron gun of Iranian Light Source Facility (ILSF pre-injection system. Design, fabrication and low-power microwave tests results of the prototype RF electron gun have been described in detail. This paper also explains the tuning procedure of the prototype RF electron gun to the desired resonant frequency. The outcomes of this project brighten the path to the fabrication of the RF electron gun by the local industries  

  8. Stochastic Geometry and Quantum Gravity: Some Rigorous Results

    Science.gov (United States)

    Zessin, H.

    The aim of these lectures is a short introduction into some recent developments in stochastic geometry which have one of its origins in simplicial gravity theory (see Regge Nuovo Cimento 19: 558-571, 1961). The aim is to define and construct rigorously point processes on spaces of Euclidean simplices in such a way that the configurations of these simplices are simplicial complexes. The main interest then is concentrated on their curvature properties. We illustrate certain basic ideas from a mathematical point of view. An excellent representation of this area can be found in Schneider and Weil (Stochastic and Integral Geometry, Springer, Berlin, 2008. German edition: Stochastische Geometrie, Teubner, 2000). In Ambjørn et al. (Quantum Geometry Cambridge University Press, Cambridge, 1997) you find a beautiful account from the physical point of view. More recent developments in this direction can be found in Ambjørn et al. ("Quantum gravity as sum over spacetimes", Lect. Notes Phys. 807. Springer, Heidelberg, 2010). After an informal axiomatic introduction into the conceptual foundations of Regge's approach the first lecture recalls the concepts and notations used. It presents the fundamental zero-infinity law of stochastic geometry and the construction of cluster processes based on it. The second lecture presents the main mathematical object, i.e. Poisson-Delaunay surfaces possessing an intrinsic random metric structure. The third and fourth lectures discuss their ergodic behaviour and present the two-dimensional Regge model of pure simplicial quantum gravity. We terminate with the formulation of basic open problems. Proofs are given in detail only in a few cases. In general the main ideas are developed. Sufficiently complete references are given.

  9. Performance of the CERN plasma lens in laboratory and beam tests at the Antiproton Source

    International Nuclear Information System (INIS)

    Kowalewicz, R.; Lubrano di Scampamorte, M.; Milner, S.; Pedersen, F.; Riege, H.; Christiansen, J.; Frank, K.; Stetter, M.; Tkotz, R.; Boggasch, E.

    1991-01-01

    The CERN plasma lens is based on a dynamic z-pinch which creates during 500 ns a cylindrical plasma current conductor of 290 mm length and 38 to 45 mm diameter. The lens is designed for pulsed pinched currents of 400 kA and magnetic field gradients of 200 T/m produced with stored energies of 56 kJ. Life tests of different lens components were carried through at a repetition rate of 4.8 s/pulse. The results of the first beam tests of the plasma lens at the CERN antiproton source are very encouraging in view of other potential plasma lens applications

  10. Reduction of sources of error and simplification of the Carbon-14 urea breath test

    International Nuclear Information System (INIS)

    Bellon, M.S.

    1997-01-01

    Full text: Carbon-14 urea breath testing is established in the diagnosis of H. pylori infection. The aim of this study was to investigate possible further simplification and identification of error sources in the 14 C urea kit extensively used at the Royal Adelaide Hospital. Thirty six patients with validated H. pylon status were tested with breath samples taken at 10,15, and 20 min. Using the single sample value at 15 min, there was no change in the diagnostic category. Reduction or errors in analysis depends on attention to the following details: Stability of absorption solution, (now > 2 months), compatibility of scintillation cocktail/absorption solution. (with particular regard to photoluminescence and chemiluminescence), reduction in chemical quenching (moisture reduction), understanding counting hardware and relevance, and appropriate response to deviation in quality assurance. With this experience, we are confident of the performance and reliability of the RAPID-14 urea breath test kit now available commercially

  11. Accurate source location from waves scattered by surface topography: Applications to the Nevada and North Korean test sites

    Science.gov (United States)

    Shen, Y.; Wang, N.; Bao, X.; Flinders, A. F.

    2016-12-01

    Scattered waves generated near the source contains energy converted from the near-field waves to the far-field propagating waves, which can be used to achieve location accuracy beyond the diffraction limit. In this work, we apply a novel full-wave location method that combines a grid-search algorithm with the 3D Green's tensor database to locate the Non-Proliferation Experiment (NPE) at the Nevada test site and the North Korean nuclear tests. We use the first arrivals (Pn/Pg) and their immediate codas, which are likely dominated by waves scattered at the surface topography near the source, to determine the source location. We investigate seismograms in the frequency of [1.0 2.0] Hz to reduce noises in the data and highlight topography scattered waves. High resolution topographic models constructed from 10 and 90 m grids are used for Nevada and North Korea, respectively. The reference velocity model is based on CRUST 1.0. We use the collocated-grid finite difference method on curvilinear grids to calculate the strain Green's tensor and obtain synthetic waveforms using source-receiver reciprocity. The `best' solution is found based on the least-square misfit between the observed and synthetic waveforms. To suppress random noises, an optimal weighting method for three-component seismograms is applied in misfit calculation. Our results show that the scattered waves are crucial in improving resolution and allow us to obtain accurate solutions with a small number of stations. Since the scattered waves depends on topography, which is known at the wavelengths of regional seismic waves, our approach yields absolute, instead of relative, source locations. We compare our solutions with those of USGS and other studies. Moreover, we use differential waveforms to locate pairs of the North Korea tests from years 2006, 2009, 2013 and 2016 to further reduce the effects of unmodeled heterogeneities and errors in the reference velocity model.

  12. Partial discharge tests and characterisation of the Advanced Photon Source linac modulator cables

    International Nuclear Information System (INIS)

    Cours, A.

    2007-01-01

    The advanced photon source (APS) linac modulators are PFN-type pulsers with switch-mode charging power supplies (PSs). The PS and the PFN are connected to each other by 15 feet of 100-kV x-ray cable, with the PFN end of the cable terminated with a connector that was confirmed partial-discharge (PD)-free up to 38 kV ac (53.5 kV peak). Another end of the cable is terminated with a connector that was designed by the PS manufacturer and cannot easily be replaced with another type of connector, since part of it is located inside the densely packed PS. PD tests of the cables with this type of connector show that the PD inception voltages (PDIVs) in different cables turn out to be located within a wide voltage range: 21 to 27 kV ac that corresponds to 29 to 38 kV peak. In order to evaluate the insulation condition of the modulator cables, detect insulation deterioration, and ensure failure-preventing equipment maintenance, over the last two years the PDIVs of all high-voltage (HV) cables in use in the modulators have been tested about every three and a half months. Before the tests, all cables were removed from the equipment, carefully cleaned, inspected, and regreased. The tests were performed using a 40-kV PD detector. The test results show that: 1 The PDIVs remain almost unchanged in all tested cables. 2 From test to test, the PDIV of any particular cable may slightly oscillate around some average value. This possibly depends on the connector regreasing technique. 3 There is no direct evidence of cable insulation deterioration during more than two years of operation under voltage higher than the PD inception level.

  13. Rigorous Results for the Distribution of Money on Connected Graphs

    Science.gov (United States)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  14. Which Interventions Have the Greatest Effect on Student Learning in Sub-Saharan Africa? "A Meta-Analysis of Rigorous Impact Evaluations"

    Science.gov (United States)

    Conn, Katharine

    2014-01-01

    In the last three decades, there has been a large increase in the number of rigorous experimental and quasi-experimental evaluations of education programs in developing countries. These impact evaluations have taken place all over the globe, including a large number in Sub-Saharan Africa (SSA). The fact that the developing world is socially and…

  15. Leakage of caesium braquitherapy sources

    International Nuclear Information System (INIS)

    Lozada, J.A.

    1998-01-01

    In several Venezuelan public hospitals where cervix uteri tumours are treated by intracavitary radiotherapy, that use manual after loading Fletcher method, with Caesium 137 sources, the use of improper source holders, locally manufactured from pieces of drainage plastic tubing, which deteriorated and created a corrosive environment all around the sources, omission of manufacturer's recommendations regarding corrosion information, source storage, inspection and testing, violation of International Atomic Energy Agency Radiation Protection Procedures, and lack of proper regulatory control, resulted integrity damage to about sixty special form sources (ISO2919 C 63322), leakage of Cs-137 from a supposed insoluble refractory active content (caesium silicoaluminate), and contamination of applicators, floors and bedding. When the situation was detected by means removal contamination tests, after routine inspections, the sources were removed from the hospitals, decontaminated by means of immersion in 3% EDTA solution in ultrasonic bath, subjected to leaking assessment tests, and the ones that passed were placed in low cost stainless steel source holders, designed and built by the instituto Venezolano de Investigaciones Cientificas (IVIC) returned to the hospitals. The leaking sources were removed from use and considered radioactive waste. In order to avoid the occurrence of similar situations, all the importers of such sources are now required to send them to IVIC for testing and placement in proper source holders, before they are shipped to the hospitals. (author)

  16. Challenges in Regulating Radiation Sources and Associated Waste Management

    International Nuclear Information System (INIS)

    Shehzad, A.

    2016-01-01

    Radiation sources are widely used in the fields of medical, industry, agriculture, research, etc. Owing to the inherent risk of exposure to ionizing radiations while using the radiation sources and management of associated waste, safety measures are of utmost importance including robust regulatory control. Pakistan Nuclear Regulatory Authority (PNRA) is responsible for supervising all matters pertaining to nuclear safety and radiation protection in the country. Since its inception, PNRA has made rigorous efforts to regulate the radiation facilities for which regulatory framework was further strengthened by taking into account international norms/practices and implemented afterwards. However, due to vibrant use of these facilities, there are numerous challenges being faced while implementing the regulatory framework. These challenges pertains to shielding design of some facilities, control over service provider for QC/repair maintenance of radiation equipment, assessment of patient doses, and establishment of national diagnostic reference levels for radiological procedures. Further, the regulatory framework also delineate requirements to minimize the generation of associated radioactive waste as low as practicable. The requirements also necessitates that certain sealed radioactive sources (SRS) are returned to the supplier upon completion of their useful life, while other radioactive sources are required to be transported for storage at designated radioactive waste storage facilities in the country, which requires commitment from the licensee. This paper will briefly describe the challenges in regulating the radiation sources and issues related to the waste management associated with these facilities. (author)

  17. Effects of Source RDP Models and Near-source Propagation: Implication for Seismic Yield Estimation

    Science.gov (United States)

    Saikia, C. K.; Helmberger, D. V.; Stead, R. J.; Woods, B. B.

    - It has proven difficult to uniquely untangle the source and propagation effects on the observed seismic data from underground nuclear explosions, even when large quantities of near-source, broadband data are available for analysis. This leads to uncertainties in our ability to quantify the nuclear seismic source function and, consequently the accuracy of seismic yield estimates for underground explosions. Extensive deterministic modeling analyses of the seismic data recorded from underground explosions at a variety of test sites have been conducted over the years and the results of these studies suggest that variations in the seismic source characteristics between test sites may be contributing to the observed differences in the magnitude/yield relations applicable at those sites. This contributes to our uncertainty in the determination of seismic yield estimates for explosions at previously uncalibrated test sites. In this paper we review issues involving the relationship of Nevada Test Site (NTS) source scaling laws to those at other sites. The Joint Verification Experiment (JVE) indicates that a magnitude (mb) bias (δmb) exists between the Semipalatinsk test site (STS) in the former Soviet Union (FSU) and the Nevada test site (NTS) in the United States. Generally this δmb is attributed to differential attenuation in the upper-mantle beneath the two test sites. This assumption results in rather large estimates of yield for large mb tunnel shots at Novaya Zemlya. A re-examination of the US testing experiments suggests that this δmb bias can partly be explained by anomalous NTS (Pahute) source characteristics. This interpretation is based on the modeling of US events at a number of test sites. Using a modified Haskell source description, we investigated the influence of the source Reduced Displacement Potential (RDP) parameters ψ ∞ , K and B by fitting short- and long-period data simultaneously, including the near-field body and surface waves. In general

  18. Applicability and interpretation of fracture test methods for metals

    International Nuclear Information System (INIS)

    Langford, W.J.

    1978-05-01

    Fracture tests are conducted usually out of a conviction (sometimes only vaguely defined) that they will guarantee a certain level of protecton from metal failure. Qualitative tests, such as the Charpy V-notch, produce results which cannot be rigorously related to a measure of fracture tolerance: rather, they indicate metal quality so that fracture tolerance may be inferred. Quantitative tests on the other hand provide parameters which may be used directly in equations to determine the likelihood of fracture. Both types of tests have limitations which should be understood: the paper tries to provide guidance on the relative merits of either approach for a particular purpose, and gives an insight into near-future test methods which will extend the range of usefullness of quantitative tests. (author)

  19. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  20. Campylobacter species in animal, food, and environmental sources, and relevant testing programs in Canada.

    Science.gov (United States)

    Huang, Hongsheng; Brooks, Brian W; Lowman, Ruff; Carrillo, Catherine D

    2015-10-01

    Campylobacter species, particularly thermophilic campylobacters, have emerged as a leading cause of human foodborne gastroenteritis worldwide, with Campylobacter jejuni, Campylobacter coli, and Campylobacter lari responsible for the majority of human infections. Although most cases of campylobacteriosis are self-limiting, campylobacteriosis represents a significant public health burden. Human illness caused by infection with campylobacters has been reported across Canada since the early 1970s. Many studies have shown that dietary sources, including food, particularly raw poultry and other meat products, raw milk, and contaminated water, have contributed to outbreaks of campylobacteriosis in Canada. Campylobacter spp. have also been detected in a wide range of animal and environmental sources, including water, in Canada. The purpose of this article is to review (i) the prevalence of Campylobacter spp. in animals, food, and the environment, and (ii) the relevant testing programs in Canada with a focus on the potential links between campylobacters and human health in Canada.